Page 2,507«..1020..2,5062,5072,5082,509..2,5202,530..»

Rigetti and Oxford Instruments Participate and Sponsor The City – Marketscreener.com

5 November 2021

Rigetti Computing, a pioneer in full-stack quantum computing, and Oxford Instruments are gold sponsors of the upcoming City Quantum Summit London 2021 which is taking place on Wednesday 10th November at Mansion House. The Summit will bring together founders and CEOs of quantum computing companies with the aim of clearly demonstrating the need for quantum computing in all sectors and industries, from sustainable development to medical revelations. The event is being hosted by William Russell, Lord Mayor Of The City Of London, in collaboration with Robinson Hambro.

"The City Quantum Summit is a great example of the level of collaboration required to secure ongoing quantum commercialisation here in the UK and we're pleased to be actively participating in the discussions and working closely with other leading players like Rigetti to accelerate real applications in quantum computing today," says Stuart Woods, Managing Director at Oxford Instruments NanoScience. Rigetti and Oxford Instruments are partners in a public-private consortium to deliver the first commercial quantum computer in the U.K.

Woods and Marco Paini, Technology Partnerships Director for Europe at Rigetti, will both be participating in the event as part of a panel discussion focused on financial modelling. The discussion will cover quantum computing applications for the financial sector and a project with Standard Chartered to analyse use cases including, for example, synthetic financial data generation and classification for implied volatility.

"Some of the most promising use cases for quantum computing are based in the financial sector. The ability to develop practical applications, like financial modelling, on real hardware puts us in a strong position to accelerate the commercialisation of quantum computing in the UK. Our Innovate UK consortium brings together industry experts and full-stack quantum expertise to make practical quantum computing a reality."

The Summit will be conducted as a hybrid event with the opportunity to join virtually as well as in person. You can find out more about how to register for the event here

Disclaimer

Oxford Instruments plc published this content on 09 November 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 09 November 2021 09:32:06 UTC.

Read the rest here:
Rigetti and Oxford Instruments Participate and Sponsor The City - Marketscreener.com

Read More..

Lost in Space-Time newsletter: Will a twisted universe save cosmology? – New Scientist

By Richard Webb

Albert Einsteins general theory of relativity didnt have to be

Hello, and welcome to Novembers Lost in Space-Time, the monthly physics newsletter that unpicks the fabric of the universe and attempts to stitch it back together in a slightly different way. To receive this free, monthly newsletter in your inbox, sign up here.

Theres a kind of inevitability about the fact that, if you write a regular newsletter about fundamental physics, youll regularly find yourself banging on about Albert Einstein. As much as it comes with the job, I also make no apology for it: he is a towering figure in the history of not just fundamental physics, but science generally.

A point that historians of science sometimes make about his most monumental achievement, the general theory of relativity, is that, pretty much uniquely, it was a theory that didnt have to be. When you look at the origins of something like Charles Darwins theory of evolution by natural selection, for example not to diminish his magisterial accomplishment in any way youll find that other people had been scratching around similar ideas surrounding the origin and change of species for some time as a response to the burgeoning fossil record, among other discoveries.

Even Einsteins special relativity, the precursor to general relativity that first introduced the idea of warping space and time, responded to a clear need (first distinctly identified with the advent of James Clerk Maxwells laws of electromagnetism in the 1860s) to explain why the speed of light appeared to be an absolute constant.

When Einstein presented general relativity to the world in 1915, there was nothing like that. We had a perfectly good working theory of gravity, the one developed by Isaac Newton more than two centuries earlier. True, there was a tiny problem in that it couldnt explain some small wobbles in the orbit of Mercury, but they werent of the size that demanded we tear up our whole understanding of space, time, matter and the relationship between them. But pretty much everything we know (and dont know) about the wider universe today stems from general relativity: the expanding big bang universe and the standard model of cosmology, dark matter and energy, black holes, gravitational waves, you name it.

So whyamI banging on about this? Principally because, boy, do we need a new idea in cosmology now and in a weird twist of history, it might just be Einstein who supplies it. Im talking about anintriguing feature by astrophysicist Paul M. Sutter in the magazine last month. It deals with perhaps general relativitys greatest (perceived, at least) weakness the way it doesnt mesh with other bits of physics, which are all explained by quantum theory these days. The mismatch exercised Einstein a great deal, and he spent much of his later years engaged in a fruitless quest to unify all of physics.

Perhaps his most promising attempt came with a twist literally on general relativity that Einstein played about with early on. By developing a mathematical language not just for how space-time bends (which is the basis of how gravity is created within relativity) but for how it twists, he hoped to create a theory that also explained the electromagnetic force. He succeeded in the first bit, creating a description of how massive, charged objects might twist space-time into mini-cyclones around them. But it didnt create a convincing description of electromagnetism, and Einstein quietly dropped the theory.

Well, the really exciting bit, as Sutter describes, is that this teleparallel gravity seems to be back in a big way. Many cosmologists now think it could be a silver bullet to explain away some of the most mysterious features of todays universe, such as thenature of dark matterand dark energy and thetroublesome period of faster-than-light inflationright at the moment of the big bang that is invoked to explain features of todays universe, such as its extraordinary smoothness. Not only that, but there could be a way to test the theory soon. Id recommendreading the featureto get all the details, but in the meantime, its about as exciting a development as youll get in cosmology these days.

Lets take just a quick dip into the physics arXiv preprint server, where the latest research is put up. One paper that caught my eye recently has the inviting title Life, the universe and the hidden meaning of everything . Its by Zhi-Wei Wang at the College of Physics in China and Samuel L. Braunstein at the University of York in the UK, and it deals with a question thats been bugging a lot of physicists and cosmologists ever since we started making detailed measurements of the universe and developing cogent theories to explain what we see: why does everything in the universe (the strengths of the various forces, the masses of fundamental particles, etc.) seem so perfectly tuned to allow the existence of observers like us to ask the question?

This has tended to take cosmologists and physicists down one of two avenues. The first says things are how they are because thats how theyre made. For some, that sails very close to an argument via intelligent design, aka the existence of god. The other avenue tends to be some form of multiverse argument: our universe is as it is because we are here to observe it (we could hardly be here to observe it if it werent), but it is one of a random subset of many possible universes that happen to be conducive to intelligent life arising.

This paper examines more closely a hypothesis from British physicist Dennis Sciama (doctoral supervisor to the stars: among his students in the 1960s and 1970s wereStephen Hawking, quantum computing pioneer David Deutsch and the UKs astronomer royal, Martin Rees ) that if ours were a random universe, there would be a statistical pattern in its fundamental parameters that would give us evidence of that. In this paper, the researchers argue that the logic is actually reversed. In their words: Were our universe random, it could give the false impression of being intelligently designed, with the fundamental constants appearing to be fine-tuned to a strong probability for life to emerge and be maintained.

Full disclosure Im writing something on this very subject for New Scientists 65th-anniversary issue, due out on 20 November. Read more there!

While Im banging on about Einstein, I stumbled across one of my favourite features Ive worked on while at the magazine the other day, and thought it was worth sharing. Called Reality check: Closing the quantum loopholes, its from 2011, a full 10 years ago, but the idea it deals with stretches back way before that and is still a very live one.

The basic question is: is quantum theory a true description of reality, or are its various weirdnesses not least the entanglement of quantum objects over vast distances indications of goings-on in an underlying layer of reality not described by quantum theory (or indeed any other theory to date)? I talked about entanglement quite a bit in last months newsletter, so I wont go into its workings here.

The alternative idea of hidden variables explaining the workings of the quantum world goes back to a famous paper published by Einstein and two collaborators, Nathan Rosen and Boris Podolsky, back in 1935. It led Einstein into a long-drawn-out debate about the nature of quantum theory with another of its pioneers, Niels Bohr, that continued decorously right until Einsteins death in 1955. It wasnt until the 1980s that we began to have the theoretical and experimental capabilities to actually pit the two pictures against one another.

The observatories atop the volcano Teide on Tenerife were one scene of a bold test of quantum reality.

Phil Crean A/ Alamy

I love the story not just for this rich history, but also for the way that, after each iteration of the experiments every time showing that quantum theory, and entanglement, are the right explanation for what is going on, whatever they might mean the physicists found another loophole in the experiments that might allow Einsteins hidden variable idea back into the frame again.

That led them to some pretty impressive feats of experimental derring-do to close the loopholes again the feature opens with a group of modern physicists shooting single photons between observatories on Tenerife and La Palma in the Canary Islands. In an update to the story that we published in 2018 (with the rather explicit titleEinstein was wrong: Why normal physics cant explain reality), they even reproduced the result with photons coming at us from galaxies billions of light years away proving that, if not the whole universe, then a goodly proportion of it follows quantum rules. You cant win em all, Einstein.

One reason Ive been thinking particularly frequently about Einstein and his work lately is that Ive been putting together the latestNew Scientist Essential Guidecalled Einsteins Universe. Its a survey of his theories of relativity and all those things that came out of it: the big bang universe and the standard model of cosmology, dark matter and energy, gravitational waves, black holes and, of course,the search for that elusive unifying theory of physics. Ive just putting the finishing touches to theEssential Guidewith my left hand as I type this, and I think its a fair expectation that youll find me banging on about that (and Einstein) a lot more next month.

1. Talking of fine-tuned universes, if you havent done so already, you can still catch up with Brian Cleggs New Scientist Event talk,The Patterns That Explain the Universe, from last month, available on demand.

2. If youre fan of big ideas (I hope thats why youre here) and like casting your net a little wider than just physics, then a ticket to ourBig Thinkers series of live events gives you access to 10 talks from top researchers from across the board, including Harvard astronomer Avi Loeb on the search for extraterrestrial life and Michelle Simmons and John Martinis on quantum computing.

3. It happened just after my last newsletter, but it would be remiss not to mention the awarding of this years Nobel prize to three researchers who played a leading role in advancing our understanding of chaotic systems notably the climate. You can find out more about what they didhere.

More on these topics:

Read more:
Lost in Space-Time newsletter: Will a twisted universe save cosmology? - New Scientist

Read More..

Report: Southeast New Mexico’s rural school districts struggle in computer science – Carlsbad Current Argus

Vaccine equity pits rural against urban America

The U.S. vaccine campaign has heightened tensions between rural and urban America, where educators in urban counties are driving hours to rural areas to get vaccinated. (March 1)

AP

A recent report shows New Mexico has made progress in improving access to computer science education but may be struggling in some areas.

According to the 2021 State of Computer Science Education report, 44% of public high schools offereda foundational computer science course during the2020-21 school year, compared to 32% in 2019-20 and 23% in 2018-19.

In Southeast New Mexico 17 out of the region's 33 school districts do not offer foundational computerscience classes, according to Code's Computer Science Access Report.

Some of these districts include Jal Public Schools, Artesia Public Schools and Tularosa Municipal Schools.

In suburban areas 50% of students have access to computer science classes.That number drops to 46% in urban areas, 43% in rural areas and 38% in towns.

Schools with a high number of students on the Free or Reduced Lunch (FRL) programalso have less access, per the report.

Education briefs: Loving Schools names honor roll students

Schools that have more than half of their students on FRL are 13% to 25% less likely to have computer science classes, according to the report.

Carlsbad Municipal Schools Superintendent Dr. Gerry Washburn saidcomputer science and IT has become an essential part of the region's industries like oil and gas, andthere is a need for professionals in the field.

According to the report, there are an average of 2,925 job openings for computer science positions in New Mexico each month.

High schools across the region have created programs aimed at preparing students for these positions.

Gerrymandering: Two communities, side by side, have wildly different education outcomes by design

Carlsbad High Schooloffers a variety of computer science classes as part of its new academy system that was implemented this year.

Under theAcademy of Business Information Technology,students can takecomputer programming classes and even get paid to work as computer technicians for the district, according to the CHS curriculum guide.

High schools in Hobbs and Loving also offer computer science classes on topics likeweb design and computer graphics as an elective.Alamogordo High School and Roswell High School also have computer science classes as part of their career and technical training programs.

New Mexico is one of the states that have adopted plans to improve access to computer science educationalong with Alabama, Massachusetts and Oklahoma.

Computer science is vital for each New Mexico students education, empowering them to skillfully navigate life, education and career opportunities, Public Education Secretary (Designate) Kurt Steinhaus said. We are pleased to see continued improvement to computer science education in New Mexico. NMPED will continue to prioritize and promote computer science so all students have access to this promising career path.

This degree can earn you more than an MBA

Recent grads with a masters degree in computer science can earn more than recent MBA grads, according to U.S. News. Buzz60's Sean Dowling has more.

Buzz60

The PED'scomputer science task-forcereleased afive-year plan in June 2021, which includes the creation of new policies and teacher certifications for computer science. Under the plan, every high school in the state will have higher-level computer science and IT classes by 2026.

In February 2021 state legislators passed House Bill 188 which aims to create a license endorsement in secondary computer science. New Mexico is also the first state to create two positions that oversees computer science in the Math and Science Bureau and theCollege and Career Readiness Bureau.

Claudia Silva is a reporterfrom the UNMLocal ReportingFellowship. Shecan be reached at csilva2@currentargus.com, by phone at(575) 628-5506 or on Twitter @thewatchpup.

Go here to read the rest:

Report: Southeast New Mexico's rural school districts struggle in computer science - Carlsbad Current Argus

Read More..

Washington People: Chenyang Lu – The Source – Washington University in St. Louis – Washington University in St. Louis Newsroom

Chenyang Lu is not a civil engineer.

For a computer scientist, though, he builds a lot of bridges. Particularly between the fields of computer science and health care.

Lu is the Fullgraf Professor in the Department of Computer Science & Engineering at Washington University in St. Louis McKelvey School of Engineering. His research is concerned with the Internet of Things (IoT), cyber-physicalsystems and artificial intelligence, and he is particularly keen on how these technologies can improve healthcare.

As part of several teams with surgeons and physicians, Lu has been testing Fitbit activity trackers in studies that have shown that these relatively inexpensive wearable devices can play a valuable role in improving patient health.

We can collect data such as step count, heart rate and sleep cycles, which we use with our machine-learning models to predict deterioration or improvement in a patients health status. Lu said. These efforts demonstrate tremendous potential for wearable and machine learning to improve health care.

Using data from Fitbits, for example, Lu and his collaborators have demonstrated the ability to predict surgical outcomes of pancreatic cancer patients with higher success than the current risk assessment tool.

The goal is improved health care, but where some of the most challenging problems arise, Lu finds engineering solutions. Obstacles can include subpar data, or simply not enough data to get useful information from wearable devices.

You have to extract features using engineering techniques, Lu said. How do we take this noisy, lousy data from wearables and extract robust and predictive features to generate something clinically meaningful and informative so we can actually predict something?

Getting useful information from messy data is one of the reasons his colleagues at the School of Medicine value his partnership.

Chenyang has established himself as an expert in how to interpret and connect the dots for this high-dimensional data. Thats why I think hes so prolific, said Philip Payne, the Janet and Bernard Becker Professor and director of the Institute for Informatics, associate dean for health information and data science and chief data scientist at the School of Medicine.

Prolific he may be, but Lu is eager to do more.

We can scale this up to perfect the technology and broaden the scope so it can be used on larger groups of different types of patients, Lu said. I look forward to collaborating with even more physicians and surgeons to expand this work.

Read more on the engineering website.

Read more from the original source:

Washington People: Chenyang Lu - The Source - Washington University in St. Louis - Washington University in St. Louis Newsroom

Read More..

GA Dept of Ed Awards Third Round of Grants to Build Computer Science Teacher Capacity – All On Georgia

The Georgia Department of Education is awarding athird roundof grants to school districts to help them build teacher capacity around computer science education, State School Superintendent Richard Woods announced today. Seventeen school districts will receive the grants, for a total allocation of $279,000.

Computer science (CS) has become a high-demand career across multiple industries, and includes skills all students need to learn. Thus far, the largest challenge for school districts in building this new discipline is building teaching capacity, though that capacity is expanding. There are currently 615 credentialed CS teachers and just under 1,000 middle and high schools in Georgia. Thats up from 403 teachers in 2020 and 250 in 2019.

The grant provides funding for teachers to participate in professional learning opportunities, including credential programs, to help mitigate the remaining gap.

Technology has emerged as a vital component of our daily lives, with its impact growing stronger each day,Lieutenant Governor Geoff Duncan said.For Georgia to maintain its role as a national leader in economic innovation, we must continue to prioritize investments in education and workforce development. I commend the Georgia Department of Education for working to meet this growing demand for computer science professionals in the Peach State.

We must continue preparing our children for the jobs and opportunities theyll encounter in the future not just for the economy of today,Superintendent Woods said.These CS Capacity Grants help school systems build a pipeline of qualified computer science teachers within their district to ensure children are prepared with the technical skills and the experience in problem-solving and real-world thinking that will serve them well in any career they choose.

Funds were awarded through a competitive application process, with priority given to school systems serving highly impoverished and/or rural communities.

The grant is aligned to GaDOEs Roadmap to Reimagining K-12 Education, which calls for setting the expectation that every child, in every part of the state, has access to a well-rounded education including computer science.

Cherokee County Schools Colquitt County Schools Dade County Schools Decatur County Schools Effingham County Schools Emanuel County Schools Fayette County Schools Gainesville City Schools Glynn County Schools Greene County Schools Jefferson City Schools McIntosh County Schools Pike County Schools Pulaski County Schools Seminole County Schools Walker County Schools Wayne County Schools

Read the original:

GA Dept of Ed Awards Third Round of Grants to Build Computer Science Teacher Capacity - All On Georgia

Read More..

Tableau to add new business science tools to analytics suite – TechTarget

Self-service data science capabilities and integrations with Slack highlight the latest Tableau analytics platform upgrades.

Tableau, founded in 2003 and based in Seattle, unveiled new capabilities -- not all of which are generally available -- on Tuesday during Tableau Conference 2021, the vendor's virtual user conference.

Both the self-service data science capabilities and Slack integrations are slated for general availability in 2022. And according to Francois Ajenstat, Tableau's chief product officer, the impetus for developing both was Tableau's mission of enabling people to better see and understand data.

"What we're trying to do is make analytics easier to use for anyone, anywhere, with any data, and we're trying to do that by broadening access to analytics so that it's available to everyone," he said on Nov. 8 during a virtual press conference. "We need to make analytics dramatically easier to use -- more accessible and more collaborative."

The self-service data science capabilities aim to extend predictive analytics capabilities beyond data scientists while the integrations with Slack enable collaboration, Ajenstat continued.

"We're going to extend the advanced capabilities of the platform, [and] and we're going to bring analytics in the flow of business so that everyone can be a data person," he said.

Tableau, which was acquired by Salesforce in June 2019, introduced the concept of business science in March 2021 when it unveiled its first integration with Salesforce's Einstein Analytics in Tableau 2021.1. Business science, as defined by Tableau, is enabling business users without training in statistics and computer science with data science capabilities using augmented intelligence and machine learning.

The integration added Einstein Discovery -- a no-code tool in Salesforce's Einstein Analytics platform -- to Tableau to give Tableau customers the ability to do predictive modeling and generate prescriptive recommendations.

In 2022, Tableau analytics platform updates will build on the initial business science capabilities enabled by Einstein Discovery with the additions of Model Builder and Scenario Planning.

Model Builder is designed to help teams collaboratively develop and consume predictive models within their Tableau workflows using Einstein Discovery. Scenario Planning, meanwhile, takes advantage of Einstein's AI capabilities to enable users to compare scenarios, build models and understand expected outcomes to better make data-driven decisions.

The business science capabilities are very interesting to me. With solutions that enable business agility being so important today, Model Builder and Scenario Planning will prove to be a difference-maker in the future direction of modern businesses. Mike LeoneAnalyst, Enterprise Strategy Group

Once generally available next year, the two capabilities will be significant for users, according to Mike Leone, an analyst at Enterprise Strategy Group.

"The business science capabilities are very interesting to me," he said. "With solutions that enable business agility being so important today, Model Builder and Scenario Planning will prove to be a difference-maker in the future direction of modern businesses."

The key to their effectiveness will be their ability to simplify what are otherwise complex tasks, Leone continued.

"What-if analysis can be a difficult endeavor without the right data and tools, but with Tableau expanding their business science capabilities, they are looking to drastically simplify planning, prediction and outcome analysis in a collaborative way rooted in AI and self-service," he said.

Likewise, Boris Evelson, an analyst at Forrester Research, noted that Tableau is at the forefront of adding augmented analytics capabilities to its platform, but noted that the vendor is not alone in that respect.

"What we now call Augmented BI -- BI infused with AI -- is a capability that most modern BI platforms have already invested in," he said.

"In our recent [research], Tableau was found as one of the leaders in that evaluation, but so were Microsoft, Oracle, Tibco, Sisense and Domo," Evelson said, referring to other tech giants and independent vendors with popular analytics systems.

In addition to the added business science capabilities, Tableau analytics platform updates next year will include three new integrations with Slack, which Salesforce acquired for $27.7 billion in December 2020.

When ready for release, Tableau users will have access to Ask Data in Slack, Explain Data in Slack and Einstein Discovery in Slack.

Ask Data is an AI capability that enables users to query data using natural language rather than requiring them to write code, while Explain Data is automatically generates explanations of data points. Both were first introduced in 2019 and were upgraded in Tableau's June 2021 analytics platform update.

"The Slack integration is very notable," Evelson said.

It's notable, he continued, because most people who use data to inform their decisions don't spend the majority of their time in their analytics platform environments. Instead, they spend most of their time in work applications and collaboration platforms.

"BI platforms are not what we call systems of work -- digital workspaces where people live 9 to 5," Evelson said. "These decision-makers' systems of work are ERP, CRM, planning/budgeting or other business applications, or productivity applications such as email, calendar, and increasingly -- due to remote work -- collaboration tools like Slack."

Similarly, Leone noted that Tableau's integrations with Slack are in line with what data workers now require and have the potential to enable new discovery.

"I think collaboration enablement is critical to the next level of data-centric adoption, and Tableau's announcements are aligned perfectly," he said. "Imagine being in a Slack channel and seeing a team member ask a data question with an instant answer. That answer could introduce a new data angle to attack a problem or spark an open discussion that leads down a new innovation path."

Beyond the expansion of its business science capabilities and Slack integrations, Tableau unveiled a host of other features both generally available now and scheduled for upcoming analytics platform updates.

Among them are security, administration, community engagement and collaboration tools, including:

Virtual Connections, Centralized Row-Level Security, the Tableau Exchange and Hire Me button in Tableau Public are now generally available.

Connected Applications will be available with the release of Tableau 2021.4 before the end of the year, and all other capabilities are expected to be generally available in 2022, according to Tableau.

Beyond helping users better see and understand their data, data governance is a priority for Tableau, according to Ajenstat. Features such as Centralized Row-Level Security and Share Prep Flows are aimed at addressing the trustworthiness and security of data.

"Having access to data is really important, but we need to also make sure people can trust the data," Ajenstat said. "If you can't trust the data, you can't trust the insight, and this is why we need to make trust a fundamental part of the Tableau platform."

Enterprise Strategy Group is a division of TechTarget.

See original here:

Tableau to add new business science tools to analytics suite - TechTarget

Read More..

Facebook Isn’t Shutting Down Its Facial Recognition System After All – News @ Northeastern – News@Northeastern

Facebook recently announced that it would be shutting down its facial recognition system and deleting its store of face-scan data from the billion people who opted in to the system. In a press release from Meta, the newly minted parent company of Facebook, the vice president of artificial intelligence heralded the change as one of the largest shifts in facial recognition usage in the technologys history.

Ari E. Waldman is a professor of law and computer science. He is also the faculty director for Northeasterns Center for Law, Information and Creativity. Courtesy photo

But Northeastern legal scholars see it differently, and are renewing their calls for government oversight of facial recognition and other advanced technologies.

This is yet another example of Facebook misdirection, says Ari E. Waldman, professor of law and computer science at the university. Theyre deleting the face scans but keeping the algorithmic tool that forms the basis of those scans, and are going to keep using the tool in the metaverse.

Indeed, in an interview shortly after the announcement, Meta spokesperson Jason Grosse told a Vox reporter that the companys commitment not to use facial recognition doesnt apply to its metaverse productsa suite of interactive tools that Meta engineers are building to create a virtual, simulated environment layered over the physical one we interact in day to day.

What this announcement amounts to is that Facebook is throwing out the data they dont need anymore, says Waldman, who also serves as faculty director for Northeasterns Center for Law, Information and Creativity. This is like when your neighbor is playing music really loud, and they stop not because youre concerned about the sound, but because theyre done practicing.

Woodrow Hartzog is a professor of law and computer science at Northeastern University. Photo by Matthew Modoono/Northeastern University

While the announcement isnt the paradigm-shifting change its Meta authors might have users believe, it does have an upside, says Woodrow Hartzog, professor of law and computer science at Northeastern.

I appreciate the fact that this announcement gives a little bit more fuel for privacy advocates who argue that facial recognition is not inevitable, its not invaluable, and its not toothpaste thats out of the tubewe can change the way we use this technology, he says.

Far from signaling an end to widespread facial recognition technology, the move by Facebook may create a vacuum in the market that other tech companies will race to fill with their own databases of paired face-name data, Hartzog says. We may see other companies hit the accelerator now that theres a market opportunity opened up by Facebook, he says.

Together, the landscape of facial recognition and other biometric data collection software is one that requires lawmakers to create oversight, not rely upon tech companies to self-regulate, the law scholars say.

It shows that existing laws continue to drive a need for a more robust and stable regulatory framework for biometrics and other surveillance tools, Hartzog says.

For media inquiries, please contact Shannon Nargi at s.nargi@northeastern.edu or 617-373-5718.

Read more from the original source:

Facebook Isn't Shutting Down Its Facial Recognition System After All - News @ Northeastern - News@Northeastern

Read More..

Top 10 Best Women Programmers of All Time – Analytics Insight

In this male patriarchal society, female achievers are often forgotten by the world. We are well versed with the fact that there are many successful male programmers in the world. But how many of us know about the women programmers whose contribution is incomparable in computer science and technology? Quite a few right? It is true that many female programmers have contributed a lot to computer programming. Women in computing were among the first programmers in the early 20th century who contributed largely to the industry. With the advancement of technology, the role of women as programmers is also increasing enormously but remains obscure. Since the 18th century, women have been dominantly working in the field of programming and scientific computations. Due to the gender disparity, females could never get the limelight in comparison with their male counterparts. Nevertheless, women continue to work hard and make significant contributions to the programming and IT industry. Women also hold significant roles in multi-tech companies such as Yahoo, Google, Microsoft, and many more.

Heres the list of the top 10 best women programmers of all time:

Ada Lovelace is also known as Augusta Ada King is regarded as the first female computer programmer. She was an extremely good mathematician and writer. Ada was an influential pioneer in the field of computer research and programming. She is primarily known for her work on Charles Babbages proposed mechanical general-purpose computer, the Analytical Engine. She was the first person to find out machines can do far more than calculations and also published the first algorithm by such a machine.

Grace Brewster Murray Hopper was an American computer scientist. She was a computer programmer who invented one of the first linkers and discovered bugging for fixing programming errors and technical glitches. She invented a compiler for computer programming language and also publicized the idea of machine-independent programming language. She formed the theory of FLOW-MATIC programming language and later helped in the development of COBOL, a high-level programming language.

Joan Clarke was a cryptanalyst who became famous for her role as a codebreaker in the second world war. She was the only woman who worked on solving the German Enigma messages along with Alan Turing. But due to gender biases, she got less wage even when at the same position along with her male co-workers. She took against this by transforming into a linguist. She is considered as genius British for her code-breaking ground.

Margaret Heafield Hamilton is a well-known scientist for her work in the fields of computer science, system engineer, and business owner. She introduced the term software engineering and became the head of the software engineering division at MIT Instrumentation Laboratory for developing onboard flight software for the Apollo space program. She designed the asynchronous system giving priority to only important functions and rejecting the rest.

Adele Goldberg is a computer scientist known for developing the programming language Smalltalk-80 and various other object-oriented programmings. She introduced a programming environment of overlapping windows on graphics display screens. She was also involved in designing templates and patterns in modern software. Apple implemented her methods in their Macintosh computers

Francis Elizabeth Allen is a computer scientist best known for her work in the field of optimizing compilers, program optimization, and parallelization. She was known for working on programming language compilers for IBM Research. Allen received the IBM fellow title award, regarded as the highest recognition for scientists, engineers, and programmers for the company. She also introduced many algorithms and implementations for automatic program optimization technology.

Barbara Liskov was a successful programmer who also won the Turing award for developing the Liskov Substitution principle. He has worked on various important projects like the Venus Operating system which is an affordable and interactive timesharing system. The first high-level language Argus was created by her and demonstrates the technique of pipelining and Thor, an object-oriented database system. Barbara also led the Programming Methodology Group of MIT.

She was an American computer scientist. She was one of the members of the ten-member team at IBM for the development of FORTRAN, a high-level programming language. She was the only female programmer there. She examined the flow of programs produced by the compilers. She developed the first syntactic analyzer of arithmetic expressions.

Shafrira Goldwasser is an American computer scientist and another winner of the Turning award in 2012 for a number of theories. She is well known for her works in computational complexity theory, cryptography, and computational number theory. She helped in the creation of probabilistic encryption and zero-knowledge proofs, a cryptographical protocol. Currently, she is a professor in the department of electrical engineering and computer science at MIT.

Anita Borg is a reputed American computer scientist and the founder of the Institute for Women and Technology. She is the brain behind the idea of Systers in the year 1987. She worked for Digital Equipment Corporation when she developed a technique for generating, analyzing, and designing a high-speed memory system. Bill Clinton also appointed her as an important member of the Committee on the Advancement of women and minorities in science.

See original here:

Top 10 Best Women Programmers of All Time - Analytics Insight

Read More..

Playing safe: Ways to manage cyber threats – The Financial Express

Protect your keys to the digital world; create complex passwords for all your accounts and applicationsuse a different password for eachand change them regularly.

By Ritesh Chopra

The real world has become inextricably intertwined with the digital world. Online presence comes with the risk of exposure to cyberthreats. Yet, peoples behaviour, by and large, seem to reflect a disregard for cyber safetyfrom readily clicking on SMS links, using public Wi-Fi to bank and shop online, through to sharenting and oversharing on social media. A Norton Cyber Safety Insights Report study, conducted online by The Harris Poll on behalf of NortonLifeLock during May 20-June 8 among more than 1,000 Indian adults, revealed that 82% of Indians say that the amount of time they spend in front of a screen, aside from work or school purposes, has increased significantly during the pandemic, likely owing to the new norms of working, learning, and shopping from home. During this time, cybercriminals are targeting consumers with an increased number of sophisticated attacks and well-coordinated scams. Yet seeing daily news reports about data breaches, identity theft, and cyberbullying is now worryingly common.

One of the possible reasons for this situation could be that not all consumers are well-informed about how to protect their internet-enabled devices and online activities. While they may have heard about terms such as phishing, malware and creeping, not everyone knows the measures they could take. On the Dark Web, as an unregulated space, operators and users can work anonymously to avoid being traced, which has created a breeding ground for illegal online activities, including trading of peoples personal data, but the impact often spills over into the real world.

It is difficult for the user to know if the information they think is securely on their device is being tracked or monitored by a third party. Often, users themselves unwittingly provide access to their information when they select options like accept all cookies, save password/account details for future use or grant activity surveillance across platforms without a second thought. The price for such momentary convenience can sometimes be a heavy one to pay.

The Norton Cyber Safety Insights 2021 Report showed that four in five Indian adults (82%) admitted to using personal information in their password, most notably their name (38%), their child(ren)s name (27%), their pets name (23%) or a current (22%) or former (19%) partners name. However, Indian adults do take some security precautions, as 72% Indian adults with a Wi-Fi router (72%) change their Wi-Fi password more than once a year.

Protect your keys to the digital world; create complex passwords for all your accounts and applicationsuse a different password for eachand change them regularly. Do not share your passwords with anyone. Use a trusted Virtual Private Network (VPN) to secure your network and ensure the privacy of your digital activities. Practice safe computing, do not click on suspicious links or visit unsecured websites. Download software updates only from trusted, secure sources. And last, use a comprehensive internet security solutionit is a small but worthy investment. Nobody wants their personal and financial data being accessed and misused by cybercriminals.

Cyberattacks are on the rise against businesses, governments, and individuals alike. Many of the existing threats can be warded off by being aware and mindful of our actions and by keeping abreast of developments in the cyber landscape. Taking adequate care at the individual level and installing internet security solutions on devices, together, can go a long way.

The writer is director, Sales and Field Marketing, India & SAARC Countries, NortonLifeLock

Get live Stock Prices from BSE, NSE, US Market and latest NAV, portfolio of Mutual Funds, Check out latest IPO News, Best Performing IPOs, calculate your tax by Income Tax Calculator, know markets Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.

Financial Express is now on Telegram. Click here to join our channel and stay updated with the latest Biz news and updates.

Read more here:
Playing safe: Ways to manage cyber threats - The Financial Express

Read More..

Internet Security Software Market Survey 2021 with Top Countries Data: Trend, Future Demand and Leading Players Updates by Forecast, Impact of…

United States/WA: Market Will Boom In Near Future

According to a recent research published on Internet Security Software Market. Reports on industry size, status, market trends, forecasts and more also provide brief information about specific growth opportunities between competitors and key market drivers.

Market Overview:

Internet Security Software Market Report 2021 contains a comprehensive industry analysis of development components, patterns, flows, and sizes. The report also calculates current and past market values to forecast potential market management for the forecast period up to 2021 to 2027. This research study of the Internet Security Software market described the widespread use of both primary and secondary data sources.

Major Prominent Key Vendors are:Juniper Networks, Inc., Trend Micro Inc., Symantec Corporation, IBM Corporation, Kaspersky Lab, McAfee Inc., Cipher Cloud, CA Technologies, Cisco system Inc., Websense, Inc.( Forcepoint), Fortinet, Inc., Sophos Ltd., Dell, Check Point Software Technologies Ltd., SafeNet, Inc., and Cyren Ltd

This includes investigating various parameters that affect the industry, such as government policy, market environment, competitive environment, historical data, current market trends, innovations, future technologies, and technological advances in related industries. This report is specifically focused on the software defined storage market in North America, Europe, Asia Pacific, Latin America, the Middle East and Africa. This report fully categorizes the market by region, type, and application.

Request A Sample Copy:https://www.coherentmarketinsights.com/insight/request-sample/1469

Track and analyze certain developments:

Key Questions Answered in reports are:

About Coherent Market Insights:

Coherent Market Insights is a prominent market research and consulting firm offering action-ready syndicated research reports, custom market analysis, consulting services, and competitive analysis through various recommendations related to emerging market trends, technologies, and potential absolute dollar opportunity.

Contact Us:

Name: Mr. Shah

Phone: US +12067016702 / UK +4402081334027

Email: [emailprotected]

Read the original:
Internet Security Software Market Survey 2021 with Top Countries Data: Trend, Future Demand and Leading Players Updates by Forecast, Impact of...

Read More..