Page 3,851«..1020..3,8503,8513,8523,853..3,8603,870..»

Why Quantum Computing Gets Special Attention In The Trump Administration’s Budget Proposal – KUT

From Texas Standard:

The Trump administration's fiscal year 2021 budget proposal includes significant increases in funding for artificial intelligence and quantum computing, while cutting overall research and development spending.

If Congress agrees to it, funding for artificial intelligence, or AI, would nearly double, and quantum computing would receive a 50% boost over last year's budget, doubling in 2022 to $860 million. The administration says these two fields of research are important to U.S. national security, in part, because China also invests heavily in these fields.

Quantum computing uses quantum mechanics to solve highly complex problems more quickly than they can be solved by standard or classical computers. Though fully functional quantum computers don't yet exist, scientists at academic institutions, as well as at IBM, Google and other companies, are working to build such systems.

Scott Aaronson is a professor of computer science and the founding director of the Quantum Information Center at the University of Texas at Austin. He says applications for quantum computing include simulation of chemistry and physics problems. These simulations enable scientists to design new materials, drugs, superconductors and solar cells, among other things.

Aaronson says the government's role is to support basic scientific research the kind needed to build and perfect quantum computers.

"We do not yet know how to build a fully scalable quantum computer. The quantum version of the transistor, if you like, has not been invented yet," Aaronson says.

On the software front, researchers have not yet developed applications that take full advantage of quantum computing's capabilities.

"That's often misrepresented in the popular press, where it's claimed that a quantum computer is just a black box that does everything," Aaronson says.

Competition between the U.S. and China in quantum computing revolves, in part, around the role such a system could play in breaking the encryption that makes things secure on the internet.

Truly useful quantum computing applications could be as much as a decade away, Aaronson says. Initially, these tools would be highly specialized.

"The way I put it is that we're now entering the very, very early, vacuum-tube era of quantum computers," he says.

Written by Shelly Brisbin.

Continued here:
Why Quantum Computing Gets Special Attention In The Trump Administration's Budget Proposal - KUT

Read More..

The $600 quantum computer that could spell the end for conventional encryption – BetaNews

Concerns that quantum computing could place current encryption techniques at risk have been around for some time.

But now cybersecurity startup Active Cypher has built a password-hacking quantum computer to demonstrate that the dangers are very real.

Using easily available parts costing just $600, Active Cyphers founder and CTO, Dan Gleason, created a portable quantum computer dubbed QUBY (named after qubits, the basic unit of quantum information). QUBY runs recently open-sourced quantum algorithms capable of executing within a quantum emulator that can perform cryptographic cracking algorithms. Calculations that would have otherwise taken years on conventional computers are now performed in seconds on QUBY.

Gleason explains, "After years of foreseeing this danger and trying to warn the cybersecurity community that current cybersecurity protocols were not up to par, I decided to take a week and move my theory to prototype. I hope that QUBY can increase awareness of how the cyberthreats of quantum computing are not reserved to billion-dollar state-sponsored projects, but can be seen on much a smaller, localized scale."

The concern is that quantum computing will lead to the sunset of AES-256 (the current encryption standard), meaning all encrypted files could one day be decrypted. "The disruption that will come about from that will be on an unprecedented, global scale. It's going to be massive," says Gleason. Modelled after the SADM, a man-portable nuclear weapon deployed in the 1960s, QUBY was downsized so that it fits in a backpack and is therefore untraceable. Low-level 'neighborhood hackers' have already been using portable devices that can surreptitiously swipe credit card information from an unsuspecting passerby. Quantum compute emulating devices will open the door for significantly more cyberthreats.

In response to the threat, Active Cypher has developed advanced dynamic cyphering encryption that is built to be quantum resilient. Gleason explains that, "Our encryption is not based on solving a mathematical problem. It's based on a very large, random key which is used in creating the obfuscated cyphertext, without any key information within the cyphertext, and is thus impossible to be derived through prime factorization -- traditional brute force attempts which use the cyphertext to extract key information from patterns derived from the key material."

Active Cypher's completely random cyphertext cannot be deciphered using even large quantum computers since the only solution to cracking the key is to try every possible combination of the key, which will produce every known possible output of the text, without knowledge of which version might be the correct one. "In other words, you'll find a greater chance of finding a specific grain of sand in a desert than cracking this open," says Gleason.

Active Cypher showcased QUBY in early February at Ready -- an internal Microsoft conference held in Seattle. The prototype will also be presented at RSA in San Francisco later this month.

Read more from the original source:
The $600 quantum computer that could spell the end for conventional encryption - BetaNews

Read More..

Correcting the jitters in quantum devices – MIT News

Labs around the world are racing to develop new computing and sensing devices that operate on the principles of quantum mechanics and could offer dramatic advantages over their classical counterparts. But these technologies still face several challenges, and one of the most significant is how to deal with noise random fluctuations that can eradicate the data stored in such devices.

A new approach developed by researchers at MIT could provide a significant step forward in quantum error correction. The method involves fine-tuning the system to address the kinds of noise that are the most likely, rather than casting a broad net to try to catch all possible sources of disturbance.

The analysis is described in the journal Physical Review Letters, in a paper by MIT graduate student David Layden, postdoc Mo Chen, and professor of nuclear science and engineering Paola Cappellaro.

The main issues we now face in developing quantum technologies are that current systems are small and noisy, says Layden. Noise, meaning unwanted disturbance of any kind, is especially vexing because many quantum systems are inherently highly sensitive, a feature underlying some of their potential applications.

And theres another issue, Layden says, which is that quantum systems are affected by any observation. So, while one can detect that a classical system is drifting and apply a correction to nudge it back, things are more complicated in the quantum world. What's really tricky about quantum systems is that when you look at them, you tend to collapse them, he says.

Classical error correction schemes are based on redundancy. For example, in a communication system subject to noise, instead of sending a single bit (1 or 0), one might send three copies of each (111 or 000). Then, if the three bits dont match, that shows there was an error. The more copies of each bit get sent, the more effective the error correction can be.

The same essential principle could be applied to adding redundancy in quantum bits, or qubits. But, Layden says, If I want to have a high degree of protection, I need to devote a large part of my system to doing these sorts of checks. And this is a nonstarter right now because we have fairly small systems; we just dont have the resources to do particularly useful quantum error correction in the usual way. So instead, the researchers found a way to target the error correction very narrowly at the specific kinds of noise that were most prevalent.

The quantum system theyre working with consists of carbon nuclei near a particular kind of defect in a diamond crystal called a nitrogen vacancy center. These defects behave like single, isolated electrons, and their presence enables the control of the nearby carbon nuclei.

But the team found that the overwhelming majority of the noise affecting these nuclei came from one single source: random fluctuations in the nearby defects themselves. This noise source can be accurately modeled, and suppressing its effects could have a major impact, as other sources of noise are relatively insignificant.

We actually understand quite well the main source of noise in these systems, Layden says. So we don't have to cast a wide net to catch every hypothetical type of noise.

The team came up with a different error correction strategy, tailored to counter this particular, dominant source of noise. As Layden describes it, the noise comes from this one central defect, or this one central electron, which has a tendency to hop around at random. It jitters.

That jitter, in turn, is felt by all those nearby nuclei, in a predictable way that can be corrected.

The upshot of our approach is that were able to get a fixed level of protection using far fewer resources than would otherwise be needed, he says. We can use a much smaller system with this targeted approach.

The work so far is theoretical, and the team is actively working on a lab demonstration of this principle in action. If it works as expected, this could make up an important component of future quantum-based technologies of various kinds, the researchers say, including quantum computers that could potentially solve previously unsolvable problems, or quantum communications systems that could be immune to snooping, or highly sensitive sensor systems.

This is a component that could be used in a number of ways, Layden says. Its as though were developing a key part of an engine. Were still a ways from building a full car, but weve made progress on a critical part.

"Quantum error correction is the next challenge for the field," says Alexandre Blais, a professor of physics at the University of Sherbrooke, in Canada, who was not associated with this work. "The complexity of current quantum error correcting codes is, however, daunting as they require a very large number of qubits to robustly encode quantum information."

Blais adds, "We have now come to realize that exploiting our understanding of the devices in which quantum error correction is to be implemented can be very advantageous.This work makes an important contribution in this direction by showing that a common type of error can be corrected for in a much more efficient manner than expected. For quantum computers to become practical we need more ideas like this."

The research was supported by the U.S. Army Research Office and the National Science Foundation.

Read more:
Correcting the jitters in quantum devices - MIT News

Read More..

Quantum Internet Workshop Begins Mapping the Future of Quantum Communications – Quantaneo, the Quantum Computing Source

Building on the efforts of the Chicago Quantum Exchange at the University of Chicago, Argonne and Fermi National Laboratories, and LiQuIDNet (Long Island Quantum Distribution Network) at Brookhaven National Laboratory and Stony Brook University, the event was organized by Brookhaven. The technical program committee was co-chaired by Kerstin Kleese Van Dam, director of the Computational Science Initiative at Brookhaven, and Inder Monga, director of ESnet at Lawrence Berkeley National Lab.

The dollars we have put into quantum information science have increased by about fivefold over the last three years, Dabbar told the New York Times on February 10 after the Trump Administration announced a new budget proposal that includes significant funding for quantum information science, including the quantum Internet.

In parallel with the growing interest and investment in creating viable quantum computing technologies, researchers believe that a quantum Internet could have a profound impact on a number of application areas critical to science, national security, and industry. Application areas include upscaling of quantum computing by helping connect distributed quantum computers, quantum sensing through a network of quantum telescopes, quantum metrology, and secure communications.

Toward this end, the workshop explored the specific research and engineering advances needed to build a quantum Internet in the near term, along with what is needed to move from todays limited local network experiments to a viable, secure quantum Internet.

This meeting was a great first step in identifying what will be needed to create a quantum Internet, said Monga, noting that ESnet engineers have been helping Brookhaven and Stony Brook researchers build the fiber infrastructure to test some of the initial devices and techniques that are expected to play a key role in enabling long-distance quantum communications. The group was very engaged and is looking to define a blueprint. They identified a clear research roadmap with many grand challenges and are cautiously optimistic on the timeframe to accomplish that vision.

Berkeley Labs Thomas Schenkel was the Labs point of contact for the workshop, a co-organizer, and co-chair of the quantum networking control hardware breakout session. ESnets Michael Blodgett also attended the workshop.

Excerpt from:
Quantum Internet Workshop Begins Mapping the Future of Quantum Communications - Quantaneo, the Quantum Computing Source

Read More..

The EU is preparing to invest 2bn in a cloud computing alliance – NS Tech

The EU is preparing to invest 2bn in a cloud computing alliance - NS Tech ').appendTo( jQuery(this) ); var divText2 = jQuery('.entry-content p:eq(5)', this); jQuery('.article-mpu:eq(0)', this).insertAfter(divText2); } if (articleLength > 19) { jQuery('').appendTo( jQuery(this) ); var divText3 = jQuery('.entry-content p:eq(15)', this); jQuery('.article-mpu:eq(1)', this).insertAfter(divText3); } if (articleLength > 29) { jQuery('').appendTo( jQuery(this) ); var divText4 = jQuery('.entry-content p:eq(25)', this); jQuery('.article-mpu:eq(2)', this).insertAfter(divText4); } if (articleLength > 39) { jQuery('').appendTo( jQuery(this) ); var divText5 = jQuery('.entry-content p:eq(35)', this); jQuery('.article-mpu:eq(3)', this).insertAfter(divText5); } } } }); } /* Add position of article as a class to its div */ var numCount = 0; function showMoreForNewArticles() { jQuery('.post-detail-row').each(function() { if ( jQuery( this ).hasClass( "marked" ) ) {} else { jQuery(this).addClass('marked'); str1 = 'articleno'; articleNumber = str1.concat(numCount); jQuery(this).addClass(articleNumber); numCount += 1; } }); }/* Initiate Banners on the side and check for unfilled Adslots every second */generateBannersForEmptySlots();window.setInterval(function(){ generateBannersForEmptySlots(); showMoreForNewArticles();}, 1000);/* .Initiate Banners on the side and check for unfilled Adslots every second */if (/Android|BlackBerry|iPhone|iPad|iPod|webOS/i.test(navigator.userAgent) === false) {//Load the second article onlym when you get to the bottom of the first.jQuery(window).bind('scroll', function() {//var elementOffset = jQuery('#full-menu').offset().top, jQuery("#full-menu").removeClass("fixed"); var scroll = jQuery(window).scrollTop(); if (scroll >= 370) { jQuery("#full-menu").addClass("fixed"); } var i = 1; jQuery('#sticky-sidebar').removeClass('widget-fixed'); var stickytop = 1000; var scroll2 = jQuery(window).scrollTop(); if (scroll2 >= stickytop-160) { jQuery("#sticky-sidebar").addClass("widget-fixed"); }}); } else { }};/* InArticle MPU */var mpuSlot0;var nextSlotId = 0;var o = 0;function generateNextSlotName1() { var id = nextSlotId++; return 'mpuSlot' + id; }function infiniteInArticleAds() { var slotName1 = generateNextSlotName1(); var slotDiv = document.createElement('div'); slotDiv.id = slotName1; document.getElementsByClassName('article-mpu')[o].appendChild(slotDiv); googletag.cmd.push(function() { var slot1 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_In_Article_MPU', [[300, 600], [300, 1050], [300, 250]], slotName1). setTargeting("Section", "Cloud"). addService(googletag.pubads()); googletag.display(slotName1); //googletag.pubads().refresh([slot1]); }); o += 1;}/* InArticle MPU *//* Sidebar MPU 1 */var sidebarMPU1Slot0;var nextSidebarMPU1SlotId = 0;var p = 0;function generateNextSlotNameMPU1() { var id = nextSidebarMPU1SlotId++; return 'sidebarMPU1Slot' + id; }function infiniteSidebarMPU1Ads() { var slotNameMPU1 = generateNextSlotNameMPU1(); var slotSidebarMPU1Div = document.createElement('div'); slotSidebarMPU1Div.id = slotNameMPU1; document.getElementsByClassName('sidebar-mpu-1')[p].appendChild(slotSidebarMPU1Div); googletag.cmd.push(function() { var slotMPU1 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_Sidebar_MPU', [[300, 600], [300, 1050], [300, 250]], slotNameMPU1). setTargeting("Section", "Cloud"). addService(googletag.pubads()); googletag.display(slotNameMPU1); }); p += 1;}/* Sidebar MPU 1 *//* Sidebar MPU 2 */var sidebarMPU2Slot0;var nextSidebarMPU2SlotId = 0;var q = 0;function generateNextSlotNameMPU2() { var id = nextSidebarMPU2SlotId++; return 'sidebarMPU2Slot' + id; }function infiniteSidebarMPU2Ads() { var slotNameMPU2 = generateNextSlotNameMPU2(); var slotSidebarMPU2Div = document.createElement('div'); slotSidebarMPU2Div.id = slotNameMPU2; document.getElementsByClassName('sidebar-mpu-2')[q].appendChild(slotSidebarMPU2Div); googletag.cmd.push(function() { var slotMPU2 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_Sidebar_MPU', [[300, 600], [300, 1050], [300, 250]], slotNameMPU2). setTargeting("Section", "Cloud"). addService(googletag.pubads()); googletag.display(slotNameMPU2); //googletag.pubads().refresh([slotMPU2]); }); q += 1;}/* Sidebar MPU 2 *//* InBetweenArticle Leaderboard */var sidebarMPU3Slot0;var nextSidebarMPU3SlotId = 0;var r = 0;function generateNextSlotNameMPU3() { var id = nextSidebarMPU2SlotId++; return 'sidebarMPU3Slot' + id; }function infiniteSidebarMPU3Ads() { var slotNameMPU3 = generateNextSlotNameMPU3(); var slotSidebarMPU3Div = document.createElement('div'); slotSidebarMPU3Div.id = slotNameMPU3; document.getElementsByClassName('between-article-leaderboard')[r].appendChild(slotSidebarMPU3Div); googletag.cmd.push(function() { var slotMPU3 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_Leaderboard_Bottom', [[975, 250], [970, 250], [970, 90], [728, 90]], slotNameMPU3). setTargeting("Section", "Cloud"). addService(googletag.pubads()); googletag.display(slotNameMPU3); //googletag.pubads().refresh([slotMPU3]); }); r += 1;}/* InBetweenArticle Leaderboard *//* InBetweenArticle MPU */var sidebarMPU4Slot0;var nextSidebarMPU4SlotId = 0;var s = 0;function generateNextSlotNameMPU4() { var id = nextSidebarMPU4SlotId++; return 'sidebarMPU4Slot' + id; }function infiniteSidebarMPU4Ads() { var slotNameMPU4 = generateNextSlotNameMPU4(); var slotSidebarMPU4Div = document.createElement('div'); slotSidebarMPU4Div.id = slotNameMPU4; document.getElementsByClassName('between-article-mpu')[s].appendChild(slotSidebarMPU4Div); googletag.cmd.push(function() { var slotMPU4 = googletag.defineSlot('/5269235/NS_Tech_2015_Mobile_MPU_1', [[300, 600], [300, 1050], [300, 250]], slotNameMPU4). setTargeting("Section", "Cloud"). addService(googletag.pubads()); googletag.display(slotNameMPU4); //googletag.pubads().refresh([slotMPU4]); }); s += 1;}/* InBetweenArticle MPU */var checkBox1 = 0;jQuery(function($) { jQuery('#popup-tos').bind('scroll', function() { if(jQuery(this).scrollTop() + jQuery(this).innerHeight()>=jQuery(this)[0].scrollHeight) { jQuery( "#itro_popup" ).addClass( "scrollDown" ); jQuery( "#readPrivacyPolicy #ppText" ).text( "I have read your Privacy Policy" ); jQuery( "#readPrivacyPolicy .wpcf7-acceptance").prop('checked', true); jQuery( "#ppSubmit .wpcf7-submit").prop('disabled', false); jQuery( "#readPrivacyPolicy" ).fadeIn('slow'); } }); });jQuery(function(){ jQuery( "#readPrivacyPolicy" ).click(function() { if (checkBox1 == 0) { jQuery( "#readPrivacyPolicy .wpcf7-acceptance").prop('checked', true); jQuery( "#ppSubmit .wpcf7-submit").prop('disabled', false); jQuery( "#itro_popup" ).addClass( "scrollDown" ); checkBox1 = 1; } else { jQuery( "#readPrivacyPolicy .wpcf7-acceptance").prop('checked', false); jQuery( "#ppSubmit .wpcf7-submit").prop('disabled', true); jQuery( "#itro_popup" ).removeClass( "scrollDown" ); checkBox1 = 0; } });});/*jQuery('#popup_content').click(function(){ alert('test');});*/jQuery(function(){ jQuery('#popup_content .header').click(function(){ jQuery(this).closest('#popup_content .container').toggleClass('collapsed'); }); });

FREDERICK FLORIN/AFP via Getty Images

show image

The European Commission has set out plans to invest 2bn in a trustworthy and energy efficient cloud computing alliance as part of a drive to unshackle the bloc from US digital infrastructure.

The initiative was included in a package of measures unveiled by senior Commission officials today in a bid to restore the continents technological sovereignty.

The funding, which would form part of a 15bn investment in Europes Digital, Industry and Space cluster, will be funnelled into a High Impact project on European data spaces, according to the Commission.

Details of the plans remain elusive, but its expected that the funding will go towards the Gaia-X programme, a French and German-led initiative aimed at bringing together cloud providers from across the continent. The initiative has attracted criticism from the US tech industry, which, primarily thanks to Amazon Web Services, dominates the global infrastructure-as-a-service market.

Speaking in Brussels on Wednesday, the Commissions industry czar Thierry Breton said that a key plank of the plans would focus on creating shared trusts for industrial data. The battle for industrial data starts now and Europe will be the main battlefield. Europe has the largest industrial base. The winners of today will not be the winners of tomorrow, he told reporters.

The Commission also published proposals on Wednesday to redraft antitrust laws, police online content and create legislation governing artificial intelligence, amid concerns that the EU is failing to keep pace with the US and China on technology and that existing measures to rein in firms such as Facebook, Google, Amazon and Apple have failed to effect long-lasting change.

In a statement issued on Wednesday, new Commission president Ursula von der Leyen (pictured) said: Europes digital transition must protect and empower citizens, businesses and society as a whole. It has to deliver for people so that they feel the benefits of technology in their lives. To make this happen, Europe needs to have its own digital capacities be it quantum computing, 5G, cybersecurity or artificial intelligence (AI).

The Commission plans to consult on the plans over the coming months, before bringing forward legislation later in the year.

Link:
The EU is preparing to invest 2bn in a cloud computing alliance - NS Tech

Read More..

Govt creates tech group to chart the tech landscape for India – Economic Times

New Delhi: The government has created an empowered Technology Group consisting of 12 members which will be headed the Principal Scientific Adviser to Government of India Professor K Vijay Raghavan. The Union Cabinet, chaired by the Prime Minister Narendra Modi approved the decision today.

"This Group is mandated to render timely policy advice on latest technologies; mapping of technology and technology products; commercialisation of dual use technologies developed in national laboratories and government R&D organisations; developing an indigenisation road map for selected key technologies; and selection of appropriate R&D programs leading to technology development," said the statement issued by the Press Information Bureau.

The Constitution of Technology Group may address the issues of silo-centric approaches to development of technology, technology standards either not developed or applied, leading to sub-optimal industrial development, dual use technologies not being optimally commercialised, R&D programs not aligned to efforts at technology development and need for mapping of technologies important for applications in society and industry.

ET had reported earlier that VijayRaghavan also heads a committee on AI which is tasked with developing a roadmap for AI in the country. The committee also has representation from the secretary in the department of science and technology, the CEO of Niti Aayog and secretary of the ministry of electronics and IT. The government has also announced Rs. 8,000 funding for Quantum Computing in the Budget 2020.

View original post here:
Govt creates tech group to chart the tech landscape for India - Economic Times

Read More..

Quantum Computing Market Analysis With Key Players, Applications, Trends and Forecast To 2026 – Instant Tech News

Quantum Computing Market Overview:

Verified Market Research adds new research report Quantum Computing Market Development Overview 2020, The report includes an in-depth analysis of the toggle switch market, taking into account market dynamics, segmentation, geographic expansion, competitive landscape and some other key issues. The market analysts who prepared the report have thoroughly examined the toggle switch market and provided reliable and accurate data. They understand the needs of the industry and customers, which makes it easier for them to focus on the problems that end users have been looking for. The research report provides an analysis of an assessment of existing and future trends in which players can invest. It also includes an assessment of the players financial perspectives and the nature of the competition.

Global Quantum Computing Market was valued at USD 89.35 million in 2016 and is projected to reach USD 948.82 million by 2025, growing at a CAGR of 30.02% from 2017 to 2025.

Request a Report Brochure @ https://www.verifiedmarketresearch.com/download-sample/?rid=24845&utm_source=ITN&utm_medium=003

Top 10 Companies in the Quantum Computing Market Research Report:

Competitive Landscape

The competitive landscape is a must-have information for the marketplayersto withstand the competition present in theglobal Quantum Computing market. This further helps the market participants to develop effective strategies to optimize their market positions. Moreover, the competitive analysis helps them to determine potential advantages as well as barriers within the global Quantum Computing market. This way, they can monitor how their competitors are implementing various strategies including pricing, marketing, and distribution.

The report analyses thecurrent trends, growth opportunities, competitive pricing, restraining factors, and boosters that may have an impact on the overall dynamics of the global Quantum Computing market. The report analytically studies the microeconomic and macroeconomic factors affecting the global Quantum Computing market growth. New and emerging technologies that may influence the global Quantum Computing market growth are also being studied in the report.

Global Quantum Computing Market: Regional Segmentation

For a deeper understanding, the research report includes geographical segmentation of the global Quantum Computing market. It provides an evaluation of the volatility of the political scenarios and amends likely to be made to the regulatory structures. This assessment gives an accurate analysis of the regional-wise growth of the global Quantum Computing market.

Regions Covered by the global market for Quantum Computing :

Middle East and Africa (GCC countries and Egypt)North America (USA, Mexico and Canada)South America (Brazil, etc.)Europe (Turkey, Germany, Russia, Great Britain, Italy, France etc.)Asia Pacific (Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia and Australia)

Ask for Discount @ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=24845&utm_source=ITN&utm_medium=003

The study objectives are:

Important Questions Answered in this Report:-

Get a Complete Market Report in your Inbox within 24 hours @ https://www.verifiedmarketresearch.com/product/Quantum-Computing-Market/?utm_source=ITN&utm_medium=003

About Us:

Verified market research partners with clients to provide insight into strategic and growth analytics; data that help achieve business goals and targets. Our core values include trust, integrity, and authenticity for our clients.

Analysts with high expertise in data gathering and governance utilize industry techniques to collate and examine data at all stages. Our analysts are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research reports.

Contact Us:

Mr. Edwyne FernandesCall: +1 (650) 781 4080Email: [emailprotected]

TAGS: Quantum Computing Market Size, Quantum Computing Market Growth, Quantum Computing Market Forecast, Quantum Computing Market Analysis, Quantum Computing Market Trends, Quantum Computing Market

Go here to read the rest:
Quantum Computing Market Analysis With Key Players, Applications, Trends and Forecast To 2026 - Instant Tech News

Read More..

EU Proposes Rules for Artificial Intelligence to Limit Risks – The New York Times

LONDON The European Union unveiled proposals Wednesday to regulate artificial intelligence that call for strict rules and safeguards on risky applications of the rapidly developing technology.

The report is part of the bloc's wider digital strategy aimed at maintaining its position as the global pacesetter on technological standards. Big tech companies seeking to tap Europe's vast and lucrative market, including those from the U.S. and China, would have to play by any new rules that come into force.

The EU's executive Commission said it wants to develop a framework for trustworthy artificial intelligence." European Commission President Ursula von der Leyen had ordered her top deputies to come up with a coordinated European approach to artificial intelligence and data strategy 100 days after she took office in December.

We will be particularly careful where essential human rights and interests are at stake," von der Leyen told reporters in Brussels. Artificial intelligence must serve people, and therefore artificial intelligence must always comply with people's rights."

EU leaders, keen on establishing technological sovereignty," also released a strategy to unlock data from the continent's businesses and the public sector so it can be harnessed for further innovation in artificial intelligence. Officials in Europe, which doesn't have any homegrown tech giants, hope to to catch up with the U.S. and China by using the bloc's vast and growing trove of industrial data for what they anticipate is a coming wave of digital transformation.

They also warned that even more regulation for foreign tech companies is in store with the upcoming Digital Services Act, a sweeping overhaul of how the bloc treats digital companies, including potentially holding them liable for illegal content posted on their platforms. A steady stream of Silicon Valley tech bosses, including Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Microsoft President Brad Smith, have visited Brussels in recent weeks as part of apparent lobbying efforts.

It is not us that need to adapt to today's platforms. It is the platforms that need to adapt to Europe, said Thierry Breton, commissioner for the internal market. That is the message that we delivered to CEOs of these platforms when they come to see us.

If the tech companies aren't able to build systems for our people, then we will regulate, and we are ready to do this in the Digital Services Act at the end of the year, he said.

The EU's report said clear rules are needed to address high-risk AI systems, such as those in recruitment, healthcare, law enforcement or transport, which should be transparent, traceable and guarantee human oversight. Other artificial intelligence systems could come with labels certifying that they are in line with EU standards.

Artificial intelligence uses computers to process large sets of data and make decisions without human input. It is used, for example, to trade stocks in financial markets, or, in some countries, to scan faces in crowds to find criminal suspects.

While it can be used to improve healthcare, make farming more efficient or combat climate change, it also brings risks. It can be unclear what data artificial intelligence systems work off. Facial recognition systems can be biased against certain social groups, for example. There are also concerns about privacy and the use of the technology for criminal purposes, the report said.

Human-centered guidelines for artificial intelligence are essential because none of the positive things will be achieved if we distrust the technology," said Margrethe Vestager, the executive vice president overseeing the EU's digital strategy.

Under the proposals, which are open for public consultation until May 19, EU authorities want to be able to test and certify the data used by the algorithms that power artificial intelligence in the same way they check cosmetics, cars and toys.

It's important to use unbiased data to train high-risk artificial intelligence systems so they can avoid discrimination, the commission said.

Specifically, AI systems could be required to use data reflecting gender, ethnicity and other possible grounds of prohibited discrimination."

Other ideas include preserving data to help trace any problems and having AI systems clearly spell out their capabilities and limitations. Users should be told when they're interacting with a machine and not a human while humans should be in charge of the system and have the final say on decisions such as rejecting an application for welfare benefits, the report said.

EU leaders said they also wanted to open a debate on when to allow facial recognition in remote identification systems, which are used to scan crowds to check people's faces to those on a database. It's considered the most intrusive form" of the technology and is prohibited in the EU except in special cases.

___

For all of the APs technology coverage: https://apnews.com/apf-technology.

___

See the rest here:
EU Proposes Rules for Artificial Intelligence to Limit Risks - The New York Times

Read More..

EASA Expects Certification of First Artificial Intelligence for Aircraft Systems by 2025 – Aviation Today

The European Aviation Safety Agency expects to certify the first integration of artificial intelligence technology in aircraft systems by 2025.

The European Aviation Safety Agency (EASA) has published its Artificial Intelligence Roadmap in anticipation of the first certification for the use of AI in aircraft systems coming in 2025.

EASA published the 33-page roadmap after establishing an internal AI task force in October 2018 to identify staff competency, standards, protocols and methods to be developed ahead of moving forward with actual certification of new technologies. A representative for the agency confirmed in an emailed statement to Avionics International that they have already received project submissions from industry designed to provide certification for AI pilot assistance technology.

The Agency has indeed received its first formal applications for the certification of AI-based aircraft systems in 2019. It is not possible to be more specific on these projects at this stage due to confidentiality. The date in our roadmap, 2025, corresponds to the project certification target date anticipated by the applicants, the representative for EASA said.

In the roadmap document, EASA notes that moving forward, the agency will define AI as any technology that appears to emulate the performance of a human. The roadmap further divides AI applications into model-driven AI and data driven AI, while linking these two forms of AI to breakthroughs in machine learning, deep learning and the use of neural networks to enable applications such as computer vision and natural language processing.

In order to be ready by 2025 for the first certification of AI-based systems, the first guidance should be available in 2021, so that the applicant can be properly guided during the development phase. The guidance that EASA will develop will apply to the use of AI in all domains, including aircraft certification as well as drone operations, the representative for EASA said.

Eight specific domains of aviation are identified as potentially being impacted by the introduction of AI to aviation systems, including the following:

The roadmap foresees the potential use of machine learning for flight control laws optimization, sensor calibration, fuel tank quantity evaluation, icing detection to be among those aircraft systems where the need for human analysis of possible combination and associated parameter values could be replaced by machine learning.

The roadmap for EASA's certification of AI in aircraft systems. Photo: EASA

EASA also points to several research and development projects and prototypes featuring the use of artificial intelligence for air traffic management already available. These include Singapore ATM Research Institutes application that generates resolution proposals that can assist controllers in resolving airspace system conflicts. There is also the Single European Sky ATM Research Joint Undertakings BigData4ATM project tasked with analyzing passenger-centric geo-located data to identify patterns in airline passenger behavior and the Machine Learning of Speech Recognition Models for Controller Assistance (MALORCA) project that has developed a speech recognition tool for use by controllers.

Several aviation industry research and development initiatives have been looking at the integration of AI and ML into aircraft systems and air traffic management infrastructure in recent years as well. During a November visit to its facility in Toulouse, Thales showed some of the technologies it is researching and developing including a virtual assistant that will provide both voice and flight intention recognition to pilots as part of its next generation FlytX avionics suite.

Zurich, Switzerland-based startup Daedalean is also developing what it describes as the aviation industrys first autopilot system to feature an advanced form of artificial intelligence (AI) known as deep convolutional feed forward neural networks. The system is to feature software that can replicate a human pilots level of decision-making and situational awareness.

NATS, the U.K.s air navigation service provider (ANSP) is also pioneering an artificial intelligence for aviation platform. At Heathrow Airport, the company has installed 18 ultra-HD 4K cameras on the air traffic control tower and others along the airports northern runway that are feeding images to a platform developed by Searidge Technology called AIMEE. The goal is for AIMEEs advanced neural network framework to become capable of identifying when a runway is cleared for takeoffs and arrivals in low visibility conditions.

As the industry moves forward with more AI developments, EASA plans to continually update its roadmap with new insights. Their roadmap proposes a possible classification of AI and ML applications separated into three levels based on the level of human oversight on a machine. Level 1 is to categorize the use of artificial intelligence for routine tasks, while Level 2 features applications where a human is a performing a function and the machine is monitoring. Level 3 is to feature full autonomy, where machines perform functions with no human intervention.

At this stage version 1.0 identifies key elements that the Agency considers should be the foundation of its human-centric approach: integration of the ethical dimension, and the new concepts of trustworthiness, learning assurance and explainability of AI, the representative for EASA said. This should be the main take away for the agencys industry stakeholders. In essence, the roadmap aims at establishing the baseline for the Agencys vision on the safe development of AI.

More here:
EASA Expects Certification of First Artificial Intelligence for Aircraft Systems by 2025 - Aviation Today

Read More..

Artificial Intelligence and Machine Learning in the Operating Room – 24/7 Wall St.

Most applications of artificial intelligence (AI) and machine learning technology provide only data to physicians, leaving the doctors to form a judgment on how to proceed. Because AI doesnt actually perform any procedure or prescribe a course of medication, the software that diagnoses health problems does not have to pass a randomized clinical trial as do devices such as insulin pumps or new medications.

A new study published Monday at JAMA Network discusses a trial including 68 patients undergoing elective noncardiac surgery under general anesthesia. The object of the trial was to determine if a predictive early warning system for possible hypotension (low blood pressure) during the surgery might reduce the time-weighted average of hypotension episodes during the surgery.

In other words, not only would the device and its software keep track of the patients mean average blood pressure, but it would sound an alarm if an 85% or greater risk of a patients blood pressure falling below 65 mm of mercury (Hg) was possible in the next 15 minutes. The device also encouraged the anesthesiologist to take preemptive action.

Patients in the control group were connected to the same AI device and software, but only routine pulse and blood pressure data were displayed. That means that the anesthesiologist had no early warning about a hypotension event and could take no action to prevent the event.

Among patients fully connected to the device and software, the median time-weighted average of hypotension was 0.1 mm Hg, compared to an average of 0.44 mm Hg in the control group. In the control group, the median time of hypotension per patient was 32.7 minutes, while it was just 8.0 minutes among the other patients. Most important, perhaps, two patients in the control group died from serious adverse events, while no patients connected to the AI device and software died.

The algorithm used by the device was developed by different researchers who had trained the software on thousands of waveform features to identify a possible hypotension event 15 minutes before it occurs during surgery. The devices used were a Flotrac IQ sensor with the early warning software installed and a HemoSphere monitor. The devices are made by Edwards Lifesciences, and Edwards also had five of eight researchers among the developers of the algorithm. The study itself was conducted in the Netherlands at Amsterdam University Medical Centers.

In an editorial at JAMA Network, associate editor Derek Angus wrote:

The final model predicts the likelihood of future hypotension via measurement of multiple variables characterizing dynamic interactions between left ventricular contractility, preload, and afterload. Although clinicians can look at arterial pulse pressure waveforms and, in combination with other patient features, make educated guesses about the possibility of upcoming episodes of hypotension, the likelihood is high that an AI algorithm could make more accurate predictions.

Among the past decades biggest health news stories were the development of immunotherapies for cancer and a treatment for cystic fibrosis. AI is off to a good start in the new decade.

By Paul Ausick

See original here:
Artificial Intelligence and Machine Learning in the Operating Room - 24/7 Wall St.

Read More..