Page 2,241«..1020..2,2402,2412,2422,243..2,2502,260..»

When Might The Use Of AI, Machine Learning, Or Robotic Process-Enabled Insurance Models Result In An Adverse Action Under The FCRA? – Insurance -…

To print this article, all you need is to be registered or login on Mondaq.com.

As insurers consider augmenting the quoting process withalgorithmic predictive models, including those aided by artificialintelligence, machine learning, and/or robotic process automation(Models) for which core inputs are,or could be considered, a consumer report, one question that mayarise is whether the Fair Credit Reporting Act, 15 U.S.C. 1681-1681x (the FCRA)dictates the distribution of an adverse action notice when a Modelis not implemented for the purpose of makingcoverage and rating decisions(determining whether to accept or decline a particular risk or thepremium charged), but instead for the purpose of determiningwhether other actions can be taken with respect to consumers likerouting applicants to certain payment methods or other designationsunrelated to coverage and rating decisions(administrative decisions).

Under the FCRA, an adverse action can meandifferent things in the context of different industries oruses. In the context of insurance, an adverseaction is defined to mean a denial orcancellation of, an increase in any charge for, or a reduction orother adverse or unfavorable change in the terms of coverage oramount of, any insurance, existing or applied for, in connectionwith the underwriting of insurance."1 Under adifferent section of the FCRA, If any person takes anyadverse action with respect to any consumer that is based in wholeor in part on any information contained in a consumer reportthat person must, among other things, provide an adverse actionnotice to the consumer.2

A consumer report is defined tomean any written, oral, or other communication of anyinformation by a consumer reporting agency bearing on aconsumer's credit worthiness, credit standing, credit capacity,character, general reputation, personal characteristics, or mode ofliving which is used or expected to be used or collected in wholeor in part for the purpose of serving as a factor in establishingthe consumer's eligibility for . . . (A) credit or insurance tobe used primarily for personal, family, or household purposes; or .. . (C) any other purpose authorized [as a permissible purpose ofconsumer reports]"3 Thepermissible purposes of consumerreports include, in relevant part, the furnishing of a consumerreport by a consumer reporting agency to a person which ithas reason to believe . . . intends to use the information inconnection with the underwriting of insurance involving theconsumer."4

First, insurers should consider whether an administrativedecision could be considered [1] an increase in any chargefor . . . or other adverse or unfavorable change in the terms ofcoverage . . . applied for, [2] in connection with the underwritingof insurance.

An administrative decision could be considered an increase inthe charge for coverage, because applicants subject to anadministrative decision could be giving more value for the samelevel of coverage in some way. Such additional value could beminimal to the point of appearing nominal, but could theoreticallybe construed as an increase.

An administrative decision could be considered an adverse orunfavorable change in the terms of coverage, because the burden ofhaving to pay premium in a different way or obtain or interact withtheir coverage in a different way could be construed asadverse or unfavorable from the perspective of theapplicant. In many circumstances, particularly thoseaffecting applicants with fewer resources, paying more at one timeor in a different manner could mean the applicant has less funds onhand to contribute to other needs. An administrative decisioncould therefore be considered adverse orunfavorable.

Depending on the nature of the administrative decision, it couldbe construed as being undertaken in connection with theunderwriting of insurance. The only permissible purpose forwhich a consumer report may be provided to an insurer is touse the information in connection with the underwriting ofinsurance. Further, it seems counterintuitive that thelegislative intent of the FCRA would be to permit the provision ofconsumer reports without the attachment of attendant restrictionsand obligations like the FCRA's requirements in respect ofadverse actions.

As stated above, according to the FCRA, if any person takes anyadverse action with respect to any consumer that is based in wholeor in part on any information contained in a consumer reportthe person must, among other things, provide an adverse actionnotice to the consumer.5 Insurers must thereforeconsider whether an administrative decision could be construed asbeing (1) based in whole or in part on (2) any informationcontained in a consumer report.

The phrase based in whole or in part on has beeninterpreted to apply only when there is a but-forcausal relationship. An adverse action is not considered tobe based in whole or in part on the consumer report unlessthe report was a necessary condition of the adverseaction.6

Under certain caselaw, the baseline or benchmark for consideringwhether there has been a disadvantageous increase in rate (and,therefore an adverse action requiring notice to the applicant) hasbeen interpreted to be what the applicant would have had ifthe company had not taken his[/her] credit score intoaccount."7It may be that the onlypurpose of a Model's use of a consumer report is to determinewhether an administrative decision will be engaged. In that case,the baseline could be considered to be the absence ofthe result of the administrative decision. In other words,without use of the Model that integrates the consumer report, theremight not be any possibility of the administrative decisionimpacting the applicant.

An insurer must analyze whether particularized information usedin a Model has been obtained from a consumer reporting agency basedon the insurer's permissible purpose. An insurer shouldalso analyze whether the information is: (i) a writtencommunication of information derived from a consumer reportingagency; (ii) bearing on a consumer's credit worthiness, creditstanding, credit capacity, character, general reputation, personalcharacteristics, or mode of living; (iii) which is used or expectedto be used or collected in whole or in part for the purpose ofserving as a factor in establishing the consumer's eligibilityfor insurance to be used primarily for personal, family, orhousehold purposes.

Finally, an insurer should consider whether the above analysiswould differ or whether additional considerations arise out ofstate insurance scoring laws promulgated based on the NationalCouncil of Insurance Legislators' Model Act Regarding Use ofCredit Information in Personal Insurance (NCOILModel). The NCOIL Model defines whatconstitutes an insurance score (which is similar tothe FCRA's definition of consumer report), what constitutesan adverse action in respect of such insurance scores(which is similar to the FCRA's definition of adverseaction), and when an adverse action notice must be sent in respectof such adverse actions (which trigger language is similar to theFCRA's trigger language). This analysis will depend on thestate-specific implementation of the NCOIL Model (whereapplicable), or on other related state laws and regulationsaddressing this subject matter (for those states that have notadopted some form of the NCOIL Model).

Of course, in analyzing these issues, insurers should consultextensively with insurance and federal regulatory counsel as to thespecific nature of the administrative decisions, how Models arecreated and used, and what the impact of such administrativedecisions and Models are on applicants and consumers.

1 15 U.S.C.A. 1681a(k)(1)(B)(i).

2 15 U.S.C.A. 1681m(a).

3 15 U.S.C.A. 1681a(d)(1)(A) and (C).

4 15 U.S.C.A. 1681b(a)(3)(C).

6 Safeco Ins. Co. of Am. v. Burr, 551 U.S. 47, 63, 127 S.Ct. 2201, 2212, 167 L. Ed. 2d 1045 (2007). This case is alsosometimes referred to asGeico v. Edo.

7 Id.at 2213.

The content of this article is intended to provide a generalguide to the subject matter. Specialist advice should be soughtabout your specific circumstances.

Here is the original post:
When Might The Use Of AI, Machine Learning, Or Robotic Process-Enabled Insurance Models Result In An Adverse Action Under The FCRA? - Insurance -...

Read More..

What is The Role of Machine Learning in Bio-Technology? – Analytics Insight

ML is transforming biological research, resulting in new discoveries in healthcare and biotechnology.

Machine Learning and Artificial Intelligence have taken the world by storm, changing the way people live and work. Advances in these fields have elicited both praise and criticism. AI and ML, as theyre colloquially known, offer several applications and advantages across a wide range of sectors. Most importantly, they are transforming biological research, resulting in new discoveries in healthcare and biotechnology.

Here are some use cases of ML in biotech:

Next-generation sequencing has greatly improved the study of genomics by sequencing a gene in a short period of time. As a result, the machine learning approach is being used to discover gene coding areas in a genome. Such machine learning-based gene prediction techniques would be more sensitive than traditional homology-based sequence analyses.

PPI was mentioned before in the context of proteomics. However, the application of ML in structure prediction has increased accuracy from 70% to more than 80%. The application of ML in text mining is extremely promising, with training sets used to find new or unique pharmacological targets from many journal articles and secondary databases searched.

Deep learning is an extension of neural networks and is a relatively new topic in ML. The term deep in deep learning represents the number of layers through which data is changed. As a result, deep learning is analogous to a multi-layer neural structure. These multi-layer nodes attempt to simulate how the human brain works in order to solve issues. ML already uses neural networks. To undertake analysis, neural network-based ML algorithms require refined or meaningful data from raw data sets. However, the rising amount of data generated by genome sequencing makes it harder to analyse significant information. Multiple layers of a neural network filter information and interact with each other, allowing the output to be refined.

Anxiety, stress, substance use disorder, eating disorder, and other symptoms of mental disease are examples. The bad news is that most people go undiagnosed since they are not sure if they have a problem. That is a stunning but harsh reality. Until today, doctors and scientists have not been as effective in predicting mental diseases. Yes, technology innovation has enabled healthcare professionals to create smart solutions that not only detect mental disease but also recommend the appropriate diagnostic and treatment techniques.

Machine learning and artificial intelligence (AI) are widely employed by hospitals and healthcare providers to increase patient happiness, administer individualized treatments, make accurate forecasts, and improve quality of life. It is also being utilized to improve the efficiency of clinical trials and to accelerate the process of medication development and distribution.

The development of digitization has rendered the twenty-first-century data-centric, affecting every business and sector. The healthcare, biology, and biotech industries are not immune to the effects. Enterprises are seeking to locate a solution that can combine their operations with a powerful resolution and give the capacity to record, exchange, and transmit data in a systematic, quicker, and smoother manner. Bioinformatics, biomedicine, network biology, and other biological subfields have long struggled with biological data processing challenges.

Share This ArticleDo the sharing thingy

View post:
What is The Role of Machine Learning in Bio-Technology? - Analytics Insight

Read More..

Mathematicians to Build New Connections With Machine Learning: Know-How – Analytics Insight

Machine learning makes it possible to generate more data than mathematician can in a lifetime

For the first time, mathematicians have partnered with artificial intelligence to suggest and prove new mathematical theorems. While computers have long been used to generate data for mathematicians, the task of identifying interesting patterns has relied mainly on the intuition of the mathematicians themselves. However, its now possible to generate more data than any mathematician can reasonably expect to study in a lifetime. Which is where machine learning comes in.

Two separate groups of mathematicians worked alongside DeepMind, a branch of Alphabet, Googles parent company, dedicated to the development of advanced artificial intelligence systems. Andrs Juhsz and Marc Lackenby of the University of Oxford taught DeepMinds machine learning models to look for patterns in geometric objects called knots. The models detected connections that Juhsz and Lackenby elaborated to bridge two areas of knot theory that mathematicians had long speculated should be related. In separate work, Williamson used machine learning to refine an old conjecture that connects graphs and polynomials.

Andrs Juhsz and Marc Lackenby of the University of Oxford taught DeepMinds machine learning models to look for patterns in geometric objects called knots. The models detected connections that Juhsz and Lackenby elaborated to bridge two areas of knot theory that mathematicians had long speculated should be related. In separate work, Williamson used machine learning to refine an old conjecture that connects graphs and polynomials.

The most amazing thing about this work and it really is a big breakthrough is the fact that all the pieces came together and that these people worked as a team, said Radmila Sazdanovic of North Carolina State University.

Some observers, however, view the collaboration as less of a sea change in the way mathematical research is conducted. While the computers pointed the mathematicians toward a range of possible relationships, the mathematicians themselves needed to identify the ones worth exploring.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Read the original here:
Mathematicians to Build New Connections With Machine Learning: Know-How - Analytics Insight

Read More..

New study looks at machine learning and palliative care – RNZ

Those working in the health sector will tell you of the patient who's sick - but doesn't want to be a bother, so doesn't ask for help, even though they really need it.

Or the family that is desperately worried about the health of their loved one, who is pretending that's everything's ok, when it's not.

Kathryn speaks with Dr Margaret Sandham, who's spear-headed a study into how machine learning could help in the palliative care sector, picking up crucial symptoms that can mark a change in the health of a patient, so appropriate care can be given.

The research, conducted by AUT, analysed the symptoms of 800 patients at an Auckland hospice, using a combination of statistical tools, machine learning, and network visualisation.

Margaret explains how the data could have application for mobile apps and wearable technology - a much less intrusive way of keeping tabs on the health of a patient, than constant phone calls or visits from health workers.

Photo: Supplied, 123RF

Visit link:
New study looks at machine learning and palliative care - RNZ

Read More..

Mytide Therapeutics Raises $7 Million Series A Round to Transform Peptide Manufacturing with Machine Learning – Business Wire

BOSTON--(BUSINESS WIRE)--Mytide Therapeutics, a company transforming peptide manufacturing with predictive analytics and machine learning, has raised $7 million in Series A financing. The round was led by Alloy Therapeutics, a biotechnology ecosystem company, and was joined by Uncommon Denominator and the Mytide founding team. As part of the financing, Alloy Therapeutics CEO Errik Anderson will join Mytides Board of Directors. This financing will allow Mytide to scale its AI-enabled Gen2 platform to support cost-effective, scalable, and decentralized manufacturing for a wide-variety of peptide and peptide conjugate applications for therapeutic discovery and personal peptide vaccines (PPV).

Mytides Gen2 platform produces both natural and non-natural peptides 30-times faster than traditional manufacturing practices by eliminating bottlenecks throughout the entire process of synthesis, analysis, purification, and lyophilization. Through rigorous in-process data collection, Mytides continuously learning AI-guided engine enables higher purity, production reliability, and speed, by controlling a proprietary set of chemical processes, analytical tools, and robotics. These tools enable access to a novel peptide space including difficult-to-manufacture non-canonical amino acids, constrained peptides, and short-proteins that are inaccessible or uneconomical to produce and screen using traditional peptide manufacturing processes.

Mytides robust data capture and processing techniques represents one of the largest and fastest growing peptide manufacturing data repositories in the world. Through unparalleled manufacturing speed and precision, Mytides technology has addressed the high-throughput screening and library generation needs of computational biology modeling to support in vivo and in vitro studies, as well as clinical trial studies.

At Mytide, we aim to overcome the time-consuming and labor-intensive organic chemistry processes limiting peptide and other biopolymer production. Our goal is to speed drug developers ability to translate therapeutic innovations into clinical impact, said Mytide co-founder Dale Thomas. Our platform takes a holistic view of the entire manufacturing process and couples it with a fully closed-loop computational biology platform, unlocking therapeutic development at unprecedented speeds and precision. The investment from Alloy Therapeutics brings our quick-turn manufacturing technology into a broad drug discovery ecosystem to further accelerate the development of new peptide therapeutics.

Peptides are a high growth drug discovery modality of interest within the pharma industry, with multiple PPVs in Phase III clinical trials. To validate its technology, Mytide has actively partnered its continuous manufacturing platform with pharmaceutical companies requiring scalable and time-sensitive manufacturing for both research and clinical programs. Mytides Gen2 platform is designed to easily be integrated into cGMP manufacturing environments to allow for scalable and decentralized clinical trial manufacturing of a partners lead peptide-based therapeutic candidates. Mytide continues to advance upon the progress in molecular access and analysis being made by the likes of Integrated DNA Technologies (IDT), Illumina, and Thermo Fisher Scientific.

Mytide represents an exciting opportunity to bring down barriers in drug development further, by providing Alloys ecosystem of industry partners with access to high-quality, AI-enabled peptide manufacturing, said Errik Anderson, Alloy Therapeutics CEO and founder. Together, we are excited to empower developers of peptide and combination therapeutics and enable rapid innovation in this promising modality, for the ultimate benefit of patients.

About Mytide Therapeutics:

Mytide Therapeutics is a Boston, MA based peptide and biopolymer manufacturing and computational biology company focused on eliminating the time-consuming and labor-intensive chemical and screening processes preventing innovative therapeutics translation into the clinic. Mytides quick-turn manufacturing technology, coupled with AI-enabled predictive analytics, is providing access to a novel peptide space of difficult-to-make natural and non-natural peptide and peptide conjugates discovery and therapeutic manufacturing. The company is focused on the translation of life-saving therapeutics for serious conditions ranging from metabolic conditions to oncology to inflammatory disorders to infectious diseases.

Learn more about Mytide Therapeutics by visiting Mytide.io or following Mytide on LinkedIn.

About Alloy Therapeutics

Alloy Therapeutics is a biotechnology ecosystem company empowering the global scientific community to make better medicines together. Through a community of partners across academia, biotech, and the largest biopharma, Alloy democratizes access to tools, technologies, services, and company creation capabilities that are foundational for discovering and developing therapeutic biologics. Alloys foundational technology, the ATX-Gx, is a human therapeutic antibody discovery platform consisting of a growing suite of proprietary transgenic mice strains. Alloy is a leader in bispecific antibody discovery and engineering services, utilizing its proprietary ATX-CLC common light chain platform integrating novel transgenic mice and phage display. DeepImmune integrates Alloy's full complement of proprietary in vivo, in vitro, and in silico discovery and optimization technologies into one comprehensive offering for fully human antibody, bispecific, and TCR discovery. DeepImmune is also available for royalty-free access as part of Alloys novel Innovation Subscription model. Alloy is headquartered in Boston, MA with labs in Cambridge, UK; Basel, CH; San Francisco, CA; and Athens, GA. As a reflection of Alloys relentless commitment to the scientific community, Alloy reinvests 100% of its revenue in innovation and access to innovation.

Join the Alloy Therapeutics community by visiting alloytx.com, following Alloy on LinkedIn, or scheduling a 15-minute chat with Alloys Founder and CEO at alloytx.com/ceo.

Originally posted here:
Mytide Therapeutics Raises $7 Million Series A Round to Transform Peptide Manufacturing with Machine Learning - Business Wire

Read More..

Pasqal and ARAMCO developing quantum computing applications for the energy industry – WorldOil

3/9/2022

Pasqal, a developer of neutral atom-based quantum technology, and ARAMCO announced the signing of an MoU to collaborate on quantum computing capabilities and applications in the energy sector. Objectives include accelerating the design and development of quantum based machine learning models as well as identifying and advancing other use cases for the technology across the Saudi Aramco value chain. To that end, both companies plan to explore ways for collaborating and cultivating the quantum information sciences ecosystem in the Kingdom of Saudi Arabia.

Quantum computing can be used to address a wide range of upstream, midstream and downstream challenges in the oil and gas industry including network optimization and management, reaction network generation and refinery linear programming. The collaboration will explore potential applications for quantum computing and artificial intelligence in these areas as well.

As part of the project, Pasqal will provide both its quantum expertise and platform to develop new use cases. The companies will also explore the applicability and benefit of augmenting Aramcos training programs with Pasqals quantum technologies as part of these joint efforts.

For its part, ARAMCO is focused on pioneering the use of quantum computing in the energy sector, positioning itself as an early beneficiary of quantum advantage over classical computers. Pasqal aims to establish operations in the Middle East and grow its business both in Saudi Arabia and across the region.

This is a very promising initiative for Pasqal and a perfect opportunity for us to show not only the energy sector, but the entire world, what our technology can do, said Georges-Olivier Reymond, CEO of Pasqal. It further confirms that our neutral atom technology is one of the most promising in the world.

Follow this link:
Pasqal and ARAMCO developing quantum computing applications for the energy industry - WorldOil

Read More..

Cooling quantum computers: a challenge that will shape the industry – Tech Monitor

Last month quantum computing start-up Quantum Motion opened what it says is the UKs largest independent quantum laboratory. The Islington lab, which represents a multi-million-pound investment for the University College London (UCL) spin-out, is home to specialist equipment for its scientists and engineers to use. This includes dilution refrigerators, which allow quantum technology to be developed at a temperature close to absolute zero, or -278 degrees Celsius, some 100 times colder than outer space.The dilution refrigerator at Quantum Motions London lab. Cooling is key to effective quantum processing. (Photo courtesy of Quantum Motion)

Islington is officially now the coolest part of London, quipped James Palles-Dimmock, the companys chief operating officer, at the time. Were working with technology that is colder than deep space and pushing the boundaries of our knowledge to turn quantum theory into reality.

Keeping quantum chips cold is key to ensuring they work accurately and fulfil their promise to outperform classical computers for certain tasks. But as the technology matures and develops, doing this in a sustainable and scalable fashion may prove a challenge. With several types of qubit technology the building blocks on which quantum computers operate in development, the one which solves the cooling puzzle most effectively may gain a significant advantage in the race for commercialisation.

Qubits are the way information is represented in quantum form within a quantum computer. So where a classical computer, which runs on bits, represents data as a one or a zero, quantum data can simultaneously be a one and a zero. In theory, this means a quantum computer can process information much faster and more efficiently than a classical machine.

The technology remains at an early stage, and in November IBM announced it had developed what it claims is the most powerful processor yet, the 127-qubit Eagle. According to Big Blues quantum roadmap, it expects to reach quantum advantage the point where quantum machines outperform traditional computers on certain tasks within two years.

To achieve accurate processing, quantum computers need to operate at extremely low temperatures. This is because the heat generated by the surrounding equipment can interfere with the qubits, says Harrison Ball, quantum engineer at UK quantum computer developer Universal Quantum.

When we talk about the temperature of a material, what were really referring to is the motion of the constituent particles, the atoms, says Ball. The colder the temperature, the less motion of those atoms, which means there are contributing less variation in their environment.

The obsession of quantum engineers and physicists over the last few years has been attempting to make the most pristine qubits possible.Harrison Ball, Universal Quantum

The obsession of quantum engineers and physicists over the last few years has been attempting to make the most pristine qubits possible, and the way in which you do that is try and produce an environment for the qubit where it interacts with absolutely nothing. Thats why, broadly speaking, colder is better.

Universal Quantum is developing its quantum machine using trapped ions, or individually charged atoms, as its qubits. This is one of a number of methods for generating and controlling qubits which are in development, and John Morton, professor of nanoelectronics at UCL and co-founder of Quantum Motion, says each of them has its own reasons for needing to operate at a low temperature. Superconducting quantum computers have dominated early deployments.

"The superconducting qubit approach that Google and IBM are following needs low temperatures so they don't accidentally create cubit errors," Professor Morton says. "Ion traps use low temperatures because they need to create an incredibly good vacuum in which to operate. In the photonics approach, photons travel around quite happily at room temperature, but if you want to detect the types of photons that are being used you often need superconducting detectors, which work better at extremely low temperatures."

While the enormous carbon footprint of classical computing, particularly when it comes to the emissions of the rapidly increasing number of cloud data centres around the world, is well known, quantum computing promises a more sustainable alternative, despite the ultra-low temperatures that are required.

Professor Morton explains that the new Quantum Motion lab is housed in a standard commercial unit. "Our power requirement is not very different to that of a typical office," he says.

While energy requirements will increase as quantum machines become more powerful, they are still likely to remain more efficient than their classical counterparts. "In general we anticipate workloads where well have quantum advantage to be more efficient than the classical route," says Jean-Francois Bobier, partner and director at Boston Consulting Group.

The key factor in this is speed. "Cooling down one of these fridges to a fraction of a degree above absolute zero takes about 10-15 kilowatts," says Professor Morton. "But with that quantum chip, you can do things that would take vast computing resources to achieve. These machines are not designed to replace a desktop computer, which can use less than a kilowatt of energy a day. They are a replacement for something that consumes much more."

Google demonstrated this in 2019 with Sycamore, its 53-qubit supercomputer, which it benchmarked against IBM's Summit, which at the time was the world's most powerful classical supercomputer. Sycamore was able to complete a random number problem in three minutes 20 seconds. Summit took two and a half days to solve the same problem. This increased speed meant the power consumed by Sycamore to achieve this milestone was orders of magnitude lower 30 kilowatts compared to the 25 megawatts required by Summit.

Though the nascent quantum computing industry is focused on the 'fidelity' (meaning quality and reliability) of qubits, Bobier says this does not need to be at the expense of energy efficiency. "Given all the advantages of quantum computing, exact computation is the priority over energy efficiency right now fidelity is the key bottleneck," he says. "We might possibly find a new way to control qubits that is both exact and consumes a lot of energy, but right now we dont see that, even with superconducting qubits which require dilution fridges. The ratio relative to the calculation speed-up should remain massively in favour of quantum computing."

But quantum computing's cooling requirements bring with them practical challenges.

IBM's roadmap anticipates that it will release a 433 qubit quantum chip this year, with a 1,000 qubit version to follow.This number will need to grow exponentially to realise the full benefits of quantum computing, Professor Morton says.

"The 100 qubit chip IBM released recently is about 2.5cm square," he says. "So if you ask yourself what that chip will look like if you have one million qubits, which is likely to be the amount you need to establish a fault-tolerant architecture, then you're looking at chip which is 2.5m square. The kind of cooling technology required to go to that sort of size hasn't been worked out, and certainly, if you're working in superconducting qubits one of the things you'll need to think about is how to scale the cooling system. It's definitely one of the challenges."

IBM's solution to this is to build its own enormous fridge. The company is currently constructing what it says will be the world's largest dilution refrigerator. Code-named Goldeneye, it will have a licence to chill a quantum computer of up to one million qubits, and measure some 3m tall by 1.8m wide. The project was announced in 2020 and construction is due to be completed next year. Once operational it will take between 5-14 days to reach the temperature required for a large quantum computer to operate.

Such a sizeable investment may not be practical for companies without IBM's resources, but other techniques are being investigated. Quantum computing start-up IonQ, for example, is building quantum computers on the Ion Trap architecture, and cools its qubits by using a laser to cool the individual atoms which are required to be in a quantum state, a process known as laser doppler cooling.

Professor Morton says that whoever comes up with the best cooling solution could have a significant advantage as commercial applications for quantum computers start to emerge. "At the moment there are three or four different architectures which are being most actively investigated," he says. "I think it's certainly possible that the practicalities of cooling may well influence which qubit technology ends up winning."Read more: Want more on technology leadership?

Sign up for Tech Monitor's weekly newsletter, Changelog, for the latest insight and analysis delivered straight to your inbox.

Matthew Gooding is news editor for Tech Monitor.

See the article here:
Cooling quantum computers: a challenge that will shape the industry - Tech Monitor

Read More..

Quantum Computers: Why They Are Hard To Build And Worth The Effort – Swarajya

This is Part 2 of the two-part article on quantum computing. Read Part 1 here.

Quantum Cryptography And Post-Quantum Cryptography

Present-day systems are protected by Rivest-Shamir-Adleman (RSA) encryption, which is based on the fact that it is practically impossible for classical computers to factorise large integers.

Peter Shor surprised the world with his polynomial-time quantum algorithm, which made it theoretically possible for a quantum computer to factorise large positive integers, thereby putting present-day encryption, and hence computer and communications systems, at risk.

A quantum computer powerful enough to run the algorithm to factor large integers may be several decades away, but the effort to build the next generation of encryption schemes resistant to a breach using quantum computers is already ongoing.

There are two approaches. The first one, called post-quantum cryptography, is based on constructing classical cryptographic algorithms that are hard for a quantum computer to break.

The other approach quantum cryptography is to use the properties of quantum mechanics itself to secure data.

Quantum cryptography is defined as using quantum mechanical properties for cryptography tasks, such as quantum key distribution (QKD).

Keys are large binary strings of numbers used to provide security to most cryptographic protocols, like encryption and authentication. Though classical key distribution algorithms like Diffie-Helman provide the secure exchange of keys between two parties, they will be vulnerable in the future to quantum computers.

The first QKD scheme was proposed by Bennet and Brassard in 1984. It is called the BB84 protocol and is based on Heisenberg's Uncertainty Principle. The basic idea for this protocol is that Alice can send a secret key to Bob encoded in the polarisation of a string of photons. If an eavesdropper tries to intercept and read it, the state of the photons will change, revealing the presence of the eavesdropper.

Why Are Quantum Computers Hard To Build?

Qubits, unlike classical bits, need to interact strongly among themselves to form entangled states, which in turn form the basis for computation in quantum computers. But to achieve this experimentally is incredibly hard.

We don't want qubits to interact with the environment because it causes decoherence. Decoherence is the phenomenon due to which quantum effects are visible in the microscopic world, but not in the macroscopic world. The main difference between classical information and quantum information is that we cant observe a quantum state without damaging it in some uncontrolled way. We may not look at quantum computers all the time; nature continuously interferes with them. That's why the information the quantum computer is processing needs to be almost isolated from the outside world.

Why We Believe Quantum Computers Can Be Built

Since Richard Feynman's talk 40 years ago, we have come a long way, but the quantum computers present today are not very useful yet.

We are presently in the NISQ era of quantum computers. NISQ stands for 'Noisy Intermediate-Scale Quantum'. 'Intermediate scale' means that the qubit count is greater than 50 and it cannot be simulated using the most powerful classical supercomputers. 'Noisy' means that these devices are not yet error-corrected.

Through the discovery of polynomial-time factorisation and discrete logarithm algorithms by Shor, the interest in quantum computing skyrocketed, but scepticism regarding quantum computing remained, captured in the saying that it is the computer scientists dream [but] the experimenters nightmare".

Again, it was Shor who showed the way. He discovered quantum error-correcting codes and fault-tolerant methods for executing quantum computations reliably on noisy hardware.

In classical error correction, we measure bits to find out errors, but measuring a qubit will destroy the state of the qubit. Shor found a way to detect errors in the qubit without measuring the state of the qubit itself.

The discovery of error-correcting codes showed that we will be able to scale up quantum computers to the degree that they can solve practical problems, but we will need a lot more qubits and a lower inherent error rate before any such correction is useful.

Industry, Governments Are Interested

The promise of quantum computing has propelled major industry players like IBM, Google, Microsoft, Amazon, Honeywell and Alibaba into pouring billions of dollars into quantum computing research.

Google plans to build a full-scale quantum computer by 2029, one that can be used for solving practical business problems. Companies like IBM have laid out technology development milestones to develop a scalable and fault-tolerant quantum computer.

Startups are not falling behind in investment. Several millions of dollars are invested into startups like Rigetti computing, IonQ, Xanadu and PsiQuantum to develop quantum computers.

Governments across the world are pumping billions of dollars into quantum computing research. In 2019, the United States National Science Foundation (NSF) and Department of Energy (DOE) committed to spending $1.2 billion over a period of five years to support quantum computing research.

Similarly, China has included quantum technology as one the high-technology investment areas in its 14th five-year plan. India too has announced a National Mission on Quantum Technologies & Applications (NM-QTA).

While investment of billions of dollars into quantum computing will not immediately result in a practically usable quantum computer, the future promise of the power that quantum computing may deliver has set in motion a flurry of investments into the field.

Quantum Technology In India, In Brief

The Indian government in its 2020 budget announced the Rs 8,000 crore ($ 1.2 billion) NM-QTA. The mission aims to focus on fundamental science and technology development, and to help prepare the next generation of workforce, encourage entrepreneurship, and address issues concerning national priorities.

In India, the Defence Research and Development Organisation (DRDO) and Indian Space Research Organisation (ISRO) have made strides on the quantum communication front.

Not long ago, DRDO demonstrated QKD between Prayagraj and Vindhyachal in Uttar Pradesh over a 100 km fibre optic link.

ISRO, on the other hand, demonstrated quantum entanglement-based real-time QKD over a 300 m atmospheric channel. This is a step towards the development of the planned satellite-based quantum communication (SBQC).

Efforts at building a quantum computer in India presently seem to be limited to academic efforts.

The Future

Noise severely limits the scale of computations in NISQ-era devices. We expect to overcome this issue in the long run using quantum error correction and fault-tolerant quantum computing (FQTC), but the number of qubits required to run these error-correcting schemes is very high and depends on the algorithms we are trying to run and the quality of the hardware.

Present-day quantum computers are not capable enough to replace supercomputers, given the fact that the scaling of the number of qubits remains a challenge.

In 2019, Google demonstrated quantum supremacy using its 53-qubit quantum computer. It means that a programmable quantum device can solve a problem that no classical computer can solve in any physical time. This may give the idea that quantum computers have become more powerful than classical computers, but the problem that was solved in the quantum computer is random in nature and doesnt have any practical significance in real life.

The path to fault-tolerant and error-corrected quantum computers will remain difficult due to the fragile nature of qubits, but the possibilities quantum computers offer makes the pursuit worthwhile.

This concludes the two-part article on quantum computing. Read Part 1 if you haven't already.

This article has been published as part of Swasti 22, the Swarajya Science and Technology Initiative 2022. We are inviting submissions towards the initiative.

Also Read:

Are We Close To Realising A Quantum Computer? Yes And No, Quantum Style

New Quantum Tech Hub At IISER Pune: Quantum Computers, Sensors, Clocks And More On The Cards

See the rest here:
Quantum Computers: Why They Are Hard To Build And Worth The Effort - Swarajya

Read More..

QuantWare will build you a custom 25-qubit quantum processor in 30 days – TechCrunch

Its still very early days for quantum computing, but even so, were already seeing early signs of hardware and an ecosystem thats starting to resemble the classical computing space, with different startups specializing in the different components that make up a quantum computer. Delft, Netherlands-based QuantWare basically wants to become the chip manufacturer for this ecosystem and today, the company announced that it can now offer researchers and other startups in the space a customer 25-qubit quantum processing unit (QPU). And in an industry with long lead times, QuantWare says it can deliver this new QPU, dubbed the Contralto, in 30 days.

The company launched its first processor last year. That was a five-qubit affair, with each qubit reaching 99.9% fidelity. That made for a nice proof of concept, and QuantWare co-founder and managing director Matthijs Rijlaarsdam noted that it has already been used to build full-stack quantum computers. The five-qubit QPU allows people who are not able to make qubits to for the first time build a quantum computer because they can now get qubits. The 25-qubit QPU allows anyone in the field to get to the state of the art of the best laboratories in the world, he explained and added that there are actually very few laboratories that are currently able to build a similar QPU (think ETH and Lincoln Labs).

Image Credits: QuantWare

The Netherlands is investing heavily in quantum startups and QuantWare, with its heritage as a spin-off of Delft University, has been able to attract a group of highly qualified researchers and engineers. Alessandro Bruno, another co-founder and the companys director of Engineering, previously spent more than 10 years working on different aspects of quantum computing, including at the DICarlo lab at Delft Universitys QuTech.

While Delft may not be the first place you think about when you think about quantum computing, its worth noting that it has become a hotbed for quantum innovation. In addition to a wide variety of startups that are, like QuantWare, often associated with the school, Microsoft set up a lab at the university in 2019, too, though we havent heard all that much about the companys own efforts to build qubits lately. Maybe its no surprise that QuantWare has also hired engineers away from Microsoft.

Because of this existing tech ecosystem, QuantWare can get access to state-of-the-art cleanroom facilities to produce its superconducting QPUs. But even more importantly, the company has been able to collaborate with a lot of other quantum startups, too. What also helps is this ecosystem of partners that we find ourselves in, Rijlaarsdam said. We are able to collaborate, for instance, with a control hardware maker that needs to test their control hardware. We can provide them with the chip that we need measured anyway.

For the new QPU, potential buyers can choose from a library of components and buyers can choose how the qubits are wired together based on their specific needs. Because every qubit features multiple lines to control and read their state, its this hardware control system that also limits the size of the chip. We chose to go with this particular layout because it shows what is basically the maximum you can do at this size, he said. Beyond this, those lines will become an issue. Youll run out of space at the edges. The team expects to shift to a different technology for its next-generation chip, though Rijlaarsdam wasnt quite ready to provide any details about that yet.

A quantum computer with a 25-qubit QPU cant quite keep up with what IBM, IonQ, Rigetti and others can currently offer, but it is also QuantWares first play at selling its unit to the systems integrator market and especially new players in this market. Rijlaarsdam told me that the company is already talking to a few companies that plan to build full-stack quantum systems based on its design. Were trying to enable people to become Dell the Dell of quantum, he said.

See the rest here:
QuantWare will build you a custom 25-qubit quantum processor in 30 days - TechCrunch

Read More..

NATO and White House recognize post-quantum threats and prepare for Y2Q – VentureBeat

Join today's leading executives online at the Data Summit live now! Watch here.

Over the past decade, encryption has emerged as one of the key solutions that organizations use to secure enterprise communications, services and applications. However, the development of quantum computing is putting these defenses at risk, with the next generation of computers having the capability to decrypt these PKC algorithms.

While quantum computing technology is still in its infancy, the potential threat of PKC decryption remains. Yesterday, the NATO Cyber Security Center (NCSC) announced that it had tested a post-quantum VPN provider by U.K.-based quantum computing provider Post-Quantum, to secure its communication flows.

Post-Quantums VPN uses quantum cryptography that it claims is complex enough to prevent a malicious quantum computer from decrypting transmissions.

The development of these post-quantum cryptographic solutions offers a solution that enterprises and technical decision makers can use to protect their encrypted data from quantum computers.

NATO isnt alone in taking post-quantum cyber attacks seriously. The U.S. National Institute of Standards and Technology (NIST) recently announced that it was developing a standard to migrate to post-quantum cryptography to begin replacing hardware, software, and services that rely on public-key algorithms.

At the same time, the White House is also concerned over the threat raised by post-quantum computing, recently releasing a National Security Memorandum which gave the National Security Agency (NSA) 30 days to update the Commercial National Security Algorithm Suite (CNSA Suite) and to add quantum-resistant cryptography.

The memorandum also noted that within 180 days, agencies that handle national security systems must identify all instances of encryption not in compliance with NSA-approved Quantum Resistant Algorithms and chart a timeline to transition these systems to use compliant encryption, to include quantum resistant encryption.

While quantum computers arent capable of decrypting modern public key algorithms like RSA, Post-Quantums CEO Andersen Cheng believes that as quantum technology develops we will reach a Y2Q scenario, where all these security measures are obsolete in the face of the computational power of weaponized quantum computers.

People frequently talk about commercial quantum computers when referencing this Y2Q moment, and thats a long way off potentially 10-15 years away. But from a cybersecurity perspective, were not talking about slick commercial machines; a huge, poorly functioning prototype in the basement is all thats needed to break todays encryption, Cheng said.

It does not need to go through any benchmark review or certification, and this prospect is much closer and it could happen within the next three to five years, Cheng said.

If Cheng is correct that non-commercial quantum computing solutions could be developed to weaponize quantum computing in just a few years, then organizations have a fine timeline to enhance their encryption protections, or they risk handing malicious entities and nation-states a skeleton key to their private data.

However, its not just data that exposed post-Y2Q thats at risk; potentially any data encrypted data thats been harvested in the past could then be unencrypted as part of a retrospective attack.

Quantum decryption can be applied retrospectively, in that the groundwork for a harvest now, decrypt later attack could be laid today. This means that, if a rogue nation-state or bad actor intercepted data today, they could decrypt this harvested data once quantum computers capabilities exceed those of classical computers, he said.

As more enterprises recognize the need for quantum cryptography in a post-quantum world, the post-quantum cryptography market is anticipated to reach $9.5 billion by 2029, with more than 80% of revenues from the market coming from web browsers, the IoT, machine tools, and the cybersecurity industry.

While quantum computing could pose a substantial threat to enterprises down the line, there are a wide range of solution providers emerging who are developing state-of-the-art post-quantum cryptographic solutions to mitigate this.

One such provider is UK-based post-quantum provider PQShield, which offers a range of quantum-secure solutions from IoT firm to PKI mobile and server technologies, as well as end-user applications.

Some of PQShields most recently developments include researchers and engineers contributing to the NIST Post-Quantum Cryptography Standardization Process, and the company recently raising $20 million as part of a Series A funding round.

Another promising provider is Crypta Labs, which raised 5.5 million ($7.4 million USD) in seed funding in 2020, and recently developed the worlds first space compliant Quantum Random Number Generator, which will be used to securely encrypt satellite data.

Post-Quantum itself is also in a strong position, with its encryption algorithm NTS-KEM becoming the only code-based finalist in the NIST process to identify a cryptographic standard to replace RSA and Elliptic Curve for PKC in the post-quantum world.

In any case, the wave of providers developing state of the art cryptographic algorithms means there are plenty of solutions for enterprises to deploy to mitigate the risk of quantum computing, now and in the future, to ensure that their private data stays protected.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More

Read this article:
NATO and White House recognize post-quantum threats and prepare for Y2Q - VentureBeat

Read More..