Category Archives: Quantum Computer
Is quantum computing the next technological frontier? – The Week
As technology continues to advance toward higher realms, a new mechanism has entered the crosshairs of scientists: quantum computing. This process uses the principles of fundamental physics to "solve extremely complex problems very quickly," according to McKinsey & Company.
Using logic-based computing to solve problems isn't a new phenomenon; it was (and remains) the basis for artificial intelligence and digital computers. However, quantum computers are "poised to take computing to a whole new level," McKinsey said, because the introduction of physics into computing has the "potential tosolvevery complex statistical problems that are beyond the limits of today's computers." Quantum computing alone "could account fornearly $1.3 trillion in valueby 2035."
However, while organizations like McKinsey are clearly high on the potential for quantum computing, others say that it could create a slew of new problems.
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
SUBSCRIBE & SAVE
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
Quantum computing is a huge leap forward because "complex problems that currently take the most powerful supercomputer several years could potentially be solved in seconds," said Charlie Campbell for Time. This could open "hitherto unfathomable frontiers in mathematics and science, helping to solve existential challenges like climate change and food security."
Quantum computing is already being used for more practical purposes. One company called D-Wave Systems has "used its quantum computer to help clients determine driver schedules for grocery-store deliveries, the routing of cross-country promotional tours and cargo-handling procedures at the port of Los Angeles," said Bob Henderson for The Wall Street Journal. It could even help optimize seemingly minute problems, such as the arranging of planes at airport gates. If trying to arrange just 50 planes at 100 gates, the number of possibilities would be "10 to the hundredth power far more than the number of atoms in the visible universe," said Henderson. No standard computer "could keep track of all these possibilities.But a quantum computer potentially could."
While ubiquitous usage of quantum computers is a long way away, there are some strides being made, as Google "has built a quantum computer that's about 158 million times faster than the world's fastest supercomputer," said Luke Lango, a senior investment analyst at InvestorPlace. And quantum theory in general "has led to huge advancements over the past century. That's especially true over the past decade," as scientists "have started to figure out how to harness the power of quantum mechanics to make a new generation of superquantum computers."
But with new advancements come new sets of problems. Case-in-point: Quantum computers have "become a national security migraine," said Campbell for Time, because its ability to solve problems "will soon render all existing cryptography obsolete, jeopardizing communications, financial transactions and even military defenses."
This would be "potentially a completely different kind of problem than one we've ever faced," Glenn S. Gerstell, a former general counsel for the National Security Agency, said to The New York Times. There may be "only a 1% chance of that happening, but a 1% chance of something catastrophic is something you need to worry about." This risk "extends not just to future breaches but to past ones: Troves of encrypted data harvested now and in coming years could ... be unlocked," said Zach Montague for the Times.
Even as the risks are documented, investors are working to ensure quantum computers can be used on a widespread scale. Curtis Priem, the co-founder of AI chip manufacturer Nvidia, is "looking to establish New York's Hudson Valley as an epicenter of quantum-computing research in the country," the Journal said. Priem has already donated more than $75 million to develop a quantum computing system at Rensselaer Polytechnic Institute, making it the first college campus in the world with such a device.
Others are looking at the future of the industry through a more financial lens; Illinois legislators will soon be "asked to consider a series of incentives" as part of the state's "intensifying push to become the nation's hub for quantum computing," said Crain's Chicago Business. One of these major proposals is the creation of an "'enterprise zone' that would allow the state to provide quantum companies exemptions from sales, payroll and utility taxes for up to 40 years." If lawmakers in Illinois pass these incentives, there is a high chance that other states could follow.
To continue reading this article...
Create a free account
Continue reading this article and get limited website access each month.
Already have an account? Sign in
Subscribe to The Week
Get unlimited website access, exclusive newsletters plus much more.
Cancel or pause at any time.
Already a subscriber to The Week?
Unlimited website access is included with Digital and Print + Digital subscriptions. Create an account with the same email registered to your subscription to unlock access.
Read the original:
Is quantum computing the next technological frontier? - The Week
NXP, eleQtron and ParityQC Reveal Quantum Computing Demonstrator – Embedded Computing Design
By Ken Briodagh
Senior Technology Editor
Embedded Computing Design
May 30, 2024
News
According to a recent release, NXP Semiconductors has partnered with eleQtron and ParityQC, with theQSea consortiumof theDLR Quantum Computing Initiative (DLR QCI), to create what is reportedly the first full-stack, ion-trap based quantum computer demonstrator made entirely in Germany. The new quantum computer demonstrator is in Hamburg.
Hamburg is one of our most important R&D locations. We are proud that, together with DLR and our partners eleQtron and ParityQC, we are able to present the first ion-trap based quantum computer demonstrator developed entirely in Germany, said Lars Reger, CTO at NXP Semiconductors. We are convinced that industry and research communities in Hamburg and throughout Germany will benefit from this project. It will help to build up and expand important expertise in quantum computing, to use it for the economic benefit of us all, and also to further strengthen our digital sovereignty in Germany and the EU.
The goal of this demonstrator is to enable early access to quantum computing resources and help companies and research teams leverage it for applications like climate modeling, global logistics and materials sciences, the companies said.
DLR QCI says it aims to build necessary skills by creating a quantum computing ecosystem in which economy, industry and science cooperate closely to fully leverage the potential of this technology. Quantum computers are expected to tackle complex problems across industries, and will likely dramatically change the cybersecurity landscape.
NXP, eleQtron and ParityQC have used their expertise to build this ion-trap based quantum computer demonstrator by combining eleQtrons MAGIC hardware, ParityQC architecture, and NXP chip design and technology. To speed innovation and iteration, they have also developed a digital twin, which reportedly will be used to help this QSea I demonstrator to evolve to a quantum computer with a modular architecture, scalable design, and error correction capabilities. That evolution will be the goal of the ongoing work with the project.
The demonstrator is set up at the DLR QCI Innovation Center in Hamburg and will be made available to industry partners and DLR research teams, the release said. The three partners and the DLR QCI say they aim to foster and strengthen the development of an advanced quantum computing ecosystem in Germany.
To achieve a leading international position in quantum computing, we need a strong quantum computing ecosystem. Only together will research, industry and start-ups overcome the major technological challenges and successfully bring quantum computers into application. The QSea I demonstrator is an important step for the DLR Quantum Computing Initiative and for Hamburg. It enables partners from industry and research to run quantum algorithms on real ion trap qubits in a real production environment for the first time. This hands-on experience will enable them to leverage the advantages of quantum computers and become part of a strong and sovereign quantum computing ecosystem in Germany and Europe, said Dr.-Ing. Robert Axmann, Head of DLR Quantum Computing Initiative (DLR QCI).
Ken Briodagh is a writer and editor with two decades of experience under his belt. He is in love with technology and if he had his druthers, he would beta test everything from shoe phones to flying cars. In previous lives, hes been a short order cook, telemarketer, medical supply technician, mover of the bodies at a funeral home, pirate, poet, partial alliterist, parent, partner and pretender to various thrones. Most of his exploits are either exaggerated or blatantly false.
More from Ken
More:
NXP, eleQtron and ParityQC Reveal Quantum Computing Demonstrator - Embedded Computing Design
A combination of tech and medicine – Spectrum News 1
CLEVELAND The Cleveland Clinic and IBM have published findings focused on using quantum computing to better understand how diseases spread and thus how to develop effective therapies.
Specifically, this work was published in the Journal of Chemical Theory and Computation. It sought to learn how quantum computing could be used to predict protein structures, according to a Cleveland Clinic release.
For decades, researchers have leveraged computational approaches to predict protein structures, the release reads. A protein folds itself into a structure that determines how it functions and binds to other molecules in the body. These structures determine many aspects of human health and disease.
This work came from the Cleveland Clinic-IBM Discovery Accelerator partnership, their first peer-reviewed paper on quantum computing. It was a team led by Cleveland Clinic postdoctoral fellow Dr. Bryan Raubenolt and IBM researcher Dr. Hakan Doga.
One of the most unique things about this project is the number of disciplines involved, Raubenolt said in the release. Our teams expertise ranges from computational biology and chemistry, structural biology, software and automation engineering, to experimental atomic and nuclear physics, mathematics, and of course quantum computing and algorithm design. It took the knowledge from each of these areas to create a computational framework that can mimic one of the most important processes for human life.
The release notes that machine learning has resulted in major strides when it comes to predicting protein structures, explaining that the way this works comes down to the training data.
The limitation with this is that the models only know what theyre taught, leading to lower levels of accuracy when the programs/algorithms encounter a protein that is mutated or very different from those on which they were trained, which is common with genetic disorders.
An alternative option is to rely on simulations to emulate the physics of protein folding. Using these simulations, the goal is to find the most stable shape, which the release describes as crucial for designing drugs.
Once you reach a certain size of protein, this becomes quite difficult on a standard computer, however. Raubenolt explained in the release that even a small protein with just 100 amino acids would take a classical computer the time equal to the age of the universe to exhaustively search all the possible outcomes
Thats why the researchers utilized both quantum and classic computing methods in their work. The release states that this hybrid approach outperformed previous methods and resulted in increased accuracy.
According to the release, the researchers will continue working on and improving these algorithms.
This work is an important step forward in exploring where quantum computing capabilities could show strengths in protein structure prediction, Doga said in the release. Our goal is to design quantum algorithms that can find how to predict protein structures as realistically as possible.
Read this article:
A combination of tech and medicine - Spectrum News 1
JPMorgan Chase, Argonne National Laboratory and Quantinuum Show Theoretical Quantum Speedup with the … – JP Morgan
NEW YORK, NY; BROOMFIELD, CO; LEMONT, IL; MAY 29, 2024 - In a new paper in Science Advances on May 29, researchers at JPMorgan Chase, the U.S. Department of Energys (DOE) Argonne National Laboratory and Quantinuum have demonstrated clear evidence of a quantum algorithmic speedup for the quantum approximate optimization algorithm (QAOA).
This algorithm has been studied extensively and has been implemented on many quantum computers. It has potential applications in fields such as logistics, telecommunications, financial modeling, and materials science.
This work is a significant step towards reaching quantum advantage, laying the foundation for future impact in production, says Marco Pistoia, Head of Global Technology Applied Research at JPMorgan Chase.
The team examined whether a quantum algorithm with low implementation costs could provide a quantum speedup over the best-known classical methods. QAOA was applied to the Low Autocorrelation Binary Sequences (LABS) problem, which has significance in understanding the behavior of physical systems, signal processing and cryptography. The study showed that if the algorithm was asked to tackle increasingly larger problems, the time it would take to solve them would grow at a slower rate than that of a classical solver.
To explore the quantum algorithms performance in an ideal noiseless setting, JPMorgan Chase and Argonne jointly developed a simulator to evaluate the algorithms performance at scale. It was built on the Polaris supercomputer, accessed through the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science user facility.The ALCF is supported by DOEs Advanced Scientific Computing Research program.
The large-scale quantum circuit simulations efficiently utilized the DOE petascale supercomputer Polaris located at the ALCF. These results show how high-performance computing can complement and advance the field of quantum information science, says Yuri Alexeev, a computational scientist at Argonne.
To take the first step toward practical realization of the speedup in the algorithm, the researchers demonstrated a small-scale implementation on Quantinuums System Model H1 and H2 trapped-ion quantum computers. Using algorithm-specific error detection, the team reduced the impact of errors on algorithmic performance by up to 65%.
Our long-standing partnership with JPMorgan Chase led to this meaningful and noteworthy three-way research experiment that also brought in Argonne National Lab. The results could not have been achieved without the unprecedented and world leading quality of our H-Series Quantum Computer, which provides a flexible device for executing error-correcting and error-detecting experiments on top of gate fidelities that are years ahead of other quantum computers, says Ilyas Khan, Founder and Chief Product Officer of Quantinuum.
Read the full research paperhere.
About JPMorgan Chase JPMorgan Chase & Co. (NYSE: JPM) is a leading financial services firm based in the United States of America (U.S.), with operations worldwide. JPMorgan Chase had $4.1 trillion in assets and $337 billion in stockholders equity as of March 31, 2024. With over 63,000 technologists globally and an annual tech spend of $17 billion, JPMorgan Chase is dedicated to improving the design, analytics, development, coding, testing and application programming that goes into creating high quality software and new products. Under the J.P.Morgan and Chase brands, the Firm serves millions of customers in the U.S., and many of the worlds most prominent corporate, institutional and government clients globally. Visit http://www.jpmorganchase.com/tech for more information.
About Argonne National Laboratory Argonne National Laboratory seeks solutions to pressing national problems in science and technology by conducting leading-edge basic and applied research in virtually every scientific discipline. Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energys Office of Science.
About Quantinuum Quantinuum,the worlds largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuums technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With almost 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
Quantinuum recently closed an equity fundraise anchored by JPMorgan Chase with additional participation from Mitsui & CO., Amgen and Honeywell, which remains the companys majority shareholder, bringing the total capital raised by Quantinuum since inception to approximately $625 million.
The Honeywell trademark is used under license from Honeywell International Inc. Honeywell makes no representations or warranties with respect to this service.
Read the original post:
JPMorgan Chase, Argonne National Laboratory and Quantinuum Show Theoretical Quantum Speedup with the ... - JP Morgan
Glimpse of Next-Generation Internet – The Good Men Project
By Anne J. Manning, Harvard Gazette
Its one thing to dream up a next-generation quantum internet capable of sending highly complex, hacker-proof information around the world at ultra-fast speeds. Its quite another to physically show its possible.
Thats exactly what Harvard physicists have done, using existing Boston-area telecommunication fiber, in a demonstration of the worlds longest fiber distance between two quantum memory nodes. Think of it as a simple, closed internet carrying a signal encoded not by classical bits like the existing internet, but by perfectly secure, individual particles of light.
Thegroundbreaking work, published in Nature, was led by Mikhail Lukin, the Joshua and Beth Friedman University Professor in the Department of Physics, in collaboration with Harvard professorsMarko LonarandHongkun Park,who are all members of theHarvard Quantum Initiative.The Naturework was carried out with researchers atAmazon Web Services.
The Harvard team established the practical makings of the first quantum internet by entangling two quantum memory nodes separated by optical fiber link deployed over a roughly 22-mile loop through Cambridge, Somerville, Watertown, and Boston. The two nodes were located a floor apart in Harvards Laboratory for Integrated Science and Engineering.
Quantum memory, analogous to classical computer memory, is an important component of a quantum computing future because it allows for complex network operations and information storage and retrieval. While other quantum networks have been created in the past, the Harvard teams is the longest fiber network between devices that can store, process, and move information.
Each node is a very small quantum computer, made out of a sliver of diamond that has a defect in its atomic structure called a silicon-vacancy center. Inside the diamond, carved structures smaller than a hundredth the width of a human hair enhance the interaction between the silicon-vacancy center and light.
The silicon-vacancy center contains two qubits, or bits of quantum information: one in the form of an electron spin used for communication, and the other in a longer-lived nuclear spin used as a memory qubit to store entanglement, the quantum-mechanical property that allows information to be perfectly correlated across any distance.
(In classical computing, information is stored and transmitted as a series of discrete binary signals, say on/off, that form a kind of decision tree. Quantum computing is more fluid, as information can exist in stages between on and off, and is stored and transferred as shifting patterns of particle movement across two entangled points.)
Using silicon-vacancy centers as quantum memory devices for single photons has been a multiyear research program at Harvard. The technology solves a major problem in the theorized quantum internet: signal loss that cant be boosted in traditional ways.
A quantum network cannot use standard optical-fiber signal repeaters because simple copying of quantum information as discrete bits is impossible making the information secure, but also very hard to transport over long distances.
Silicon-vacancy-center-based network nodes can catch, store, and entangle bits of quantum information while correcting for signal loss. After cooling the nodes to close to absolute zero, light is sent through the first node and, by nature of the silicon vacancy centers atomic structure, becomes entangled with it, so able to carry the information.
Since the light is already entangled with the first node, it can transfer this entanglement to the second node, explained first author Can Knaut, a Kenneth C. Griffin Graduate School of Arts and Sciences student in Lukins lab. We call this photon-mediated entanglement.
Over the last several years, the researchers have leased optical fiber from a company in Boston to run their experiments, fitting their demonstration network on top of the existing fiber to indicate that creating a quantum internet with similar network lines would be possible.
Showing that quantum network nodes can be entangled in the real-world environment of a very busy urban area is an important step toward practical networking between quantum computers, Lukin said.
A two-node quantum network is only the beginning. The researchers are working diligently to extend the performance of their network by adding nodes and experimenting with more networking protocols.
The paper is titled Entanglement of Nanophotonic Quantum Memory Nodes in a Telecom Network. The work was supported by the AWS Center for Quantum Networkings research alliance with the Harvard Quantum Initiative, the National Science Foundation, the Center for Ultracold Atoms (an NSF Physics Frontiers Center), the Center for Quantum Networks (an NSF Engineering Research Center), the Air Force Office of Scientific Research, and other sources.
This story is reprinted with permission from The Harvard Gazette.
***
All Premium Members get to view The Good Men Project with NO ADS.
A $50 annual membership gives you an all access pass. You can be a part of every call, group, class and community. A $25 annual membership gives you access to one class, one Social Interest group and our online communities. A $12 annual membership gives you access to our Friday calls with the publisher, our online community.
Need more info? A complete list of benefits is here.
Photo credit: iStock
See the rest here:
Glimpse of Next-Generation Internet - The Good Men Project
Ripple publishes math prof’s warning: ‘public-key cryptosystems should be replaced’ – Cointelegraph
Professor Massimiliano Sala, of the University of Trento in Italy, recently discussed the future of blockchain technology, as it relates to encryption and quantum computing, with the crew at Ripple as part of the companys ongoing university lecture series.
Salas discussion focused on the potential threat posed by quantum computers as the technology matures. According to the professor, current encryption methods could be easy for tomorrows quantum computers to solve, thus putting entire blockchains at risk.
Per Sala:
What the professor is referring to is a hypothetical paradigm called Q-day, a point at which quantum computers become sufficiently powerful and available for bad actors to break classical encryption methods.
While this would have far-reaching implications for any field where data security is important including emergency services, infrastructure, banking, and defense it could theoretically devastate the world of cryptocurrency and blockchain.
Specifically, Sala warns that all classical public-key cryptosystems should be replaced with counterparts secure against quantum attacks. The idea here being that a future quantum computer or quantum attack algorithm could crack the encryption on these keys using mathematical brute force.
It bears mention that Bitcoin, the worlds most popular cryptocurrency and blockchain, would fall under this category.
While there currently exists no practical quantum computer capable of such a feat, governments and science institutions around the globe have been preparing for Q-day as if its an eventuality. For his part, Sala says that such an event may not be imminent. However, physicists at dozens of academic and commercial laboratories have demonstrated breakthroughs that have led many in the field to believe such systems could arrive within a matter of years.
Ultimately, Sala says hes satisfied with the progress being made in the sector and recommends that blockchain developers continue to work with encryption experts who understand the standards and innovations surrounding quantum-proofing modern systems.
Related: Harvard built hacker-proof quantum network in Boston using existing fiber cable
Original post:
Ripple publishes math prof's warning: 'public-key cryptosystems should be replaced' - Cointelegraph
The 3 Best Quantum Computing Stocks to Buy in May 2024 – InvestorPlace
Despite taking a hit in April, quantum computing stocks have a long runway ahead
Quantum computing stocks on the whole took a generous hit in April, mostly due to the markets realization that interest rates will not be going down any time soon. Moreover, the U.S. Labor Departments most recent report on inflation has stymied any hopes of a near-term rate cut, with the most likely date for cuts now occurring in September of 2024 if inflation continues to cool. According to the report, year-over-year inflation numbers decreased from 3.5% to 3.4%. Most consumers wont notice, but it shows the right trend and might motivate larger corporations to rethink expenditures.
After all, the world of quantum computing relies heavily on expenditures in the esoteric. Many of the investments made today to progress the technology may not see maturation for decades to come. As such, investors should be very picky about which quantum computing stocks to buy. Not all are created equal, and may not have equal shares of the future of this revolutionary technology.
Source: shutterstock.com/LCV
Ive been bullish before on International Business Machines (NYSE:IBM) due to its contributions to artificial intelligence and nanomaterials. Many of the companys advantages in the world of technology come from its mastery of computer design and manufacturing. Nowhere is this more evident than in IBMs quantum computing projects, which offer a return to massive supercomputers of the 20th century, yet at computing speeds exponentially more powerful.
From its Quantum System Two to the new Heron Processor, IBM is constantly on the cutting edge of quantum technology and shows no signs of slowing down. For investors, this means investing in a company with both the capital and reputation to lead in new quantum technologies.
Furthermore, IBM is incredibly well-diversified, touching several industries from consumer to corporate computers, to AI models and beyond. Thus, even if its quantum computing projects underperform, IBM has diverse ways to make it up to investors.
Source: T. Schneider / Shutterstock
Branding itself as the practical quantum computing company, D-Wave Quantum (NYSE:QBTS) still sits at a significant discount from its special acquisition company merger (SPAC) price of around $10. Two years after going public via the SPAC, the company has lost 87% of its value. This means trouble for anyone who bought into it then. However, the tides are turning for D-Wave Quantums stock, as the company has one of the most pertinent business models in quantum computing stocks to buy list.
By leveraging its resources in quantum computing data centers, the company offers The Leap quantum cloud service, which makes the power of quantum computers available for a fraction of the price. Subscribers can use quantum cloud computing to solve mathematical algorithms at fractions of the speed of traditional computing.
Ultimately, this business model has resulted in the companys Q1 2024 revenue up 56% year-over-year (YOY) with Q1 bookings up 54% YOY and gross profit up 294% YOY. As such, QBTS could easily be a quantum computing stock to buy due to sheer value for money.
Source: Amin Van / Shutterstock.com
From a quantum computing technology standpoint, IonQ (NYSE:IONQ) still offers some of the most compelling computers on the market. Thats because IONQs proprietary design relies on electromagnetism and atomic interactions to perform sustained calculations. From a quantum mechanics standpoint, this allows for solving far more complex and time-consuming algorithms. These computers are also exceptionally scalable to customer needs, making them versatile.
This is why I recommended the stock back in March 2024. Now Im doubling down on my recommendation thanks to its recent Q1 2024 report showing decent revenue growth. Though somewhat meager, its $7.6 million in revenue for the quarter represents a 77% growth year-over-year. Moreover, the company is keeping generous cash reserves at $434.4 million to maintain operations and research around its projects.
Bearing all this in mind, analysts have awarded the stock strong buy ratings for now. As a result, investors looking for a pure-play in quantum computing should not pass up IONQ.
On the date of publication, Viktor Zarevdid not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.
Viktor Zarev is a scientist, researcher, and writer specializing in explaining the complex world of technology stocks through dedication to accuracy and understanding.
Read the original post:
The 3 Best Quantum Computing Stocks to Buy in May 2024 - InvestorPlace
Quantum Computing Enters the Atomic Realm – Optics & Photonics News
Atom-based architectures may have a scalability advantage over other platforms in the quest to build more powerful quantum processors.
An experimental scheme demonstrated by researchers at Princeton and Yale universities is able to convert physical noise into errors that can be corrected more easily. [F. Wojciechowski / Princeton University]
Quantum computers built from arrays of ultracold atoms have recently emerged as a serious contender in the quest to create qubit-powered machines that can outperform their classical counterparts. While other hardware architectures have yielded the first fully functioning quantum processors to be available for programming through the cloud, recent developments suggest that atom-based platforms might have the edge when it comes to future scalability.
The scalability advantage of atom-based platforms stems from the exclusive use of photonic technologies to cool, trap and manipulate the atomic qubits.
That scalability advantage stems from the exclusive use of photonic technologies to cool, trap and manipulate the atomic qubits. Side-stepping the need for complex cryogenic systems or the intricacies of chip fabrication, neutral-atom quantum computers can largely be built from existing optical components and systems that have already been optimized for precision and reliability.
The traps are optical tweezers, the atoms are controlled with laser beams and the imaging is done with a camera, says Jeff Thompson, a physicist at Princeton University, USA, whose team has been working to build a quantum computer based on arrays of ytterbium atoms. The scalability of the platform is limited only by the engineering that can be done with the optical system, and there is a whole industry of optical components and megapixel devices where much of that work has already been done.
Jeff Thompson and his team at Princeton University, USA, have pioneered the use of ytterbium atoms to encode and manipulate quantum information. [S.A. Khan / Fotobuddy]
Such ready availability of critical components and systems has enabled both academic groups and commercial companies to scale their quantum processors from tens of atomic qubits to several hundred in the space of just a few years. Then, in November 2023, the California-based startup Atom Computing announced that it had populated a revamped version of its commercial system with almost 1,200 qubitsmore than had yet been reported for any hardware platform. Its exciting to be able to showcase the solutions we have been developing for the past several years, says Ben Bloom, who founded the company in 2018 and is now its chief technology officer. We have demonstrated a few firsts along the way, but while we have been building, the field has been getting more and more amazing.
Neutral atoms offer many appealing characteristics for encoding quantum information. For a start, they are all identicalcompletely free of any imperfections that may be introduced through fabricationwhich means that they can be controlled and manipulated without the need to tune or calibrate individual qubits. Their quantum states and interactions are also well understood and characterized, while crucial quantum properties such as superposition and entanglement are maintained over long enough timescales to perform computational tasks.
However, early attempts to build quantum computers from neutral atoms met with two main difficulties. The first was the need to extend existing methods for trapping single atoms in optical tweezers to create large-scale atomic arrays. Although technologies such as spatial light modulators enable laser beams to be used to produce a regular pattern of microtraps, loading the atoms into the tweezers is a stochastic processwhich means that the probability of each trap being occupied is 50%. As a result, the chances of creating a defect-free array containing large numbers of atoms becomes vanishingly small.
The solution came in 2016, when three separate groupsbased at the Institut dOptique, France, Harvard University, USA, and the Korea Advanced Institute of Science and Technology (KAIST), Republic of Koreademonstrated a concept called rearrangement. In this scheme, an image is taken of the atoms when they are first loaded into the tweezers, which identifies which sites are occupied and which are empty. All the vacant traps are switched off, and then the loaded ones are moved to fill the gaps in the array. This shuffling procedure can be achieved, for example, by using acousto-optic deflectors to alter the positions of the trapping laser beams, creating dynamic optical tweezers that can be combined with real-time control to assemble large arrays of single atoms in less than a second.
[Enlarge image]Large defect-free arrays of single atoms can be created through the process of rearrangement. In this example, demonstrated by a team led by Antoine Browaeys of the Institut dOptique, France, an ordered array of 324 atoms was created from 625 randomly filled traps. [Reprinted with permission from K.-N. Schymik, Phys. Rev. A 106, 022611(2022); 2022 by the American Physical Society]
Before that, there were lots of complicated ideas for generating single-atom states in optical tweezers, remembers Thompson. This rearrangement technique enabled the creation of large arrays containing one hundred or so single atoms without defects, and that has since been extended to much higher numbers.
In these atomic arrays, the qubits are encoded in two long-lived energy states that are controlled with laser light. In rubidium, for example, which is often used because its well-understood atomic transitions can be manipulated relatively easily, the single outermost electron occupies one of two distinct energy levels in the ground state, caused by the coupling between the electron spin and the nuclear spin. The atoms are easily switched between these two energy states by flipping the spins relative to each other, which is achieved with microwave pulses tuned to 6.8 GHz.
While atoms in these stable low-energy levels offer excellent single-qubit properties, the gate operations that form the basis of digital computation require the qubits to interact and form entangled states. Since the atoms in a tweezer array are too far apart for them to interact while remaining in the ground state, a focused laser beam is used to excite the outermost electron into a much higher energy state. In these highly excited Rydberg states, the atom becomes physically much larger, generating strong interatomic interactions on sub-microsecond timescales.
One important effect of these interactions is that the presence of a Rydberg atom shifts the energy levels in its nearest neighbors, preventing them from being excited into the same high-energy state. This phenomenon, called the Rydberg blockade, means that only one of the atoms excited by the laser will form a Rydberg state, but its impossible to know which one. Such shared excitations are the characteristic feature of entanglement, providing an effective mechanism for controlling two-qubit operations between adjacent atoms in the array.
Until recently, however, the logic gates created through two-atom entanglement were prone to errors. For a long time, the fidelity of two-qubit operations hovered at around 80%much lower than could be achieved with superconducting or trapped-ion platforms, says Thompson. That meant that neutral atoms were not really taken seriously for gate-based quantum computing.
The sources of these errors were not fully understood until 2018, when breakthrough work by Antoine Browaeys and colleagues at the Institut dOptique and Mikhail Lukins team at Harvard University analyzed the effects of laser noise on the gate fidelities. People had been using very simple models of the laser noise, says Thompson. With this work, they figured out that phase fluctuations were the major contributor to the high error rates.
At a stroke, these two groups showed that suppressing the laser phase noise could extend the lifetime of the Rydberg states and boost the fidelity of preparing two-qubit entangled states to 97%. Further enhancements since then have yielded two-qubit gate fidelities of more than 99%the minimum threshold for fault-tolerant quantum computing.
While rubidium continues to be a popular choice, several groups believe that ytterbium could offer some crucial benefits for large-scale quantum computing.
That fundamental advance established atomic qubits as a competitive platform for digital quantum computing, catalyzing academic groups and quantum startups to explore and optimize the performance of different atomic systems. While rubidium continues to be a popular choice, several groups believe that ytterbium could offer some crucial benefits for large-scale quantum computing. Ytterbium has a nuclear spin of one half, which means that the qubit can be encoded purely in the nuclear spin, explains Thompson. While all qubits based on atoms or ions have good coherence by default, we have found that pure nuclear-spin qubits can maintain coherence times of many seconds without needing any special measures.
Pioneering experiments in 2022 by Thompsons Princeton group, as well as by a team led by Adam Kaufman at JILA in Boulder, CO, USA, first showed the potential of the ytterbium-171 isotope for producing long-lived atomic qubits. Others have followed their lead, with Atom Computing replacing the strontium atoms in its original prototype with ytterbium-171 in the upgraded 1,200-qubit platform. Strontium also supports nuclear qubits, but we found that we needed to do lots of quantum engineering to achieve long coherence times, says Bloom. With ytterbium, we can achieve coherence times of tens of seconds without the need for any of those extra tricks.
Atom Computings first-generation quantum computer exploited around 100 qubits of single strontium atoms, while its next-generation platform can accommodate around 1,200 ytterbium atoms. [Atom Computing]
The rich energy-level structure of ytterbium also provides access to a greater range of atomic transitions from the ground state, offering new ways to manipulate and measure the quantum states. Early experiments have shown, for example, that this additional flexibility can be exploited to measure some of the qubits while a quantum circuit is being run but without disturbing the qubits that are still being used for logical operations.
Indeed, the ability to perform these mid-circuit measurements is a critical requirement for emerging schemes to locate and correct physical errors in the system, which have so far compromised the ability of quantum computers to perform complex computations. These physical errors are caused by noise and environmental factors that perturb the delicate quantum states, with early estimates suggesting that millions of physical qubits might be needed to provide the redundancy needed to achieve fault-tolerant quantum processing.
More recently, however, it has become clear that fewer qubits may be needed if the physical system can be engineered to limit the impact of the errors. One promising approach is the concept of erasure conversiondemonstrated in late 2023 by a team led by Thompson and Shruti Puri at Yale University, USAin which the physical noise is converted into errors with known locations, also called erasures.
In their scheme, the qubits are encoded in two metastable states of ytterbium, for which most errors will cause them to decay back to the ground state. Importantly, those transitions can easily be detected without disturbing the qubits that are still in the metastable state, allowing failures to be spotted while the quantum processor is still being operated. We just flash the atomic array with light after a few gate operations, and any light that comes back illuminates the position of the error, explains Thompson. Just being able to see where they are could ultimately reduce the number of qubits needed for error correction by a factor of ten.
Experiments by the Princeton researchers show that their method can currently locate 56% of the errors in single-qubit gates and 33% of those in two-qubit operations, which can then be discarded to reduce the effects of physical noise. The team is now working to increase the fidelity that can be achieved when using these metastable states for two-qubit operations, which currently stands at 98%.
A team led by Mikhail Lukin (right) at Harvard University, USA, pictured with lab member Dolev Bluvstein, created the first programmable logical quantum processor, capable of encoding up to 48 logical qubits. [J. Chase / Harvard Staff Photographer]
Meanwhile, Lukins Harvard group, working with several academic collaborators and Boston-based startup QuEra Computing, has arguably made the closest approach yet to error-corrected quantum computing. One crucial step forward is the use of so-called logical qubits, which mitigate the effects of errors by sharing the quantum information among multiple physical qubits.
Previous demonstrations with other hardware platforms have yielded one or two logical qubits, but Lukin and his colleagues showed at the end of 2023 that they could create 48 logical qubits from 280 atomic qubits. They used optical multiplexing to illuminate all the rubidium atoms within a logical qubit with identical light beams, allowing each logical block to be moved and manipulated as a single unit. Since each atom in the logical block is addressed independently, this hardware-efficient control mechanism prevents any errors in the physical qubits from escalating into a logical fault.
For more-scalable processing of these logical qubits, the researchers also divided their architecture into three functional zones. The first is used to store and manipulate the logical qubitsalong with a reservoir of physical qubits that can be mobilized on demandensuring that these stable quantum states are isolated from processing errors in other parts of the hardware. Pairs of logical qubits can then be moved, or shuttled, into the second entangling zone, where a single excitation laser drives two-qubit gate operations with a fidelity of more than 99.5%. In the final readout zone, the outcome of each gate operation is measured without affecting the ongoing processing tasks.
[Enlarge image]Schematic of the logical processor, split into three zones: storage, entangling and readout. Logical single-qubit and two-qubit operations are realized transversally with efficient, parallel operations. [D. Bluvstein et al. Nature,626, 58 (2024); CC-BY-NC 4.0]
The team also configured error-resistant quantum circuits to run on the logical processor, which in one example yielded a fidelity of 72% when operating on 10 logical qubits, increasing to 99% when the gate errors detected in the readout zone at the end of each operation were discarded. When running more complex quantum algorithms requiring hundreds of logical gates, the performance was up to 10 times better when logical qubits were used instead of their single-atom counterparts.
While this is not yet full error correction, which would require the faults to be detected and reset in real time, this demonstration shows how a logical processor can work in tandem with error-resistant software to improve the accuracy of quantum computations. The fidelities that can be achieved could be improved still further by sharing the quantum information among more physical qubits, with QuEras technology roadmap suggesting that by 2026 it will be using as many as 10,000 single atoms to generate 100 logical qubits. This is a truly exciting time in our field as the fundamental ideas of quantum error correction and fault tolerance start to bear fruit, Lukin commented. Although there are still challenges ahead, we expect that this new advance will greatly accelerate the progress toward large-scale, useful quantum computers.
In another notable development, QuEra has also won a multimillion-dollar contract to build a version of this logical processor at the UKs National Quantum Computing Centre (NQCC). The QuEra system will be one of seven prototype quantum computers to be installed at the national lab by March 2025, with others including a cesium-based neutral-atom system from Infleqtion (formerly ColdQuanta) and platforms exploiting superconducting qubits and trapped ions.
Once built, these development platforms will be used to understand and benchmark the capabilities of different hardware architectures, explore the types of applications that suit each one, and address the key scaling challenges that stand in the way of fault-tolerant quantum computing. We know that much more practical R&D will be needed to bridge the gap between currently available platforms and a fully error-corrected neutral-atom quantum computer with hundreds of logical qubits, says Nicholas Spong, who leads the NQCCs activities in tweezer-array quantum computing. For neutral-atom architectures, the ability to scale really depends on engineering the optics, lasers and control systems.
Researchers at the Boston-based startup QuEra, which collaborates on neutral-atom quantum computing with Mikhail Lukins group at Harvard University, USA. [Courtesy of QuEra]
One key goal for hardware developers will be to achieve the precision needed to control the spin rotations of individual atoms as they become more closely packed into the array. While global light fields and qubit shuttling provide efficient and precise control mechanisms for bulk operations, single-atom processes must typically be driven by focused laser beams operating on the scale of tens of nanometers.
To relax the strict performance criteria for these local laser beams, Thompsons group has demonstrated an alternative solution that works for divalent atoms such as ytterbium. We still have a global gate beam, but then we choose which atoms experience that gate by using a focused laser beam to shift specific atoms out of resonance with the global light field, he explains. It doesnt really matter how big the light shift is, which means that this approach is more robust to variations in the laser. Being able to control small groups of atoms in this way is a lot faster than moving them around.
Another key issue is the number of single atoms that can be held securely in the tweezer array. Current roadmaps suggest that arrays containing 10,000 atoms could be realized by increasing the laser power, but scaling to higher numbers could prove tricky. Its a challenge to get hundreds of wattsof laser powerinto the traps while maintaining coherence across the array, explains Spong. The entire array of traps should be identical, but imperfect optics makes it hard to make the traps around the edge work as well as those in the center.
With that in mind, the team at Atom Computing has deployed additional optical technologies in its updated platform to provide a pathway to larger-scale machines. If we wanted to go from 100 to 1,000 qubits, we could have just bought some really big lasers, says Bloom. But we wanted to get on a track where we can continue to expand the array to hundreds of thousands of atoms, or even a million, without running into issues with the laser power.
cA quantum engineer measures the optical power of a laser beam at Atom Computings research and development facility in Boulder, CO, USA. [Atom Computing]
The solution for Atom Computing has been to combine the atomic control provided by optical tweezers with the trapping ability of optical lattices, which are most commonly found in the worlds most precise atomic clocks. These optical lattices exploit the interference of laser beams to create a grid of potential wells on the subwavelength scale, and their performance can be further enhanced by adding an optical buildup cavity to generate constructive interference between many reflected laser beams. With these in-vacuum optics, we can create a huge array of deep traps with only a moderate amount of laser power, says Bloom. We chose to demonstrate an array that can trap 1,225 ytterbium atoms, but theres no reason why we couldnt go much higher.
Importantly, in a modification of the usual rearrangement approach, this design also allows the atomic array to be continuously reloaded while the processor is being operated. Atoms held in a magneto-optical trap are first loaded into a small reservoir array, from which they are transferred into the target array that will be used for computation. The atoms in both arrays are then moved into the deep trapping potential of the optical lattice, where rapid and low-loss fluorescence imaging determines which of the sites are occupied. Returning the atoms to the optical tweezers then allows empty sites within the target array to be filled from the reservoir, with multiple loading cycles yielding an occupancy of 99%.
Researchers working in the field believe the pace of progress is already propelling the technology toward the day when a neutral-atom quantum computer will outperform a classical machine.
Repeatedly replenishing the reservoir with fresh atoms ensures that the target array is always full of qubits, which is essential to prevent atom loss during the execution of complex quantum algorithms. Large-scale error-corrected computations will require quantum information to survive long past the lifetime of a single qubit, Bloom says. Its all about keeping that calculation going when you have hundreds of thousands of qubits.
While many challenges remain, researchers working in the field believe the pace of progress in recent years is already propelling the technology toward the day when a neutral-atom quantum computer will be able to outperform a classical machine. Neutral atoms allow us to reach large numbers of qubits, achieve incredibly long coherence times and access novel error-correction codes, says Bloom. As an engineering firm, we are focused on improving the performance still further, since all thats really going to matter is whether you have enough logical qubits and sufficiently high gate fidelities to address problems that are interesting for real-world use cases.
Susan Curtis is a freelance science and technology writer based in Bristol, UK.
Read more:
Quantum Computing Enters the Atomic Realm - Optics & Photonics News
Europe’s Race towards Quantum-HPC Integration and Quantum Advantage – HPCwire
What an interesting panel, Quantum Advantage Where are We and What is Needed? While the panelists looked slightly weary theirs was, after all, one of the last panels at ISC 2024 the discussion was fascinating and the panelists knowledgable. No such panel would be complete without also asking when QA will be achieved. The broad unsurprising answer to that question is not especially soon.
The panel included: Thomas Lippert, head of Jlich Supercomputing Centre (JSC) and director at the Institute for Advanced Simulation; Laura Schulz, acting head of quantum computing and technologies, Leibniz Supercomputing Centre; Stefano Mensa, advanced computing and emerging technologies group leader STFC Hartree Centre; and Sabrina Maniscalco, CEO and co-founder, Algorithmiq Ltd. The moderator was Heike Riel, IBM Fellow, head of science & technology and lead of IBM Research Quantum Europe.
Missing from the panel was a pure-play quantum computer developer that might have added a different perspective. Maybe next year. Topics included quantum-HPC integration, the need for benchmarks (though when and how was not clear), the likely role for hybrid quantum-HPC applications in the NISQ world; familiar discussion around error mitigation and error correction, and more.
Of the many points made, perhaps the strongest was around the idea that Europe has mobilized to rapidly integrate quantum computers into its advanced HPC centers.
Schulz said, The reason that our work in the Munich Quantum Valley (MQV) is so important is because when we look at the European level. We have the EuroHPC Joint undertaking. We have the six quantum systems that are going to be placed in hosting centers that European wide, and we all [have] different modalities, and we all have to integrate. We have to think about this at the European level for how were going to bring these systems together. We do not want multiple schedulers. We do not want multiple solutions that could then clash with one another. We want to try to find unity, where it makes sense and be able to amplify the user experience and smooth the user experience European-wide for them.
The idea is to connect all of these EuroHPC JU systems and make them widely available to academia and industry. LRZ and JSC, for example, have already fielded or are about to field several quantum computers in their facilities (see slides below).
Lippert emphasized that, at least for this session, the focus is on how to achieve quantum advantage when we talk about quantum utility, when this becomes useful, then the quantum computer is able to solve problems of practical usage significantly faster than any classical computer [based on] CPUs, GPUs, of comparable size, weight and power in similar environments. We think this is the first step to be made with quantum-HPC hybrid type of simulation, optimization, machine learning algorithms. Now, how do you realize such quantum advantage? You build HPC-hybrid compute systems. We have the approach that we talk about the modular supercomputing architecture.
Our mission is to establish a vendor-agnostic comprehensive public quantum computer user infrastructure integrated in to our modular complex of supercomputers to . [Its] is a user friendly and peer reviewed access. So like we do with supercomputing.
Schulz drilled down in the software stack being developed at LRZ in collaboration with many partners. On the left side of the slide below are traditional parts co-scheduling, co-resource management, all those components that we need to think of, and that we do think of with things like disaggregated acceleration, said Schulz.
When you get to the right side, she noted, we have to deal with the new physics environment or the new quantum computing environment. So we have a quantum compiler that we are developing, we have a quantum representation moving between them. Weve got a robust, customized, comprehensive toolkit with things like the debuggers, the optimizers, all of those components thats built with our partners in the ecosystem. Then we have an interface, this QBMI (quantum back-end manager interface) and this is what connects the systems individually into our whole framework.
Now, this is really important. And this is part of the evolution. Weve been working on this for two years, actively building this up, and were already starting to see the fruits of our labor. In our quantum Integration center (QIC), we are already able to go from our HPC environment, so our HPC testbed that we have, using our Munich quantum software stack, we are able to go to an access node on HPC system, the same hardware, and call to the quantum system. We have that on prem, it is co located these systems, and it is an integrated effort with our own software stack. So we are making great strides, Schulz said.
The pan-European effort to integrate quantum computing into HPC centers is impressive and perhaps furthest along worldwide. Its emphasis is on handling multiple quantum modalities (superconducting, trapped ion, photonic, neutral atom) and approaches (gate-based and annealing) and trying develop relatively-speaking a common easy-to-use software stack connecting HPC and the quantum.
Mensa of the U.K.s STFC zeroed in on benchmarking. Currently there are many efforts but few widely agreed-upon benchmarks. Roughly, the quantum community talks about system benchmarks (low and middle level) that evaluate a systems basic attributes (fidelity, speed, connectivity, etc) and application-oriented benchmarks intended to look more at time-to-solution, quantum resources needed, and accuracy.
No one disputes the need for quantum benchmarks. Mensa argued for a coordinated effort and suggested the SPEC model as something to look at it. The SPEC Consortium for HPC is a great example, because its a nonprofit and it establishes and maintains and endorses standardized benchmarks. We need to seek something like that, he said
He took a light shot at the Top500 metric not being the best approach, noting it didnt represent practical workloads today, and added the You know that your car can go up to 260. But on a normal road, we never do that. Others noted the Top500, based on Linpack, does at least show you can actually get your system up and running correctly. Moreover, noted Lippert and Schulz, the truth is that the Top500 score is not on the criteria lists they use to evaluate advanced systems procurements.
Opinions on benchmarking varied, but it seems that the flurry of separate benchmark initiatives are likely to continue and remain disparate for now. One issue folks agree on is that quantum technology is moving so fast that its hard to keep up with, and maybe its too early to settle on just a few benchmarks. Moreover benchmarking hybrid quantum-HPC systems becomes even more confusing. All seem to favor use of a suite of benchmarks over a single metric. This is definitely a stay-tuned topic.
Turning to efforts to achieve practical uses, Maniscalco presented two use cases that demonstrate the ability to combine quantum and HPC resources by using classical computing to mitigate errors. Her company Algorithmic Ltd, is developing algorithms for use in bioscience. She provided a snapshot of a technique that Algorithmic has developed to use tensor processing in post-process on classical systems to mitigate errors on the quantum computer.
HPC and quantum computers are seen almost as antagonists in the sense that we can use, for example, tensor network methods to simulate quantum systems, and this is, of course, its very important for benchmarking, said Maniscalco. But what we are interested in is bringing these two together and the quantum-centric supercomputing idea brought forward by IBM is important for us and what we do is specifically focused on this interface between the quantum computer and the HPC.
We develop techniques that are able to measure or extract information from the quantum computers in a way that allows [you] to optimize the efficiency in terms of number of measurements, this eventually corresponds to shorter wall time overhead overall, and also allows to optimize the information that you extract from the quantum computer, and importantly, allows in post processing, she said. (best to read the associated papers for details)
At the end of Q&A, moderator Heike Riel asked the panel, Where will we be in five years? Here are their brief answers in the order given:
Read more here:
Europe's Race towards Quantum-HPC Integration and Quantum Advantage - HPCwire
‘Quantum-inspired’ laser computing is more effective than both supercomputing and quantum computing, startup claims – Livescience.com
Engineers have developed an optical computer, about the size of a desktop PC, that can purportedly execute complex artificial intelligence (AI) calculations in nanoseconds, rivaling the performance of both quantum and classical supercomputers.
The computer, dubbed the LPU100, uses an array of 100 lasers to perform calculations through a process called laser interference, LightSolver representatives said in a March 19 statement.
In this process, an optimization problem that requires solving is encoded onto physical obstacles on the lasers' paths using a device called a programmable spatial light modulator. These obstacles prompt the lasers to adjust their behavior to minimize energy loss, similar to how water naturally finds the easiest route downhill by following the path of least resistance.
By quickly altering their state to minimize energy waste, the lasers achieve a state of minimal energy loss. This directly corresponds to the problem's solution.
The LPU100 then uses conventional cameras to detect and interpret these laser states, translating them into a mathematical solution to the original optimization problem.
According to the company, the LPU100 can perform complex operations such as vector-matrix multiplications a demanding computational workload in just 10 nanoseconds. That is hundreds of times quicker than the fastest graphics processing units (GPUs) can perform the same task.
Bob Sorensen, senior vice president of research and chief analyst for quantum computing at Hyperion Research, said in a statement that LightSolver's technology presented "a low barrier to entry for a wide range of advanced computing users."
Get the worlds most fascinating discoveries delivered straight to your inbox.
Vector matrix multiplication is key to handling complex tasks involving a large number of potential outcomes. One example is the vehicle routing problem, a logistics challenge used in the transportation and delivery sector to determine the most efficient routes for vehicle fleets.
In benchmark tests published by LightSolver, the LPU100 identified the most efficient route for vehicle fleets in less than a tenth of a second outperforming Gurobi, a commonly used logistics tool, which often failed to find a solution within 10 seconds.
Previous studies published by researchers at Cornell University found that the LPU100 outperformed traditional GPUs in Max-2-SAT challenges, which are used for testing the efficiency of logic-solving algorithms, as well as in the 3-Regular 3-XORSAT problem, a test for evaluating the performance of algorithms used for handling difficult problems that involve sorting through numerous combinations to find the best solution.
While the LPU100 employs what LightSolver dubs "quantum-inspired" technology, it doesn't rely on qubits nor the laws of quantum mechanics. Instead, it borrows the principle of processing multiple operations simultaneously at very high speeds, which classical computers cannot do.
According to LightSolver, the LPU100's laser array can handle 100 continuous variables, theoretically allowing it to address computational problems involving an astronomically large number of variable combinations (120 to the power of 100).
This makes it particularly well suited for industries such as finance, aerospace, logistics and manufacturing, all of which have resource-intensive data demands, the company said.
Quantum computers require extremely cold temperatures to operate and remain highly experimental, whereas supercomputers typically consume large amounts of energy and need to be housed in purpose-built facilities. By contrast, because the LPU100 lacks electronics, it can operate efficiently at room temperature and maintain a compact size similar to a desktop computer.
It is also built entirely from "well-understood laser technology and commercially available components." This makes it a more practical alternative to resource-intensive quantum computers and supercomputers, LightSolver representatives said.
LightSolver now offers select enterprise customers the ability to use the LPU100 through its cloud platform for problems involving up to 1 million variables.