Category Archives: Quantum Computer
Europe’s Race towards Quantum-HPC Integration and Quantum Advantage – HPCwire
What an interesting panel, Quantum Advantage Where are We and What is Needed? While the panelists looked slightly weary theirs was, after all, one of the last panels at ISC 2024 the discussion was fascinating and the panelists knowledgable. No such panel would be complete without also asking when QA will be achieved. The broad unsurprising answer to that question is not especially soon.
The panel included: Thomas Lippert, head of Jlich Supercomputing Centre (JSC) and director at the Institute for Advanced Simulation; Laura Schulz, acting head of quantum computing and technologies, Leibniz Supercomputing Centre; Stefano Mensa, advanced computing and emerging technologies group leader STFC Hartree Centre; and Sabrina Maniscalco, CEO and co-founder, Algorithmiq Ltd. The moderator was Heike Riel, IBM Fellow, head of science & technology and lead of IBM Research Quantum Europe.
Missing from the panel was a pure-play quantum computer developer that might have added a different perspective. Maybe next year. Topics included quantum-HPC integration, the need for benchmarks (though when and how was not clear), the likely role for hybrid quantum-HPC applications in the NISQ world; familiar discussion around error mitigation and error correction, and more.
Of the many points made, perhaps the strongest was around the idea that Europe has mobilized to rapidly integrate quantum computers into its advanced HPC centers.
Schulz said, The reason that our work in the Munich Quantum Valley (MQV) is so important is because when we look at the European level. We have the EuroHPC Joint undertaking. We have the six quantum systems that are going to be placed in hosting centers that European wide, and we all [have] different modalities, and we all have to integrate. We have to think about this at the European level for how were going to bring these systems together. We do not want multiple schedulers. We do not want multiple solutions that could then clash with one another. We want to try to find unity, where it makes sense and be able to amplify the user experience and smooth the user experience European-wide for them.
The idea is to connect all of these EuroHPC JU systems and make them widely available to academia and industry. LRZ and JSC, for example, have already fielded or are about to field several quantum computers in their facilities (see slides below).
Lippert emphasized that, at least for this session, the focus is on how to achieve quantum advantage when we talk about quantum utility, when this becomes useful, then the quantum computer is able to solve problems of practical usage significantly faster than any classical computer [based on] CPUs, GPUs, of comparable size, weight and power in similar environments. We think this is the first step to be made with quantum-HPC hybrid type of simulation, optimization, machine learning algorithms. Now, how do you realize such quantum advantage? You build HPC-hybrid compute systems. We have the approach that we talk about the modular supercomputing architecture.
Our mission is to establish a vendor-agnostic comprehensive public quantum computer user infrastructure integrated in to our modular complex of supercomputers to . [Its] is a user friendly and peer reviewed access. So like we do with supercomputing.
Schulz drilled down in the software stack being developed at LRZ in collaboration with many partners. On the left side of the slide below are traditional parts co-scheduling, co-resource management, all those components that we need to think of, and that we do think of with things like disaggregated acceleration, said Schulz.
When you get to the right side, she noted, we have to deal with the new physics environment or the new quantum computing environment. So we have a quantum compiler that we are developing, we have a quantum representation moving between them. Weve got a robust, customized, comprehensive toolkit with things like the debuggers, the optimizers, all of those components thats built with our partners in the ecosystem. Then we have an interface, this QBMI (quantum back-end manager interface) and this is what connects the systems individually into our whole framework.
Now, this is really important. And this is part of the evolution. Weve been working on this for two years, actively building this up, and were already starting to see the fruits of our labor. In our quantum Integration center (QIC), we are already able to go from our HPC environment, so our HPC testbed that we have, using our Munich quantum software stack, we are able to go to an access node on HPC system, the same hardware, and call to the quantum system. We have that on prem, it is co located these systems, and it is an integrated effort with our own software stack. So we are making great strides, Schulz said.
The pan-European effort to integrate quantum computing into HPC centers is impressive and perhaps furthest along worldwide. Its emphasis is on handling multiple quantum modalities (superconducting, trapped ion, photonic, neutral atom) and approaches (gate-based and annealing) and trying develop relatively-speaking a common easy-to-use software stack connecting HPC and the quantum.
Mensa of the U.K.s STFC zeroed in on benchmarking. Currently there are many efforts but few widely agreed-upon benchmarks. Roughly, the quantum community talks about system benchmarks (low and middle level) that evaluate a systems basic attributes (fidelity, speed, connectivity, etc) and application-oriented benchmarks intended to look more at time-to-solution, quantum resources needed, and accuracy.
No one disputes the need for quantum benchmarks. Mensa argued for a coordinated effort and suggested the SPEC model as something to look at it. The SPEC Consortium for HPC is a great example, because its a nonprofit and it establishes and maintains and endorses standardized benchmarks. We need to seek something like that, he said
He took a light shot at the Top500 metric not being the best approach, noting it didnt represent practical workloads today, and added the You know that your car can go up to 260. But on a normal road, we never do that. Others noted the Top500, based on Linpack, does at least show you can actually get your system up and running correctly. Moreover, noted Lippert and Schulz, the truth is that the Top500 score is not on the criteria lists they use to evaluate advanced systems procurements.
Opinions on benchmarking varied, but it seems that the flurry of separate benchmark initiatives are likely to continue and remain disparate for now. One issue folks agree on is that quantum technology is moving so fast that its hard to keep up with, and maybe its too early to settle on just a few benchmarks. Moreover benchmarking hybrid quantum-HPC systems becomes even more confusing. All seem to favor use of a suite of benchmarks over a single metric. This is definitely a stay-tuned topic.
Turning to efforts to achieve practical uses, Maniscalco presented two use cases that demonstrate the ability to combine quantum and HPC resources by using classical computing to mitigate errors. Her company Algorithmic Ltd, is developing algorithms for use in bioscience. She provided a snapshot of a technique that Algorithmic has developed to use tensor processing in post-process on classical systems to mitigate errors on the quantum computer.
HPC and quantum computers are seen almost as antagonists in the sense that we can use, for example, tensor network methods to simulate quantum systems, and this is, of course, its very important for benchmarking, said Maniscalco. But what we are interested in is bringing these two together and the quantum-centric supercomputing idea brought forward by IBM is important for us and what we do is specifically focused on this interface between the quantum computer and the HPC.
We develop techniques that are able to measure or extract information from the quantum computers in a way that allows [you] to optimize the efficiency in terms of number of measurements, this eventually corresponds to shorter wall time overhead overall, and also allows to optimize the information that you extract from the quantum computer, and importantly, allows in post processing, she said. (best to read the associated papers for details)
At the end of Q&A, moderator Heike Riel asked the panel, Where will we be in five years? Here are their brief answers in the order given:
Read more here:
Europe's Race towards Quantum-HPC Integration and Quantum Advantage - HPCwire
Researchers at the SQMS Center achieve leading performance in transmon qubits – Fermi National Accelerator Laboratory
Scientists and engineers at the Superconducting Quantum Materials and Systems Center, hosted by the U.S. Department of Energys Fermi National Accelerator Laboratory,have achieved reproducible improvements in superconducting transmon qubit lifetimes with record values of 0.6 milliseconds. The result was achieved through an innovative materials technique that eliminated a major loss source in the devices.
These results have been published in Nature Partner Journal Quantum Information.
Quantum devices such as qubits are critical for storing and manipulating quantum information. The qubit lifespan, known as coherence time, determines how long data can be stored and processed before an error occurs. This phenomenon, called quantum decoherence, is a key obstacle to operating quantum processors and sensors.
Electron microscopy images show the surface of the various superconducting transmon qubits fabricated at SQMS with the novel encapsulation technique. The qubit with the native niobium oxide is compared to the tantalum and gold capping layers that prevent the re-growth of the niobium oxide. Graphic: SQMS Center, Fermilab
The novel process called surface encapsulation protects key layers of the qubit during fabrication and prevents the formation of problematic, lossy oxides at the surfaces and interfaces of these devices. By carefully investigating and comparing various materials and deposition techniques, SQMS researchers have studied different oxides that lead to longer qubit lifetimes and fewer losses.
SQMS is pushing the envelope of qubit performance, said Alexander Romanenko, a senior scientist at Fermilab and SQMS Centers quantum technology thrust leader. These efforts show that undergoing a systematic review of processes and materials and attacking what matters most first is the key to pushing qubit coherence. Pursuing device fabrication and characterization, hand in hand with materials science is the right recipe to deepen our scientific understanding of loss mechanisms and improve quantum devices in the future.
Anna Grassellino, Fermilab senior scientist and SQMS Center director, and Akshay Murthy, SQMS Materials Focus area leader and Materials Characterization group leader, apply state-of-the-art characterization techniques in the Materials Science Lab, such as X-ray photoelectron spectroscopy and time-of-flight secondary ion mass spectrometry, to examine the effectiveness of niobium surface capping. Photo: Ryan Postel, Fermilab
There are many types of qubits. These basic building blocks of quantum computers process information differently and potentially faster than classical computers. The longer a qubit can store quantum information, the better its potential for use in a quantum computer.
Since its inception in 2020, the SQMS research team has focused on understanding the source of errors and decoherence in transmon qubits. This type of qubit is patterned on a chip consisting of a metallic niobium layer on top of a substrate, such as silicon or sapphire. Many consider these superconducting qubits to be the most advanced platform for quantum computers. Tech companies in the United States and around the world are also exploring them.
Mustafa Bal, nanofabrication group leader at the Fermilab SQMS division and leader of the SQMS Center national nanofabrication taskforce (left) and graduate student Francesco Crisa hold transmon chips of leading performance they produced at the Pritzker Nanofabrication Facility. Photo: Dan Svoboda, Fermilab
However, scientists must still overcome some challenges before quantum computers can fulfill their promise of solving previously unsolvable problems. Specific properties of the materials used to create these qubits can lead to the decoherence of quantum information. At SQMS, developing a deeper scientific understanding of these properties and loss mitigation strategies is an active area of research.
Shaojiang Zhu, qubit design and simulation group leader at the Fermilab SQMS Division, holds transmon qubits prepared with the surface encapsulation technique ready to be measured at the SQMS Quantum Garage at Fermilab. Photo: Dan Svoboda, Fermilab
SQMS scientists studying the losses in transmon qubits pointed to the niobium surface as the primary culprit. These qubits are fabricated in a vacuum, but when exposed to air, an oxide forms on the surface of niobium. Though this oxide layer is thin only about 5 nanometers it is a major source of energy loss and leads to shorter coherence times.
Our prior measurements indicate that niobium is the best superconductor for these qubits. While the metal losses are near zero, the niobium surface oxide is problematic and the main driver of losses in these circuits. Romanenko said.
SQMS scientists proposed encapsulating the niobium during fabrication so it would never be exposed to air and, therefore, its oxide would not form. While they had a hypothesis on which materials would work best for capping, determining the optimal material required a detailed study. So, they systematically tested this technique with different materials, including aluminum, tantalum, titanium nitride, and gold.
With each capping layer attempt, SQMS scientists analyzed the materials using several advanced characterization techniques at material science labs at Fermilab, Ames National Laboratory, Northwestern University, and Temple University. Qubit performances were measured inside a dilution refrigerator at the SQMS Quantum Garage at Fermilab. This cryogenic device cools qubits to just a tick above absolute zero. The results demonstrated that the researchers could prepare qubits with 2 to 5 times coherence improvement compared to samples prepared without a capping layer (containing the niobium oxide layer).
The team found that the capping process improved coherence times for all materials explored in the study. Of these materials, tantalum and gold proved to be the most effective for enabling a higher coherence time, with an average of 0.3 milliseconds and maximum values as high as 0.6 milliseconds. These results shed further light on the nature, hierarchy, and the mechanism of losses in these qubits. They are found to be driven by the presence of amorphous oxides and interfaces.
When fabricating a qubit, there are many variables, more or less hidden, that can impact performance, said Mustafa Bal, a scientist at Fermilab and head of the SQMS nanofabrication group and task force. This is a first-of-its-kind study that very carefully compares one material change and one process change at a time, on a chip of a fixed geometry, across different fabrication facilities. This approach ensures that we develop reproducible techniques for improvement in qubit performance.
The teams fabricated and tested qubits in different facilities as part of the SQMS Centers National Nanofabrication Taskforce. Fermilab led the way with the SQMS nanofabrication group headed by Bal, making qubits at the Pritzker Nanofabrication Facility at the University of Chicago. Other facilities included Rigetti Computing, a quantum computing company with a quantum foundry, and the National Institute of Standards and Technology Boulder Laboratories. Both are flagship partners at the SQMS Center. Fabricating the chip at Rigettis commercial foundry proved that the technique is easily reproducible and scalable for the industry.
At Rigetti Computing, we want to make the best possible superconducting qubits to make the best possible quantum computers, and extending the lifetimes of qubits in a reproducible way has been one of the hardest problems, said Andrew Bestwick, senior vice president of quantum systems at Rigetti. These are among the leading transmon coherence times that the field has been able to achieve on a two-dimensional chip. Most importantly, the study has been guided by the scientific understanding of qubit loss, leading to reproducibility across different labs and in our fabrication facility.
Rigettis Fab-1 is the industrys first dedicated and integrated quantum device and manufacturing facility, located in Fremont, California. The qubit surface encapsulation technique was easily reproduced at the Rigetti facilities. Photo: Rigetti Computing
At NIST, scientists are interested in using quantum technology to make fundamental measurements of photons, microwave radiation, and voltage. This has been a great team effort and a good planting of a flag that shows both how far we have come and the challenges that remain to be faced, said Peter Hopkins, a physicist at NIST who leads the superconductive electronics group and is a lead member of the SQMS Center National Nanofabrication Taskforce.
Following this work, SQMS researchers continue to push qubits performance frontier further. The next steps include engineering creative and robust nanofabrication solutions for applying this technique to other transmon qubit surfaces to eliminate all lossy interfaces present in these devices. The underlying substrate upon which these qubits are prepared also represents the next major source of losses. SQMS researchers are already hard at work characterizing and developing better silicon wafers or other lower-loss substrates suitable for quantum applications.
Moreover, SQMS scientists are working to ensure these advances in the coherence studies can be preserved in more complex chip architectures with several interconnected qubits.
Given the breadth of the SQMS Center collaboration, the Centers vision and mission are multi-fold. The researchers seek to improve the performance of the building blocks of a quantum computer and apply these innovations in mid-scale prototypes of quantum processors.
At SQMS, two main superconducting quantum computing platforms are under exploration: 2D transmon qubit chip-based and 3D cavity-based architectures. For the chip-based processors, SQMS researchers work hand in hand with industry partners such as Rigetti to advance performance and scalability of these platforms.
Currently, SQMS researchers from Fermilab and Rigetti have co-developed a 9-qubit processor incorporating these surface encapsulation advances. The chip is being installed in the SQMS Quantum Garage at Fermilab. Its performance will be evaluated and benchmarked in the upcoming weeks.
This timeline shows shows a roadmap for the SQMS Centers development of 2D transmon qubits and 3D cavity-based platforms. Graphic: Samantha Koch, Fermilab
For the 3D cavity-based platforms, Fermilab scientists have been working to integrate these qubits with superconducting radio-frequency cavities. Scientists initially developed these cavities for particle accelerators and Fermilab builds upon decades of experience in making the worlds best SRF cavities, demonstrating photon lifetimes of up to 2 seconds. When combined with transmon qubits, these cavities can also be used as building blocks of quantum computing platforms. Such an approach promises potentially better coherence, scalability and qubit connectivity. To date, Fermilab scientists have achieved up to several milliseconds of coherence in these cavity-qubit combined systems.
We know how to make the worlds best cavities, but the success of the 3D platforms under construction at Fermilab also heavily depends on how far we can keep pushing the performance of these transmon qubits used to control and manipulate the quantum states in the cavities, said Romanenko. So, its kind of two birds with one stone. As we push to advance our transformational 3D technologies, we also work alongside industry to enable important advances in 2D chip-based quantum computing platforms.
The Superconducting Quantum Materials and Systems Center at Fermilab is supported by theDOE Office of Science.
The Superconducting Quantum Materials and Systems Center is one of the five U.S. Department of Energy National Quantum Information Science Research Centers. Led by Fermi National Accelerator Laboratory, SQMS is a collaboration of more than 30 partner institutions national labs, academia and industry working together to bring transformational advances in the field of quantum information science. The center leverages Fermilabs expertise in building complex particle accelerators to engineer multiqubit quantum processor platforms based on state-of-the-art qubits and superconducting technologies. Working hand in hand with embedded industry partners, SQMS will build a quantum computer and new quantum sensors at Fermilab, which will open unprecedented computational opportunities. For more information, please visitsqmscenter.fnal.gov.
Fermi National Accelerator Laboratory is Americas premier national laboratory for particle physics research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC. Visit Fermilabs website athttps://www.fnal.govand follow us on Twitter@Fermilab.
See the original post here:
Researchers at the SQMS Center achieve leading performance in transmon qubits - Fermi National Accelerator Laboratory
USask partners with PINQ to access Canada’s only IBM Quantum System One – USask News
Scientists at USask are on the forefront of groundbreaking research thanks to a partnership with the PINQ (Qubec Digital and Quantum Innovation Platform), the sole administrator of Canadas only IBM Quantum System One, a utility-scale quantum computer located at IBMs research facility in Bromont, Quebec.
USasks three-year agreement with PINQ enables faculty and students affiliated with USasks Centre for Quantum Topology and Its Applications (quanTA) to have access to the machine via PINQs quantum computing platform. This collaboration significantly enhances the existing quantum computing research activities at USask.
IBM Quantum System One is powered by a 127-qubit processor, which has achieved utility-scale performance, a point at which quantum computers could serve as scientific tools to explore a new scale of problems that classical systems may never be able to solve. Under ideal circumstances, a qubit can be astoundingly powerful in comparison to the ordinary bits in conventional computers.
One of the partnerships first projects will be a study of complex health data in children suffering from chronic diseases, including juvenile arthritis. Using patient-derived data, researchers will deploy quantum-enhanced data analysis and machine learning techniques to uncover and understand hidden factors that may lead to such diseases, leading potentially to future preventatives and therapies. The nature of this work augments what is possible using traditional computing methods.
The groundbreaking research made possible through this partnership will see USasks quantum scientists working together with many other scientists from diverse fields, further showcasing the interdisciplinary work currently underway at USask.
"Our government is supporting quantum computing capacity in Canada through this unique collaboration between USask and Quantum System One, the next-generation quantum computer at IBMs research facility. Todays investment gives USask access to that system and the computing power that will allow them to tackle difficult problems in key areas like health care, climate sciences, and beyond. This access will lead to exponential growth in research and development, while boosting innovation and further solidifying USask as a scientific centre of excellence.
- The Honourable Dan Vandal, Minister for PrairiesCan
USask is a leader in quantum computing and this exciting new partnership allows us to further our influential work in the quantum ecosystem. We are committed to training the next generation of researchers, leaders and changemakers. Access to IBM Quantum System One will be a central component to recruit highly qualified students and build the skills of the next generation of quantum leaders.
-Baljit Singh, Vice President Research, USask
Over the past 60 years or so, computers have become one of the most important tools in a scientists back pocket, on par with the microscope. But the time has come where ordinary computing can no longer keep up with the problems that society needs to solve today, such as climate change and accelerated vaccine design. While still in its infancy, quantum computing promises to be the next indispensable tool in science. Some of the first real-world use cases for this technology will be developed right here at USask, thanks to this one-of-a-kind partnership with IBM and PINQ and owing to the strong interdisciplinary culture on our campus.
- Dr. Steven Rayan, director of USasks quanTA Centre and lead of USasks Quantum Innovation Signature Area of Research.
We are delighted to collaborate with USask, granting their researchers access to one of the worlds most powerful quantum computers. This partnership promises groundbreaking research and innovation, and we eagerly anticipate the outcomes arising from this collaboration. Our mission is to facilitate accelerated digital transformation for organizations and empower individuals in utilizing the capabilities of quantum computing. This partnership exemplifies our commitment to achieving that goal.
- ric Capelle, Managing Director, PINQ (Qubec Digital and Quantum Innovation Platform)
Excerpt from:
USask partners with PINQ to access Canada's only IBM Quantum System One - USask News
Why China, the U.S., and Big Tech Are Racing to Harness Quantum Computing and AI – TIME
When Lawrence Gasman was looking for a PhD topic back in the 1970s, computing labs were already abuzz with smart people proposing clever studies in artificial intelligence. But the problem was we had nothing to run them on, he says. The processors needed just didnt exist.
It took half a century for computing power to catch up with AIs potential. Today, thanks to hi-powered chips such as GPUs from California-based Nvidia, generative artificial intelligence, or gen AI, is revolutionizing the way we work, study, and consume entertainment, empowering people to create bespoke articles, images, videos, and music in the blink of an eye. The technology has spawned a bevy of competing consumer apps offering enhanced voice recognition, graphic design, and even coding.
Now AI stands poised to get another boost from a radical new form of computing: quantum. Quantum could potentially do some really remarkable things with AI, says Gasman, founder of Inside Quantum Technology.
Rather than relying on traditional computings binary bitsswitches denoted as 1s and 0squantum use multivariant qubits that exist in some percentage of both states simultaneously, akin to a coin spinning in midair. The result is exponentially boosted computing power as well as an enhanced ability to intuitively mimic natural processes that rarely conform to a binary form.
Whereas gen AIs consumer-targeted applications have made its impact more widespread and immediate, quantum is more geared towards industry, meaning several recent milestones have slipped under the radar. However, they stand to potentially turbocharge the AI revolution.
Generative AI is one of the best things that has happened to quantum computing, says Raj Hazra, CEO of Colorado-based quantum start-up Quantinuum. And quantum computing is one of the best things to happen to the advance of generative AI. They are two perfect partners.
Ultimately, AI relies on the ability to crunch huge stacks of information, which is where quantum excels. In December, IBM unveiled its latest processor, dubbed Heron, which boasts 133 qubits, the firms best ever error reduction and the ability to be linked together within its first modular quantum computer, System Two. In addition, IBM unveiled another chip, Condor, which has 1,121 superconducting qubits arranged in a honeycomb pattern. Theyre advances that mean now were entering what I like to call quantum utility, where quantum is getting used as a tool, Jay Gambetta, vice-president of IBM Quantum, tells TIME.
Since qubits are incredibly delicate subatomic particles, they dont always behave in the same way, meaning quantum relies both on increasing the overall number of qubits to check their calculations as well as boosting the fidelity of each individual. Different technologies used to create a quantum effect prioritize different sides of this equation, making direct comparisons very tricky and enhancing the arcane nature of the technology.
IBM uses superconducting qubits, which require cooling to almost absolute zero to mitigate thermal noise, preserve quantum coherence, and minimize environmental interactions. However, Quantinuum uses alternative trapped-ion technology that holds ions (charged atoms) in a vacuum using magnetic fields. This technology doesnt require cooling, though is thought to be more difficult to scale. However, Quantinuum in April claimed it had achieved 99.9% fidelity of its qubits.
The trapped ion approach is miles ahead of everybody else, says Hazra. Gambetta, in turn, argues the superconducting quantum has advantages for scaling, speed of quantum interactions, and leveraging existing semiconductor and microwave technology to make advances quicker.
For impartial observers, the jury is still out since the raft of competing, non-linear metrics render it impossible to tell whos actually ahead in this race. They are very different approaches, both are showing promise, says Scott Likens, global AI and innovation technology lead for the PwC business consultancy. We still dont see a clear winner, but its exciting.
Where Gambetta and Hazra agree is that quantum has the potential to mesh with AI to produce truly awesome hybrid results. I would love to see quantum for AI and AI for quantum, says Gambetta. The synergies between them, and the advancement in general in technology, makes a lot of sense.
Hazra concurs, saying generative AI needs the power of quantum computing to make fundamental advances. For Hazra, the Fourth Industrial Revolution will be led by generative AI but underpinned by the power of quantum computing. The workload of AI and the computing infrastructure of quantum computing are both necessary.
Its a view shared across the Pacific in China, where investments in quantum are estimated at around $25 billion, dwarfing the rest of the world. Chinas top quantum expert, Prof. Pan Jianwei, has developed a Jiuzhang quantum computer that he claims can perform certain kinds of AI-related calculations some 180 million times faster than the worlds top supercomputer.
In a paper published in the peer-reviewed Physical Review Letters journal last May, Jiuzhang processed over 2,000 samples of two common AI-related algorithmsMonte Carlo and simulated annealingwhich would take the worlds fastest classical supercomputer five years, in under a second. In October, Pan unveiled Jiuzhang 3.0, which he claims was 10 quadrillion times faster in solving certain problems than a classical supercomputer.
Jiuzhang utilizes yet a third form of quantum technologylight or photonsand Pan is widely lauded as Chinas king of quantum. A physics professor at the University of Science and Technology of China, Pan in 2016 launched Micius, the worlds first quantum communication satellite, which beamed entangled photons between earth a year later for the worlds first quantum-secured video call.
Micius is considered quantums Sputnik moment, prompting American policymakers to funnel hundreds of millions of dollars into quantum information science via the National Quantum Initiative. Bills such as the Innovation and Competition Act of 2021 have provided $1.5 billion for communications research, including quantum technology. The Biden Administrations proposed 2024 budget includes $25 billion for emerging technologies including AI and quantum. Ultimately, quantums awesome computing power will soon render all existing cryptography obsolete, presenting a security migraine for governments and corporations everywhere.
Quantums potential to turbocharge AI also applies to the simmering technology competition between the worlds superpowers. In 2021, the U.S. Commerce Department added eight Chinese quantum computing organizations to its Entity List, claiming they support the military modernization of the Peoples Liberation Army and adopt American technologies to develop counter-stealth and counter-submarine applications, and the ability to break encryption.
These restrictions dovetail with a raft of measures targeting Chinas AI ambitions, including last year blocking Nvida from selling AI chips to Chinese firms. The question is whether competition between the worlds top two economies stymies overall progress on AI and quantumor pushes each nation to accelerate these technologies. The answer could have far-reaching consequences.
AI can accelerate quantum computing, and quantum computing can accelerate AI, Google CEO Sundar Pichai told the MIT Technology Review in 2019. And collectively, I think its what we would need to, down the line, solve some of the most intractable problems we face, like climate change.
Still, both the U.S. and China must overcome the same hurdle: talent. While only a few universities around the world offer quantum physics or mechanics, dedicated courses on quantum computing are even rarer, let alone expertise on the various specialties within. Typically, the most valuable and scarcest resource becomes the basis of your competitive advantage, says Hazra. And right now in quantum its people with that knowledge.
Read more here:
Why China, the U.S., and Big Tech Are Racing to Harness Quantum Computing and AI - TIME
Quantum computing takes a giant leap forward with breakthrough discovery – Earth.com
Scientists have produced an enhanced, ultra-pure form of silicon that allows the construction of high-performance qubit devices. This fundamental component is crucial for paving the way towards scalable quantum computing.
The finding, published in the journal Communications Materials Nature, could define and push forward the future of quantum computing.
The research was led by Professor Richard Curry from the Advanced Electronic Materials group at The University of Manchester, in collaboration with the University of Melbourne in Australia.
What weve been able to do is effectively create a critical brick needed to construct a silicon-based quantum computer, Professor Curry excitedly proclaimed.
Its a crucial step to making a technology that has the potential to be transformative for humankind feasible; a technology that could give us the capability to process data at such as scale, that we will be able to find solutions to complex issues such as addressing the impact of climate change and tackling healthcare challenges, Curry continued.
One of the biggest challenges in the development of quantum computers is that qubits, the building blocks of quantum computing, are highly sensitive and require a stable environment to maintain the information they hold. Even tiny changes in their environment, including temperature fluctuations, can cause computer errors.
Another issue is their scale, both their physical size and processing power. Ten qubits have the same processing power as 1,024 bits in a normal computer and can potentially occupy a much smaller volume.
Scientists believe a fully performing quantum computer needs around one million qubits, which provides capability unfeasible by any classical computer.
Qubits, or quantum bits, are the fundamental building blocks of quantum computers, analogous to bits in classical computers. However, qubits have several unique properties that differentiate them from classical bits:
While classical bits can only be in one of two states (0 or 1), qubits can exist in a superposition of multiple states simultaneously. This means that a qubit can represent a combination of both 0 and 1 at the same time, enabling quantum computers to perform many calculations in parallel.
Qubits can be entangled with each other, meaning that their quantum states are correlated, even if they are physically separated. This property allows quantum computers to perform certain computations much faster than classical computers.
Qubits are highly sensitive to their environment and can easily lose their quantum state, a process called decoherence. This is one of the main challenges in building stable, large-scale quantum computers.
Operations on qubits are performed using quantum gates, which are the quantum equivalent of logic gates in classical computers. These gates manipulate the quantum states of qubits to perform computations.
When a qubit is measured, it collapses from its superposition state into a definite state of either 0 or 1. The outcome of the measurement is probabilistic and depends on the qubits initial quantum state.
Due to the fragility of qubits, quantum error correction techniques are necessary to maintain the integrity of quantum computations. These techniques involve using multiple qubits to encode and protect the information stored in a single logical qubit.
Researchers are exploring various physical systems to implement qubits, such as superconducting circuits, trapped ions, photons, and silicon-based spin qubits.
Each approach has its own advantages and challenges, and the choice of qubit technology depends on factors such as scalability, error rates, and ease of manipulation.
Silicon is the underpinning material in classical computing due to its semiconductor properties, and researchers believe it could be the answer to scalable quantum computers.
However, natural silicon is made up of three atoms of different mass (called isotopes) silicon 28, 29, and 30. The Si-29, making up around 5% of silicon, causes a nuclear flip flopping effect, causing the qubit to lose information.
Scientists at the University of Melbourne have come up with a way to engineer silicon to remove the silicon 29 and 30 atoms, making it the perfect material to make quantum computers at scale, and with high accuracy.
The result the worlds purest silicon provides a pathway to the creation of one million qubits, which may be fabricated to the size of a pinhead.
The great advantage of silicon quantum computing is that the same techniques that are used to manufacture the electronic chips currently within an everyday computer that consist of billions of transistors can be used to create qubits for silicon-based quantum devices, noted Ravi Acharya, a PhD researcher who performed experimental work in the project.
The ability to create high quality Silicon qubits has in part been limited to date by the purity of the silicon starting material used. The breakthrough purity we show here solves this problem, Acharya continued.
The new capability offers a roadmap towards scalable quantum devices with unparalleled performance and capabilities and holds promise of transforming technologies in ways hard to imagine.
Our technique opens the path to reliable quantum computers that promise step changes across society, including in artificial intelligence, secure data and communications, vaccine and drug design, and energy use, logistics and manufacturing, explained project co-supervisor, Professor David Jamieson, from the University of Melbourne.
Now that we can produce extremely pure silicon-28, our next step will be to demonstrate that we can sustain quantum coherence for many qubits simultaneously. A reliable quantum computer with just 30 qubits would exceed the power of todays supercomputers for some applications, Jamieson concluded.
While still in the early stages of quantum computing, once fully developed, quantum computers will be used to solve real-world complex problems, such as drug design, and provide more accurate weather forecasts calculations too difficult for todays supercomputers.
In summary, the pioneering discovery of ultra-pure silicon by scientists at The University of Manchester and the University of Melbourne marks a significant milestone in the journey towards scalable quantum computing.
This achievement aligns with the 200th anniversary of The University of Manchester, which has been at the forefront of science innovation throughout its history, including Ernest Rutherfords splitting the atom discovery in 1917 and the first-ever real-life demonstration of electronic stored-program computing with The Baby in 1948.
The research produced by these brilliant scientists paves the way for the construction of high-performance qubit devices, bringing us closer to a future where quantum computers can solve complex real-world problems that are beyond the capabilities of todays supercomputers.
As researchers continue to push the boundaries of quantum computing, we can expect to see transformative advancements across various fields, from artificial intelligence and secure communications to vaccine design and weather forecasting.
The quantum revolution is on the horizon, and the creation of the worlds purest silicon is a crucial step towards making it a reality.
The full study was published in the journal Communications Materials Nature.
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
The rest is here:
Quantum computing takes a giant leap forward with breakthrough discovery - Earth.com
China creates its largest ever quantum computing chip and it could be key to building the nation’s own ‘quantum cloud’ – Livescience.com
Scientists in China have developed a 504-qubit quantum computing chip that will be made available to researchers worldwide via a new quantum computing cloud platform.
The new chip, called "Xiaohong," is the biggest built by China to date and is designed to improve systems that manage the behavior and interaction of quantum bits, or qubits, in quantum computers, state-owned China Daily reported. The scientists hope the chip will help to scale up existing quantum computers so they can handle more complex tasks.
Xiaohong was developed by scientists at the Center for Excellence in Quantum Information and Quantum Physics, part of the Chinese Academy of Sciences (CAS). Chinese quantum computing company QuantumCTek, which received the first Xiaohong chip, will now reportedly work alongside China Telecom Quantum Group to integrate the 504-qubit chip into a new quantum computer.
This system will then be made available to researchers worldwide via a quantum computing cloud platform developed by China Telecom Quantum Group, according to the report.
Wang Zhen, deputy general manager of China Telecom Quantum Group, said in a statement the new system would "allow users in various fields to conduct research on problems and algorithms of practical value efficiently, and accelerate the application of quantum computing in actual scenarios."
Related: 'World's purest silicon' could lead to 1st million-qubit quantum computing chips
Xiaohong is designed to meet the performance standards of cloud-enabled quantum computing platforms like those made by IBM or AWS. But it's not intended as a technical rival to cutting-edge U.S. technology such as the 1,121-qubit IBM Quantum Condor chip, said Gong Ming, a researcher at the Center for Excellence in Quantum Information and Quantum Physics.
Get the worlds most fascinating discoveries delivered straight to your inbox.
This system will then be made available to researchers worldwide via a quantum computing cloud platform developed by China Telecom Quantum Group, according to the report.
Wang Zhen, deputy general manager of China Telecom Quantum Group, said in a statement the new system would "allow users in various fields to conduct research on problems and algorithms of practical value efficiently, and accelerate the application of quantum computing in actual scenarios."
Xiaohong is designed to meet the performance standards of cloud-enabled quantum computing platforms like those made by IBM or AWS. But it's not intended as a technical rival to cutting-edge U.S. technology such as the 1,121-qubit IBM Quantum Condor chip, said Gong Ming, a researcher at the Center for Excellence in Quantum Information and Quantum Physics.
Instead, the scientists hope access to Xiaohong via the cloud will promote the development of large-scale quantum computing measurement and control systems (QCMCSs).
Quantum computers work fundamentally differently from classical computers. Unlike classical bits, which can only be represented as 0 or 1, qubits can exist in multiple states simultaneously. This enables quantum computers to perform calculations in parallel and at near-unimaginable speeds if qubits are stitched together through quantum entanglement.
QCMCSs, meanwhile, are components that play a crucial role in quantum computing acting as a bridge that connects traditional computers with quantum computers. This connection enables quantum computers to interpret commands received from classical computing environments and manage the state of qubits accordingly.
QuantumCTek will use Xiaohong to test the "kilo-qubit" quantum computing measurement and control systems developed in-house. This would "greatly influence the overall performance of quantum computers," Liang Futian, an associate professor at the Center for Excellence in Quantum Information and Quantum Physics, said in the statement.
While the 504-qubit Xiaohong chip is Chinas largest quantum chip to date, its not the largest in the world. That title currently belongs to Atom Computing, which announced its behemoth 1,125-qubit quantum computer in October 2023.
Previous notable contributions from China include the Jiuzhang 2.0 and Zuchongzhi 2.1 supercomputers. When China launched its Jiuzhang quantum computer in 2020, it claimed it was the world's fastest reportedly surpassing Google's Sycamore supercomputer by 10 billion times.
World’s Purest Silicon Paves the Way for Next-Gen Quantum Computers – SciTechDaily
Researchers at the University of Manchester and the University of Melbourne have developed an ultra-pure silicon crucial for creating scalable quantum computers, which could potentially address global challenges such as climate change and healthcare issues.
A major breakthrough in quantum computing has been achieved with the development of ultra-pure silicon, setting the stage for the creation of powerful, scalable quantum computers.
More than 100 years ago, scientists at The University of Manchester changed the world when they discovered the nucleus in atoms, marking the birth of nuclear physics.
Fast forward to today, and history repeats itself, this time in quantum computing.
Building on the same pioneering method forged by Ernest Rutherford the founder of nuclear physics scientists at the University, in collaboration with the University of Melbourne in Australia, have produced an enhanced, ultra-pure form of silicon that allows construction of high-performance qubit devices a fundamental component required to pave the way towards scalable quantum computers.
The finding, published in the journal Communications Materials, could define and push forward the future of quantum computing.
Richard Curry, Professor of Advanced Electronic Materials at The University of Manchester, said:
What weve been able to do is effectively create a critical brick needed to construct a silicon-based quantum computer. Its a crucial step to making a technology that has the potential to be transformative for humankind feasible; a technology that could give us the capability to process data at such as scale, that we will be able to find solutions to complex issues such as addressing the impact of climate change and tackling healthcare challenges.
Prof Rich Curry (right) and Dr. Mason Adshead (left). Credit: The University of Manchester
It is fitting that this achievement aligns with the 200th anniversary of our University, where Manchester has been at the forefront of science innovation throughout this time, including Rutherfords splitting the atom discovery in 1917, then in 1948 with The Baby the first ever real-life demonstration of electronic stored-program computing, now with this step towards quantum computing.
One of the biggest challenges in the development of quantum computers is that qubits the building blocks of quantum computing are highly sensitive and require a stable environment to maintain the information they hold. Even tiny changes in their environment, including temperature fluctuations can cause computer errors.
Another issue is their scale, both their physical size and processing power. Ten qubits have the same processing power as 1,024 bits in a normal computer and can potentially occupy much smaller volume. Scientists believe a fully performing quantum computer needs around one million qubits, which provides the capability unfeasible by any classical computer.
Prof Rich Curry. Credit: The University of Manchester
Silicon is the underpinning material in classical computing due to its semiconductor properties and the researchers believe it could be the answer to scalable quantum computers. Scientists have spent the last 60 years learning how to engineer silicon to make it perform to the best of its ability, but in quantum computing, it has its challenges.
Natural silicon is made up of three atoms of different mass (called isotopes) silicon 28, 29, and 30. However the Si-29, making up around 5% of silicon, causes a nuclear flip flopping effect causing the qubit to lose information.
In a breakthrough at The University of Manchester, scientists have come up with a way to engineer silicon to remove the silicon 29 and 30 atoms, making it the perfect material to make quantum computers at scale, and with high accuracy.
The result the worlds purest silicon provides a pathway to the creation of one million qubits, which may be fabricated to the size of pin head.
Ravi Acharya, a PhD researcher who performed experimental work in the project, explained: The great advantage of silicon quantum computing is that the same techniques that are used to manufacture the electronic chips currently within an everyday computer that consist of billions of transistors can be used to create qubits for silicon-based quantum devices. The ability to create high quality Silicon qubits has in part been limited to date by the purity of the silicon starting material used. The breakthrough purity we show here solves this problem.
The new capability offers a roadmap towards scalable quantum devices with unparalleled performance and capabilities and holds promise of transforming technologies in ways hard to imagine.
Project co-supervisor, Professor David Jamieson, from the University of Melbourne, said: Our technique opens the path to reliable quantum computers that promise step changes across society, including in artificial intelligence, secure data and communications, vaccine and drug design, and energy use, logistics and manufacturing.
Now that we can produce extremely pure silicon-28, our next step will be to demonstrate that we can sustain quantum coherence for many qubits simultaneously. A reliable quantum computer with just 30 qubits would exceed the power of todays supercomputers for some applications,
All computers operate using electrons. As well as having a negative charge, electrons have another property known as spin, which is often compared to a spinning top.
The combined spin of the electrons inside a computers memory can create a magnetic field. The direction of this magnetic field can be used to create a code where one direction is called 0 and the other direction is called 1. This then allows us to use a number system that only uses 0 and 1 to give instructions to the computer. Each 0 or 1 is called a bit.
In a quantum computer, rather than the combined effect of the spin of many millions of electrons, we can use the spin of single electrons, moving from working in the classical world to the quantum world; from using bits to qubits.
While classical computers do one calculation after another, quantum computers can do all the calculations at the same time allowing them to process vast amounts of information and perform very complex calculations at an unrivalled speed.
While still in early stages of quantum computing, once fully developed, quantum computers will be used to solve real-world complex problems, such as drug design, and provide more accurate weather forecasts calculations too difficult for todays supercomputers.
Reference: Highly 28Si enriched silicon by localised focused ion beam implantation by Ravi Acharya, Maddison Coke, Mason Adshead, Kexue Li, Barat Achinuq, Rongsheng Cai, A. Baset Gholizadeh, Janet Jacobs, Jessica L. Boland, Sarah J. Haigh, Katie L. Moore, David N. Jamieson and Richard J. Curry, 7 May 2024, Communications Materials. DOI: 10.1038/s43246-024-00498-0
This work was supprted by the UK Engineering and Physical Science Research Council (EPSRC), specifically the Programme Grant Nanoscale Advanced Materials Engineering led by Prof. Curry. Professor Jamiesons collaboration with the University of Manchester is supported by a Royal Society Wolfson Visiting Fellowship and the Australian Research Council. Ravi Acharya is a joint University of Manchester and University of Melbourne PhD student supported by a Cookson Scholarship.
The rest is here:
World's Purest Silicon Paves the Way for Next-Gen Quantum Computers - SciTechDaily
Finland develops quantum algorithms for the future – ComputerWeekly.com
Many experts believe that once quantum computers are big enough and reliable enough to solve useful problems, the most common deployment architecture will be to have them serve as accelerators for supercomputers. Several algorithms have already been developed to run on this architecture, including the most famous one Shors algorithm, which will one day crack many of todays public key cryptosystems.
To experiment with this arrangement in Finland, the Finnish Technical Research Center (VTT) cooperated with CSC, who run LUMI, Europes fastest supercomputer. VTT and CSC have been connecting progressively larger quantum computers to LUMI to allow users to experiment with algorithms that will make the best use of both types of computers. The classical computer stores most of the data and executes most of the steps in the algorithm, handing off other steps to the quantum computer and then reading the results.
VTTs first quantum computer was completed in 2021 and used 5 qubits. They then upgraded to20 qubits last yearand are aiming for 50 qubits by the end of 2024. Universities and research organisations have already started using the hybrid service to solve trivial problems.
This is sort of a rehearsal for the future, saysVilleKotovirta, leader of the VTTteam that develops quantum algorithms and software. It allows people who develop supercomputing algorithms to start thinking about what they will be able to do when a bigger quantum computer works alongside a classical computer. They can use the existing service to practice writing algorithms.
We were the first to set up the hybrid configuration in Europe, but others are following, says Kotovirta. In June 2023, the European HPC joint undertaking acceptedsix other projectsto build similar architectures. These will be in Czechia, France, Germany, Italy, Poland and Spain.
Kotovirta is responsible for research in quantum algorithms and development of software to enable access to VTTs quantum computing. Part of his job is to hire new talent, which he says is not easy. Some of the graduating students already in Finland are interested in quantum computing, but they dont have real world experience.
The people who have work experience in Finland already have a job somewhere else in the ecosystem, and people from outside Finland are hesitant to move to a cold climate. Having said that, some of the outsiders are impressed enough with theFinnish ecosystemto overcome concerns they may have about the weather.
Were all learning because its a new field and its changing all the time, says Kotovirta. There are new inventions, new platforms, new devices, new algorithms and new ways of programming. To keep up, we try to hire mathematicians, physicists and computer scientists.
Kotovirtas team is developing several types of algorithms for hybrid architectures. One is a set of optimisation problems, called quadradic unconstrained binary optimisation (QUBO), which can be solved using quantum annealing or quantum approximate optimisation algorithms (QAOAs).
We have built quantum algorithms for analysing graph data and identifying the community structure of networks, he says. The data comes from complex networks, like technological or biological networks systems.
The team is also developing algorithms for quantum chemistry, with focus on reducing the complexity of amolecular Hamiltonianto improve on simulations of molecules. Similarly, they are working on synthetic biology, where they generate new proteins, with certain desirable features.
Another area of focus is quantum machine learning especially quantum generative machine learning, models that learn from existing data to produce novel samples.
Most people have heard of generative AI in the context of fake AI, where it is used to create images, text and sound, says Kotovirta. These same techniques can be applied to science, learning from something that already exists to create something new. We are finding ways of improving these techniques with quantum computing to generate new proteins.
The most difficult part is proving that quantum machines have benefits over their classical counterparts, says Kotovirta. Current quantum computers are real computers, and they can do real calculations and solve real problems.
In that sense, they are already doing useful things, but the problem is that sizes are very small, because the current systems are inefficient in comparison to their classical counterparts. In order for quantum computers to solve something more efficiently than classical computers, fidelities need to improve.
Fidelity refers to the success rate of a single operation on a quantum computer. The higher the fidelities, the better success rate of the overall computation. We are currently in an era of noisy intermediate-scale quantum (NISQ) devices, which have already shown that quantum devices can simulate things classical computers struggle to simulate. However, so far, the results are so trivial that they dont solve real problems.
As fidelities improve, were approaching the era of utility-scale algorithms, utility-scale quantum computing, says Kotovirta. This will happen when the fidelities are good enough to run certain algorithms that are tailored for the topology of the device youre using. That will give us results that classical computers cannot replicate, but only for very specific use cases.
These algorithms could be used to simulate quantum systems related to material sciences or chemistry, for example. Although you cant claim general quantum advantage, for those specific use cases you can demonstrate advantage. Finlands strategy is to make a difference on the world stage through the use cases in which quantum advantage can be achieved in the not-so-distant future.
It isnt easy for smaller countries to compete with larger economies. However, it is possible for them to find a niche that allows them to actively contribute on a global scale in one or more specific areas. In that regard, we are doing the right things, he says. Hopefully, we will continue to do so in the future.
Read more:
Finland develops quantum algorithms for the future - ComputerWeekly.com
Advancing Spin Qubit Technology: Cryogenic Probing – AZoQuantum
In an article recently published in the journal Nature, researchers developed a 300-mm cryogenic probing process to obtain high-volume data on spin qubit devices across full wafers and optimized an industry-compatible process to fabricate spin qubit devices on a low-disorder host material to realize automated probing of single electrons in spin qubit arrays across 300-mm wafers.
Substantial numbers of physical qubits are required to build a fault-tolerant quantum computer. Recently, silicon quantum dot spin qubits/qubits based on silicon electrons have displayed two-qubit and single-qubit fidelities above 99%, satisfying the error correction thresholds.
Integrated spin qubit arrays have currently attained sizes equal to six quantum dots, and bigger quantum dot platforms in two-dimensional (2D) and one-dimensional (1D) configurations have also been demonstrated. However, the number of physical qubits must significantly increase to realize real-world applications using spin qubit technology. Thus, spin qubit devices must be fabricated with uniformity, volume, and density comparable with classical computing chips, which currently consist of billions of transistors.
Fabricating spin qubit devices using a similar infrastructure as classical computing chips can facilitate the development of fault-tolerant quantum computers using the spin qubit technology and unlock the spin qubits' potential for scaling.
This is because spin qubit technology possesses inherent advantages for scaling due to the approximate qubit size of 100 nm and built-in compatibility with modern complementary metal-oxide-semiconductor (CMOS) manufacturing infrastructure, specifically in the case of silicon-based devices.
Currently, yield and process variation are the major challenges for spin qubits. Additionally, the cryogenic electrical testing bottleneck hinders the scaling of solid-state quantum technologies like superconducting and topological qubits and spin qubits.Thus, the cryogenic device testing scale must maintain pace with the increasing fabrication complexity to ensure efficient device screening and improve statistical metrics like voltage variation and qubit yield. Yield and process variation in quantum devices can be improved by combining process changes with statistical measurements of indicators like component yield and voltage variation.
In this study, researchers proposed a testing process using a cryogenic 300-mm wafer prober to obtain high-volume data on the performance of hundreds of industry-manufactured spin qubit devices at 1.6 K across full wafers. Additionally, they combined low process variation with a low-disorder host material to optimize an industry-compatible process for spin qubit device fabrication on silicon/silicon-germanium (Si/SiGe) heterostructures.
These proposed advancements were synergistic as the development of the full-wafer cryogenic test capability can enable the complex 300-mm fabrication process optimization and the optimized fabrication process can improve the reliability of the devices, enabling high-fidelity automated measurements across wafers.
Collectively, these advancements culminated in automated single-electron probing in spin qubit arrays across 300-mm wafers. In this work, the spin qubit devices were synthesized in Intel's D1 factory, where the CMOS logic processes are developed. A Si/SiGe heterostructure grown on 300-millimeter silicon wafers was used as the host material.
Researchers selected this structure to exploit the prolonged electron spin coherence in silicon and its applicability for multiple qubit encodings. All patterning was performed using optical lithography. Extreme ultraviolet lithography was employed for quantum dot gate patterning in a single pass.
Additionally, all device sub-components were fabricated using fundamental industry techniques like chemical-mechanical polishing, etching, and deposition. The AEM Afore and Bluefors-manufactured cryo-prober/cryogenic wafer prober used in this work can load and cool 300-millimeter wafers to 1.0 K base temperature at the chuck and 1.6 0.2 K electron temperature. Thousands of test structures and spin qubit arrays on the wafer were measured after cooldown.
Low process variation and high yield were successfully achieved across the 300-mm wafer using the proposed approach. The proposed cryogenic testing method provided fast feedback to enable the CMOS-compatible fabrication process's optimization, resulting in low process variation and high yield.
Using this proposed system, measurements of the spin qubits' operating point were successfully automated and the transitions of single electrons were thoroughly investigated across full wafers. Results obtained by analyzing the random variation in single-electron operating voltages demonstrated that the optimized fabrication process results in low levels of disorder at the 300-mm scale.
The high device yield combined with the cryogenic wafer prober enabled a simple path from device fabrication to the investigation of spin qubits, which eliminated failures due to electrostatics or yield at the dilution refrigerator stage. Overall, an extensible and large unit cell of up to 12 qubits was realized using a high-volume cryogenic testing method, an all CMOS-industry-compatible fabrication process with low process variation, and a low-disorder host material/Si/SiGe.
To summarize, the findings of this study established a new standard for the reliability and scale of spin qubit devices and paved the way for more complex and much larger spin qubit arrays of the future.
Neyens, S., et al. et al. (2024). Probing single electrons across 300-mm spin qubit wafers. Nature, 629(8010), 80-85. https://doi.org/10.1038/s41586-024-07275-6, https://www.nature.com/articles/s41586-024-07275-6
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.
Go here to read the rest:
Advancing Spin Qubit Technology: Cryogenic Probing - AZoQuantum
Quantum computing may be the future of technology: what does that really mean? – cuindependent
Physics Professor David Wineland in his lab at the National Institute of Standards and Technology (NIST) in Boulder. Professor Wineland was awarded the 2012 Nobel Prize for his ground-breaking experimental methods that enable measuring and manipulation of individual quantum systems. (Courtesy of The University of Colorado)
Envision a computer that runs faster and more securely than ever before. A new age of technology is coming and its called quantum computing.
Global investors are pouring billions of dollars into quantum research and technologies. Many research universities are pursuing quantum computing development. The University of Colorado Boulder is among the list of schools researching quantum with the CUbit Quantum Initiative.
Its an exciting field, said Scott Sternberg, the executive director of the CUbit Quantum Initiative at CU Boulder. Theres not [only] just a lot of money that the United States is putting forward to this.
Quantum in a nutshell
Quantum mechanics refers to the laws of physics that were discovered in the 1920s, said Karl Mayer, who received his Ph.D. in quantum engineering from CU Boulder.
These are the laws of physics that apply to really small things, like electrons, atoms and things like that. The laws of physics at that small scale behave differently than what physicists call Newtonian physics, Mayer said.
Newtons Laws of Motion refer to how an object responds when forces act upon it. While Newtons laws are objectively correct, they do not apply to smaller bodies, such as electrons or objects moving close to the speed of light. Thats where quantum mechanics come into play.
Quantum physics starts to present itself in a very different mathematical way, Sternberg said. It becomes a window into some of the fundamental forces of the universe.
CUbit Quantum Initiative
Under the quantum umbrella, CUBit is working on three main focus areas. These include quantum sensing and measurement, quantum network and communications and quantum computing and simulation.
CUbit aims to advance science and build a foundation for quantum technology. With Colorado being a hub for quantum research, the program has an advantage in partnering with local companies and startups.
Quantum sensing and measurement and qubits
Quantum sensing and measurement allows for technological devices to be more efficient, accurate and productive. GPS systems use quantum physics to provide mapping tools.
According to Sternberg, sensors based on quantum techniques help form atomic clocks, very precise timekeepers, in addition to mapping tools. In 2017, CU Boulder scientists created a new design for an atomic clock that outperformed previous attempts.
Part of what allows for devices to be more efficient and productive is the use of bits, an essential part of both classical and quantum computing.
Bits, the smallest unit of information in a computer, have two states: 0 and 1, which correspond to off and on. Using combinations of bits, engineers transform simple electrical signals into complex logic gates. Traditional bits only have the potential to represent single values whereas qubits can take on any proportion between 0 and 1 when coding. Since qubits can break things into smaller values, they are more powerful than traditional bits.
Instead of a traditional computer that encodes into one or zero, you have this bit, this qubit, which in this particular state operates very differently than a traditional transistor one/zero would, Sternberg said. If you think about a one or a zero, it [a qubit] could be a one and a zero at the same time and everything in between.
Quantum computing and simulation
Quantum physics deals with electrons and other minuscule moving objects. According to Mayer, electrons do not have exact positions, but they have wave functions. This is important for prediction-making.
A wave function is an equation used to explain the behavior of quantum particles, mainly electrons. It is useful for finding the probability that a particle will end up in a specific place. While probabilities can be found, it is not an exact prediction of what will happen.
Sternberg used a deck of cards as an example. When drawing from a deck of cards, there is a one in 52 chance of getting a certain number or face and a certain suit. In this scenario, a wave function would describe the probability of which face and suit will appear.
Quantum computing and simulation use qubits to speed up computing processes. When computers perform at such a high level, they can simulate materials or chemistry processes. This is where probability and wave functions come into play.
Using electrons and the principles of quantum physics, researchers have found that there is a better chance of predicting probability through experiments run on quantum computers. For example, quantum computers predict the likelihood of certain events, such as the sun rising tomorrow.
Quantum networks and communications
According to Mayer, quantum computing also encompasses cryptography.
Cryptography is the process that protects digital information. Encryption and decryption are two essential parts of this procedure. Encryption occurs when data is turned from plaintext into ciphertext. Decryption is the process of transforming ciphertext into plaintext. When one opens a readable message, it is considered in its plaintext or readable form.
As a way to block third-party viewers from reading messages, they are encrypted and turned into ciphertext, which is not easily readable. When someone sends a text message, it is encrypted and when the receiver opens the message, it is decrypted.
Traditional cryptography uses math when encrypting messages whereas quantum cryptography uses physics. It sends subatomic particles known as photons through a filter and to a receiver. When an eavesdropper tries to read the photons, the state of the photon changes. This makes quantum cryptography more secure than traditional cryptography.
The context of quantum networks and communication is being able to create a secure connection that is so unbreakable, Sternberg said.
Problems with quantum
Experts are hopeful about the future of quantum, but research is developing. Quantum computers still exhibit an array of operational errors and researchers dont yet know the most effective way to build them.
The hope with quantum computers is that you would kind of do this experimental research of building the material and studying it, Mayer said. Quantum computers enable you to do that simulation efficiently where classical computers do not.
Mayer works at Quantinuum, a company that develops technology and conducts research in quantum computing. One of their products, known as Quantum Origin, is designed to use quantum cryptography in a way that better secures data.
Part of Mayers research at Quantinuum involves identifying errors in the system.
One problem present with current quantum computers is that they are very noisy or very prone to errors, Mayer said, and theres a whole zoo of possible errors that can occur to quantum bits. And what I do is I design experiments to get run on our system that measure things like what types of errors are occurring and with what rates.
Additionally, Sternberg said that quantum computers are hard to build due to the advanced cooling technology the process requires.
Its an engineering problem to be able to manipulate these and scale them to the point where quantum computers could now grow, Sternberg said.
While quantum computers have the potential to improve security, they can also pose a threat.
Sternberg said that some members of the quantum community fear that advanced quantum computers in other nations could access private information in the United States.
And so theres sort of this half security side of it and half benefit for overall mankind, Sternberg said.
CU Boulder and the future of quantum
CU Boulder has a particular advantage in quantum research as a university based in Colorado.
We have industries that are quantum-like. Meaning that we have aerospace, we have biosciences, we have IT as part of our major industry sectors here in the state, Sternberg said. We have those skill sets that are adjacent to the quantum space in other industries that we have here in the state.
Although quantum computing is not yet the industry standard, technology is heading in that direction. Companies around the globe are working to progress humanity toward an era of quantum technology.
However, as quantum technology develops, potential users shouldnt expect to see any quantum laptops or computers in their local tech stores anytime soon.
Were at this kind of frontier, this beginning of engineering quantum computers, Sternberg said. I dont know if well see them on a laptop, but well certainly be able to use a laptop to access a quantum computer for a computation.
Contact Guest Writer Hope Munoz at Hope.Munoz-Stanley@colorado.edu.
Go here to see the original:
Quantum computing may be the future of technology: what does that really mean? - cuindependent