Category Archives: Quantum Computer

This University in New York Is the First With a Full-Fledged Quantum Computer – PCMag

On Friday April 5, I attended the ribbon-cutting for the first quantum computer installed on a university campus, an IBM Quantum System One machine at Rensselaer Polytechnic Institute. While quantum computing has the potential to solve some problems that traditional computers cant and has been advancing at a steady rate, there are still many questions and challenges around the technology. Installing the machine on a college campus will allow researchers to examine many of these issues and allow students to get hands-on experience with the technology.

RPI President Martin A. Schmidt (Credit: Michael J. Miller)

RPI President Martin A. Schmidt says that with this quantum computer, we will explore applications, develop algorithms, and in so doing help humanity solve some very large problems. He states that while it's easy to predict that quantum systems will rapidly become essential because of their computational power, we don't yet fully know how best to use them. He says we can anticipate that there will be important applications in biomedicine, in modeling climate and predicting weather, and in materials design; but there will be applications in many other fields.

With IBMs research in Yorktown Heights, manufacturing in Poughkeepsie, and partnerships with the University of Albany as well as RPI, he hopes for "an agglomeration effect," in which organizations in a region working together can create something where the whole is greater than the sum of the parts. Schmidt notes that there are already partnerships in the area for semiconductor research, and this has led to new factories being built in upstate New York: "Adding 'quantum valley' aspects to 'tech valley' is not only going to draw new businesses here and encourage startups, but also offer the region's existing businesses early insights into what it means to be quantum advantaged."

Schmidt hopes the system and its use by RPI and the University of Albany will help answer the question of how the United States educates a quantum-ready workforce for the near future. He notes RPI's history of 'hands-on' education and that students at all levels will be encouraged to use the machine.

Separately, Schmidt also tells me that he believes the quantum computer will be useful in attracting both faculty and students.

Curtis Priem, a cofounder of Nvidia and vice-chairman of RPIand the donor who arranged for the machine to come to RPInotes that he enrolled at RPI initially because of this 'hands-on' approach and remarked at how today even undergraduates can use RPI's supercomputer.

IBM CEO Arvind Krishna (Credit: Michael J. Miller)

IBM CEO Arvind Krishna says that quantum systems will solve problems that we cannot solve on today's computersproblems in materials, problems in carbon sequestration, problems around drug discovery, and problems in lightweight materials, lubricants, and EV battery materials. "When you think about it intuitively," he says "they come from a world of physical chemistry, which means that they are subject to the principles of quantum mechanics, which is why these systems, which kind of simulate nature, are the ones that are going to let us make progress on these problems." They have the potential to solve problems around stochastics and financial risk.

Krishna believes that the university could uniquely help with workforce development, saying "Students are going to imagine using these systems in ways that even the inventors of these systems can't conceive." Listing a set of potential use cases, he says, " I'll make a bet that within five years students and faculty here are going to bring up use cases that are far beyond what we are imagining."

The unveiling was preceded by a day of discussions about the opportunities and the many challenges facing quantum computing before it is ready for commercial applications. I'll talk about those in my next post.

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

See the original post here:
This University in New York Is the First With a Full-Fledged Quantum Computer - PCMag

New method of measuring qubits promises ease of scalability in a microscopic package – EurekAlert

image:

An artistic illustration shows how microscopic bolometers (depicted on the right) can be used to sense very weak radiation emitted from qubits (depicted on the left).

Credit: Aleksandr Kkinen/Aalto University

Chasing ever-higher qubit counts in near-term quantum computers constantly demands new feats of engineering.

Among the troublesome hurdles of this scaling-up race is refining how qubits are measured. Devices called parametric amplifiers are traditionally used to do these measurements. But as the name suggests, the device amplifies weak signals picked up from the qubits to conduct the readout, which causes unwanted noise and can lead to decoherence of the qubits if not protected by additional large components. More importantly, the bulky size of the amplification chain becomes technically challenging to work around as qubit counts increase in size-limited refrigerators.

Cue the Aalto University research group Quantum Computing and Devices (QCD). They have a hefty track record of showing how thermal bolometers can be used as ultrasensitive detectors, and they just demonstrated in an April 10 Nature Electronics paper that bolometer measurements can be accurate enough for single-shot qubit readout.

A new method of measuring

To the chagrin of many physicists, the Heisenberg uncertainty principle determines that one cannot simultaneously know a signals position and momentum, or voltage and current, with accuracy. So it goes with qubit measurements conducted with parametric voltage-current amplifiers. But bolometric energy sensing is a fundamentally different kind of measurementserving as a means of evading Heisenbergs infamous rule. Since a bolometer measures power, or photon number, it is not bound to add quantum noise stemming from the Heisenberg uncertainty principle in the way that parametric amplifiers are.

Unlike amplifiers, bolometers very subtly sense microwave photons emitted from the qubit via a minimally invasive detection interface. This form factor is roughly 100 times smaller than its amplifier counterpart, making it extremely attractive as a measurement device.

When thinking of a quantum-supreme future, it is easy to imagine high qubit counts in the thousands or even millions could be commonplace. A careful evaluation of the footprint of each component is absolutely necessary for this massive scale-up. We have shown in the Nature Electronics paper that our nanobolometers could seriously be considered as an alternative to conventional amplifiers. In our very first experiments, we found these bolometers accurate enough for single-shot readout, free of added quantum noise, and they consume 10 000 times less power than the typical amplifiersall in a tiny bolometer, the temperature-sensitive part of which can fit inside of a single bacterium, says Aalto University Professor Mikko Mttnen, who heads the QCD research group.

Single-shot fidelity is an important metric physicists use to determine how accurately a device can detect a qubits state in just one measurement as opposed to an average of multiple measurements. In the case of the QCD groups experiments, they were able to obtain a single-shot fidelity of 61.8% with a readout duration of roughly 14 microseconds. When correcting for the qubits energy relaxation time, the fidelity jumps up to 92.7%.

With minor modifications, we could expect to see bolometers approaching the desired 99.9% single-shot fidelity in 200 nanoseconds. For example, we can swap the bolometer material from metal to graphene, which has a lower heat capacity and can detect very small changes in its energy quickly. And by removing other unnecessary components between the bolometer and the chip itself, we can not only make even greater improvements on the readout fidelity, but we can achieve a smaller and simpler measurement device that makes scaling-up to higher qubit counts more feasible, says Andrs Gunyh, the first author on the paper and a doctoral researcher in the QCD group.

Prior to demonstrating the high single-shot readout fidelity of bolometers in their most recent paper, the QCD research group first showed that bolometers can be used for ultrasensitive, real-time microwave measurements in 2019. They then published in 2020 a paper in Nature showing how bolometers made of graphene can shorten readout times to well below a microsecond.

The work was carried out in the Research Council of Finland Centre of Excellence for Quantum Technology (QTF) using OtaNano research infrastructure in collaboration with VTT Technical Research Centre of Finland and IQM Quantum Computers. It was primarily funded by the European Research Council Advanced Grant ConceptQ and the Future Makers Program of the Jane and Aatos Erkko Foundation and the Technology Industries of Finland Centennial Foundation.

Full paper:

Andrs M. Gunyh, Suman Kundu, Jian Ma, Wei Liu, Sakari Niemel, Giacomo Catto, Vasilii Vadimov, Visa Vesterinen, Priyank Singh, Qiming Chen, Mikko Mttnen, Single-Shot Readout of a Superconducting Qubit Using a Thermal Detector, Nature Electronics, https://doi.org/10.1038/s41928-024-01147-7 (2024).

Contact information:

Mikko Mttnen

Professor, QCD group leader

Aalto University

mikko.mottonen@aalto.fi

m. +358505940950

Andrs Gunyh

Doctoral researcher

Aalto University

andras.gunyho@aalto.fi

Nature Electronics

Experimental study

Single-Shot Readout of a Superconducting Qubit Using a Thermal Detector

10-Apr-2024

M.M. declares that that he is a Co-Founder and Shareholder of the quantum-computer company IQM Finland Oy. M.M. declares that he is an inventor in granted patents FI122887B, US9255839B2, JP5973445B2, and EP2619813B1 titled Detector of single microwave photons propagating in a guide, applied by Aalto korkeakoulusti, and invented by M.M. and Jukka Pekola. This patent family describes an ultrasensitive microwave detector concept. M.M. declares that he is an inventor in pending patent applications WO2022248759A1 and TW202303466A titled Quantum-state readout arrangement and method, applied by IQM Finland Oy, and invented by M.M. and Juha Hassel. This patent family describes a concept of measuring the states of qubits using bolometers. Other authors declare no competing interests.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Original post:
New method of measuring qubits promises ease of scalability in a microscopic package - EurekAlert

Breakthrough in Quantum Computing: ETH Zurich Innovates with Static Fields in Ion Trapping – yTech

ETH Zurichs recent foray into the realm of ion trapping has yielded promising advancements for quantum computing. A team of researchers at the esteemed institution has developed a method for trapping ions that could potentially enable the creation of quantum computers with greater numbers of qubits than currently possible. Utilizing static electric and magnetic fields, the group has taken quantum operations a step further, signaling a leap forward in computing capabilities.

**Summary:** Researchers at ETH Zurich have made a significant stride in quantum computing by devising an ion trapping technique that employs static electric and magnetic fields. This novel approach, utilizing Penning traps on a microfabricated chip, allows for arbitrary ion transport and offers a scalable solution that promises to increase the number of qubits in quantum computers considerably.

Quantum computer scientists are working tirelessly to overcome the limitations imposed by traditional oscillating field ion traps, such as the Paul trap, which restricts ions to linear motion and complicates the integration of multiple traps on a single chip. By means of steady fields, the ETH teams Penning traps have unlocked new potentials for maneuvering ions in two dimensions without the constraints of oscillating fields, offering a boon for future quantum computing applications.

The ETH researchers, led by Jonathan Home, have reimagined the ion trap architecture, traditionally used in precision experiments, to suit the demands of quantum computing. Despite encountering initial skepticism, the team constructed an advanced Penning trap that incorporated a superconducting magnet producing a field strength of 3 Tesla. They effectively implemented precise control over the ions energy states, proving their methods viability for quantum computation.

The trapped ions ability to stay put for several days within this new system has marked a remarkable achievement. This stable trapping environment, free from oscillating fields and external disturbances, allowed the researchers to maintain quantum mechanical superpositions essential for operations in quantum computers.

Looking ahead, the ETH group aims to harness these innovations for multi-qubit operations by trapping two ions in adjacent Penning traps on the same chip. This ambitious endeavor would illustrate the practicality of large-scale quantum computers using static field ion traps, potentially leading to more powerful computing technologies than any seen before.

The research at ETH Zurich represents an exciting development in the field of quantum computing, an industry that is expected to revolutionize the world of computing as we know it. With the progress made in ion trapping techniques, the scalability of quantum computers could rise precipitously, culminating in machines far exceeding the capabilities of todays supercomputers.

Industry Background: Quantum computing harnesses the phenomena of quantum mechanics to perform computation. Unlike classical bits, quantum computers use qubits, which can exist in states of 0, 1, or any quantum superposition of these states. This allows quantum computers to solve certain problemslike factoring large numbers or running simulations of quantum materialsmuch faster than classical computers.

Market Forecasts: The quantum computing market is projected to grow significantly in the coming years. According to industry analysis, the global market size, which was valued at several hundred million dollars, is expected to reach into the billions by the end of the decade, with a compound annual growth rate (CAGR) often cited in strong double digits. This growth is driven by increasing investments from both public and private sectors and advancements in quantum computing technologies.

Industry-Related Issues: There are several challenges that the quantum computing industry faces. One of the main hurdles is quantum decoherence, where the qubits lose their quantum state due to environmental interference, posing a significant issue for maintaining quantum superpositions. Another challenge involves error rates in quantum calculations that require complex error correction methods. Furthermore, the creation and maintenance of qubits are technically demanding and expensive, requiring precise control over the physical systems that host them, like ions or other particles.

The breakthrough by ETH Zurichs researchers addresses some of these challenges by using static fields, which can potentially improve the stability and coherence times of the qubits. This could lead to advancements in quantum error correction and enable the implementation of more complex quantum algorithms.

As the demand for quantum computing continues to rise, collaboration and investment in research and development are crucial. Successful implementation of quantum computers can impact various industries, including cryptography, materials science, pharmaceuticals, and finance. For those interested in the cutting-edge developments in this field, the following sources offer valuable insights:

IBM Quantum IBM is one of the companies at the forefront of quantum computing. They provide access to quantum computers through the cloud and are actively involved in advancing quantum computation technology.

D-Wave Systems Inc. D-Wave is known for developing quantum annealing-based computers, specializing in solving optimization and sampling problems.

Google Quantum AI Googles Quantum AI lab is working on developing quantum processors and novel quantum algorithms to help researchers and developers solve near-term problems across various sectors.

The innovations from the team at ETH Zurich are poised to contribute significantly to this burgeoning industry, potentially overcoming some of the critical challenges and pushing us closer to the realization of fully functional quantum computers.

Marcin Frckiewicz is a renowned author and blogger, specializing in satellite communication and artificial intelligence. His insightful articles delve into the intricacies of these fields, offering readers a deep understanding of complex technological concepts. His work is known for its clarity and thoroughness.

Follow this link:
Breakthrough in Quantum Computing: ETH Zurich Innovates with Static Fields in Ion Trapping - yTech

Shaping the Future: South Carolina’s Quantum Computing Education Initiative – yTech

A summary of the new initiative by the South Carolina Quantum Association reveals the states forward-thinking investment in quantum computing expertise. South Carolina is funneling resources into a groundbreaking educational partnership aimed at equipping University of South Carolina students with real-world quantum computing skills. Backed by taxpayer dollars, this project is providing a platform for students to train on a cutting-edge quantum supercomputer, fostering their growth into in-demand tech professionals and invigorating local industries with innovative solutions.

In a significant development for South Carolinas aspiring quantum scientists, the states Quantum Association is collaborating with the University of South Carolina to offer an extraordinary educational experience. The venture is supported by a $20,000 research project fund and is already yielding promising outcomes.

Finance and computer science majors at the university are piloting a quantum computing project, enhancing investment strategies for a regional bank. The quantum computer, funded through a substantial $15 million state budget allocation and accessed remotely from the University of Maryland, serves as a cornerstone for the states burgeoning intellectual and industrial advancements.

The initiatives participants are already making waves on the national stage, having secured a top position at a notable MIT hackathon. With aspirations extending well into the investment sphere, these students are founding a hedge fund to apply their unique quantum computing insights.

South Carolinas project goes beyond mere technological enhancement. It aims to nurture top-tier talent through an expansive quantum computing curriculum and an online training platform, positioning the state to become a nexus of high-tech finance and industry professionals.

The global quantum computing industry is poised for exponential growth as this powerful technology promises to transform diverse sectors. The South Carolina initiative reflects a strategic movement to prepare for a future that demands advanced computational knowledge, amidst challenges like hardware stability and the need for specialists.

By marrying academic learning with practical application, South Carolinas initiative is setting the stage for building a proficient quantum workforce. This workforce would be adept at addressing industry challenges and leveraging the opportunities offered by this emergent technological field.

Industry and Market Forecasts

The quantum computing industry represents one of the most exciting frontiers in technology and science. As of my latest knowledge update in 2023, the industry is expected to grow substantially in the coming years. According to market research, the global quantum computing market is projected to reach billions of dollars by the end of this decade, with a compounded annual growth rate (CAGR) that underscores its dynamic potential. This growth is fueled by increased investments from both public and private sectors, along with breakthroughs in quantum algorithms and hardware.

Companies across various industries, such as finance, pharmaceuticals, automotive, and aerospace, are exploring quantum computing to gain a competitive advantage. This technology holds the promise of solving complex problems that are currently intractable by classical computers, such as optimizing enormous data sets, modeling molecular interactions for drug discovery, and improving encryption methods.

Current Issues in Quantum Computing

One of the most significant issues facing the quantum computing industry is the stabilization of qubits, the basic units of quantum information. Unlike classical bits, qubits can exist in multiple states simultaneously, through a phenomenon known as superposition. However, they are also highly susceptible to interference from their environment, which can lead to errors in computation. Overcoming this challenge, often referred to as quantum decoherence, is a key focus for researchers.

Another issue is the need for a highly specialized workforce, as quantum computing requires not just expertise in computer science but also in quantum mechanics. The intricacies of quantum algorithms and the underpinning physical principles necessitate a new breed of professionals who can bridge the gap between theory and practical application.

Moreover, the industry is also working on making quantum computing more accessible. Currently, only a handful of organizations have the resources to develop and maintain a quantum computer. However, the rise of quantum computing as a service (QCaaS) models has begun to democratize access to quantum resources, allowing more players to explore the potential of this technology.

South Carolinas Role in the Quantum Computing Ecosystem

South Carolinas initiative to invest in quantum computing education highlights the importance of building a smart workforce that can contribute to and benefit from this promising industry. With their practical projects, such as improving banking investment strategies through quantum computing, students are not only contributing to innovation but are also showcasing how these complex technologies can have real-world applications.

Such initiatives prepare a new generation to play an active role in the industry, ensuring that the U.S. remains at the forefront of technological advancements. For more information on related topics, you may consider visiting the website of professional organizations and industry leaders in quantum computing. Here are a couple of valid related links:

IBM Quantum Google Quantum AI

By promoting education and funding in quantum computing, South Carolina is positioning itself not only as a contributor to the global quantum revolution but also as a beneficiary of the quantum economy to come. It is an example of how regional initiatives can have significant outcomes in a rapidly evolving high-tech landscape.

Jerzy Lewandowski, a visionary in the realm of virtual reality and augmented reality technologies, has made significant contributions to the field with his pioneering research and innovative designs. His work primarily focuses on enhancing user experience and interaction within virtual environments, pushing the boundaries of immersive technology. Lewandowskis groundbreaking projects have gained recognition for their ability to merge the digital and physical worlds, offering new possibilities in gaming, education, and professional training. His expertise and forward-thinking approach mark him as a key influencer in shaping the future of virtual and augmented reality applications.

View post:
Shaping the Future: South Carolina's Quantum Computing Education Initiative - yTech

Could quantum computing be South Carolina’s next economic draw? This statewide initiative says yes – Columbia … – columbiabusinessreport.com

The future of cutting-edge computer technology in South Carolina is getting a huge boost from an initiative announced March 25.

The South Carolina Quantum Association has launched an effort to develop quantum computing technology and talent in the state through $15 million approved by the South Carolina legislature in the fiscal year 2023-24 budget, the states largest ever investment in a tech initiative, according to information from SCQA.

SC Quantum hopes to increase collaboration among academia, entrepreneurs, industry and government to further the advancement of this technology in the Midlands and South Carolina in general, officials said.

Columbia Mayor Daniel Rickenmann, state Sen. Dick Harpootlian, and Joe Queenan, executive director of SC Quantum, announced the landmark project at an event held at the Boyd Innovation Center on Saluda Avenue in Columbias Five Points district.

Quantum computing is a concept that many people havent even heard of and one that is still in development. In a nutshell, its a computing system that uses the principles of quantum physics to simulate and solve problems that are difficult for traditional digital systems to manage, according to MITs Sloan School of Management. Quantum computing was first proposed in the 1980s and the first well known quantum algorithm emerged from MIT in the 1990s.

Unlike traditional computers which use binary electric signals, quantum computers use subatomic particles called qubits which can represent combinations of both ones and zeroes. Experts say the technology could be used to help scientists, businesses, economists and others to work through complex problems and find solutions in a more efficient way.

The funds will go toward education including workforce development, certificate and micro-credential programs, entrepreneurship support and engagement projects such as gatherings of experts and quantum demonstration projects.

Quantum is a new way of solving problems, and this initiative will allow us to build out a quantum-ready workforce able to solve important real-world problems with this cutting edge technology, Queenan said.

Queenan noted the importance of funding quantum development because massive efforts are already underway overseas. China has recently dedicated $15 billion to development of quantum technology, and the European Union is devoting $8 billion.

The U.S. government has named quantum an industry of the future on par with artificial intelligence and 5G, and committed more than $1.2 billion for quantum research and development budgets in 2022, according to information from SCQA.

Work in the quantum computing field is already underway at the University of South Carolina, where students recently came in third at a quantum competition held at MIT and others have recently developed a prototype quantum-based hedge fund, which is showing strong returns, Queenan said.

Mayor Rickenmann said the new initiative will help develop Columbias role as a technology research hub for the state.

This is the right project at the right time, he said. This is an investment in the intellectual capital of our city and state. I think were going to see a renaissance of intellectual development here in this community.

Sen. Harpootlian, who has lived in the Five Points area for more than 50 years, said the quantum initiative being launched from there is just the latest marker of the dramatic change that has transformed the neighborhood since he first moved to the area to attend law school in 1971.

I look back fondly on the days when this was a sleepy little village, of going to get breakfast at Gibsons and then a hot dog at Franks Hot Dogs, Harpootlian said, referencing two iconic eateries that were symbols of the areas previous incarnation, But those days are long gone and they arent coming back whats coming is much better. South Carolina Quantum is putting South Carolina ahead of the curve. Columbia could be a major hub of innovation for this technology that is rapidly growing in use across the globe.

See the rest here:
Could quantum computing be South Carolina's next economic draw? This statewide initiative says yes - Columbia ... - columbiabusinessreport.com

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative – HPCwire

Editors Note: Next month there will be a workshop to discuss what a quantum initiative led by NSFs Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Call for Participation announcement. A key contact for interested quantum community members is Frank Mueller, N.C. State University.

Call for Participation: Planning Workshop on Quantum Computing (PlanQC 2024) April 28, 2024, https://www.asplos-conference.org/asplos2024/ in conjunction with: ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2024) San Diego, CA (USA)

Funding for quantum computing has come from a variety of programs at the National Science Foundation (NSF), which have been multi-disciplinary and cutting across multiple NSF divisions. However, no NSF quantum initiatives have been led by the Computer, Information

Science and Engineering (CISE) directorate within NSF. Undoubtedly, there is a surge in demand driven by open positions in academia and industry focused on the computing side of quantum. There is arguably a need for a focused program on quantum computing led by CISE in cooperation with other directorates to enable the next generation of quantum algorithms, quantum architectures, quantum communication, quantum systems, quantum software and compilers. The objective of this workshop is to identify areas of quantum computing in particular to enable new discoveries in quantum science and engineering, and to meaningly contribute to creating a Quantum computing ready workforce.

To articulate this need and to develop a plan for new CISE-led quantum program, we plan to bring several leading senior and some junior researchers in quantum computing together for a planning workshop complemented by selected ones beyond the CISE community to foster interdisciplinary interactions.

This workshop will lead to a comprehensive report that will provide a detailed assessment of the need for a new CISE program and will provide a description of the research areas that such a program should focus on. With such a new program, NSF would be able to focus its efforts on the computing aspects of quantum science and greatly enhance its ability to sustain the leadership role of the United States in this area of strategic interest. This workshop and report will be the first stepping stone in bootstrapping such a program.

Call for participation

We invite researchers both already active in quantum computing and those aspiring to become active to participate in a one day workshop, including some participants from the quantum application domains. The focus of the workshop will be to engage in discussions and summarize findings in writing to create a report on open quantum problems and motivate workforce development in the area. The workshop format will be alternating plenary and break-out sessions to provide a unifying vision while identifying research challenges in a diverse set of subareas.

Quantum Topics

Algorithms

Architectures

Communication

Compilers/Languages

Simulation

Software

Theory

Applications

Classical control and peripheral hardware

Workshop Chairs and Organizers

Frank Mueller North Carolina State University

Fred Chong University of Chicago

Vipin Chaudhary Case Western Reserve University

Samee Khan Mississippi State University

Gokul Ravi University of Michigan

Read the original post:
Call for Participation in Workshop on Potential NSF CISE Quantum Initiative - HPCwire

Quantum computing just got hotter: 1 degree above absolute zero – The Conversation

For decades, the pursuit of quantum computing has struggled with the need for extremely low temperatures, mere fractions of a degree above absolute zero (0 Kelvin or 273.15C). Thats because the quantum phenomena that grant quantum computers their unique computational abilities can only be harnessed by isolating them from the warmth of the familiar classical world we inhabit.

A single quantum bit or qubit, the equivalent of the binary zero or one bit at the heart of classical computing, requires a large refrigeration apparatus to function. However, in many areas where we expect quantum computers to deliver breakthroughs such as in designing new materials or medicines we will need large numbers of qubits or even whole quantum computers working in parallel.

Quantum computers that can manage errors and self-correct, essential for reliable computations, are anticipated to be gargantuan in scale. Companies like Google, IBM and PsiQuantum are preparing for a future of entire warehouses filled with cooling systems and consuming vast amounts of power to run a single quantum computer.

But if quantum computers could function at even slightly higher temperatures, they could be much easier to operate and much more widely available. In new research published in Nature, our team has shown a certain kind of qubit the spins of individual electrons can operate at temperatures around 1K, far hotter than earlier examples.

Cooling systems become less efficient at lower temperatures. To make it worse, the systems we use today to control the qubits are intertwining messes of wires reminiscent of ENIAC and other huge computers of the 1940s. These systems increase heating and create physical bottlenecks to making qubits work together.

Read more: How long before quantum computers can benefit society? That's Google's US$5 million question

The more qubits we try to cram in, the more difficult the problem becomes. At a certain point the wiring problem becomes insurmountable.

After that, the control systems need to be built into the same chips as the qubits. However, these integrated electronics use even more power and dissipate more heat than the big mess of wires.

Our new research may offer a way forward. We have demonstrated that a particular kind of qubit one made with a quantum dot printed with metal electrodes on silicon, using technology much like that used in existing microchip production can operate at temperatures around 1K.

This is only one degree above absolute zero, so its still extremely cold. However, its significantly warmer than previously thought possible. This breakthrough could condense the sprawling refrigeration infrastructure into a more manageable, single system. It would drastically reduce operational costs and power consumption.

The necessity for such technological advancements isnt merely academic. The stakes are high in fields like drug design, where quantum computing promises to revolutionise how we understand and interact with molecular structures.

The research and development expenses in these industries, running into billions of dollars, underscore the potential cost savings and efficiency gains from more accessible quantum computing technologies.

Hotter qubits offer new possibilities, but they will also introduce new challenges in error correction and control. Higher temperatures may well mean an increase in the rate of measurement errors, which will create further difficulties in keeping the computer functional.

It is still early days in the development of quantum computers. Quantum computers may one day be as ubiquitous as todays silicon chips, but the path to that future will be filled with technical hurdles.

Read more: Explainer: quantum computation and communication technology

Our recent progress in operating qubits at higher temperatures is as a key step towards making the requirements of the system simpler.

It offers hope that quantum computing may break free from the confines of specialised labs into the broader scientific community, industry and commercial data centres.

Follow this link:
Quantum computing just got hotter: 1 degree above absolute zero - The Conversation

IBM Quantum Computing Blog | Landmark IBM error correction paper on Nature cover – IBM

Today, the paper detailing those results was published as the cover story of the scientific journal Nature.1

Last year, we demonstrated that quantum computers had entered the era of utility, where they are now capable of running quantum circuits better than classical computers can. Over the next few years, we expect to find speedups over classical computing and extract business value from these systems. But there are also algorithms with mathematically proven speedups over leading classical methods that require tuning quantum circuits with hundreds of millions, to billions, of gates. Expanding our quantum computing toolkit to include those algorithms requires us to find a way to compute that corrects the errors inherent to quantum systems what we call quantum error correction.

Read how a paper from IBM and UC Berkeley shows a path toward useful quantum computing

Quantum error correction requires that we encode quantum information into more qubits than we would otherwise need. However, achieving quantum error correction in a scalable and fault-tolerant way has, to this point, been out of reach without considering scales of one million or more physical qubits. Our new result published today greatly reduces that overhead, and shows that error correction is within reach.

While quantum error correction theory dates back three decades, theoretical error correction techniques capable of running valuable quantum circuits on real hardware have been too impractical to deploy on quantum system. In our new paper, we introduce a new code, which we call the gross code, that overcomes that limitation.

This code is part of our broader strategy to bring useful quantum computing to the world.

While error correction is not a solved problem, this new code makes clear the path toward running quantum circuits with a billion gates or more on our superconducting transmon qubit hardware.

Quantum information is fragile and susceptible to noise environmental noise, noise from the control electronics, hardware imperfections, state preparation and measurement errors, and more. In order to run quantum circuits with millions to billions of gates, quantum error correction will be required.

Error correction works by building redundancy into quantum circuits. Many qubits work together to protect a piece of quantum information that a single qubit might lose to errors and noise.

On classical computers, the concept of redundancy is pretty straightforward. Classical error correction involves storing the same piece of information across multiple bits. Instead of storing a 1 as a 1 or a 0 as a 0, the computer might record 11111 or 00000. That way, if an error flips a minority of bits, the computer can treat 11001 as 1, or 10001 as 0. Its fairly easy to build in more redundancy as needed to introduce finer error correction.

Things are more complicated on quantum computers. Quantum information cannot be copied and pasted like classical information, and the information stored in quantum bits is more complicated than classical data. And of course, qubits can decohere quickly, forgetting their stored information.

Research has shown that quantum fault tolerance is possible, and there are many error correcting schemes on the books. The most popular one is called the surface code, where qubits are arranged on a two-dimensional lattice and units of information are encoded into sub-units of the lattice.

But these schemes have problems.

First, they only work if the hardwares error rates are better than some threshold determined by the specific scheme and the properties of the noise itself and beating those thresholds can be a challenge.

Second, many of those schemes scale inefficiently as you build larger quantum computers, the number of extra qubits needed for error correction far outpaces the number of qubits the code can store.

At practical code sizes where many errors can be corrected, the surface code uses hundreds of physical qubits per encoded qubit worth of quantum information, or more. So, while the surface code is useful for benchmarking and learning about error correction, its probably not the end of the story for fault-tolerant quantum computers.

The field of error correction buzzed with excitement in 2022 when Pavel Panteleev and Gleb Kalachev at Moscow State University published a landmark paper proving that there exist asymptotically good codes codes where the number of extra qubits needed levels off as the quality of the code increases.

This has spurred a lot of new work in error correction, especially in the same family of codes that the surface code hails from, called quantum low-density parity check, or qLDPC codes. These qLDPC codes are quantum error correcting codes where the operations responsible for checking whether or not an error has occurred only have to act on a few qubits, and each qubit only has to participate in a few checks.

But this work was highly theoretical, focused on proving the possibility of this kind of error correction. It didnt take into account the real constraints of building quantum computers. Most importantly, some qLDPC codes would require many qubits in a system to be physically linked to high numbers of other qubits. In practice, that would require quantum processors folded in on themselves in psychedelic hyper-dimensional origami, or entombed in wildly complex rats nests of wires.

In our paper, we looked for fault-tolerant quantum memory with a low qubit overhead, high error threshold, and a large code distance.

Bravyi, S., Cross, A., Gambetta, J., et al. High-threshold and low-overhead fault-tolerant quantum memory. Nature (2024). https://doi.org/10.1038/s41586-024-07107-7

In our Nature paper, we specifically looked for fault-tolerant quantum memory with a low qubit overhead, high error threshold, and a large code distance.

Lets break that down:

Fault-tolerant: The circuits used to detect errors won't spread those errors around too badly in the process, and they can be corrected faster than they occur

Quantum memory: In this paper, we are only encoding and storing quantum information. We are not yet doing calculations on the encoded quantum information.

High error threshold: The higher the threshold, the higher amount of hardware errors the code will allow while still being fault tolerant. We were looking for a code that allowed us to operate the memory reliably at physical error rates as high as 0.001, so we wanted a threshold close to 1 percent.

Large code distance: Distance is the measure of how robust the code is how many errors it takes to completely flip the value from 0 to 1 and vice versa. In the case of 00000 and 11111, the distance is 5. We wanted one with a large code distance that corrects more than just a couple errors. Large-distance codes can suppress noise by orders of magnitude even if the hardware quality is only marginally better than the code threshold. In contrast, codes with a small distance become useful only if the hardware quality is significantly better than the code threshold.

Low qubit overhead: Overhead is the number of extra qubits required for correcting errors. We want the number of qubits required to do error correction to be far less than we need for a surface code of the same quality, or distance.

Were excited to report that our teams mathematical analysis found concrete examples of qLDPC codes that met all of these required conditions. These fall into a family of codes called Bivariate Bicycle (BB) codes. And they are going to shape not only our research going forward, but how we architect physical quantum systems.

While many qLDPC code families show great promise for advancing error correction theory, most arent necessarily pragmatic for real-world application. Our new codes lend themselves better to practical implementation because each qubit needs only to connect to six others, and the connections can be routed on just two layers.

To get an idea of how the qubits are connected, imagine they are put onto a square grid, like a piece of graph paper. Curl up this piece of graph paper so that it forms a tube, and connect the ends of the tube to make a donut. On this donut, each qubit is connected to its four neighbors and two qubits that are farther away on the surface of the donut. No more connections needed.

The good news is we dont actually have to embed our qubits onto a donut to make these codes work we can accomplish this by folding the surface differently and adding a few other long-range connectors to satisfy mathematical requirements of the code. Its an engineering challenge, but much more feasible than a hyper-dimensional shape.

We explored some codes that have this architecture and focused on a particular [[144,12,12]] code. We call this code the gross code because 144 is a gross (or a dozen dozen). It requires 144 qubits to store data but in our specific implementation, it also uses another 144 qubits to check for errors, so this instance of the code uses 288 qubits. It stores 12 logical qubits well enough that fewer than 12 errors can be detected. Thus: [[144,12,12]].

Using the gross code, you can protect 12 logical qubits for roughly a million cycles of error checks using 288 qubits. Doing roughly the same task with the surface code would require nearly 3,000 qubits.

This is a milestone. We are still looking for qLDPC codes with even more efficient architectures, and our research on performing error-corrected calculations using these codes is ongoing. But with this publication, the future of error correction looks bright.

Fig. 1 | Tanner graphs of surface and BB codes.

Fig. 1 | Tanner graphs of surface and BB codes. a, Tanner graph of a surface code, for comparison. b, Tanner graph of a BB code with parameters [[144, 12, 12]] embedded into a torus. Any edge of the Tanner graph connects a data and a check vertex. Data qubits associated with the registers q(L) and q(R) are shown by blue and orange circles. Each vertex has six incident edges including four short-range edges (pointing north, south, east and west) and two long-range edges. We only show a few long-range edges to avoid clutter. Dashed and solid edges indicate two planar subgraphs spanning the Tanner graph, see the Methods. c, Sketch of a Tanner graph extension for measuring Z ={Z} and X ={X} following ref. 50, attaching to a surface code. The ancilla corresponding to the X ={X} measurement can be connected to a surface code, enabling load-store operations for all logical qubits by means of quantum teleportation and some logical unitaries. This extended Tanner graph also has an implementation in a thickness-2 architecture through the A and B edges (Methods).

Fig. 2 | Syndrome measurement circuit.

Fig. 2 | Syndrome measurement circuit. Full cycle of syndrome measurements relying on seven layers of CNOTs. We provide a local view of the circuit that only includes one data qubit from each register q(L) and q(R). The circuit is symmetric under horizontal and vertical shifts of the Tanner graph. Each data qubit is coupled by CNOTs with three X-check and three Z-check qubits: see the Methods for more details.

This code is part of our broader strategy to bring useful quantum computing to the world.

Today, our users benefit from novel error mitigation techniques methods for reducing or eliminating the effect of noise when calculating observables, alongside our work suppressing errors at the hardware level. This work brought us into the era of quantum utility. IBM researchers and partners all over the world are exploring practical applications of quantum computing today with existing quantum systems. Error mitigation lets users begin looking for quantum advantage on real quantum hardware.

But error mitigation comes with its own overhead, requiring running the same executions repeatedly so that classical computers can use statistical methods to extract an accurate result. This limits the scale of the programs you can run, and increasing that scale requires tools beyond error mitigation like error correction.

Last year, we debuted a new roadmap laying out our plan to continuously improve quantum computers over the next decade. This new paper is an important example of how we plan to continuously increasing the complexity (number of gates) of the quantum circuits that can be run on our hardware. It will allow us to transition from running circuits with 15,000 gates to 100 million, or even 1 billion gates.

Read the rest here:
IBM Quantum Computing Blog | Landmark IBM error correction paper on Nature cover - IBM

Quantum computing progress: Higher temps, better error correction – Ars Technica

There's a strong consensus that tackling most useful problems with a quantum computer will require that the computer be capable of error correction. There is absolutely no consensus, however, about what technology will allow us to achieve that. A large number of companies, including major players like Microsoft, Intel, Amazon, and IBM, have all committed to different technologies to get there, while a collection of startups are exploring an even wider range of potential solutions.

We probably won't have a clearer picture of what's likely to work for a few years. But there's going to be lots of interesting research and development work between now and then, some of which may ultimately represent key milestones in the development of quantum computing. To give you a sense of that work, we're going to look at three papers that were published within the last couple of weeks, each of which tackles a different aspect of quantum computing technology.

Error correction will require connecting multiple hardware qubits to act as a single unit termed a logical qubit. This spreads a single bit of quantum information across multiple hardware qubits, making it more robust. Additional qubits are used to monitor the behavior of the ones holding the data and perform corrections as needed. Some error-correction schemes require over a hundred hardware qubits for each logical qubit, meaning we'd need tens of thousands of hardware qubits before we could do anything practical.

A number of companies have looked at that problem and decided we already know how to create hardware on that scalejust look at any silicon chip. So, if we could etch useful qubits through the same processes we use to make current processors, then scaling wouldn't be an issue. Typically, this has meant fabricating quantum dots on the surface of silicon chips and using these to store single electrons that can hold a qubit in their spin. The rest of the chip holds more traditional circuitry that performs the initiation, control, and readout of the qubit.

This creates a notable problem. Like many other qubit technologies, quantum dots need to be kept below 1 Kelvin in order to keep the environment from interfering with the qubit. And, as anyone who has ever owned an x86-based laptop knows, all the other circuitry on the silicon generates heat. So, there's the very real prospect that trying to control the qubits will raise the temperature to the point that the qubits can't hold onto their state.

That might not be the problem that we thought, according to some work published in Wednesday's Nature. A large international team that includes people from the startup Diraq have shown that a silicon quantum dot processor can work well at the relatively toasty temperature of 1 Kelvin, up from the usual milliKelvin that these processors normally operate at.

The work was done on a two-qubit prototype made with materials that were specifically chosen to improve noise tolerance; the experimental procedure was also optimized to limit errors. The team then performed normal operations starting at 0.1 K and gradually ramped up the temperatures to 1.5 K, checking performance as they did so. They found that a major source of errors, state preparation and measurement (SPAM), didn't change dramatically in this temperature range: "SPAM around 1 K is comparable to that at millikelvin temperatures and remains workable at least until 1.4 K."

The error rates they did see depended on the state they were preparing. One particular state (both spin-up) had a fidelity of over 99 percent, while the rest were less constrained, at somewhere above 95 percent. States had a lifetime of over a millisecond, which qualifies as long-lived in the quantum world.

All of which is pretty good and suggests that the chips can tolerate reasonable operating temperatures, meaning on-chip control circuitry can be used without causing problems. The error rates of the hardware qubits are still well above those that would be needed for error correction to work. However, the researchers suggest that they've identified error processes that can potentially be compensated for. They expect that the ability to do industrial-scale manufacturing will ultimately lead to working hardware.

Read this article:
Quantum computing progress: Higher temps, better error correction - Ars Technica

3 Quantum Computing Stocks to Buy on the Dip: March 2024 – InvestorPlace

While classical computers have enjoyed tremendous capacity gains over the past few decades, its time for a paradigm shift, which brings the discussion to quantum computing stocks to buy. Here, were not just talking about shifting gears but moving from a race car to a rocket ship.

To be sure, its difficult to explain the various intricacies that help propel quantum computers over their traditional counterparts. But in a nutshell, it comes down to exponentially quicker processing. An attribute called superposition enables quantum computers to evaluate multiple possibilities simultaneously. That makes the new innovation run circles around classical processes.

Further, you cant argue with the numbers. In 2022, the quantum market reached a valuation of $1.9 billion. By 2032, this sector could jump to $42.1 billion, representing a compound annual growth rate of 36.4%.

Who knows? That might end up being a conservative estimate. With so much anticipation, these are the quantum computing stocks to buy for speculators.

Source: JHVEPhoto / Shutterstock.com

One of the top names in the tech ecosystem, Intel (NASDAQ:INTC) could be one of the underappreciated quantum computing stocks to buy. According to its public profile, designs, develops, manufactures, markets and sells computing and related products and services worldwide. It operates through Client Computing Group, Data Center and AI [artificial intelligence], Network and Edge, Mobileye and Intel Foundry Services segments.

Last year, Intel manufactured a quantum chip, availing it to university and federal research labs to grow the underlying community. While it might not be the most exciting play among quantum computing stocks to buy, its continued research and development makes it a worthy idea to consider.

Financially, the company has performed quite well against expected bottom-line targets. Specifically, Intel mitigated the expected loss per share in the first quarter of 2023 while delivering earnings in Q2 through Q3. Overall, the average positive surprise came out to 177.65% in the past four quarters.

For fiscal 2024, analysts anticipate earnings per share to land at $1.24 on sales of $53.1 billion. Thats a solid improvement over last years 97 cents per share on sales of $50.18 billion.

Source: Amin Van / Shutterstock.com

Falling under the computer hardware segment of the broader tech ecosystem, IonQ (NASDAQ:IONQ) engages in the development of general-purpose quantum computing systems. Per its corporate profile, the company sells access to quantum computers of various qubit capacities. The company makes access to its quantum computers through cloud platforms. These platforms are offered by enterprises like Amazon (NASDAQ:AMZN), Microsoft (NASDAQ:MSFT) and Alphabet (NASDAQ:GOOGL).

Since the start of the year, IONQ slipped 25%. However, in the past 52 weeks, it has gained 78%. Therefore, those who are willing to tolerate volatility in the near term may benefit from a possible discounted opportunity. On the financials, the company has started to improve its performance.

For example, in Q2 last year, IonQ incurred a negative surprise of 69.2%. In Q3, the metric was 22.2% in the red. However, in Q4, the company met the expected loss per share of 20 cents.

For fiscal 2024, analysts believe that the tech firm could generate revenue of $38.93 million. If so, that would represent a 76.6% increase from last years print of $22 million. Thus, its one of the exciting ideas for quantum computing stocks to buy.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Another name within the computer hardware subsector, Rigetti Computing (NASDAQ:RGTI), through its subsidiaries builds quantum computers and the superconducting quantum processors. Per its public profile, Rigetti offers cloud services in the form of quantum processing units. It also sells access to its quantum systems via a Cloud Computing as a Service business model.

Now, RGTI generates plenty of attention regarding quantum computing stocks to buy because of its tremendous performance. Since the beginning of the year, Rigetti shares popped up more than 64%. In the trailing 52 weeks, its up almost 175%. However, RGTI is also down 15% in the trailing five sessions, potentially providing speculators with a discount.

Interestingly, Rigetti provides some hits and misses in its quarterly disclosures. In Q2 and Q4, the company beat per-share expectations while missing in Q1 and Q3. For fiscal 2024, Rigetti could generate $16.1 million in revenue. If so, that would be 34.1% higher than last years print of $12.01 million.

Its no wonder, then, that analysts rate RGTI a unanimous strong buy with a $3.25 price target. That implies 115% upside potential.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare. Tweet him at @EnomotoMedia.

Read the original:
3 Quantum Computing Stocks to Buy on the Dip: March 2024 - InvestorPlace