Page 320«..1020..319320321322..330340..»

Elusive Ghost Particles from Space May Solve A Longstanding Mystery In Quantum Physics – Inverse

Sensors buried deep in the Antarctic ice may help physicists figure out how gravity works in the quantum realm, the smallest scale of existence.

A team of physicists says theyve figured out how to use a sensor-laden patch of ice near the South Pole to measure tiny changes in particles called neutrinos as they pass through space. Those changes could help reveal how gravity affects matter on the scale of subatomic particles like neutrinos. If theyre right, it could be a step toward resolving a problem physicists have wrangled with for decades.

Physicist Tom Studdard of the University of Copenhagen and his colleagues published their work in the journal Nature Physics.

The IceCube lab stands near the Amundsen-Scott South Pole Station in Antarctica. Beneath the surface, neutrino detectors embedded in a cubic kilometer of ice wait for a neutrino to streak past.

Studdard and his colleagues studied data from the IceCube Neutrino Observatory in Antarctica, whose sensors have recorded the passage of around 300,000 neutrinos so far. The physicists wanted to know whether they could measure tiny, subtle changes in the neutrinos properties, which could be caused by an elusive force called quantum gravity. Stuttard and his colleagues showed that their method for measuring those small changes worked, but they need more data to find what they are looking for.

The universe as we know it works according to a set of rules described by physicist Albert Einstein in his theory of general relativity. Those rules explain everything from why a dropped pen falls to the floor, to why galaxy clusters can become cosmic magnifying lenses. But if you zoom in to very tiny scales the realm of subatomic particles like electrons, quarks, and neutrinos Einsteins rules dont seem to apply anymore. At the smallest scales, the universe runs on a different set of rules called quantum mechanics.

Somewhere, theres an exact scale at which the rules change over from general relativity to quantum mechanics, and somewhere, theres a way to connect the two sets of rules. Thats the kind of problem that has kept generations of physicists awake at night. Right now, there isnt a good description of how (or if) gravity works at the tiny scales of quantum mechanics. Studdard and his colleagues hope neutrinos can help them learn those rules.

But theyre going to need a lot more neutrinos.

Neutrinos are like tiny ghosts; they have no electrical charge and almost no mass, so they barely interact with the universe around them. Like ghosts, they can move through walls, mountains, or even entire planets, undisturbed and often undetected. Most of the neutrinos IceCubes sensors detect have passed through the roughly 7,800-mile diameter of Earth to arrive in the Antarctic ice.

Even IceCube doesnt really detect neutrinos; instead, its sensors (thousands of them, embedded throughout a cubic kilometer of deep, ancient ice) measure the quick flash of radiation that happens when a neutrino bumps into an atom. That radiation holds clues about the neutrinos energy and the direction it came from. Stuttard and his colleagues hoped that information could also reveal whether certain properties had changed slightly during the neutrinos journey.

Heres where it gets weird. (No, it wasnt weird before. You havent seen weird yet.)

Neutrinos come in three types, or flavors: electron neutrinos, muon neutrinos, and tau neutrinos. Those flavors are named for the types of subatomic particles that tend to be spawned when the neutrino finally collides with an atom. And as a neutrino flies through space, it changes flavors constantly.

Imagine purchasing a carton of chocolate ice cream at the store, driving home, and opening it only to find it was vanilla! So you put a scoop of vanilla in your bowl and walk into the other room to eat it, where you are surprised to find it is now strawberry, explains the Fermi National Laboratory on its website. Thats what happens with neutrinos.

However, those changes arent random; physicists can generally predict how far a neutrino should travel between flavor changes. Stuttard and his colleagues were looking for deviations from that predictable pattern (which is called quantum coherence). If those deviations happened, they had to be caused by something actually having an effect on the neutrino something like a version of gravity that works at the quantum scale.

Stuttard and his colleagues didnt measure any quantum-gravity-induced changes in their neutrinos, but they did show that its possible to measure whether a neutrinos quantum coherence has been tampered with mid-flight.

Most of the neutrinos that IceCube has measured so far formed in Earths atmosphere when high-energy particles from space collided with atoms in the atmosphere. At most, those atmospheric neutrinos had traveled about 7,800 miles through Earth to reach IceCube. Thats not very far in the cosmic scheme of things.

Apparently a much longer distance is needed for quantum gravity to make an impact if it exists, says Studdard in a recent statement.

And that means physicists are going to need data from many, many neutrinos that began their journeys in distant space, traveling billions of light years to reach Earth. Those neutrinos are rarer and harder to detect than atmospheric neutrinos, but Studdard and his colleagues are optimistic.

With future measurements with astrophysical neutrinos, as well as more precise detectors being built in the coming decade, we hope to finally answer this fundamental question, says Studdard.

See the article here:

Elusive Ghost Particles from Space May Solve A Longstanding Mystery In Quantum Physics - Inverse

Read More..

Graviton modes observed for the first time by Chinese scientists – China Daily

Light probing chiral graviton modes in fractional quantum Hall effect liquids. [Photo/Official Wechat account of Nanjing University]

Chinese scientists have for the first time observed graviton modes that are condensed-matter analogs of gravitons, a major discovery for understanding new correlated quantum physics.

The research team, led by Du Lingjie from Nanjing University, spent more than three years designing and assembling an experimental apparatus that utilized resonant inelastic scattering of circularly polarized light to study low-energy collective excitations of the fractional quantum Hall liquids in a gallium arsenide quantum well.

The graviton modes in fractional quantum Hall liquids manifested as chiral spin-2 long-wavelength magnetorotons. (The hypothetical spin-2 bosons was pointed out as early as 1939.), which appears to be similar to a graviton.

"Gravitons correspond to gravitational waves, the latter of which has been experimentally confirmed, while gravitons have not been directly observed." Du Lingjie was quoted as saying by Xinhua News Agency.

"Gravitons are a product of the combination of general relativity and quantum mechanics. If the existence of this mysterious particle can be confirmed, it may help to achieve the unification of these two major theories, which is of great significance for contemporary physics," said Du.

The discovery was published in Nature on March 27.

Read more from the original source:

Graviton modes observed for the first time by Chinese scientists - China Daily

Read More..

The Cosmic Quandary: Stability of the Universe’s Vacuum – yTech

Summary: Contemporary scientific inquiry delves into the mysteries of the vacuum in space, uncovering its dynamic energy potential and the existential uncertainties it presents. This article examines the Casimir effect, the concept of metastable vacuum, and the implications for the universes longevity, balancing the fascination with potential cataclysm against the enduring stability observed in cosmic history.

The notion of nothingness in space is a misapprehension; the expanse that we call a vacuum is, in fact, brimming with transient energy and ephemeral particles. These findings build upon the groundwork laid down by Hendrik Casimir in the mid-20th century, unveiling the forceful interplay between uncharged metal platesnow known as the Casimir effectwhich manifest from vacuum fluctuations. This peculiar quantum behavior contradicts classic physics, revealing a universe whirring with imperceptible activity.

Unlike the basic assumptions of a vacuum encompassing the minimum energy state, theoretical physics entertains the prospect of a higher, though precarious, energy plateau. Here lies the potential for a metastable vacuum where the fabric of space teeters on a thin line between its current form and a more relaxed, fundamental state. The hypothetical consequences of this shift precipitate a universe-ending event, a conjecture both terrifying and intriguing.

Yet, such hypothetical doom seems removed as evidence points to our universe weathering an impressive 14-billion-year existence without succumbing to such a phase change. Calculations suggest the enduring nature of this fine-tuned cosmic setup may well extend for billions of years to come, offering reassurance against the backdrop of more tangible existential risks.

This exploration into the universes vacuums metastability underpins a vast universe that is still yielding its secrets, and with that, seeding speculative realms in science fiction narratives like the novel Feuermondnacht, which capture the human imagination around the unfathomable mysteries of existence and the latent forces at play in the quantum fields.

Exploring the Enigmatic Vacuum: Energy, Stability, and Cosmic Longevity

The vacuum of space, long thought to be a void of emptiness, has proven to be a hotbed of quantum activity and a field of great scientific interest. This seemingly empty space is filled with energy and particles that blink in and out of existence, challenging traditional conceptions of nothingness. One of the key phenomena associated with the quantum vacuum is the Casimir effect, named after Dutch physicist Hendrik Casimir, who first predicted this quantum occurrence in 1948. Employing advanced techniques, physicists have since confirmed that uncharged metal plates can attract each other in a vacuum due to the pressure exerted by quantum fluctuations, an insight contradicting classical physics.

Beyond its scientific fascination, the vacuums dynamic energy has implications for various industry sectors, such as nanotechnology, where understanding quantum forces can enhance the creation and manipulation of materials on an atomic scale. Additionally, companies in aerospace engineering are invested in researching these quantum effects as they could have practical applications in space travel and satellite technology.

Metastability and Cosmic Uncertainties

In theoretical physics, the concept of a metastable vacuum suggests our universe could be in a delicate state of balance. If the vacuum were to transition to a lower energy level, it could lead to cosmic restructuring on a grand scale, an event that could spell the end of the universe as we know it. While this notion is worrying, there is currently no evidence to suggest that such a phase transition is imminent. Our universes track record of 14 billion years without undergoing such a transformation lends weight to the belief in its stability.

Market Forecasts and Industry Implications

Ongoing research and development into the quantum mechanics of vacuums hold significant market potential. Market forecasts in industries like quantum computing, space exploration, and energy production are taking into account the advancements made in understanding the quantum vacuum. These insights can lead to transformative technologies with the power to redefine our energy consumption, computational capabilities, and even our approach to space endeavors.

Furthermore, there are budding areas of exploration such as vacuum energy extractionsometimes mentioned in speculative science and literaturewhich entertain the idea of harnessing the quantum vacuums energy. If made technically feasible, this could lead to groundbreaking renewable energy sources.

Issues and Challenges in the Field

Despite the exciting potential, there are complex challenges facing researchers studying the vacuum of space. Any practical application of the energy within the vacuum must hurdle the limitations imposed by the laws of thermodynamics and quantum field theory. Also, due to the highly abstract nature of these phenomena, public understanding and therefore investment can be hard to secure. Ethical considerations concerning the manipulation of fundamental forces, and international regulation, also present hurdles that require careful navigation as we advance in this domain.

As humankind continues to push the boundaries of its understanding of the universe, the foundations of everything we know hang in the balance. Through meticulous research, the industries built upon these foundations may propel us into a future that once existed only within the realms of science fiction. For those intrigued by the impact of quantum physics on our world, a visit to authoritative resources on science and technology could offer deeper insights into these profound topics.

Micha Rogucki is a pioneering figure in the field of renewable energy, particularly known for his work on solar power innovations. His research and development efforts have significantly advanced solar panel efficiency and sustainability. Roguckis commitment to green energy solutions is also evident in his advocacy for integrating renewable sources into national power grids. His groundbreaking work not only contributes to the scientific community but also plays a crucial role in promoting environmental sustainability and energy independence. Roguckis influence extends beyond academia, impacting industry practices and public policy regarding renewable energy.

Read the original here:

The Cosmic Quandary: Stability of the Universe's Vacuum - yTech

Read More..

Breakthrough in Quantum Chemistry: The Creation of the Coldest Complex Molecule – yTech

Summary: In a groundbreaking experiment, scientists have managed to synthesize an ultra-cold four-atom molecule, providing new insights into quantum behavior and the potential for developing materials with unprecedented properties, such as high-temperature superconductors.

The world of quantum mechanics often surprises with its counterintuitive nature, and researchers have now reached a new milestone by creating an ultracold four-atom molecule, the coldest of its kind, at a temperature that is near absolute zero. Utilizing a sophisticated multi-step cooling technique including laser and evaporative cooling, scientists were able to construct this molecule at 134 nanokelvin just a whisper above the lowest temperature theoretically possible in the universe.

This breakthrough comes with significant challenges. At such low temperatures, quantum mechanics becomes the dominant force, and manipulating the complex array of quantum states in molecules as opposed to simpler atoms or ions is an intricate task. Roman Bause, a quantum optics researcher, has likened the plethora of quantum states in molecules to a thick book, reflecting the complex nature of their behavior.

Yet, overcoming these obstacles has immense potential benefits. A key application of understanding ultracold molecules lies in the simulation of other quantum systems. This research could pave the way for the development of materials critical to future technology, such as high-temperature superconductors or advanced lithium batteries.

The team, led by Tao Shi at the Chinese Academy of Sciences, circumvented the tendency of the molecules to clump and lose their experimental control by carefully orchestrating microwaves to finely tune the bonding process. As a result, they were able to form this unique four-atom molecule with a bond length a thousand times longer than that within its component parts.

The success of forging such a molecule is a testament to the innovative techniques in the field of ultracold science, offering a new perspective on the quantum world and hinting at revolutionary advancements in materials science.

The Synthesis of Ultracold Four-Atom Molecules and Its Impact on the Quantum Mechanics Industry

The synthesis of the ultracold four-atom molecule represents a significant achievement in the realm of quantum mechanics and underlines the rapid progression within this industry. The ability to create and manipulate matter at near absolute zero temperatures is not only a triumph of experimental physics but also a milestone that could influence market dynamics by driving the development of new technologies.

Industry and Market Forecasts

The quantum mechanics industry has been evolving with research progressing towards practical applications such as quantum computing, precision sensors, and advanced materials. The creation of such molecules could factor importantly into the development of high-temperature superconductors, which promise to revolutionize industries by substantially reducing energy losses in electrical systems. The market for superconductors alone has significant growth potential, with forecasts projecting substantial expansion fueled by advancements in sectors like medical imaging, scientific research, and power utilities.

Moreover, as the technology matures, it can lead to improvements in the manufacturing of advanced lithium batteries, which are pivotal to the green energy transition, particularly in the electric vehicle market. The synthesis of ultracold molecules contributes to understanding how quantum systems interact, paving the way for innovative battery technologies with higher energy capacities and faster charging times.

The quantum mechanics industry is also interconnected with other fields, like materials science and nanotechnology, and this synergy could lead to a host of unforeseen products and applications that rely on the manipulation of quantum states.

Issues Related to the Industry or Product

Despite the exciting possibilities, there are significant challenges facing the industry. The complexity of quantum states presents a considerable barrier to commercialization, requiring sophisticated and expensive equipment for experimentation and development. Furthermore, quantum materials and technologies might face regulatory and safety concerns due to their novel properties and the potential impact on existing markets and infrastructures.

To further explore the field of quantum mechanics and its potential implications, one might visit reputable scientific research organizations or industry-leading technology companies that are pioneering quantum technologies. A reliable source for such information could be found through the official links of these organizations or companies, such as:

The European Quantum Industry Consortium: qurope.eu The National Quantum Initiative: quantum.gov The Quantum Technology Hub: quantumcommshub.net

Please note that these are suggested domains for further information on quantum technology and should be verified for accuracy before use.

The synthesis of an ultracold four-atom molecule is a blueprint for our understanding of quantum interactions. It carries with it the promise of breakthroughs in numerous applications. As researchers and industries delve deeper into this field, the impact on both science and the global market will continue to unfold, bearing witness to the transformative power of quantum mechanics.

Roman Perkowski is a distinguished name in the field of space exploration technology, specifically known for his work on propulsion systems for interplanetary travel. His innovative research and designs have been crucial in advancing the efficiency and reliability of spacecraft engines. Perkowskis contributions are particularly significant in the development of sustainable and powerful propulsion methods, which are vital for long-duration space missions. His work not only pushes the boundaries of current space travel capabilities but also inspires future generations of scientists and engineers in the quest to explore the far reaches of our solar system and beyond.

View post:

Breakthrough in Quantum Chemistry: The Creation of the Coldest Complex Molecule - yTech

Read More..

For the 1st Time Scientists Found Experimental Evidence of Graviton-like Particle – IndianWeb2.com

Gravitons are fascinating hypothetical particles that play a pivotal role in our understanding of gravity. These are the fundamental particles that mediate the force of gravitational interaction in the realm of quantum field theory.

In simpler terms, they carry the gravitational force, much like how photons carry the electromagnetic force. When you toss something upward, and it gracefully descends due to gravity, it's essentially the gravitons at work.

Like photons, gravitons are expected to be massless and electrically uncharged. Gravitons too travel at the speed of light, zipping through the fabric of spacetime. Their existence is rooted in the quest for a unified theory that combines quantum mechanics and gravity.

Gravitons are the focus of the search for the "theory of everything", which would unify Einstein's General Relativity (GR) theory of gravity with quantum theory

Gravitons remain elusive and unobserved and continue to intrigue scientists as we seek to unravel the mysteries of gravity and the cosmos.

In a latest however, scientists have glimpsed into graviton-like particles and these particles of gravity have shown their existence in a semiconductor.

An international research team led byChinese scientists has, for the first time, presented experimental evidence of a graviton-like particle called chiral graviton modes (CGMs), with the findings published in the scientific journal Nature on Thursday.

By putting a thin layer of semiconductor under extreme conditions and exciting its electrons to move in concert, researchers from eastern Chinas Nanjing University, the United States and Germany found the electrons to spin in a way that is only expected to exist in gravitons.

Despite the breakthrough, Loren Pfeiffer at Princeton University, who wrote the paper of this findings, said "This is a needle in ahaystack [finding]. And the paper that started this whole thing is from way back in 1993." He wrote that paper with several colleagues including Aron Pinczuk, who passed away in 2022 before they could find hints of the gravitons.

The term "graviton" was coined in 1934 by Soviet physicists Dmitrii Blokhintsev and F. M. Gal'perin. Paul Dirac later reintroduced the term, envisioning that the energy of the gravitational field should come in discrete quantathese quanta he playfully dubbed "gravitons."

Just as Newton anticipated photons, Laplace also foresaw "gravitons," albeit with a greater speed than light and no connection to quantum mechanics or special relativity.

Continue reading here:

For the 1st Time Scientists Found Experimental Evidence of Graviton-like Particle - IndianWeb2.com

Read More..

Quantum computing just got hotter: 1 degree above absolute zero – The Conversation

For decades, the pursuit of quantum computing has struggled with the need for extremely low temperatures, mere fractions of a degree above absolute zero (0 Kelvin or 273.15C). Thats because the quantum phenomena that grant quantum computers their unique computational abilities can only be harnessed by isolating them from the warmth of the familiar classical world we inhabit.

A single quantum bit or qubit, the equivalent of the binary zero or one bit at the heart of classical computing, requires a large refrigeration apparatus to function. However, in many areas where we expect quantum computers to deliver breakthroughs such as in designing new materials or medicines we will need large numbers of qubits or even whole quantum computers working in parallel.

Quantum computers that can manage errors and self-correct, essential for reliable computations, are anticipated to be gargantuan in scale. Companies like Google, IBM and PsiQuantum are preparing for a future of entire warehouses filled with cooling systems and consuming vast amounts of power to run a single quantum computer.

But if quantum computers could function at even slightly higher temperatures, they could be much easier to operate and much more widely available. In new research published in Nature, our team has shown a certain kind of qubit the spins of individual electrons can operate at temperatures around 1K, far hotter than earlier examples.

Cooling systems become less efficient at lower temperatures. To make it worse, the systems we use today to control the qubits are intertwining messes of wires reminiscent of ENIAC and other huge computers of the 1940s. These systems increase heating and create physical bottlenecks to making qubits work together.

Read more: How long before quantum computers can benefit society? That's Google's US$5 million question

The more qubits we try to cram in, the more difficult the problem becomes. At a certain point the wiring problem becomes insurmountable.

After that, the control systems need to be built into the same chips as the qubits. However, these integrated electronics use even more power and dissipate more heat than the big mess of wires.

Our new research may offer a way forward. We have demonstrated that a particular kind of qubit one made with a quantum dot printed with metal electrodes on silicon, using technology much like that used in existing microchip production can operate at temperatures around 1K.

This is only one degree above absolute zero, so its still extremely cold. However, its significantly warmer than previously thought possible. This breakthrough could condense the sprawling refrigeration infrastructure into a more manageable, single system. It would drastically reduce operational costs and power consumption.

The necessity for such technological advancements isnt merely academic. The stakes are high in fields like drug design, where quantum computing promises to revolutionise how we understand and interact with molecular structures.

The research and development expenses in these industries, running into billions of dollars, underscore the potential cost savings and efficiency gains from more accessible quantum computing technologies.

Hotter qubits offer new possibilities, but they will also introduce new challenges in error correction and control. Higher temperatures may well mean an increase in the rate of measurement errors, which will create further difficulties in keeping the computer functional.

It is still early days in the development of quantum computers. Quantum computers may one day be as ubiquitous as todays silicon chips, but the path to that future will be filled with technical hurdles.

Read more: Explainer: quantum computation and communication technology

Our recent progress in operating qubits at higher temperatures is as a key step towards making the requirements of the system simpler.

It offers hope that quantum computing may break free from the confines of specialised labs into the broader scientific community, industry and commercial data centres.

Follow this link:
Quantum computing just got hotter: 1 degree above absolute zero - The Conversation

Read More..

Quantum computing progress: Higher temps, better error correction – Ars Technica

There's a strong consensus that tackling most useful problems with a quantum computer will require that the computer be capable of error correction. There is absolutely no consensus, however, about what technology will allow us to achieve that. A large number of companies, including major players like Microsoft, Intel, Amazon, and IBM, have all committed to different technologies to get there, while a collection of startups are exploring an even wider range of potential solutions.

We probably won't have a clearer picture of what's likely to work for a few years. But there's going to be lots of interesting research and development work between now and then, some of which may ultimately represent key milestones in the development of quantum computing. To give you a sense of that work, we're going to look at three papers that were published within the last couple of weeks, each of which tackles a different aspect of quantum computing technology.

Error correction will require connecting multiple hardware qubits to act as a single unit termed a logical qubit. This spreads a single bit of quantum information across multiple hardware qubits, making it more robust. Additional qubits are used to monitor the behavior of the ones holding the data and perform corrections as needed. Some error-correction schemes require over a hundred hardware qubits for each logical qubit, meaning we'd need tens of thousands of hardware qubits before we could do anything practical.

A number of companies have looked at that problem and decided we already know how to create hardware on that scalejust look at any silicon chip. So, if we could etch useful qubits through the same processes we use to make current processors, then scaling wouldn't be an issue. Typically, this has meant fabricating quantum dots on the surface of silicon chips and using these to store single electrons that can hold a qubit in their spin. The rest of the chip holds more traditional circuitry that performs the initiation, control, and readout of the qubit.

This creates a notable problem. Like many other qubit technologies, quantum dots need to be kept below 1 Kelvin in order to keep the environment from interfering with the qubit. And, as anyone who has ever owned an x86-based laptop knows, all the other circuitry on the silicon generates heat. So, there's the very real prospect that trying to control the qubits will raise the temperature to the point that the qubits can't hold onto their state.

That might not be the problem that we thought, according to some work published in Wednesday's Nature. A large international team that includes people from the startup Diraq have shown that a silicon quantum dot processor can work well at the relatively toasty temperature of 1 Kelvin, up from the usual milliKelvin that these processors normally operate at.

The work was done on a two-qubit prototype made with materials that were specifically chosen to improve noise tolerance; the experimental procedure was also optimized to limit errors. The team then performed normal operations starting at 0.1 K and gradually ramped up the temperatures to 1.5 K, checking performance as they did so. They found that a major source of errors, state preparation and measurement (SPAM), didn't change dramatically in this temperature range: "SPAM around 1 K is comparable to that at millikelvin temperatures and remains workable at least until 1.4 K."

The error rates they did see depended on the state they were preparing. One particular state (both spin-up) had a fidelity of over 99 percent, while the rest were less constrained, at somewhere above 95 percent. States had a lifetime of over a millisecond, which qualifies as long-lived in the quantum world.

All of which is pretty good and suggests that the chips can tolerate reasonable operating temperatures, meaning on-chip control circuitry can be used without causing problems. The error rates of the hardware qubits are still well above those that would be needed for error correction to work. However, the researchers suggest that they've identified error processes that can potentially be compensated for. They expect that the ability to do industrial-scale manufacturing will ultimately lead to working hardware.

Read this article:
Quantum computing progress: Higher temps, better error correction - Ars Technica

Read More..

IBM Quantum Computing Blog | Landmark IBM error correction paper on Nature cover – IBM

Today, the paper detailing those results was published as the cover story of the scientific journal Nature.1

Last year, we demonstrated that quantum computers had entered the era of utility, where they are now capable of running quantum circuits better than classical computers can. Over the next few years, we expect to find speedups over classical computing and extract business value from these systems. But there are also algorithms with mathematically proven speedups over leading classical methods that require tuning quantum circuits with hundreds of millions, to billions, of gates. Expanding our quantum computing toolkit to include those algorithms requires us to find a way to compute that corrects the errors inherent to quantum systems what we call quantum error correction.

Read how a paper from IBM and UC Berkeley shows a path toward useful quantum computing

Quantum error correction requires that we encode quantum information into more qubits than we would otherwise need. However, achieving quantum error correction in a scalable and fault-tolerant way has, to this point, been out of reach without considering scales of one million or more physical qubits. Our new result published today greatly reduces that overhead, and shows that error correction is within reach.

While quantum error correction theory dates back three decades, theoretical error correction techniques capable of running valuable quantum circuits on real hardware have been too impractical to deploy on quantum system. In our new paper, we introduce a new code, which we call the gross code, that overcomes that limitation.

This code is part of our broader strategy to bring useful quantum computing to the world.

While error correction is not a solved problem, this new code makes clear the path toward running quantum circuits with a billion gates or more on our superconducting transmon qubit hardware.

Quantum information is fragile and susceptible to noise environmental noise, noise from the control electronics, hardware imperfections, state preparation and measurement errors, and more. In order to run quantum circuits with millions to billions of gates, quantum error correction will be required.

Error correction works by building redundancy into quantum circuits. Many qubits work together to protect a piece of quantum information that a single qubit might lose to errors and noise.

On classical computers, the concept of redundancy is pretty straightforward. Classical error correction involves storing the same piece of information across multiple bits. Instead of storing a 1 as a 1 or a 0 as a 0, the computer might record 11111 or 00000. That way, if an error flips a minority of bits, the computer can treat 11001 as 1, or 10001 as 0. Its fairly easy to build in more redundancy as needed to introduce finer error correction.

Things are more complicated on quantum computers. Quantum information cannot be copied and pasted like classical information, and the information stored in quantum bits is more complicated than classical data. And of course, qubits can decohere quickly, forgetting their stored information.

Research has shown that quantum fault tolerance is possible, and there are many error correcting schemes on the books. The most popular one is called the surface code, where qubits are arranged on a two-dimensional lattice and units of information are encoded into sub-units of the lattice.

But these schemes have problems.

First, they only work if the hardwares error rates are better than some threshold determined by the specific scheme and the properties of the noise itself and beating those thresholds can be a challenge.

Second, many of those schemes scale inefficiently as you build larger quantum computers, the number of extra qubits needed for error correction far outpaces the number of qubits the code can store.

At practical code sizes where many errors can be corrected, the surface code uses hundreds of physical qubits per encoded qubit worth of quantum information, or more. So, while the surface code is useful for benchmarking and learning about error correction, its probably not the end of the story for fault-tolerant quantum computers.

The field of error correction buzzed with excitement in 2022 when Pavel Panteleev and Gleb Kalachev at Moscow State University published a landmark paper proving that there exist asymptotically good codes codes where the number of extra qubits needed levels off as the quality of the code increases.

This has spurred a lot of new work in error correction, especially in the same family of codes that the surface code hails from, called quantum low-density parity check, or qLDPC codes. These qLDPC codes are quantum error correcting codes where the operations responsible for checking whether or not an error has occurred only have to act on a few qubits, and each qubit only has to participate in a few checks.

But this work was highly theoretical, focused on proving the possibility of this kind of error correction. It didnt take into account the real constraints of building quantum computers. Most importantly, some qLDPC codes would require many qubits in a system to be physically linked to high numbers of other qubits. In practice, that would require quantum processors folded in on themselves in psychedelic hyper-dimensional origami, or entombed in wildly complex rats nests of wires.

In our paper, we looked for fault-tolerant quantum memory with a low qubit overhead, high error threshold, and a large code distance.

Bravyi, S., Cross, A., Gambetta, J., et al. High-threshold and low-overhead fault-tolerant quantum memory. Nature (2024). https://doi.org/10.1038/s41586-024-07107-7

In our Nature paper, we specifically looked for fault-tolerant quantum memory with a low qubit overhead, high error threshold, and a large code distance.

Lets break that down:

Fault-tolerant: The circuits used to detect errors won't spread those errors around too badly in the process, and they can be corrected faster than they occur

Quantum memory: In this paper, we are only encoding and storing quantum information. We are not yet doing calculations on the encoded quantum information.

High error threshold: The higher the threshold, the higher amount of hardware errors the code will allow while still being fault tolerant. We were looking for a code that allowed us to operate the memory reliably at physical error rates as high as 0.001, so we wanted a threshold close to 1 percent.

Large code distance: Distance is the measure of how robust the code is how many errors it takes to completely flip the value from 0 to 1 and vice versa. In the case of 00000 and 11111, the distance is 5. We wanted one with a large code distance that corrects more than just a couple errors. Large-distance codes can suppress noise by orders of magnitude even if the hardware quality is only marginally better than the code threshold. In contrast, codes with a small distance become useful only if the hardware quality is significantly better than the code threshold.

Low qubit overhead: Overhead is the number of extra qubits required for correcting errors. We want the number of qubits required to do error correction to be far less than we need for a surface code of the same quality, or distance.

Were excited to report that our teams mathematical analysis found concrete examples of qLDPC codes that met all of these required conditions. These fall into a family of codes called Bivariate Bicycle (BB) codes. And they are going to shape not only our research going forward, but how we architect physical quantum systems.

While many qLDPC code families show great promise for advancing error correction theory, most arent necessarily pragmatic for real-world application. Our new codes lend themselves better to practical implementation because each qubit needs only to connect to six others, and the connections can be routed on just two layers.

To get an idea of how the qubits are connected, imagine they are put onto a square grid, like a piece of graph paper. Curl up this piece of graph paper so that it forms a tube, and connect the ends of the tube to make a donut. On this donut, each qubit is connected to its four neighbors and two qubits that are farther away on the surface of the donut. No more connections needed.

The good news is we dont actually have to embed our qubits onto a donut to make these codes work we can accomplish this by folding the surface differently and adding a few other long-range connectors to satisfy mathematical requirements of the code. Its an engineering challenge, but much more feasible than a hyper-dimensional shape.

We explored some codes that have this architecture and focused on a particular [[144,12,12]] code. We call this code the gross code because 144 is a gross (or a dozen dozen). It requires 144 qubits to store data but in our specific implementation, it also uses another 144 qubits to check for errors, so this instance of the code uses 288 qubits. It stores 12 logical qubits well enough that fewer than 12 errors can be detected. Thus: [[144,12,12]].

Using the gross code, you can protect 12 logical qubits for roughly a million cycles of error checks using 288 qubits. Doing roughly the same task with the surface code would require nearly 3,000 qubits.

This is a milestone. We are still looking for qLDPC codes with even more efficient architectures, and our research on performing error-corrected calculations using these codes is ongoing. But with this publication, the future of error correction looks bright.

Fig. 1 | Tanner graphs of surface and BB codes.

Fig. 1 | Tanner graphs of surface and BB codes. a, Tanner graph of a surface code, for comparison. b, Tanner graph of a BB code with parameters [[144, 12, 12]] embedded into a torus. Any edge of the Tanner graph connects a data and a check vertex. Data qubits associated with the registers q(L) and q(R) are shown by blue and orange circles. Each vertex has six incident edges including four short-range edges (pointing north, south, east and west) and two long-range edges. We only show a few long-range edges to avoid clutter. Dashed and solid edges indicate two planar subgraphs spanning the Tanner graph, see the Methods. c, Sketch of a Tanner graph extension for measuring Z ={Z} and X ={X} following ref. 50, attaching to a surface code. The ancilla corresponding to the X ={X} measurement can be connected to a surface code, enabling load-store operations for all logical qubits by means of quantum teleportation and some logical unitaries. This extended Tanner graph also has an implementation in a thickness-2 architecture through the A and B edges (Methods).

Fig. 2 | Syndrome measurement circuit.

Fig. 2 | Syndrome measurement circuit. Full cycle of syndrome measurements relying on seven layers of CNOTs. We provide a local view of the circuit that only includes one data qubit from each register q(L) and q(R). The circuit is symmetric under horizontal and vertical shifts of the Tanner graph. Each data qubit is coupled by CNOTs with three X-check and three Z-check qubits: see the Methods for more details.

This code is part of our broader strategy to bring useful quantum computing to the world.

Today, our users benefit from novel error mitigation techniques methods for reducing or eliminating the effect of noise when calculating observables, alongside our work suppressing errors at the hardware level. This work brought us into the era of quantum utility. IBM researchers and partners all over the world are exploring practical applications of quantum computing today with existing quantum systems. Error mitigation lets users begin looking for quantum advantage on real quantum hardware.

But error mitigation comes with its own overhead, requiring running the same executions repeatedly so that classical computers can use statistical methods to extract an accurate result. This limits the scale of the programs you can run, and increasing that scale requires tools beyond error mitigation like error correction.

Last year, we debuted a new roadmap laying out our plan to continuously improve quantum computers over the next decade. This new paper is an important example of how we plan to continuously increasing the complexity (number of gates) of the quantum circuits that can be run on our hardware. It will allow us to transition from running circuits with 15,000 gates to 100 million, or even 1 billion gates.

Read the rest here:
IBM Quantum Computing Blog | Landmark IBM error correction paper on Nature cover - IBM

Read More..

3 Quantum Computing Stocks to Buy on the Dip: March 2024 – InvestorPlace

While classical computers have enjoyed tremendous capacity gains over the past few decades, its time for a paradigm shift, which brings the discussion to quantum computing stocks to buy. Here, were not just talking about shifting gears but moving from a race car to a rocket ship.

To be sure, its difficult to explain the various intricacies that help propel quantum computers over their traditional counterparts. But in a nutshell, it comes down to exponentially quicker processing. An attribute called superposition enables quantum computers to evaluate multiple possibilities simultaneously. That makes the new innovation run circles around classical processes.

Further, you cant argue with the numbers. In 2022, the quantum market reached a valuation of $1.9 billion. By 2032, this sector could jump to $42.1 billion, representing a compound annual growth rate of 36.4%.

Who knows? That might end up being a conservative estimate. With so much anticipation, these are the quantum computing stocks to buy for speculators.

Source: JHVEPhoto / Shutterstock.com

One of the top names in the tech ecosystem, Intel (NASDAQ:INTC) could be one of the underappreciated quantum computing stocks to buy. According to its public profile, designs, develops, manufactures, markets and sells computing and related products and services worldwide. It operates through Client Computing Group, Data Center and AI [artificial intelligence], Network and Edge, Mobileye and Intel Foundry Services segments.

Last year, Intel manufactured a quantum chip, availing it to university and federal research labs to grow the underlying community. While it might not be the most exciting play among quantum computing stocks to buy, its continued research and development makes it a worthy idea to consider.

Financially, the company has performed quite well against expected bottom-line targets. Specifically, Intel mitigated the expected loss per share in the first quarter of 2023 while delivering earnings in Q2 through Q3. Overall, the average positive surprise came out to 177.65% in the past four quarters.

For fiscal 2024, analysts anticipate earnings per share to land at $1.24 on sales of $53.1 billion. Thats a solid improvement over last years 97 cents per share on sales of $50.18 billion.

Source: Amin Van / Shutterstock.com

Falling under the computer hardware segment of the broader tech ecosystem, IonQ (NASDAQ:IONQ) engages in the development of general-purpose quantum computing systems. Per its corporate profile, the company sells access to quantum computers of various qubit capacities. The company makes access to its quantum computers through cloud platforms. These platforms are offered by enterprises like Amazon (NASDAQ:AMZN), Microsoft (NASDAQ:MSFT) and Alphabet (NASDAQ:GOOGL).

Since the start of the year, IONQ slipped 25%. However, in the past 52 weeks, it has gained 78%. Therefore, those who are willing to tolerate volatility in the near term may benefit from a possible discounted opportunity. On the financials, the company has started to improve its performance.

For example, in Q2 last year, IonQ incurred a negative surprise of 69.2%. In Q3, the metric was 22.2% in the red. However, in Q4, the company met the expected loss per share of 20 cents.

For fiscal 2024, analysts believe that the tech firm could generate revenue of $38.93 million. If so, that would represent a 76.6% increase from last years print of $22 million. Thus, its one of the exciting ideas for quantum computing stocks to buy.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Another name within the computer hardware subsector, Rigetti Computing (NASDAQ:RGTI), through its subsidiaries builds quantum computers and the superconducting quantum processors. Per its public profile, Rigetti offers cloud services in the form of quantum processing units. It also sells access to its quantum systems via a Cloud Computing as a Service business model.

Now, RGTI generates plenty of attention regarding quantum computing stocks to buy because of its tremendous performance. Since the beginning of the year, Rigetti shares popped up more than 64%. In the trailing 52 weeks, its up almost 175%. However, RGTI is also down 15% in the trailing five sessions, potentially providing speculators with a discount.

Interestingly, Rigetti provides some hits and misses in its quarterly disclosures. In Q2 and Q4, the company beat per-share expectations while missing in Q1 and Q3. For fiscal 2024, Rigetti could generate $16.1 million in revenue. If so, that would be 34.1% higher than last years print of $12.01 million.

Its no wonder, then, that analysts rate RGTI a unanimous strong buy with a $3.25 price target. That implies 115% upside potential.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare. Tweet him at @EnomotoMedia.

Read the original:
3 Quantum Computing Stocks to Buy on the Dip: March 2024 - InvestorPlace

Read More..

Revolutionizing Quantum Computing: Breakthroughs in Quantum Error Correction – AZoQuantum

Despite their great potential, quantum computers are delicate devices. Unlike classical computers, qubits (the quantum version of bits) are prone to errors from noise and decoherence. Addressing this challenge, Quantum Error Correction (QEC) is a crucial division of quantum computing development that focuses on resolving qubit errors.

Image Credit:Yurchanka Siarhei/Shutterstock.com

The world of atoms and subatomic particles is governed by the laws of quantum mechanics. Quantum computing harnesses these principles, performing calculations in a completely different way from traditional computers.

Regular computers use bits, which can be either 0 or 1. Quantum computers, however, exploit the bizarre property of superposition, allowing qubits to be 0, 1, or both at the same time. The ability to be in multiple states simultaneously enhances the processing power of quantum computers.

Qubits are made from quantum particles like electrons or photons. By controlling properties like electrical charge or spin, data can be represented as 0, 1, or a combination of both. To unlock the true power of quantum computers, scientists rely on two unique properties:

There is no preferred qubit technology; instead, a range of physical systems, such as photons, trapped ions, superconducting circuits, and semiconductor spins, are being investigated for use as qubits.1

All these methods face the common challenge of isolating qubits from external noise, making errors during quantum computation inevitable. In contrast, classical computer bits, realized by the on/off states of transistor switches with billions of electrons, have substantial error margins that virtually eliminate physical defects.

There is no equivalent error-prevention security for quantum computers, where qubits are realized as fragile physical systems. Thus, active error correction is necessary for any quantum computer relying on qubit technology.

In 1995, Peter Shor introduced the first quantum error-correcting method. Shors approach demonstrated how quantum information could be redundantly encoded by entangling it across a larger system of qubits.

Subsequent findings then showed that if specific physical requirements on the qubits themselves are satisfied, extensions to this technique may theoretically be utilized to arbitrarily lower the quantum error rate.

While diverse efforts are being undertaken in the field of QEC, the fundamental approach to QEC implementation involves the following steps.

Quantum information is encoded across several physical, distributed qubits. These qubits act as 'information holders' for a 'logical qubit,' which is more robust and contains the data used for computation.

The logical qubits are then entangled with the physical information holders using a specific QEC code. These additional physical qubits serve as sentinels for the logical qubit.

QEC identifies errors in the encoded data by measuring the information holders using a method that does not affect the data directly in the logical qubit. This measurement provides an indication or a pattern of results that shows the type and location of the error.

Different QEC codes are available for the various types of errors that could occur. Based on the detected error, the chosen QEC system applies an operation to correct the error in the data qubits.

Error correction itself has the potential to generate noise. Therefore, additional physical qubits are required to maintain the delicate balance of correcting errors and limiting the introduction of new ones.

To realize the full potential of a quantum computer, the number of logical qubits has to be increased. However, since each logical qubit requires several physical qubits for error correction, the complexity and resources needed to isolate and manage high-quality qubits become considerable obstacles to scalability.

In recent years, quantum error correction has seen significant advancements, and the community's focus has shifted from noisy applications to the potential uses of early error-corrected quantum computers. Though research on superconducting circuits, reconfigurable atom arrays, and trapped ions has made significant strides, several platform-specific technological obstacles remain to be solved.

Some notable recent advancements in QEC include:

Despite the challenges, QEC is essential for building large-scale, fault-tolerant quantum computers. Researchers are constantly developing new and improved QEC codes and techniques.

As quantum technology progresses, QEC will play a critical role in unlocking the true potential of this revolutionary field.

More from AZoQuantum: Harnessing Quantum Computing for Breakthroughs in Artificial Intelligence

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

View original post here:
Revolutionizing Quantum Computing: Breakthroughs in Quantum Error Correction - AZoQuantum

Read More..