Page 610«..1020..609610611612..620630..»

Unveiling the Mysteries of Quantum Entanglement: Harnessing the Power of Spooky Action for the Future | by Tariq … – Medium

Quantum entanglement, the phenomenon famously dubbed spooky action at a distance by Albert Einstein, has long been a source of fascination and intrigue in the world of physics. As we delve into the intricate realm of quantum mechanics, we uncover the mesmerizing characteristics of entangled particles and the tantalizing possibilities they hold for the future of technology.

Einsteins skepticism about quantum entanglement was rooted in his discomfort with the idea that particles could instantaneously influence each others states, regardless of the vast distances that separated them. He famously referred to this as spooky action at a distance. However, despite Einsteins reservations, experiments conducted since then have consistently demonstrated the reality of quantum entanglement, paving the way for groundbreaking discoveries.

Characteristics of Quantum Entanglement

Instantaneous ConnectionQuantum entanglement defies our classical intuition, as particles become interconnected in such a way that the state of one particle instantaneously influences the state of its entangled partner, no matter the distance between them.

Non-localityThe entanglement phenomenon exhibits non-locality, where the properties of one particle are dependent on the state measurements of its entangled partner, even when separated by vast distances. This seemingly faster-than-light connection challenges our understanding of causality.

Quantum SuperpositionEntangled particles exist in a state of superposition, meaning they can exist in multiple states simultaneously until a measurement is made. This unique characteristic forms the basis for quantum computing and quantum communication.

Potential Applications

Quantum ComputingOne of the most promising applications of quantum entanglement lies in the realm of quantum computing. Entangled qubits can perform complex calculations at an exponentially faster rate than classical bits, potentially revolutionizing fields such as cryptography, optimization, and simulation.

Quantum CommunicationHarnessing entangled particles for quantum communication enables the creation of unbreakable quantum key distribution (QKD) systems. These systems leverage the properties of entanglement to secure communication channels, offering a new era of ultra-secure communication.

Quantum SensingQuantum entanglement can be harnessed for ultra-precise sensing and measurement devices. Quantum sensors based on entanglement could revolutionize fields like navigation, imaging, and gravitational wave detection.

Realizing the full potential of quantum entanglement requires overcoming significant challenges. Scientists are actively working on improving the fidelity and distance over which entanglement can be maintained. Additionally, mitigating the effects of environmental factors that can disrupt entanglement is crucial for practical applications.

As we continue to unravel the mysteries of quantum entanglement, we stand on the brink of a technological revolution. The characteristics that once perplexed Einstein may soon power the next generation of computing, communication, and sensing technologies. Embracing the spooky action at a distance opens the door to a future where the unimaginable becomes achievable, and the quantum world becomes a playground for innovation and discovery.

Read the rest here:

Unveiling the Mysteries of Quantum Entanglement: Harnessing the Power of Spooky Action for the Future | by Tariq ... - Medium

Read More..

Radical theory could unite Albert Einstein’s concept of gravity with quantum mechanics – Study Finds

LONDON Albert Einstein once said, Logic will get you from A to B. Imagination will take you everywhere. Now, thanks to the creative power of University College London physicists, a new theory is challenging the longstanding discrepancy between Einsteins theory of general relativity and quantum theory. This new theory reconciles the concepts of gravity and quantum mechanics, while still holding onto the famous physicists understanding of spacetime.

At the core of modern physics lie these two fundamental pillars: quantum theory, governing the behavior of the tiniest particles, and Einsteins theory of general relativity, explaining gravity through the bending of spacetime. However, these theories clash, presenting a challenge that has persisted for over a century.

Scientists traditionally believed that reconciling these theories required modifying Einsteins theory to fit within quantum theory, as attempted by leading theories like string theory and loop quantum gravity. However, the new theory is challenging this consensus.

In the theory termed a postquantum theory of classical gravity, rather than altering spacetime, quantum theory undergoes modification. This suggests that spacetime might not be governed by quantum theory at all. Instead, the theory predicts unpredictable fluctuations in spacetime larger than those predicted by quantum theory, causing objects apparent weights to become unpredictable under precise measurements.

A simultaneous paper is also exploring the theorys implications and proposes an experiment to test it. This experiment involves precisely measuring the weight of a one-kilogram mass over time at the International Bureau of Weights and Measures in France. If fluctuations in measurements surpass expected limits, it could refute the theory.

For five years, the UCL research group has scrutinized and tested this theory.

Quantum theory and Einsteins theory of general relativity are mathematically incompatible with each other, so its important to understand how this contradiction is resolved. Should spacetime be quantized, or should we modify quantum theory, or is it something else entirely? questions study author Jonathan Oppenheim, professor of physics and astronomy at UCL, in a university release. Now that we have a consistent fundamental theory in which spacetime does not get quantized, its anybodys guess.

Study co-author Zach Weller-Davies, who as a PhD student at UCL helped develop the experimental proposal and made key contributions to the theory itself, emphasized the experimental implications.

In both quantum gravity and classical gravity, spacetime must be undergoing violent and random fluctuations all around us, but on a scale which we havent yet been able to detect, explains Weller-Davies. But if spacetime is classical, the fluctuations have to be larger than a certain scale, and this scale can be determined by another experiment where we test how long we can put a heavy atom in superposition of being in two different locations.

Study co-author Dr. Barbara oda expressed hope that these experiments could settle the debate about pursuing a quantum theory of gravity.

Because gravity is made manifest through the bending of space and time, we can think of the question in terms of whether the rate at which time flows has a quantum nature, or classical nature, notes Dr. oda. And testing this is almost as simple as testing whether the weight of a mass is constant, or appears to fluctuate in a particular way.

The theorys origins lie in resolving the black hole information problem, allowing for information destruction due to a breakdown in predictability, a departure from standard quantum theory.

This novel theory not only challenges current paradigms but also offers new perspectives for experimental testing, potentially reshaping our understanding of gravity and spacetime.

The study is published in the journal Physical Review X.

Go here to read the rest:

Radical theory could unite Albert Einstein's concept of gravity with quantum mechanics - Study Finds

Read More..

IBM Offers Quantum Error Suppression Out Of The Box – IEEE Spectrum

The error-prone nature of todays quantum computers can make doing any useful computation on them a major headache. IBM has announced that, as of this past week, they have integrated error suppression technology from Q-CTRL into IBM cloud quantum services, letting users slash error rates by simply flicking a switch.

Computers that harness the unusual properties of quantum mechanics will ultimately be capable of computational feats beyond even the most powerful supercomputers. But the quantum states that make this possible are incredibly fragile and susceptible to noise, which means carrying out operations before they are overwhelmed by errors is a significant challenge.

Its a bit of a dirty secret in our sector that the typical user experience is very rarely at the limit of what the hardware could provide.Michael Biercuk, CEO, Q-CTRL

Its widely accepted that large-scale quantum computers will require some form of error correction. The leading schemes involve spreading information over a large number of physical qubits to create more robust logical qubits. But this can require as many as a thousand physical qubits for each logical one. Given that todays largest processors feature just hundreds of qubits, error corrected quantum computing is still a distant prospect.

In the meantime, the start-up Q-CTRLbased in Sydney, Australiasays the best way to tame unruly, near-term quantum processors is error suppression, which involves altering how you operate the underlying hardware to reduce the likelihood of errors. Using a combination of techniques, the company says its software can boost the chances of an algorithm running successfully by several orders of magnitude. And now, IBM has integrated the technology into its quantum cloud offerings.

Its a bit of a dirty secret in our sector that the typical user experience is very rarely at the limit of what the hardware could provide, says Q-CTRL CEO and founder Michael Biercuk. Thats because the performance is effectively buried by all the sources of noise and interference and error. We, through our performance management solution, suppress that error in a way that allows a user to immediately access just about the best the hardware can theoretically deliver.

The companys software requires no configuration by IBMs end users. Customers accessing Big Blues quantum hardware over the cloud via its Pay-As-You-Go plan will simply see a performance management option that can be toggled on and off. Flicking the switch will engage an automated set of several different software modules that run in the background to optimize the way the users algorithm runs on the hardware.

According to Biercuk, Q-CTRLs quantum compiler mathematically optimizes the number of logic gates required to run an algorithm before subjecting this minimal circuit to several further error suppressing steps. For a start, the gates are mapped onto the hardwares qubits in such a way as to avoid the most error-prone layouts, based on a catalog of pre-performed test measurements. The circuit is also interleaved with operations that use a technique known as dynamical decoupling, in which qubits are subjected to control pulses designed to cancel out the effects of cross-talk from other nearby qubits.

Separately, Biercuk says, AI designed by Q-CTRL regularly redefines the machine language used to implement circuits on the hardware. Every six to twelve hours, he says, the AI runs through multiple test circuits to check how various potential gate implementations contribute to errors. The results are then compiled in a lookup table that the compiler uses to build the circuit.

We break the link between the underlying quantum processor ... and what the application-focused end-user programs. Thats a huge change.Michael Biercuk, CEO, Q-CTRL

Finally, once the circuit has run, the software then carries out a final post-processing step on the results designed to catch measurement errors. One of the ways that quantum computers fail is that the algorithm actually runs correctly, but when you look at the answer to read it out, that process is faulty, says Biercuk, who is also a professor of quantum physics at The University of Sydney. Q-CTRLs software uses a neural network to learn patterns in the way measurement errors occur in the hardware. The system then uses these patterns to offset any errors in the readout.

With all these error-mitigation efforts working in concert, these different modules boost the chances that an algorithm will run successfullywhich is not a sure fire thing when it comes to quantum computers. In peer-reviewed research published in Physical Review Applied in August, the company tested their error suppression technology out on several popular quantum algorithms and showed that they could boost performance by as much as 1000 times. (The quantum circuit Q-CTRL tested their error-correction protocols on in the August paper was more limited in size. However, contacted by Spectrum about the discrepancy between the size of the quantum circuits described in the graph above and those depicted in the August paper, Biercuk emailed back, It wasnt published in the previous paper, because at the time those larger devices didnt exist yet.)

The companys software works with any kind of quantum computing hardware, says Biercuk, be that trapped-ions,superconducting qubits or cold atoms. And while configuring the software for a particular processor takes some time and effort, there is no extra computational cost for the user at runtime.

Biercuk thinks an out-of-the-box solution for error suppression could be a boon for many of the companies interested in quantum computing. Typically, they have focused on algorithm development, but error-suppression requires expertise in low-level hardware manipulation.

Its a little bit like asking a web developer whos building Salesforce or Facebook, can you please start programming at the level of the voltages on the transistors, says Biercuk. We break the link between the underlying quantum processor and the way it has to be manipulated, and what the application-focused end-user programs. Thats a huge change.

Quantum hardware is still some way from being able to compete with classical computers on practical problems, admits Biercuk, but a number of companies have been testing out the technology ahead of the integration with IBM.

We have previously explored Q-CTRLs performance management capabilities and were impressed by the order of magnitude improvement seen across both the inverse quantum Fourier transform and quantum phase estimation, Julian van Velzen, CTIO and Head of Quantum Lab at Capgemini in Utrecht, the Netherlands, said in a statement. With this technology natively embedded within IBM Quantum services, we can get more value from current hardware and push our applications further.

From Your Site Articles

Related Articles Around the Web

Go here to read the rest:

IBM Offers Quantum Error Suppression Out Of The Box - IEEE Spectrum

Read More..

Superconductors’ Secret: Old Physics Law Stands the Test of Time in Quantum Material Conundrum – SciTechDaily

A new study argues that the Wiedemann-Franz law, linking electronic and thermal conductivity, is still valid for copper oxide superconductors. The research suggests that discrepancies in quantum materials stem from non-electronic factors like lattice vibrations. This finding is significant for understanding unconventional superconductors and may lead to advancements in the field.

This surprising result is important for understanding unconventional superconductors and other materials where electrons band together to act collectively.

Long before researchers discovered the electron and its role in generating electrical current, they knew about electricity and were exploring its potential. One thing they learned early on was that metals were great conductors of both electricity and heat.

In 1853, two scientists showed that those two admirable properties of metals were somehow related: At any given temperature, the ratio of electronic conductivity to thermal conductivity was roughly the same in any metal they tested. This so-called Wiedemann-Franz law has held ever since except in quantum materials, where electrons stop behaving as individual particles and glom together into a sort of electron soup. Experimental measurements have indicated that the 170-year-old law breaks down in these quantum materials, and by quite a bit.

An illustration shows strongly interacting electrons carrying heat and charge from warmer to cooler regions of a quantum material. A theoretical study by SLAC, Stanford, and the University of Illinois found that the ratio of heat transport to charge transport in cuprates quantum materials like this one, where electrons glom together and act cooperatively should be similar to the ratio in normal metals, where electrons behave as individuals. This surprising result overturns the idea that the 170-year-old Wiedemann-Franz law does not apply to quantum materials. Credit: Greg Stewart/SLAC National. Accelerator Laboratory

Now, a theoretical argument put forth by physicists at the Department of Energys SLAC National Accelerator Laboratory, Stanford University and the University of Illinois suggests that the law should, in fact, approximately hold for one type of quantum material the copper oxide superconductors, or cuprates, which conduct electricity with no loss at relatively high temperatures.

In a paper published in the journal Science on November 30, they propose that the Wiedemann-Franz law should still roughly hold if one considers only the electrons in cuprates. They suggest that other factors, such as vibrations in the materials atomic latticework, must account for experimental results that make it look like the law does not apply.

This surprising result is important to understanding unconventional superconductors and other quantum materials, said Wen Wang, lead author of the paper and a PhD student with the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC.

The original law was developed for materials where electrons interact with each other weakly and behave like little balls that bounce off defects in the materials lattice, Wang said. We wanted to test the law theoretically in systems where neither of these things was true.

Superconducting materials, which carry electric current without resistance, were discovered in 1911. But they operated at such extremely low temperatures that their usefulness was quite limited.

That changed in 1986, when the first family of so-called high-temperature or unconventional superconductors the cuprates was discovered. Although cuprates still require extremely cold conditions to work their magic, their discovery raised hopes that superconductors could someday work at much closer to room temperature making revolutionary technologies like no-loss power lines possible.

After nearly four decades of research, that goal is still elusive, although a lot of progress has been made in understanding the conditions in which superconducting states flip in and out of existence.

Theoretical studies, performed with the help of powerful supercomputers, have been essential for interpreting the results of experiments on these materials and for understanding and predicting phenomena that are out of experimental reach.

For this study, the SIMES team ran simulations based on whats known as the Hubbard model, which has become an essential tool for simulating and describing systems where electrons stop acting independently and join forces to produce unexpected phenomena.

The results show that when you only take electron transport into account, the ratio of electronic conductivity to thermal conductivity approaches what the Wiedemann-Franz law predicts, Wang said. So, the discrepancies that have been seen in experiments should be coming from other things like phonons, or lattice vibrations, that are not in the Hubbard model, she said.

SIMES staff scientist and paper co-author Brian Moritz said that although the study did not investigate how vibrations cause the discrepancies, somehow the system still knows that there is this correspondence between charge and heat transport amongst the electrons. That was the most surprising result.

From here, he added, maybe we can peel the onion to understand a little bit more.

Reference: The Wiedemann-Franz law in doped Mott insulators without quasiparticles by Wen O. Wang, Jixun K. Ding, Yoni Schattner, Edwin W. Huang, Brian Moritz and Thomas P. Devereaux, 30 November 2023, Science.DOI: 10.1126/science.ade3232

Major funding for this study came from the DOE Office of Science. Computational work was carried out at Stanford University and on resources of the National Energy Research Scientific Computing Center, which is a DOE Office of Science user facility.

Read the original:

Superconductors' Secret: Old Physics Law Stands the Test of Time in Quantum Material Conundrum - SciTechDaily

Read More..

IBM Quantum Debuts 133-Qubit Processor and Quantum System Two – High-Performance Computing News Analysis – insideHPC

Today, at its annualIBM Quantum SummitinNew York, the company debuted the 133-qubit Quantum Heron, the first in what IBM said will be a series of utility-scale quantum processors whose architecture has been engineered over the past four years. IBM said the processor deliver(s) IBMs highest performance metrics and lowest error rates of any IBM Quantum processor to date.

IBM also unveiledIBM Quantum System Two, a modular quantum computer and part of IBMs quantum-centric supercomputing architecture. The first IBM Quantum System Two, located inYorktown Heights, New York, has begun operations with three IBM Heron processors and supporting control electronics.

The consensus among industry analysts is that in its current stage of development, quantum has attracted attention from organizations exploring its potential to solve workload solution. Touting demonstrations of its 127-qubit Quantum Eagle processor earlier this year, IBM said more organizations are looking at quantum as a scientific tool for utility-scale classes of problems in chemistry, physics, and materials beyond brute force classical simulation of quantum mechanics. Those organizations include the U.S. Department of Energys Argonne National Lab, the universities of Tokyo, Washington, Cologne, Harvard, Qedma, Algorithmiq, UC Berkeley, Q-CTRL, Fundacion Ikerbasque, Donostia International Physics Center, and the University of the Basque Country, as well as IBM.

IBM Quantum System Two

This includes experiments already running on the new IBM Quantum Heron, which IBM is making available for users today via the cloud. IBM said Heron has significantly improved error rates, offering a five-times improvement over the previous best records set by IBM Eagle.

We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science, saidDario Gil, IBM SVP and Director of Research. As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of more complex problems.

The company said the System Two is the foundation of IBMs next generation quantum system architecture, combining scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. The architecture brings together quantum communication and computation, assisted by classical computing resources, and leverages a middleware layer to appropriately integrate quantum and classical workflows, according to the company.

IBM also announced updated its 10-year quantum development roadmap in which Quantum System Two will house IBMs future generations of quantum processors, adding that these future processors are intended to gradually improve the quality of operations they can run to significantly extend the complexity and size of workloads they are capable of handling.

The company also released detailsfor a new generation of its software stack, within which Qiskit 1.0, wich will include Qiskit Patterns, designed to serve as a mechanism to allow quantum developers to more easily create code. It is based on a collection of tools intended to map classical problems, optimize them to quantum circuits using Qiskit, executing those circuits using Qiskit Runtime, and then postprocess the results.

With Qiskit Patterns, combined with Quantum Serverless, users will be able to build and deploy workflows integrating classical and quantum computation in different environments, such as cloud or on-prem scenarios, according to IBM.

IBM also said it is developing the use of generative AI for quantum code programming throughwatsonx, IBMs enterprise AI platform, achieved through the finetuning of theIBM Granite model series.

Generative AI and quantum computing are both reaching an inflection point, presenting us with the opportunity to use the trusted foundation model framework of watsonx to simplify how quantum algorithms can be built for utility-scale exploration, said Jay Gambetta, Vice President and IBM Fellow at IBM. This is a significant step towards broadening how quantum computing can be accessed and put in the hands of users as an instrument for scientific exploration.

The rest is here:

IBM Quantum Debuts 133-Qubit Processor and Quantum System Two - High-Performance Computing News Analysis - insideHPC

Read More..

Unlocking the Quantum Realm: A New Tool for Uncharted Phenomena – SciTechDaily

The temperature profiles obtained by the researchers show that particles that interact strongly with the environment are hot (red) and those that interact little are cold (blue). Entanglement is therefore large where the interaction between particles is strong. Credit: Helene Hainzer

Predictions of quantum field theory experimentally confirmed for the first time.

Entanglement is a quantum phenomenon where the properties of two or more particles become interconnected in such a way that one cannot assign a definite state to each individual particle anymore. Rather, we have to consider all particles at once that share a certain state. The entanglement of the particles ultimately determines the properties of a material.

Entanglement of many particles is the feature that makes the difference, emphasizes Christian Kokail, one of the first authors of the paper now published in Nature. At the same time, however, it is very difficult to determine.

The researchers led by Peter Zoller at the University of Innsbruck and the Institute of Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences (AW) now provide a new approach that can significantly improve the study and understanding of entanglement in quantum materials. In order to describe large quantum systems and extract information from them about the existing entanglement, one would naively need to perform an impossibly large number of measurements.

We have developed a more efficient description, that allows us to extract entanglement information from the system with drastically fewer measurements, explains theoretical physicist Rick van Bijnen.

In an ion trap quantum simulator with 51 particles, the scientists have imitated a real material by recreating it particle by particle and studying it in a controlled laboratory environment. Very few research groups worldwide have the necessary control of so many particles as the Innsbruck experimental physicists led by Christian Roos and Rainer Blatt.

The main technical challenge we face here is how to maintain low error rates while controlling 51 ions trapped in our trap and ensuring the feasibility of individual qubit control and readout, explains experimentalist Manoj Joshi. In the process, the scientists witnessed for the first time effects in the experiment that had previously only been described theoretically.

Here we have combined knowledge and methods that we have painstakingly worked out together over the past years. Its impressive to see that you can do these things with the resources available today, says an excited Christian Kokail, who recently joined the Institute for Theoretical Atomic Molecular and Optical Physics at Harvard.

In a quantum material, particles can be more or less strongly entangled. Measurements on a strongly entangled particle yield only random results. If the results of the measurements fluctuate very much i.e., if they are purely random then scientists refer to this as hot. If the probability of a certain result increases, it is a cold quantum object. Only the measurement of all entangled objects reveals the exact state.

In systems consisting of very many particles, the effort for the measurement increases enormously. Quantum field theory has predicted that subregions of a system of many entangled particles can be assigned a temperature profile. These profiles can be used to derive the degree of entanglement of the particles.

In the Innsbruck quantum simulator, these temperature profiles are determined via a feedback loop between a computer and the quantum system, with the computer constantly generating new profiles and comparing them with the actual measurements in the experiment. The temperature profiles obtained by the researchers show that particles that interact strongly with the environment are hot and those that interact little are cold. This is exactly in line with expectations that entanglement is particularly large where the interaction between particles is strong, says Christian Kokail.

The methods we have developed provide a powerful tool for studying large-scale entanglement in correlated quantum matter. This opens the door to the study of a new class of physical phenomena with quantum simulators that already are available today, says quantum mastermind Peter Zoller. With classical computers, such simulations can no longer be computed with reasonable effort.

The methods developed in Innsbruck will also be used to test new theories on such platforms.

The results have been published in Nature.

Reference: Exploring large-scale entanglement in quantum simulation by Manoj K. Joshi, Christian Kokail, Rick van Bijnen, Florian Kranzl, Torsten V. Zache, Rainer Blatt, Christian F. Roos and Peter Zoller, 29 November 2023, Nature.DOI: 10.1038/s41586-023-06768-0

Financial support for the research was provided by the Austrian Science Fund FWF, the Austrian Research Promotion Agency FFG, the European Union, the Federation of Austrian Industries Tyrol and others.

Originally posted here:

Unlocking the Quantum Realm: A New Tool for Uncharted Phenomena - SciTechDaily

Read More..

IBM Unveils New Series of Utility-Scale Quantum Processors – The Fast Mode

At the annual IBM Quantum Summit in New York, IBM debuted 'IBM Quantum Heron,' the first in a new series of utility-scale quantum processors with an architecture engineered over the past four years to deliver IBM's highest performance metrics and lowest error rates of any IBM Quantum processor to date.

IBM also unveiled IBM Quantum System Two, the company's first modular quantum computer and cornerstone of IBM's quantum-centric supercomputing architecture. The first IBM Quantum System Two, located in Yorktown Heights, New York, has begun operations with three IBM Heron processors and supporting control electronics.

With this critical foundation now in place, along with other breakthroughs in quantum hardware, theory, and software, the company is extending its IBM Quantum Development Roadmap to 2033 with new targets to significantly advance the quality of gate operations.

As demonstrated by IBM earlier this year on a 127-qubit 'IBM Quantum Eagle' processor, IBM Quantum systems can now serve as a scientific tool to explore utility-scale classes of problems in chemistry, physics, and materials beyond brute force classical simulation of quantum mechanics.

IBM is also detailing plans for a new generation of its software stack, within which Qiskit 1.0 will be a pivot point defined by stability and speed. Additionally, and with the goal of democratizing quantum computing development, IBM is announcing Qiskit Patterns.

Dario Gil, IBM SVP and Director of Research

We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science. As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of more complex problems.

Jay Gambetta, Vice President and IBM Fellow at IBM

Generative AI and quantum computing are both reaching an inflection point, presenting us with the opportunity to use the trusted foundation model framework of watsonx to simplify how quantum algorithms can be built for utility-scale exploration. This is a significant step towards broadening how quantum computing can be accessed and put in the hands of users as an instrument for scientific exploration.

Read more:

IBM Unveils New Series of Utility-Scale Quantum Processors - The Fast Mode

Read More..

Superconducting nanowires detect single protein ions – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

close

An international research team led by quantum physicist Markus Arndt (University of Vienna) has achieved a breakthrough in the detection of protein ions: Due to their high energy sensitivity, superconducting nanowire detectors achieve almost 100% quantum efficiency and exceed the detection efficiency of conventional ion detectors at low energies by a factor of up to a 1,000.

In contrast to conventional detectors, they can also distinguish macromolecules by their impact energy. This allows for more sensitive detection of proteins and it provides additional information in mass spectrometry. The results of this study were recently published in the journal Science Advances.

The detection, identification, and analysis of macromolecules is interesting in many areas of life sciences, including protein research, diagnostics, and analytics. Mass spectrometry is often used as a detection systema method that typically separates charged particles (ions) according to their mass-to-charge ratio and measures the intensity of the signals generated by a detector. This provides information about the relative abundance of the different types of ions and therefore the composition of the sample.

However, conventional detectors have only been able to achieve high detection efficiency and spatial resolution for particles with high-impact energya limitation that has now been overcome by an international team of researchers using superconducting nanowire detectors.

In the current study, a European consortium coordinated by the University of Vienna, with partners in Delft (Single Quantum), Lausanne (EPFL), Almere (MSVision) and Basel (University), demonstrates for the first time the use of superconducting nanowires as excellent detectors for protein beams in so-called quadrupole mass spectrometry. Ions from the sample to be analyzed are fed into a quadrupole mass spectrometer where they are filtered.

"If we now use superconducting nanowires instead of conventional detectors, we can even identify particles that hit the detector with low kinetic energy," explains project leader Markus Arndt from the Quantum Nanophysics Group at the Faculty of Physics at the University of Vienna. This is made possible by a special material property (superconductivity) of the nanowire detectors.

The key to this detection method is that nanowires enter a superconducting state at very low temperatures, in which they lose their electrical resistance and allow lossless current flow. Excitation of the superconducting nanowires by incoming ions causes a return to the normal conducting state (quantum transition). The change in the electrical properties of the nanowires during this transition is interpreted as a detection signal.

"With the nanowire detectors we use," says first author Marcel Strau, "we exploit the quantum transition from the superconducting to the normal conducting state and can thus outperform conventional ion detectors by up to three orders of magnitude."

Indeed, nanowire detectors have a remarkable quantum yield at exceptionally low impact energiesand redefine the possibilities of conventional detectors: "In addition, a mass spectrometer adapted with such a quantum sensor can not only distinguish molecules according to their mass to charge state, but also classify them according to their kinetic energy. This improves the detection and offers the possibility for have better spatial resolution," says Marcel Strau.

Nanowire detectors can find new applications in mass spectrometry, molecular spectroscopy, molecular deflectometry, or quantum interferometry of molecules, where high efficiency and good resolution are required, especially at low impact energy.

More information: Marcel Strau et al, Highly sensitive single-molecule detection of macromolecule ion beams, Science Advances (2023). DOI: 10.1126/sciadv.adj2801

Journal information: Science Advances

Continued here:

Superconducting nanowires detect single protein ions - Phys.org

Read More..

Researchers Discover a Breakthrough in Protein Ion Detection – AZoNano

A multinational research team at the University of Vienna, under the direction of quantum physicist Markus Arndt, has made significant progress in the identification of protein ions.At low energies, superconducting nanowire detectors outperform conventional ion detectors in terms of detection efficiency by a ratio of up to 1,000 due to their high energy sensitivity, which allows for about 100% quantum efficiency.

Unlike traditional detectors, they are also capable of differentiating macromolecules based on their impact energy. This gives mass spectrometry extra information and enables more sensitive protein identification. The journal Science Advancespublished the studys findings.

Protein research, diagnostics, and analytics are just a few of the life sciences fields that find macromolecules' detection, identification, and analysis intriguing. Mass spectrometry is a widely used detection device that examines the strength of signals produced by a detector and often divides charged particles (ions) based on their mass-to-charge ratio.

This details the relative abundance of the various ion types and the samples composition. Superconducting nanowire detectors have allowed an international team of researchers to overcome the constraint that traditional detectors could only achieve high detection efficiency and spatial resolution for particles with high-impact energy.

A European consortium led by the University of Vienna and including partners from Delft (Single Quantum), Lausanne (EPFL), Almere (MSVision), and Basel (University) indicates for the first time the potential of superconducting nanowires as superior detectors for protein beams in quadrupole mass spectrometry. A quadrupole mass spectrometer receives the ions from the sample to be examined and filters them.

If we now use superconducting nanowires instead of conventional detectors, we can even identify particles that hit the detector with low kinetic energy.

Markus Arndt, Project Leader, Quantum Nanophysics Group, Faculty of Physics, University of Vienna

Superconductivity, a unique material feature of the nanowire detectors, makes this feasible.

The secret to this detecting technique is that at extremely low temperatures, nanowires transition into a superconducting state where they lose their electrical resistance and permit lossless current flow.

A quantum transition occurs when incoming ions excite the superconducting nanowires, resulting in a return to the normal conducting state. During this transition, the nanowires altered electrical characteristics are interpreted as a detecting signal.

With the nanowire detectors we use we exploit the quantum transition from the superconducting to the normal conducting state and can thus outperform conventional ion detectors by up to three orders of magnitude.

Marcel Strau, Study First Author and Project Staff, University of Vienna

In fact, nanowire detectors redefine the capabilities of conventional detectors and offer an amazing quantum yield at remarkably low impact energies.

In addition, a mass spectrometer adapted with such a quantum sensor can not only distinguish molecules according to their mass to charge state, but also classify them according to their kinetic energy. This improves the detection and offers the possibility for have better spatial resolution, Strau added.

In fields requiring high efficiency and strong resolution, particularly at low impact energy, such as mass spectrometry, molecular spectroscopy, molecular deflectometry, or quantum interferometry of molecules, nanowire detectors can find new uses.

Superconducting nanowire detector research is headed by Single Quantum, ultracold electronics is provided by experts from EPFL-Lausanne, mass spectrometry is specialized by MSVISION, and chemical synthesis and protein functionalization are handled by University of Basel experts. The University of Vienna unites all the elements with its proficiency in superconductivity, molecular beams, and quantum optics.

The SuperMaMa project (860713), which aims to investigate superconducting detectors for mass spectrometry and molecular analysis, provided funding for the study. Funding for the study of the modified proteins was provided by the Gordon & Betty Moore Foundation (10771).

Strauss, M, et. al. (2023) Highly sensitive single-molecule detection of macromolecule ion beams. Science Advances. doi:10.1126/sciadv.adj2801.

Source: http://www.univie.ac.at/

Go here to see the original:

Researchers Discover a Breakthrough in Protein Ion Detection - AZoNano

Read More..

IBM Debuts Next-Generation Quantum Processor & IBM Quantum System Two, Extends Roadmap to Advance Era of … – IBM Newsroom

-University of Tokyo, Argonne National Laboratory, Fundacion Ikerbasque, Qedma, Algorithmiq, University of Washington, University of Cologne, Harvard University, UC Berkeley, Q-CTRL demonstrate new research to explore power of utility-scale quantum computing

-IBM Quantum Heron is released as IBMs most performant quantum processor in the world, with newly built architecture offering up to five-fold improvement in error reduction over IBM Quantum Eagle

-IBM Quantum System Two begins operation with three IBM Heron processors, designed to bring quantum-centric supercomputing to reality

-Expansion of IBM Quantum Development Roadmap for next ten years prioritizes improvements in gate operations to scale with quality towards advanced error-corrected systems

-Qiskit 1.0 announced, the worlds most widely used open-source quantum programming software, with new features to help computational scientists execute quantum circuits with ease and speed

-IBM showcases generative AI models engineered to automate quantum code development with watsonx and optimize quantum circuits

Dec 4, 2023

At IBM Quantum Summit 2023, IBM Quantum System Two was debuted as the companys first modular quantum computer and cornerstone of IBMs quantum-centric supercomputing architecture. (Credit: Ryan Lavine for IBM)

NEW YORK, Dec. 4, 2023 /PRNewswire/ -- Today, at the annual IBM Quantum Summit in New York, IBM (NYSE: IBM) debuted 'IBM Quantum Heron,' the first in a new series of utility-scale quantum processors with an architecture engineered over the past four years to deliver IBM's highest performance metrics and lowest error rates of any IBM Quantum processor to date.

IBM also unveiled IBM Quantum System Two, the company's first modular quantum computer and cornerstone of IBM's quantum-centric supercomputing architecture. The first IBM Quantum System Two, located in Yorktown Heights, New York, has begun operations with three IBM Heron processors and supporting control electronics.

At IBM Quantum Summit 2023, IBM Quantum Heron was released as IBMs best performing quantum processor to date, with newly built architecture offering up to five-fold improvement in error reduction. (Credit: Ryan Lavine for IBM)

With this critical foundation now in place, along with other breakthroughs in quantum hardware, theory, and software, the company is extending its IBM Quantum Development Roadmap to 2033 with new targets to significantly advance the quality of gate operations. Doing so would increase the size of quantum circuits able to be run and help to realize the full potential of quantum computing at scale.

"We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science," said Dario Gil, IBM SVP and Director of Research. "As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of more complex problems."

As demonstrated by IBM earlier this year on a 127-qubit 'IBM Quantum Eagle' processor, IBM Quantum systems can now serve as a scientific tool to explore utility-scale classes of problems in chemistry, physics, and materials beyond brute force classical simulation of quantum mechanics.

Since that demonstration, leading researchers, scientists, and engineers from organizations including the U.S. Department of Energy's Argonne National Laboratory, the University of Tokyo, the University of Washington, the University of Cologne, Harvard University, Qedma, Algorithmiq, UC Berkeley, Q-CTRL, Fundacion Ikerbasque, Donostia International Physics Center, and the University of the Basque Country, as well as IBM, have expanded demonstrations of utility-scale quantum computing to confirm its value in exploring uncharted computational territory.

This includes experiments already running on the new IBM Quantum Heron 133-qubit processor, which IBM is making available for users today via the cloud. IBM Heron is the first in IBM's new class of performant processors with significantly improved error rates, offering a five-times improvement over the previous best records set by IBM Eagle. Additional IBM Heron processors will join IBM's industry-leading, utility-scale fleet of systems over the course of the next year.

IBM Quantum System Two and Extended IBM Quantum Development Roadmap

IBM Quantum System Two is the foundation of IBM's next generation quantum computing system architecture. It combines scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. The new system is a building block for IBM's vision of quantum-centric supercomputing. This architecture combines quantum communication and computation, assisted by classical computing resources, and leverages a middleware layer to appropriately integrate quantum and classical workflows.

As part of the newly expanded ten-year IBM Quantum Development Roadmap, IBM plans for this system to also house IBM's future generations of quantum processors. Also, as part of this roadmap, these future processors are intended to gradually improve the quality of operations they can run to significantly extend the complexity and size of workloads they are capable of handling.

At IBM Quantum Summit 2023, the company extended the IBM Quantum Development Roadmap to 2033, and has established an IBM Quantum Innovation Roadmap through 2029. (Credit: IBM)

Qiskit and Generative AI to Increase Ease of Quantum Software Programming

Today, IBM is also detailing plans for a new generation of its software stack, within which Qiskit 1.0 will be a pivot point defined by stability and speed. Additionally, and with the goal of democratizing quantum computing development, IBM is announcing Qiskit Patterns.

Qiskit Patterns will serve as a mechanism to allow quantum developers to more easily create code. It is based in a collection of tools to simply map classical problems, optimize them to quantum circuits using Qiskit, executing those circuits using Qiskit Runtime, and then postprocess the results. With Qiskit Patterns, combined with Quantum Serverless, users will be able to build, deploy, and execute workflows integrating classical and quantum computation in different environments, such as cloud or on-prem scenarios. All of these tools will provide building blocks for users to build and run quantum algorithms more easily.

Additionally, IBM is pioneering the use of generative AI for quantum code programming through watsonx, IBM's enterprise AI platform. IBM will integrate generative AI available through watsonx to help automate the development of quantum code for Qiskit. This will be achieved through the finetuning of the IBM Granite model series.

"Generative AI and quantum computing are both reaching an inflection point, presenting us with the opportunity to use the trusted foundation model framework of watsonx to simplify how quantum algorithms can be built for utility-scale exploration," said Jay Gambetta, Vice President and IBM Fellow at IBM. "This is a significant step towards broadening how quantum computing can be accessed and put in the hands of users as an instrument for scientific exploration."

With advanced hardware across IBM's global fleet of 100+ qubit systems, as well as easy-to-use software that IBM is debuting in Qiskit, users and computational scientists can now obtain reliable results from quantum systems as they map increasingly larger and more complex problems to quantum circuits.

About IBM

IBM is a leading provider of global hybrid cloud and AI, and consulting expertise. We help clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. More than 4,000 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and consulting deliver open and flexible options to our clients. All of this is backed by IBM's long-standing commitment to trust, transparency, responsibility, inclusivity and service.

Visit http://www.ibm.com for more information.

MEDIA CONTACTS

Erin AngeliniIBM Communicationsedlehr@us.ibm.com

Hugh CollinsIBM Communicationshughdcollins@ibm.com

SOURCE IBM

Read more:

IBM Debuts Next-Generation Quantum Processor & IBM Quantum System Two, Extends Roadmap to Advance Era of ... - IBM Newsroom

Read More..