Category Archives: Quantum Computer

UBC electrical and computer engineering expert named Canada … – UBC Applied Science

Dr. Olivia Di Matteo, an assistant professor in UBC's department of electrical and computer engineering(ECE), has been awarded a Tier 2 Canada Research Chair (CRC) in Quantum Software and Algorithms. Sheisone of 15 UBC researchers appointed to a new Canada Research Chair.

The Canada Research Chairs Program (CRCP) is part of Canada's national strategy to be one of the world's top countries in research and development.Chairholders improve our depth of knowledge and quality of life, strengthen Canada's international competitiveness, and help train the next generation of highly skilled people through student supervision, teaching, and the coordination of other researchers' work.

Tier 2 Chairs are tenable for five years and renewable once. These are awarded to exceptional emerging researchers acknowledged by their peers as having the potential to lead in their field. New Tier 2 chairs receive a $20,000 annual stipend for research.

Di Matteo's work spans developing and implementing new methods for characterizing quantum systems, synthesizing quantum circuits, and applications of quantum computing in physics, as well as many contributions toPennyLane, an open source quantum software framework that other researchers use for their own work.

She speaks about her work at ECE and future research plans (full interview).

My group works on quantum software and algorithms, so the day-to-day is a lot of programming. On the software side, one area of focus is developing tools for automating and improving quantum compilation, which is the pipeline that translates high-level algorithms into the language of quantum hardware. On the algorithms side, we are exploring the potential use of qutrits (instead of qubits) in quantum algorithms and working on some techniques for noise mitigation.

That there are problems that are hard even for quantum computers. There's a serious amount of hype around my field right now, and quantum computers are often presented as super-advanced machines that will solve every problem exponentially faster. There are definitely some specific (but important!) problems for which we expect this will be the case (once we overcome the major engineering hurdles of building them, of course). But there are classes of problems we believe will remain hard.

Lately, I've been diving into some applications of quantum computing to nuclear theory and particle physics, which has been really fun, since my training is actually as a physicist. The mapping of those problems to quantum algorithms in software through the compilation and optimization process is really interesting, and I'm hopeful that with some advances on the software front, we'll soon be able to leverage the hardware to solve more realistic problems.

I'm also thinking about how we can make quantum computing software more accessible (e.g. through better abstraction and helpful debugging tools), so that more people can use the technology in their own work.

Read more here:
UBC electrical and computer engineering expert named Canada ... - UBC Applied Science

Qubits Unleashed: NIST’s Toggle Switch and the Future of Quantum Computing – SciTechDaily

Scientists at NIST have introduced a toggle switch device for quantum computers that adjusts connections between qubits and a readout resonator. The device tackles challenges like noise and reprogramming limitations, paving the way for more flexible and accurate quantum computing.

The novel device could lead to more versatile quantum processors with clearer outputs.

What good is a powerful computer if you cant read its output? Or readily reprogram it to do different jobs? People who design quantum computers face these challenges, and a new device may make them easier to solve.

Introduced by a team of scientists at the National Institute of Standards and Technology (NIST), the device includes two superconducting quantum bits, or qubits, which are a quantum computers analog to the logic bits in a classical computers processing chip. The heart of this new strategy relies on a toggle switch device that connects the qubits to a circuit called a readout resonator that can read the output of the qubits calculations.

This toggle switch can be flipped into different states to adjust the strength of the connections between the qubits and the readout resonator. When toggled off, all three elements are isolated from each other. When the switch is toggled on to connect the two qubits, they can interact and perform calculations. Once the calculations are complete, the toggle switch can connect either of the qubits and the readout resonator to retrieve the results.

Having a programmable toggle switch goes a long way toward reducing noise, a common problem in quantum computer circuits that makes it difficult for qubits to make calculations and show their results clearly.

This photo shows the central working region of the device. In the lower section, the three large rectangles (light blue) represent the two quantum bits, or qubits, at right and left and the resonator in the center. In the upper, magnified section, driving microwaves through the antenna (large dark-blue rectangle at bottom) induces a magnetic field in the SQUID loop (smaller white square at center, whose sides are about 20 micrometers long). The magnetic field activates the toggle switch. The microwaves frequency and magnitude determine the switchs position and strength of connection among the qubits and resonator. Credit: R. Simmonds / NIST

The goal is to keep the qubits happy so that they can calculate without distractions, while still being able to read them out when we want to, said Ray Simmonds, a NIST physicist and one of the papers authors. This device architecture helps protect the qubits and promises to improve our ability to make the high-fidelity measurements required to build quantum information processors out of qubits.

The team, which also includes scientists from the University of Massachusetts Lowell, the University of Colorado Boulder, and Raytheon BBN Technologies, describes its results in a paper published recently in the journal Nature Physics.

Quantum computers, which are still at a nascent stage of development, would harness the bizarre properties of quantum mechanics to do jobs that even our most powerful classical computers find intractable, such as aiding in the development of new drugs by performing sophisticated simulations of chemical interactions.

However, quantum computer designers still confront many problems. One of these is that quantum circuits are kicked around by external or even internal noise, which arises from defects in the materials used to make the computers. This noise is essentially random behavior that can create errors in qubit calculations.

Present-day qubits are inherently noisy by themselves, but thats not the only problem. Many quantum computer designs have what is called a static architecture, where each qubit in the processor is physically connected to its neighbors and to its readout resonator. The fabricated wiring that connects qubits together and to their readout can expose them to even more noise.

Such static architectures have another disadvantage: They cannot be reprogrammed easily. A static architectures qubits could do a few related jobs, but for the computer to perform a wider range of tasks, it would need to swap in a different processor design with a different qubit organization or layout. (Imagine changing the chip in your laptop every time you needed to use a different piece of software, and then consider that the chip needs to be kept a smidgen above absolute zero, and you get why this might prove inconvenient.)

The teams programmable toggle switch sidesteps both of these problems. First, it prevents circuit noise from creeping into the system through the readout resonator and prevents the qubits from having a conversation with each other when they are supposed to be quiet.

This cuts down on a key source of noise in a quantum computer, Simmonds said.

Second, the opening and closing of the switches between elements are controlled with a train of microwave pulses sent from a distance, rather than through a static architectures physical connections. Integrating more of these toggle switches could be the basis of a more easily programmable quantum computer. The microwave pulses can also set the order and sequence of logic operations, meaning a chip built with many of the teams toggle switches could be instructed to perform any number of tasks.

This makes the chip programmable, Simmonds said. Rather than having a completely fixed architecture on the chip, you can make changes via software.

One final benefit is that the toggle switch can also turn on the measurement of both qubits at the same time. This ability to ask both qubits to reveal themselves as a couple is important for tracking down quantum computational errors.

The qubits in this demonstration, as well as the toggle switch and the readout circuit, were all made of superconducting components that conduct electricity without resistance and must be operated at very cold temperatures. The toggle switch itself is made from a superconducting quantum interference device, or SQUID, which is very sensitive to magnetic fields passing through its loop. Driving a microwave current through a nearby antenna loop can induce interactions between the qubits and the readout resonator when needed.

At this point, the team has only worked with two qubits and a single readout resonator, but Simmonds said they are preparing a design with three qubits and a readout resonator, and they have plans to add more qubits and resonators as well. Further research could offer insights into how to string many of these devices together, potentially offering a way to construct a powerful quantum computer with enough qubits to solve the kinds of problems that, for now, are insurmountable.

Reference: Strong parametric dispersive shifts in a statically decoupled two-qubit cavity QED system by T. Noh, Z. Xiao, X. Y. Jin, K. Cicak, E. Doucet, J. Aumentado, L. C. G. Govia, L. Ranzani, A. Kamal and R. W. Simmonds, 26 June 2023, Nature Physics.DOI: 10.1038/s41567-023-02107-2

Read this article:
Qubits Unleashed: NIST's Toggle Switch and the Future of Quantum Computing - SciTechDaily

How Will Quantum Computers Change The World? – IFLScience

Quantum computers are the next step in computation. These devices can harness the peculiarities of quantum mechanics to dramatically boost the power of computers. Not even the most powerful supercomputer can compete with this approach. But to deliver on that incredible potential, the road ahead remains long.

Still, in the last few years, big steps have been taken, with simple quantum processors coming online. New breakthroughs have shown solutions to the major challenges in the discipline. The road is still long, but now we can see several opportunities along the way. For The Big Questions, IFLScience's podcast, we spoke to Professor Winfried Hensinger, Professor of Quantum Technology at the University of Sussex and the Chief Scientific Officer for Universal Quantum, about the impact these devices will have.

Among experts, whats the current timeline they envision for quantum computers?

Winfried Hensinger: People always ask me: when are we going to have a useful quantum computer? I am always going to reply with the same answer. I am going to ask them: when do you think we had the first useful conventional computer? Some people say in the 60s, but the really clever people look to history. It was 1945. In 1945, the English Army decided the Second World War by building the first computer that could break the German Enigma code and that was arguably a reason why they could win this world war. So, you can see, in 1945 we had the first high-impact application of a conventional computer.

We really have to qualify that question, "For what applications?" For calculating projectiles or for breaking encryptions, we had a classical computer in 1945. For me to do my word processing or to get a ticket at the train station, certainly not even in the 70s. So, the same thing applies to quantum computers. In the next five or 10 years were going to see one first useful application for a quantum computer, and that might be a really high-impact application that will change certainly everything, maybe in one particular industry sector.

What will happen once the potential has been demonstrated?

WH: Then were going to build more powerful quantum computers, and not just that. We also work on the algorithms, and the software, because that is equally important for a quantum computer. The way quantum computers work is by making use of these very strange quantum phenomena and in order to really fully capitalize on these phenomena the software has to work in one particular way.

For every problem we want to solve with a quantum computer you dont just have to have a quantum computer, you also have to write software that you also need to develop. So, what we are going to see over the next five or 10 years is people are going to more and more develop the software.

It seems like a major change from the focus on building a machine, to actually working on the different quantum algorithms the software to solve the specific problems we might want to solve.

WH:A good friend of mine told me that even five or 10 years ago you couldnt even get a job at a university when you said you were going to develop quantum algorithms. Thats because nobody even felt that it was worthwhile, because people didnt think we could build such a machine. In the next five or 10 years you are going to see plenty of new applications. We talk now about simulating molecules and drug discovery or breaking encryptions; in 10 years time we are going to talk about plenty more things and plenty of different things.

But lets focus on the now. What are some exciting uses of quantum computers that are in the works now?

WH: As a company, for example, we work together with others on the first quantum computing operating system. We work with theories on solving really important problems like, for example, simulating the FeMoco molecule. Thats an example weve just recently worked on. The FeMoco molecule is important for nitrogen fixation and nitrogen fixation is really important when you want to make fertilizer. It turns out 2 percent of the worlds energy is right now being used for making fertilizer. If you can make nitrogen fixation a little bit more efficient, imagine how much energy you can save. Thats one problem for a quantum computer. Now weve just done a lot of work trying to exactly understand what the resources required for that are and we can build machines for that purpose.

Ill give you another example. We work right now with Rolls Royce towards building quantum computers that are capable of developing better aircraft engines, and more fuel-efficient aircraft engines. This is all about fluid dynamics, and so using quantum computers to really simulate the flows inside such an engine. That will have a big impact, but we first need to start understanding whats required and then we can streamline the development of the machines so we get there.

These are very exciting applications. Is there anything more that we can expect with these machines?

WH: Youre just going to see more and more applications like this coming through one after the other. If were going to have this interview in two years, in five years, or in 10 years time, there will be a much more powerful quantum computer than there is now.

But there will still be many applications for quantum computers that are completely inaccessible by the machines we will have available then. Were always going to make more powerful machines, but I think the first really cool and really interesting applications were going to see in the timescale of five or 10 years, and then in 20 years there are going to be yet another array of really interesting applications that will slowly grow as the performance of these machines becomes more and more powerful.

This interview was part of IFLScience's The Big Questions and has been edited for length and clarity. Subscribe to our newsletter so you dont miss out on the biggest stories each week.

More here:
How Will Quantum Computers Change The World? - IFLScience

Building a quantum computer was always hard. It just got harder. – The Australian Financial Review

Phil Morle, a partner at the CSIROs Main Sequence Ventures fund who specialises in this kind of investment, says what happened to SQC is typical of whats happening across the sector: investments are chunked down into smaller rounds with shorter, more demanding timeframes.

Theres a sort of meta dynamic thats happening across investments ... where theres a lot more caution, and a lot more expectation of evidence for early revenue. Its something which deep-tech companies need to pay very strict attention to, he says.

Where a couple of years ago we may have collectively been able to have a longer range plan that drove the technology towards commercial outcomes a bit later, thats not on the table today.

That shift in investor expectations has caused a re-alignment at SQC, both in terms of its rate of growth had it raised $150 million, there would be more ambition to expand the company quicker, Professor Simmons says and in terms of the technology its focusing on.

The company now talks about quantum simulation, as opposed to quantum computation, much more than it did in the past.

I think everyone ultimately wants the programmable computer, but on the way to getting there everyone recognises that nearer-term commercial products are more likely to come out in the simulation space, Professor Simmons told AFR Weekend.

Luckily, it turns out that SQCs plan to build a quantum computer, first proposed by the American physicist Bruce Kane when he was visiting UNSW as a student in 1998, is exceptionally well suited to quantum simulation, too, she adds.

Kanes ambitious plan was to place phosphorous atoms on to a silicon chip, aligning them with such precision that they would interact with each other to form a quantum computer.

The plan should SQC manage to pull it off would mean a quantum computer with vastly more qubits than any other. Each of SQCs qubits is only a single atom in size, where competitors such as Google and Microsoft are building qubits out of superconducting circuits that, while tiny, are measured in tenths of millimetres, rather than in tenths of nanometres.

Some 25 years on, SQC has developed that atomic alignment technology to the point where the company can now churn out experimental new versions of its quantum computer every week, Professor Simmons says, inching its way towards building an error-corrected quantum computer by 2033.

Error correction, which for the most part involves entangling tens or hundreds of physical qubits together to form a single logical qubit, is how the industry deals with the perturbation problem.

A stray cosmic ray might disrupt one physical qubit, but shouldnt be enough to throw off an entire logical qubit, the theory goes.

SQCs most recently revised road map has it delivering a single, error-corrected qubit made up of 100 physical qubits by 2028. By 2033, it is building an error-corrected computer thats powerful enough to be useful to a broad audience of users across multiple use cases.

(The roadmap has shifted over the years. Professor Simmons had initially hoped to build a prototype 10-qubit quantum computer by 2020, but later revised that date to 2023. While SQC is still on track to produce such a prototype this year, its the error-corrected milestone thats now the important one, she says.)

The rapid iteration that would get them to the error-corrected quantum computer happens to work well for quantum simulation, too. Etching a model of a molecule on to silicon just to perform a single calculation isnt such a grind when you can churn out new models quickly.

Our approach does translate unusually (or exceptionally) well to simulation. We manufacture devices quickly within a week, she says.

Even so, revenue from the easier quantum simulation technology is still three to five years away, says Simmons.

Whether thats fast enough for investors, only time will tell. But, with some quantum computing start-ups already putting revenues on the books, the pressure on SQC to produce revenues has only increased in the past 12 months, making its task more difficult than ever.

Q-CTRL, a quantum computing startup at Sydney University that makes error correction software for quantum systems including quantum computers, announced sales of $15 million in 2022, and was cashflow positive for the first half of 2023, its founder Michael Biercuk says.

Almost all the $US54 million ($84 million) Q-CTRL raised this year in its Series B fundraising is still sitting in the bank, he adds.

Investors are no longer saying Gosh, one day when quantum computing is going to be big, but who knows when? That is no longer the conversation, says Main Sequences Morle, whose fund invested in Q-CTRL

Quantum computing is absolutely commercialising now, and investors are starting to look at it through that lens.

Go here to see the original:
Building a quantum computer was always hard. It just got harder. - The Australian Financial Review

IonQ: At The Verge Of Delivering A Commercial Quantum Computer (NYSE:IONQ) – Seeking Alpha

liulolo/iStock via Getty Images

I think IonQ could be a hundred-billion-dollar company and represent this decade's largest return on investment. There will be casualties and disappointments in this sector, but in my mind, IonQ is already approaching the winners' circle; a quantum advantage in 2025 is realistic for IONQ. It is years ahead of its big company competitors and lightyears ahead of the other Quantum startups.

Quantum Computing companies have exhibited extreme share price volatility in recent weeks; Rigetti (RGTI), D-Wave (QBTS), and IonQ (NYSE:IONQ) have seen share price changes of over 30% in single days, both up and down. The image below shows recent weekly and monthly movements (counting backward from today, Aug 17th).

Quantum Price Movement (Author Database)

This volatility could represent a massive opportunity if we pick the right company. The markets believe these companies have a tremendous opportunity ahead of them. I believe IONQ is the right company and will garner this volatility into considerable gains in the coming years.

This is my third article on Quantum computer startups. I have already written about D-Wave (whom I like and have invested in) and Rigetti (whom I did not like and have not invested in). I have tried to explain the concepts, science, and mathematics behind quantum computers as I go.

In the Rigetti article, I tried to explain the critical concept of superposition and the history of mathematics leading to the search for quantum computers. In this article, I will try to explain the theory of entanglement and IONQ's concept of Algorithmic Qubits. Some of this article will refer to concepts in the Rigetti article. I will use *RA* to mark this.

The difference is Quantum, IONQ, and QBTS are building Quantum computers, and the rest are not. Rigetti, International Business Machines (IBM), Alphabet (GOOGL), Baidu (BIDU) (HKG9888), and others are trying to develop synthetic Qubits (often called QuDots *RA*) using superconducting materials at low temperatures to give them zero resistance. The synthetic method is making good progress. IBM has moved a long way and has a machine with over 1,000 synthetic qubits *RA*.

The QuDots provide one of the critical requirements of a Quantum Computer, superposition *RA*. However, the second crucial requirement, entanglement, is another story, and without entanglement, you cannot have the exponential growth in computing power that Quantum computers promise.

IONQ have developed a trapped ION technology; they begin with a ytterbium atom and remove one of its two electrons with lasers. (Ytterbium is a rare earth metal chemical symbol Yb, soft and silvery looking. Mined in China and the US in large quantities) Once the electron is removed, it is known as an ion. Hundreds of electrodes hold the ion precisely in place, producing an oscillating electromagnetic field trapping the ion inside a vacuum.

The vacuum is inside the quantum processing chip, and the ion is isolated from the environment by the vacuum. This approach is not unique to IONQ. The physics department of Oxford University in the UK is one of many academic institutions looking at this area of research.

Entanglement leads to Quantum computers' exponentially greater computing power than today's classical computers. Without entanglement, Quantum computers would have only superposition as an advantage meaning that they would have faster and larger memory *RA*.

I think entanglement will be the most important scientific concept for the coming decades; it could be the fundamental force that holds space and time together. Entanglement's why and how remain elusive; however, its existence has been proven beyond doubt.

Einstein did not believe entanglement was possible; in his famous paper (now generically referred to as EPR), he described it as "spooky communication." Einstein thought that entanglement meant particles were communicating over vast distances faster than the speed of light, something he had proven to be impossible.

It was not until the 1970s, when researchers at Caltech proved the existence of entanglement, that it became entirely accepted.

Experiments show that entanglement does not involve communication; it is a deep connection between particles that begins at the moment of entanglement. Particles become entangled when they are so close together that they become indistinguishable. When separated, they appear almost identical and remain a correlated version of each other over any distance. The Chinese communication satellite Micius uses entangled quantum particles for encryption and has shown that particles remain entangled even when the distance between them is over 1000 Km.

In the Rigetti article, I tried to use a radar analogy to explain superposition; here, I will use another analogy to give readers an insight into how entanglement can be thought of.

Think of two identical twins separated at birth; they began as a single cell that split in half, and each half developed into a separate baby. After delivery, the babies were adopted and could have gone to very different families. One rich living in the Hamptons, one poor in the projects; however, if you saw them aged 20, they would still look similar, and they probably would have similar interests, perhaps preferring music to sport. They could both have chosen a dog as a pet and hate the taste of Ketchup. Looking at one of the twins would give a great deal of information about the other even though they have never communicated, measuring their characteristics; for example, if one twin is 5 ft 7 inches, then you can be pretty sure how tall the other would be.

Entangled particles are like twins inextricably linked, with many correlated but not necessarily identical features.

Entanglement can occur in systems of particles; millions, even billions of particles, can be entangled so that every particle in the system is entangled with every other particle. It is thought to occur among the atoms in living beings and other materials the particles becoming an entity.

You must be careful how you measure entangled particles; if you looked at the back of the Hampton twin, you would get no information about the face of the other. Similarly, the weight of one is not the same as the height of the second. This is true in quantum entanglement; the angle and position of the measurement relative to the particle are crucially important.

One of the twins may have learned to play Chess, but the other may not; they can have separate knowledge and skills but remain correlated in other ways.

When thinking of entanglement, think of the twins; they began as a single cell and will carry similarities with them.

An 11-Qubit system would require 54 entanglements, shown below.

Mystic Rose Diagram (IONQ website)

Each blue dot is a Qubit, and each orange line is an entanglement.

100 Qubits need 5,050 entanglements, and 1000 Qubits need 500,500 entanglements.

Entanglement is IONQs great advantage. Synthetic Superconducting Qubits need synthetic superconducting entanglements. Each QuDot must be connected by superconducting material to every other QuDot in the system. That is an immense task, and controlling all these superconducting electronics may be impossible. Errors will creep in at every change in temperature and with every passing photon. Plus, you can't even measure things to ensure it is going well *RA* information leakage is a real problem.

IONQs Qubits are entangled by nature. IONQ do not have to build or control entanglement; it is endemic to their machine and is potentially a huge advantage.

In the short seller report, Scorpion (see below) mentioned that the CEO of IONQ kept repeating misleading statistics about the size of the IONQ computer. I don't believe he was; no industry standard measurement of quantum computing power has yet been developed. The CEO used IONQs measure rather than the usual quantum space figure.

Simply talking about Qubits and space ignores the power of entanglement, which is the computer's biggest part. Some QuDots only entangle adjacent particles, so the image above would have only 11 orange lines around the outside.

The combinations of entanglements is also essential. The 11 Qubit diagram above has 990 possible three-qubit entanglements, all of which should give separate and additional computing power. The power of a quantum computer is a function of its Qubits, their entanglements, how they can operate separately, and their fidelity (error rate).

To measure quantum computing power IONQ has devised the measure AQ, which attempts to provide a benchmarking scale.

AQ aims to measure what computing power the computer can deliver rather than what physical components it has.

AQ measures how many encoded states the system can exist in and how many can be used to output information. The AQ measure includes error correction and the ability to perform gate operations (gate operations are the mathematical manipulations needed to perform calculations); adding 1 extra qubit that is thoroughly entangled and error corrected effectively doubles the AQ. This is reasonable as it would double the amount of work that could be done and double the number of encoded states that can exist.

Doubling is a powerful tool.

1 AQ = 2 encoded states

51 AQ = 2,251,799,813,685,248 encoded states

The AQ measurement is helpful but not industry-accepted. Presently, only IONQ use it.

34,359,738,368 encoded states, it is the point at which classical computers cannot simulate the operations of the quantum algorithm. It is the point at which Quantum computers can solve problems that hybrid quantum/classical cannot. It will be a seminal moment Quantum Advantage will have arrived when an AQ35 computer is available for sale.

IONQ expects to produce the first AQ35 Quantum Computer in 2024, and last quarter announced they had sold one for 18 million dollars; it is due for delivery 2024. (earnings call Q&A session)

The same customer who bought the AQ35 computer has placed an order for the AQ 64 IONQ (due end of 2025)

AQ64 is another important step, it would be the first quantum computer more powerful than the world's largest supercomputer.

The world's largest supercomputer is the Hewlett Packard OLCF-5 based in Oak Ridge, Tennessee. Its power is measured in floating point operations per second or FLOPS, and the OLCF-5 has achieved 1.1 Quintillion FLOPS. The AQ 64 will be able to operate at 18 quintillion FLOPS (18 quintillion FLOPS is 18,000,000,000,000,000,000 operations per second). With a price tag of $18 million, the AQ64 will outperform the $600 million OLCF-5 by an enormous margin.

IONQ is selling access to its AQ29 via its cloud operation. IONQ produced the AQ29 seven months ahead of schedule and announced its successful implementation during the Q2 earnings call. In the call, IONQ maintained their guidance of adding 1 AQ per month, something they have said repeatedly and is evidenced by the timeline new devices have arrived.

There is still a great deal of debate about whether or not Quantum computers will ever work, when they will arrive and how useful they will be. IONQ currently has its 29AQ machine working and is producing impressive commercial success and scientific research.

Recent customers include Hyundai (OTCPK:HYMTF), Airbus (OTCPK:EADSF), GE Research, Goldman Sachs, and the US Airforce Research Lab. (Q2 earnings)

The Hyundai deal was first signed in January 2022 and expanded in December. I consider the extension to be commercial validation. After 12 months with the machine, Hyundai extended and increased the deal's scope. Hyundai is now looking at two areas; in the search for new battery materials, Hyundai is using the IONQ AQ29 to run electrochemical metal simulations; previously, it was limited to lithium only. Hyundai is also using IONQ tech to improve the object detection its autonomous vehicles use. IONQ and Hyundai have shown that their machines are better at object detection and learn faster than classical computing alternatives.

Airbus signed a deal in August 2022 to develop an aircraft loading and machine learning application with IONQ. The project should run for 12 months, and its results will be significant. If Airbus extends the agreement and looks at additional areas such as aerodynamics like Hyundai, it will be another sign that quantum is making a significant difference. I hope to hear about the progress of this deal at the upcoming investors' day.

New demand for quantum computing arrives all of the time, on Aug 15th SA reported that IONQ had signed a deal to develop AI with Zapata.

The moment IONQ releases the AQ 64, I expect a significant demand increase. At that point, no current computer, including the world's largest supercomputers, can come close to the computing power available from an IONQ AQ64.

The world currently has 500 supercomputers, with around 150 new ones arriving each year. That represents a market of 150 x 18 million = $2.7 billion in revenue per year for IONQ. IONQ may capture all of this business within two or three years.

A recent report from Markets and Markets forecasted a 38% CAGR for the quantum computing industry, with 2028 estimated to be $4.4 billion.

At $18 million, the AQ64 computer will be affordable for large companies, government research agencies, large-scale data centers, and the existing market for Quantum/Supercomputers. If the IONQ AQ64 delivers on price and performance, current forecasts could be out by a factor of 10.

The potential market is enormous. Due to its natural entanglement, IONQ will likely have a first-mover advantage and enhanced AQ. That is ignoring the temperature issue. Currently, the synthetic QuDots of the competition must be cooled close to absolute zero, and IONQ's trapped ION technology works at room temperature. (recent research suggests superconducting material at low temperatures may be possible. The new material is called LK99; having read the paper, I think it is bad science that will not prove accurate, and I am not the only one.)

IONQ delivered their AQ29 computer seven months ahead of schedule and appears to be increasing computing power at 1 AQ per month. That would imply that AQ35 will arrive before the end of 2023 and the all-important AQ64 by Q1 2026. So far, 1 of the AQ 35 machines and 1 of the AQ64 machines have been sold to the same customer in Basel.

I expect the bookings figure to grow as IONQ approaches the AQ 64 threshold. When it arrives, it will be the most powerful computer man has ever produced and will be at least three years ahead of the comparable machine on the IBM roadmap *RA*.

In January, IONQ announced its intention to build the world's first quantum computer manufacturing site in Bothell, Washington State. In the Q&A section of the Q2 earnings call, the CEO said the factory had its permits in place, and construction was underway; he also said it was on time and budget. The Basel machine will be built in this facility so we can expect it to be operational Q1 2024, but we have to wait for confirmation.

IONQ is a publishing powerhouse; the output of scientific papers is much larger than the other Quantum computing companies I looked at. Scientific papers starting in 2016 track the company's technical milestones, from its first single qubit programable computer to optimizing entanglement, quantum gates, and circuits.

The introduction of the current trapped ion technology and the techniques of fault-tolerant error correction are covered. It is all there if you want to read the full history of technological progress. The final article is a peer-reviewed publication in the magazine Nature showing how to use a quantum machine for cryptographic uses beyond the capability of classical computers.

I think I have made it clear that I believe Quantum computing is very close and is an inevitability, I also feel that IONQ is leading the field, but it is just my view and is not universally accepted.

The outlook for Quantum Computing companies is uncertain, and there is much disagreement about IONQ.

The Short Seller Scorpion Capital published a report in 2022 claiming that IONQ was a PONSI scheme, a scam whose products do not work, whose CEO knew nothing about the technology, and who lied publicly. Whose founders have not left their previous teaching positions because they don't believe IONQ will work.

Short Seller reports have kept me out of trouble in the past. I always read them before investing and find them of varying quality. In my opinion, the Scorpion research is poor. It is based on unnamed experts and former employees.

The evidence that IONQ products do not work is from an expert who wrote a program to add small numbers together and ran them on the IONQ cloud service. The expert's name, qualifications, and experience are not provided, the program they allegedly wrote is not made available for checking,, and the IONQ machine's output is not presented. I read the Scorpion report in detail and could not verify its claims.

The misleading statements from the CEO all relate to the Algorithmic Qubit measure I have discussed and the argument about founders not leaving their teaching role invalid. No mathematics lecturer would ever leave his doctoral students without a mentor; it is just not the done thing. I remained in contact with my mentor until his death ( in a future article, I will tell the story of how he wrote to NASA saying the recently published designs for a shuttle (in the 1980s) would crash because it was a hairy orange).

Recent articles on SA have described IONQ as a story stock without clear evidence that it will ever produce a meaningful product or a profit.

Price targets from wall street suggest only a 15% upside (SA), and only half of analysts recommend buying.

Ratings page (Seeking Alpha)

Using Wallstreet Analysts' forecasts, Simplywall.st believes IONQ is overvalued by 400%.

DCF value (Simplywall.st)

IONQ has a strong balance sheet with zero debt and enough cash to last well into 2026 using its current spending and income figures.

Balance Sheet Summary (Author Database)

Revenue has been multiplying.

Revenue by Quarter (Author Model)

The image above is from my mathematical model for IONQ and shows the significant revenue growth both Q on Q and Y o Y. The figures in white are my forecasts for the rest of the year. I update them each quarter; Q2 blew away my projections, as did Q1. It adds to a forecast of $19 million in revenue FY 2023 against $3.8 million in 2022, a 480% increase. I do not yet have a firm grip on the IONQ margin; it dropped in Q2, and I will await Q3 to see how it is progressing. In these early-stage revenue companies, margin is always bumpy as it is a mix of service agreements and milestone payments. IONQ do report bookings each quarter; they increased to $28 million in Q2 from $4 million I expect this figure to be very bumpy for at least two or three years.

Last quarter bookings will likely be the sales of the AQ35 and AQ64 computers to Europe (Q2 earnings Q&A)

The model has a positive EBITDA in 2027; the assumptions are still too general to develop a DCF, but as time goes on and I gain more confidence with the projections, I can calculate a fair value. This model is currently only valid to track performance. When the AQ35 and AQ64 arrive, we will get early sales data to project into the future.

Selected Line Items (Author Model)

Conclusion

IONQ has entanglement, it has Qubits, it has customers, it has peer-reviewed papers confirming its science, and it has cash.

By 2025 it should have the world's most powerful computer and be two years away from Quantum Supremacy.

I am long IONQ at $5.01 and $14.88; my technical plan has two possibilities: turning higher in the coming days or a pullback to around $12 before a significant move higher. In both scenarios I am targeting a return in the hundreds of percent.

I will update you in the comments section as things progress on the technical front.

Link:
IonQ: At The Verge Of Delivering A Commercial Quantum Computer (NYSE:IONQ) - Seeking Alpha

Bigger and better quantum computers possible with new ion trap, dubbed the Enchilada – EurekAlert

image:The Enchilada Trap, manufactured in Sandia National Laboratories Microsystems Engineering, Science and Applications fabrication facility. view more

Credit: Craig Fritz, Sandia National Laboratories

ALBUQUERQUE, N.M. Sandia National Laboratories has produced its first lot of a new world-class ion trap, a central component for certain quantum computers. The new device, dubbed the Enchilada Trap, enables scientists to build more powerful machines to advance the experimental but potentially revolutionary field of quantum computing.

In addition to traps operated at Sandia, several traps will be used at Duke University for performing quantum algorithms. Duke and Sandia are research partners through theQuantum Systems Accelerator, one of five U.S. National Quantum Information Science Research Centers funded by the Department of Energys Office of Science.

An ion trap is a type of microchip that holds electrically charged atoms, or ions. With more trapped ions, or qubits, a quantum computer can run more complex algorithms.

With sufficient control hardware, the Enchilada Trap could store and transport up to 200 qubits using a network of five trapping zones inspired by its predecessor, the Roadrunner Trap. Both versions are produced at SandiasMicrosystems Engineering, Science and Applicationsfabrication facility.

According to Daniel Stick, a Sandia scientist and leading researcher with the Quantum Systems Accelerator, a quantum computer with up to 200 qubits and current error rates will not outperform a conventional computer for solving useful problems. However, it will enable researchers to test an architecture with many qubits that in the future will support more sophisticated quantum algorithms for physics, chemistry, data science, materials science and other areas.

We are providing the field of quantum computing room to grow and explore larger machines and more complicated programming, Stick said.

Sandia has researched, built and tested ion traps for 20 years. To overcome a series of design challenges, the team combined institutional knowledge with new innovations.

For one, they needed space to hold more ions and a way to rearrange them for complex calculations. The solution was a network of electrodes that branches out similar to a family tree or tournament bracket. Each narrow branch serves as a place to store and shuttle ions.

Sandia had experimented with similar junctions in previous traps. The Enchilada Trap uses the same design in a tiled way so it can explore scaling properties of a smaller trap. Stick believes the branching architecture is currently the best solution for rearranging trapped ion qubits and anticipates that future, even larger versions of the trap will feature a similar design.

Another concern was the dissipation of electrical power on the Enchilada Trap, which could generate significant heat, leading to increased outgassing from surfaces, a higher risk of electrical breakdown and elevated levels of electrical field noise. To address this issue, production specialists designed new microscopic features to reduce the capacitance of certain electrodes.

Our team is always looking ahead, said Sandias Zach Meinelt, the lead integrator on the project. We collaborate with scientists and engineers to learn about the kind of technology, features and performance improvements they will need in the coming years. We then design and fabricate traps to meet those requirements and constantly seek ways to further improve.

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energys National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

View post:
Bigger and better quantum computers possible with new ion trap, dubbed the Enchilada - EurekAlert

UNM Engineering researcher leading Department of Energy project … – UNM Newsroom

These projects, announced Aug. 10, are focused on exploratory research for extreme-scale science that will leverage emerging trends and advances in high-end computing, massive datasets, scientific machine learning, artificial intelligence, and novel computing architectures.

There is a wide expanse of exciting opportunities as we reach beyond exascale computing, said Ceren Susut, Department of Energy Acting Associate Director of Science for Advanced Scientific Computing Research. These projects will help us find promising directions to realize the full potential of scientific computing from emerging technologies.

Milad Marvian, an assistant professor in the Department of Electrical and Computer Engineering and a member of theCenter for Quantum Information and Control (CQuIC)at UNM, is receiving $500,000 for a project called Bridging Between Quantum Circuit Model and Constrained Hamiltonian-based Computation.

Marvian said that quantum computers hold the potential to offer a substantial advantage compared to classical computers for certain high-impact computational tasks. This project will investigate optimal methods to translate quantum algorithms between various models of quantum computation, motivated by current hardware capabilities, which will facilitate the implementation of practical applications on near-term quantum devices.

New Mexico has always been at the forefront of scientific advancements, said U.S. Sen. Martin Heinrich, who represents New Mexico. This federal investment will continue to expand that leadership by leveraging our unique assets, supporting UNMs leading role in maximizing the benefits of emerging quantum computing technology to solve important scientific and engineering challenges. Earlier this year, Marvian received aNational Science Foundation CAREER Award for Design and Analysis of Low-Overhead Fault-Tolerant Quantum Circuits,which explores the minimum requirements for reliable quantum computation and also aims to develop low-overhead quantum error correction and fault-tolerant schemes.

The Center for Quantum Information and Control (CQuIC) is an interdisciplinary research center located at UNM, with activities across the departments of Physics and Astronomy, Electrical and Computer Engineering, and Chemistry and Chemical Biology. Research at CQuIC is focused on quantum information science, including quantum computation, quantum simulation and complexity, quantum control and measurement, quantum metrology, and quantum optics and communication.

Quantum information science has strong roots in New Mexico. UNM has been one of the pioneers in the burgeoning field of quantum information science, with major contributions to research and education for over 30 years.

Here is the original post:
UNM Engineering researcher leading Department of Energy project ... - UNM Newsroom

Variational quantum and quantum-inspired clustering | Scientific … – Nature.com

Let us start by assuming that we have N datapoints, each being described by m features. The goal is to classify these datapoints into k clusters. Without loss of generality, datapoints are described by m-dimensional vectors (overrightarrow {x}_i), with (i = 1, 2, ldots , N). To implement a clustering of the data we could, for instance, use classical bit variables (q_i^a = 0,1), with (i = 1, 2, ldots , N) and (a = 1, 2, ldots , k), so that (q_i^a = 0) if datapoint i is not in cluster a, and (q_i^a = 1) if it is in the cluster. Let us also call (d(overrightarrow {x}_i, overrightarrow {x}_j)) some distance measure between datapoints (overrightarrow {x}_i) and (overrightarrow {x}_j). With this notation we build a classical cost function H such that points very far away tend to fall into different clusters4:

$$ H = frac{1}{2}sumlimits_{{i,j = 1}}^{N} d (overrightarrow {x} _{i} ,overrightarrow {x} _{j} )sumlimits_{{a = 1}}^{k} {q_{i}^{a} } q_{j}^{a} .{text{ }} $$

(1)

Additionally, one must impose the constraint that every point falls into one and only one cluster, i.e.,

$$begin{aligned} sum _{a=1}^k q^a_i = 1 ~~ forall i. end{aligned}$$

(2)

The bit configuration optimizing Eq. (1) under the above constraint provides a solution to the clustering of the data. As explained in Ref.4, this can be rephrased naturally as a Quadratic Binary Optimization Problem (QUBO) of (k times N) bit variables, so that it can be solved by a quantum annealer. However, on a gate-based quantum computer, we can use a Variational Quantum Eigensolver (VQE)7 with fewer qubits as follows. Let us call (f^a_i equiv |langle psi _i | psi ^a rangle |^2) the fidelity between a variational quantum state (vert psi _i rangle ) for datapoint (overrightarrow {x}_i) and a reference state (vert psi ^a rangle ) for cluster a. In a VQE algorithm, we could just sample terms (h_{ij}^a),

$$begin{aligned} h_{ij}^a = d(overrightarrow {x}_i,overrightarrow {x}_j) f_i^a f_j^a, end{aligned}$$

(3)

for all datapoints i,j and clusters a, together with penalty terms (c_i),

$$begin{aligned} c_{i} = left( sum _{a=1}^k f_i^a - 1right) ^2, end{aligned}$$

(4)

which are taken into account via Lagrange multipliers for all datapoints i. This last term must only be taken into account if several configurations of the qubits forming the VQE circuit allow for multiple clusters a simultaneously for the same datapoint, e.g., if we codified one qubit per cluster as in Eq. (1).

Our approach here, though, is not to relate the number of qubits to the number of clusters. Instead, we work with some set of predefined states (vert psi ^a rangle in {mathcal {H}}), not necessarily orthogonal, and being ({mathcal {H}}) whichever Hilbert space being used for the VQE. This provides us with enormous flexibility when designing the algorithm. For instance, we could choose states (vert psi ^a rangle ) to be a set of maximally mutually-orthogonal states2 in ({mathcal {H}}). In the particular case of one qubit only, we would then have ({mathcal {H}} = {{mathbb {C}}}^2) and the set of maximally-orthogonal states would correspond to the k vertices of a platonic solid inscribed within the Bloch sphere. The corresponding VQE approach would then correspond to a simple quantum circuit of just one qubit involving the fine-tuning of a single one-qubit rotation, and no sampling of the constraints in Eq. (4) would be needed at all, since this would be satisfied by construction. And for more qubits, the corresponding generalization would involve interesting entangled states in ({mathcal {H}}).

In addition to this, the terms to be sampled can be further refined to improve algorithmic performance. One can for instance introduce modified cost functions, such as

$$begin{aligned} h_{ij}^a= & {} d(overrightarrow {x}_i,overrightarrow {x}_j)^{-1} left( 1 - f_i^a f_j^a right) end{aligned}$$

(5)

$$begin{aligned} h_{ij}^a= & {} left( d(overrightarrow {x}_i,overrightarrow {x}_j)^alpha + lambda d(overrightarrow {x}_i,overrightarrow {c}_i)right) f_i^a f_j^a end{aligned}$$

(6)

$$begin{aligned} h_{ij}^a= & {} left( d(overrightarrow {x}_i,overrightarrow {x}_j)^alpha + lambda d(overrightarrow {x}_i,overrightarrow {c}_i)right) left( 1-f_i^aright) left( 1- f_j^a right) . end{aligned}$$

(7)

In the above cost functions, the first one tends to aggregate together in the same cluster those datapoints that are separated by a short distance, which is the complementary view to the original cost function in Eq. (3). The second one includes two regularization hyperparameters (alpha ) and (lambda ), where (alpha ) allows for modified penalizations for the distances between points, and (lambda ) accounts for the relative importance of the distance between datapoint (overrightarrow {x}_i) and the centroid formed by the elements belonging to the same cluster than point i, which we call (overrightarrow {c}_i). This centroid can be re-calculated self-consistently throughout the running of the algorithm. Additionally, one can consider cost functions with a different philosophy, such as the third one, where datapoints with a large separation distance tend to be either in different clusters, but not ruling our the chance of being in the same cluster. On top of all these possibilities, one could also consider combining them in a suitable way to build even more plausible cost functions. Eventually, the goodness of a cost function depends on the actual dataset, so for each particular case it is worth trying several of them.

The rest of the algorithm follows the standards in unsupervised learning. After a preprocessing of the data (e.g., normalization), we define the suitable set of states (vert psi _a rangle ) and set the characteristics of the variational quantum circuit, including the parameters to be optimized. We set them the classical optimizer for the VQE loop (e.g., Adam optimizer) and its main features (learning rate, batch size, etc.). After initialization, and if needed, we compute the centroids (overrightarrow {c}_i) and distances (d(overrightarrow {x}_i, overrightarrow {x}_j), d(overrightarrow {x}_i, overrightarrow {c}_i)). We then perform the VQE optimization loop for a fixed number of epochs, where new parameters of the variational quantum circuit are computed at each epoch. To accelerate performance in the VQE loop, one can include only in the sampling those terms that have a non-negligible contribution. The final step involves estimating, for a given datapoint, the cluster to which it belongs. This can be done implementing quantum state tomography (either classical or quantum), so that we can read out the final state (vert psi _i rangle ) for a given datapoint (overrightarrow {x}_i), and determine to which cluster it belongs by looking for the maximum of fidelities (f_i^a) for all clusters a.

View original post here:
Variational quantum and quantum-inspired clustering | Scientific ... - Nature.com

Rigetti Computing: Advancements in Quantum Computing and … – Clayton County Register

Rigetti Computing is a company that specializes in developing and providing quantum computing solutions. Founded in 2013 by former IBM quantum researcher Chad Rigetti, the company designs and fabricates its own quantum chips, which are superconducting circuits that operate at extremely low temperatures.

Rigettis quantum chips are integrated with a classical computing architecture and a software platform called Quantum Cloud Services. This platform allows users to access the quantum processors online and run quantum algorithms. The company has deployed quantum processors with up to 80 qubits and has achieved a net energy gain in a nuclear fusion experiment using its quantum technology.

The companys ultimate goal is to build the first quantum computer that can outperform classical computers on real-world problems, known as achieving quantum advantage or quantum supremacy.

Rigetti Computing has made significant strides in the field of quantum computing. The company holds a portfolio of 165 patents, including both issued and pending patents. This strong intellectual property base solidifies Rigettis position in the market and paves the way for the practical application of quantum computers.

In the second quarter, Rigetti recorded total revenue of $3.3 million, which marked an increase of over 56% compared to the same quarter in the previous year. As a young company in a rapidly growing market, investors should focus on Rigettis technological milestones, collaborations with partners, and the overall prospects of the quantum computing industry to evaluate the future potential of the company.

During the second quarter, Rigetti achieved significant milestones and partnerships. The company sold its first Quantum Processing Unit (QPU) to a renowned national laboratory. This QPU consists of nine qubits with a unique square lattice design, enhanced with adjustable couplers. Additionally, Rigetti partnered with ADIA Lab to develop a quantum computing solution.

Looking ahead, Rigetti has a clear plan for enhancing its quantum systems. The company is refining the Ankaa-1 system to achieve a 98% accuracy level in its two-qubit operations, which will lay the foundation for the upcoming Ankaa-2 84-qubit system. Rigetti also aims to achieve a 99% accuracy level with the Ankaa-2 system, projected to be launched in 2024. The companys long-term plan includes the creation of the Lyra system, a 336-qubit quantum computer.

The market potential for quantum computing is immense. Quantum computers have the ability to address challenges beyond the reach of classical computers, making them highly appealing to industries and researchers. Rigettis vision aligns with this potential as the company aims to develop quantum computers that can meet the requirements of practical workloads, spanning domains such as cryptography and drug discovery.

In order to fulfill these requirements, Rigetti is focused on reducing error rates in its quantum systems. With a commitment to achieving less than 0.5% error rates for next-generation systems, the company aims to make quantum computers reliable and consistent tools for complex problem-solving.

Rigetti has not only made advancements in theory and algorithms but has also addressed engineering challenges. With clock speeds exceeding 1 MHz and manufacturability as tangible achievements, the company is bridging the gap between theoretical promise and the physical realization of the power of quantum computing.

The development of a co-processor that integrates with classical computers indicates Rigettis belief in hybrid computing as the future. This integration has the potential to accelerate the adoption of quantum computing in various industries.

With a current market value of $120 billion for cloud hardware and $40 billion for high-performance computing, Rigetti Computing has significant market potential for its quantum computing solutions. However, investing in the companys stock carries high risk and is suitable for investors with a high-risk appetite.

As the quantum computing sector continues to evolve, Rigetti remains committed to its long-term goals. The company plans to introduce a 1,000-qubit multichip system by 2025 and a 4,000-qubit multichip system by 2027. Investors will have more data on the companys progress in the coming quarters, allowing for a better assessment of its potential and performance.

Here is the original post:
Rigetti Computing: Advancements in Quantum Computing and ... - Clayton County Register

Quantum Avalanche A Phenomenon That May Revolutionize … – SciTechDaily

Unraveling the mystery of insulator-to-metal transitions, new research into the quantum avalanche uncovers new insights into resistive switching and offers potential breakthroughs in microelectronics.

New Study Solves Mystery on Insulator-to-Metal Transition

A study explored insulator-to-metal transitions, uncovering discrepancies in the traditional Landau-Zener formula and offering new insights into resistive switching. By using computer simulations, the research highlights the quantum mechanics involved and suggests that electronic and thermal switching can arise simultaneously, with potential applications in microelectronics and neuromorphic computing.

Looking only at their subatomic particles, most materials can be placed into one of two categories.

Metals like copper and iron have free-flowing electrons that allow them to conduct electricity, while insulators like glass and rubber keep their electrons tightly bound and therefore do not conduct electricity.

Insulators can turn into metals when hit with an intense electric field, offering tantalizing possibilities for microelectronics and supercomputing, but the physics behind this phenomenon called resistive switching is not well understood.

Questions, like how large an electric field is needed, are fiercely debated by scientists, like University at Buffalo condensed matter theorist Jong Han.

I have been obsessed by that, he says.

Han, PhD, professor of physics in the College of Arts and Sciences, is the lead author on a study that takes a new approach to answer a long-standing mystery about insulator-to-metal transitions. The study, Correlated insulator collapse due to quantum avalanche via in-gap ladder states, was published in May in Nature Communications.

University at Buffalo physics professor Jong Han is the lead author on a new study that helps solve a longstanding physics mystery on how insulators transition into metals via an electric field, a process known as resistive switching. Credit: Douglas Levere, University at Buffalo

The difference between metals and insulators lies in quantum mechanical principles, which dictate that electrons are quantum particles and their energy levels come in bands that have forbidden gaps, Han says.

Since the 1930s, the Landau-Zener formula has served as a blueprint for determining the size of electric field needed to push an insulators electrons from its lower bands to its upper bands. But experiments in the decades since have shown materials require a much smaller electric field approximately 1,000 times smaller than the Landau-Zener formula estimated.

So, there is a huge discrepancy, and we need to have a better theory, Han says.

To solve this, Han decided to consider a different question: What happens when electrons already in the upper band of an insulator are pushed?

Han ran a computer simulation of resistive switching that accounted for the presence of electrons in the upper band. It showed that a relatively small electric field could trigger a collapse of the gap between the lower and upper bands, creating a quantum path for the electrons to go up and down between the bands.

To make an analogy, Han says, Imagine some electrons are moving on a second floor. When the floor is tilted by an electric field, electrons not only begin to move but previously forbidden quantum transitions open up and the very stability of the floor abruptly falls apart, making the electrons on different floors flow up and down.

Then, the question is no longer how the electrons on the bottom floor jump up, but the stability of higher floors under an electric field.

This idea helps solve some of the discrepancies in the Landau-Zener formula, Han says. It also provides some clarity to the debate over insulator-to-metal transitions caused by electrons themselves or those caused by extreme heat. Hans simulation suggests the quantum avalanche is not triggered by heat. However, the full insulator-to-metal transition doesnt happen until the separate temperatures of the electrons and phonons quantum vibrations of the crystals atoms equilibrate. This shows that the mechanisms for electronic and thermal switching are not exclusive of each other, Han says, but can instead arise simultaneously.

So, we have found a way to understand some corner of this whole resistive switching phenomenon, Han says. But I think its a good starting point.

The study was co-authored by Jonathan Bird, PhD, professor and chair of electrical engineering in UBs School of Engineering and Applied Sciences, who provided experimental context. His team has been studying the electrical properties of emergent nanomaterials that exhibit novel states at low temperatures, which can teach researchers a lot about the complex physics that govern electrical behavior.

While our studies are focused on resolving fundamental questions about the physics of new materials, the electrical phenomena that we reveal in these materials could ultimately provide the basis of new microelectronic technologies, such as compact memories for use in data-intensive applications like artificial intelligence, Bird says.

The research could also be crucial for areas like neuromorphic computing, which tries to emulate the electrical stimulation of the human nervous system. Our focus, however, is primarily on understanding the fundamental phenomenology, Bird says.

Since publishing the paper, Han has devised an analytic theory that matches the computers calculation well. Still, theres more for him to investigate, like the exact conditions needed for a quantum avalanche to happen.

Somebody, an experimentalist, is going to ask me, Why didnt I see that before? Han says. Some might have seen it, some might not have. We have a lot of work ahead of us to sort it out.

Reference: Correlated insulator collapse due to quantum avalanche via in-gap ladder states by Jong E. Han, Camille Aron, Xi Chen, Ishiaka Mansaray, Jae-Ho Han, Ki-Seok Kim, Michael Randle and Jonathan P. Bird, 22 May 2023, Nature Communications.DOI: 10.1038/s41467-023-38557-8

Other authors include UB physics PhD student Xi Chen; Ishiaka Mansaray, who received a PhD in physics and is now a postdoc at the National Institute of Standards and Technology; and Michael Randle, who received a PhD in electrical engineering and is now a postdoc at the Riken research institute in Japan. Other authors include international researchers representing cole Normale Suprieure, French National Centre for Scientific Research (CNRS) in Paris; Pohang University of Science and Technology; and the Center for Theoretical Physics of Complex Systems, Institute for Basic Science.

Original post:
Quantum Avalanche A Phenomenon That May Revolutionize ... - SciTechDaily