Quantum Computing Is Coming Faster Than You Think – Forbes

The IBM Quantum data center in Poughkeepsie, NY.

IBM

It seems for every proponent for quantum computing there is also a detractor. The detractors often refer to quantum computing as a science project, hype, a hoax, even a failed cause. If you look back through the history of the technology industry, it is littered with technologies that failed for various technical or business reasons. So, there is reason to be skeptical. However, there are just as many technologies that went on to chart the future direction of innovation because of major advancements that enabled the technology. Some have even had a similar level, if not more, of skepticism and of being a science project - technologies like artificial intelligence (AI). AI is a concept that had been theorized about long before the development of the first silicon transistor, but it wasnt until the past decade that it became a reality through advancements in silicon technology, processing architectures, and deep learning techniques. Similarly, quantum computing technology is real now and is on the verge of that breakout over the next decade.

Even describing the concept of quantum computing is not easy. Classical computers use bits to represent a one (on state) or zero (off state), while quantum computers use qubits that can represent multiple states through superposition and links with other qubits through entanglement. The result is a computer that scales exponentially in terms of compute capacity. While this makes quantum computers ideally suited for large mathematical models, they are not suited for handling the simple overhead tasks associated with computing. As a result, quantum computing is better positioned as a new accelerator technology, similar to a Graphics Processing Unit (GPU), Digital Signal Processor (DSP), or Field-Programmable Gate Array (FPGA), but on a much larger scale in terms of computing performance. However, quantum computers require specialized control logic and memory because of the unique compute architecture on which quantum computing is based. Large refrigeration units are also required because they operate at near absolute zero, meaning zero degrees Kelvin or -273.15 degrees Celsius.

Quantum computing also faces two major challenges accuracy and scaling. Errors are introduced through both the stability (or lack thereof) of qubits and potential interference from other qubits. Maintaining stability or lifespan of a qubit in a superposition state is challenging and may be limited to a few milliseconds or microseconds. Additionally, qubits can interfere with neighboring qubits. As a result, error suppression, correction, and mitigation techniques are being developed to work both individually and together to increase computation accuracy. Error suppression does front-end processing based on the knowledge of the system and circuits to offset potential errors, such as making alterations to the pulses that control the qubits. Error mitigation corrects errors in postprocessing based on a noise model. Error correction, on the other hand, requires many additional qubits, to correct errors during execution. While error correction may be the most effective way to eliminate errors, it comes at a significant cost. However, with error suppression and mitigation, quantum computing still allows for processing at a level that cannot be easily accomplished even on the largest classical supercomputers.

Scaling quantum computers is also a significant challenge. While there are several different quantum solutions, many do not use standard CMOS manufacturing processes, which means they do not scale with the advanced semiconductor processes used for other high-end processors or accelerators. Additionally, the entire system needs to scale with the number of qubits, which means more wires connecting each individual qubit to the control logic, plus the associated cooling elements. If you look at current quantum computers when they are not in a refrigerator, they look more like a jumble of tubes and wires than a silicon-based system. Scaling these systems is not an easy task.

If quantum computing is so fraught with challenges, the natural question is why do I think that we are on the cusp of major advances in quantum computing? One of the reasons is the level of investment in quantum computing. The benefits of having a single computer that can outperform many supercomputers is so valuable that the scientific community, technology industry, governments, and enterprises are investing billions into the development and use of quantum computing. This includes industry leaders like Alibaba, Amazon, IBM, Intel, Google, Honeywell, Microsoft, Nvidia, and Toshiba among many other companies. Likewise, the US Government has a National Quantum Initiative to accelerate quantum research and development for the economic and national security of the United States. A key example of this investment is evident walking through the IBM quantum data center in Poughkeepsie, New York, which I had the opportunity to tour earlier this year.

Another reason is the continued advancements being made in quantum computing is improvements in quantum chips, control logic, systems, and software. These advancements are especially true of the development tools for error mitigation, suppression, and correction. As an example, IBM holds the lead in quantum scaling with the 433-qubit Osprey processor introduced in 2022 and is slated to introduce the 1,121-qubit Condor processor later this year. If you consider IBMs quantum processor roadmap, the number of qubits will increase by approximately 2-3x every year. IBM is also networking quantum computers together to further increase the qubit capacity. IBM has stated that it has a goal of 100,000 qubit systems by 2033. Industry and academia are already working on practical applications with current quantum computers. This development will accelerate as qubit capacity increases in the latter half of this decade.

The final reason, and the one I believe will be critical to the next step in quantum computing, is artificial intelligence (AI). Thus far, the focus has been integrating classical computers with quantum computers. However, AI holds the potential to both improve the capabilities and performance of quantum computers and being improved by quantum computers but the work in this area is just beginning.

When and how will quantum computing become available for practical applications? With thousands of universities, research organizations, and enterprises already learning and experimenting with quantum computing, the answer is now, for some limited applications. As published in the scientific journal Nature, IBM partnered with US Berkley to demonstrate the ability of quantum computers with just 127 qubits to outperform classical computers in material modeling. However, IBM believes that the 100k qubit capacity level will drive an inflection point for the industry. With quantum systems networked together, this threshold is rapidly approaching.

How the quantum computing industry will take shape is a little easier to predict. Because of the high investment in the supporting systems and infrastructure to support the systems, quantum computing is likely to be a cloud service provided by the leading hyperscalers and/or technology providers for the vast majority of the market at least in the foreseeable future. There will be some university and enterprise installations, but these are likely to be few and far between.

Given the amount of quantum computing investment, advancements, and activity, the industry is set for a dynamic change, similar to that caused by AI increased performance, functionality, and intelligence. This also comes with the same challenges presented by AI, such as security, as outlined in the recent Quantum Safe Cryptography article. But just like AI, quantum computing is coming. You might say that quantum computing is where AI was in 2015, fascinating but not widely utilized. Fast forward just five years and AI was being integrated into almost every platform and application. In just five years, quantum computing could take computing and humanity to a new level of knowledge and understanding.

The author and members of the Tirias Research staff do not hold equity positions in any of the companies mentioned. Tirias Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud. Tirias Research has consulted for IBM, Intel Microsoft, Nvidia, Toshiba, and companies throughout the quantum computing ecosystem.

Jim is a principal analyst and partner at TIRIAS Research, a high-tech research and advisory firm consisting of experienced analysts. Jim has over 30 years of technical and business experience with leading high-tech companies including Intel, Motorola, ON Semiconductor, STMicroelectronics, and General Dynamics Space Systems. Jim focuses on the market inflection points where new technology, usage models and business models collide to drive innovation and growth.

More:
Quantum Computing Is Coming Faster Than You Think - Forbes

Related Posts

Comments are closed.