Category Archives: Quantum Computing
Quantum computers are like kaleidoscopes why unusual metaphors help illustrate science and technology – The Conversation Indonesia
Quantum computing is like Forrest Gumps box of chocolates: You never know what youre gonna get. Quantum phenomena the behavior of matter and energy at the atomic and subatomic levels are not definite, one thing or another. They are opaque clouds of possibility or, more precisely, probabilities. When someone observes a quantum system, it loses its quantum-ness and collapses into a definite state.
Quantum phenomena are mysterious and often counterintuitive. This makes quantum computing difficult to understand. People naturally reach for the familiar to attempt to explain the unfamiliar, and for quantum computing this usually means using traditional binary computing as a metaphor. But explaining quantum computing this way leads to major conceptual confusion, because at a base level the two are entirely different animals.
This problem highlights the often mistaken belief that common metaphors are more useful than exotic ones when explaining new technologies. Sometimes the opposite approach is more useful. The freshness of the metaphor should match the novelty of the discovery.
The uniqueness of quantum computers calls for an unusual metaphor. As a communications researcher who studies technology, I believe that quantum computers can be better understood as kaleidoscopes.
The gap between understanding classical and quantum computers is a wide chasm. Classical computers store and process information via transistors, which are electronic devices that take binary, deterministic states: one or zero, yes or no. Quantum computers, in contrast, handle information probabilistically at the atomic and subatomic levels.
Classical computers use the flow of electricity to sequentially open and close gates to record or manipulate information. Information flows through circuits, triggering actions through a series of switches that record information as ones and zeros. Using binary math, bits are the foundation of all things digital, from the apps on your phone to the account records at your bank and the Wi-Fi signals bouncing around your home.
In contrast, quantum computers use changes in the quantum states of atoms, ions, electrons or photons. Quantum computers link, or entangle, multiple quantum particles so that changes to one affect all the others. They then introduce interference patterns, like multiple stones tossed into a pond at the same time. Some waves combine to create higher peaks, while some waves and troughs combine to cancel each other out. Carefully calibrated interference patterns guide the quantum computer toward the solution of a problem.
The term bit is a metaphor. The word suggests that during calculations, a computer can break up large values into tiny ones bits of information which electronic devices such as transistors can more easily process.
Using metaphors like this has a cost, though. They are not perfect. Metaphors are incomplete comparisons that transfer knowledge from something people know well to something they are working to understand. The bit metaphor ignores that the binary method does not deal with many types of different bits at once, as common sense might suggest. Instead, all bits are the same.
The smallest unit of a quantum computer is called the quantum bit, or qubit. But transferring the bit metaphor to quantum computing is even less adequate than using it for classical computing. Transferring a metaphor from one use to another blunts its effect.
The prevalent explanation of quantum computing is that while classical computers can store or process only a zero or one in a transistor or other computational unit, quantum computers supposedly store and handle both zero and one and other values in between at the same time through the process of superposition.
Superposition, however, does not store one or zero or any other number simultaneously. There is only an expectation that the values might be zero or one at the end of the computation. This quantum probability is the polar opposite of the binary method of storing information.
Driven by quantum sciences uncertainty principle, the probability that a qubit stores a one or zero is like Schroedingers cat, which can be either dead or alive, depending on when you observe it. But the two different values do not exist simultaneously during superposition. They exist only as probabilities, and an observer cannot determine when or how frequently those values existed before the observation ended the superposition.
Leaving behind these challenges to using traditional binary computing metaphors means embracing new metaphors to explain quantum computing.
The kaleidoscope metaphor is particularly apt to explain quantum processes. Kaleidoscopes can create infinitely diverse yet orderly patterns using a limited number of colored glass beads, mirror-dividing walls and light. Rotating the kaleidoscope enhances the effect, generating an infinitely variable spectacle of fleeting colors and shapes.
The shapes not only change but cant be reversed. If you turn the kaleidoscope in the opposite direction, the imagery will generally remain the same, but the exact composition of each shape or even their structures will vary as the beads randomly mingle with each other. In other words, while the beads, light and mirrors could replicate some patterns shown before, these are never absolutely the same.
Using the kaleidoscope metaphor, the solution a quantum computer provides the final pattern depends on when you stop the computing process. Quantum computing isnt about guessing the state of any given particle but using mathematical models of how the interaction among many particles in various states creates patterns, called quantum correlations.
Each final pattern is the answer to a problem posed to the quantum computer, and what you get in a quantum computing operation is a probability that a certain configuration will result.
Metaphors make the unknown manageable, approachable and discoverable. Approximating the meaning of a surprising object or phenomenon by extending an existing metaphor is a method that is as old as calling the edge of an ax its bit and its flat end its butt. The two metaphors take something we understand from everyday life very well, applying it to a technology that needs a specialized explanation of what it does. Calling the cutting edge of an ax a bit suggestively indicates what it does, adding the nuance that it changes the object it is applied to. When an ax shapes or splits a piece of wood, it takes a bite from it.
Metaphors, however, do much more than provide convenient labels and explanations of new processes. The words people use to describe new concepts change over time, expanding and taking on a life of their own.
When encountering dramatically different ideas, technologies or scientific phenomena, its important to use fresh and striking terms as windows to open the mind and increase understanding. Scientists and engineers seeking to explain new concepts would do well to seek out originality and master metaphors in other words, to think about words the way poets do.
Read the original here:
Quantum computers are like kaleidoscopes why unusual metaphors help illustrate science and technology - The Conversation Indonesia
Quantum control’s role in scaling quantum computing – McKinsey
June 14, 2024by Henning Soller and Niko Mohr with Elisa Becker-Foss, Kamalika Dutta, Martina Gschwendtner, Mena Issler, and Ming Xu
Quantum computing can leverage the states of entangled qubits1 to solve problems that classical computing cannot currently solve and to substantially improve existing solutions. These qubits, which are typically constructed from photons, atoms, or ions, can only be manipulated using specially engineered signals with precisely controlled energy that is barely above that of a vacuum and that changes within nanoseconds. This control system for qubits, referred to as quantum control, is a critical enabler of quantum computing because it ensures quantum algorithms perform with optimal efficiency and effectiveness.
While the performance and scaling limitations of current quantum control systems preclude large-scale quantum computing, several promising technological innovations may soon offer scalable control solutions.
A modern quantum computer comprises various hardware and software components, including quantum control components that require extensive space and span meters. In quantum systems, qubits interact with the environment, causing decoherence and decay of the encoded quantum information. Quantum gates (building blocks of quantum circuits) cannot be implemented perfectly at the physical system level, resulting in accumulated noise. Noise leads to decoherence, which lowers qubits superposition and entanglement properties. Quantum control minimizes the quantum noisefor example, thermal fluctuations and electromagnetic interferencecaused by the interaction between the quantum hardware and its surroundings. Quantum control also addresses noise by improving the physical isolation of qubits, using precise control techniques, and implementing quantum error correction codes. Control electronics use signals from the classical world to provide instructions for qubits, while readout electronics measure qubit states and transmit that information back to the classical world. Thus, the control layer in a quantum technology stack is often referred to as the interface between the quantum and classical worlds.
Components of the control layer include the following:
A superconducting- or spin qubitbased computer, for example, includes physical components such as quantum chips, cryogenics (cooling electronics), and control and readout electronics.
Quantum computing requires precise control of qubits and manipulation of physical systems. This control is achieved via signals generated by microwaves, lasers, and optical fields or other techniques that support the underlying qubit type. A tailored quantum control system is needed to achieve optimal algorithm performance.
In the context of a quantum computing stack, control typically refers to the hardware and software system that connects to the qubits the application software uses to solve real-world problems such as optimization and simulation (Exhibit 1).
At the top of the stack, software layers translate real-world problems into executable instructions for manipulating qubits. The software layer typically includes middleware (such as a quantum transpiler2) and control software comprising low-level system software that provides compilation, instrument control, signal generation, qubit calibration, and dynamical error suppression.3 Below the software layer is the hardware layer, where high-speed electronics and physical components work together to send signals to and read signals from qubits and to protect qubits from noise. This is the layer where quantum control instructions are executed.
Quantum control hardware systems are highly specialized to accommodate the intricacies of qubits. Control hardware interfaces directly with qubits, generating and reading out extremely weak and rapidly changing electromagnetic signals that interact with qubits. To keep qubits functioning for as long as possible, control hardware systems must be capable of adapting in real time to stabilize the qubit state (feedback calibration) and correct qubits from decaying to a completely decoherent state4 (quantum error correction).
Although all based on similar fundamental principles of quantum control, quantum control hardware can differ widely depending on the qubit technology with which it is designed to be used (Exhibit 2).
For example, photonic qubits operate at optical frequencies (similar to fiber internet), while superconducting qubits operate at microwave frequencies (similar to a fifth-generation network). Different types of hardware using laser technology or electronic circuits are needed to generate, manipulate, and transmit signals to and from these different qubit types. Additional hardware may be needed to provide environmental control. Cryostats, for example, cool superconducting qubits to keep them in a working state, and ion trap devices are used in trapped-ion qubit systems to confine ions using electromagnetic fields.
Quantum control is critical to enable fault-tolerant quantum computingquantum computing in which as many errors as possible are prevented or suppressed. But realizing this capability on a large scale will require substantial innovation. Existing control systems are designed for a small number of qubits (1 to 1,000) and rely on customized calibration and dedicated resources for each qubit. A fault-tolerant quantum computer, on the other hand, needs to control 100,000 to 1,000,000 qubits simultaneously. Consequently, a transformative approach to quantum control design is essential.
Specifically, to achieve fault-tolerant quantum computing on a large scale, there must be advances to address issues with current state-of-the-art quantum control system performance and scalability, as detailed below.
Equipping quantum systems to perform at large scales will require the following:
The limitations that physical space poses and the cost to power current quantum computing systems restrict the number of qubits that can be controlled with existing architecture, thus hindering large-scale computing.
Challenges to overcoming these restrictions include the following:
Several technologies show promise for scaling quantum control, although many are still in early-research or prototyping stages (Exhibit 3).
Multiplexing could help reduce costs and prevent overheating. The cryogenic complementary metal-oxide-semiconductor (cryo-CMOS) approach also helps mitigate overheating; it is the most widely used approach across industries because it is currently the most straightforward way to add control lines, and it works well in a small-scale R&D setup. However, cryo-CMOS is close to reaching the maximum number of control lines, creating form factor and efficiency challenges to scaling. Even with improvements, the number of control lines would only be reduced by a few orders of magnitude, which is not sufficient for scaling to millions of qubits. Another option to address overheating is single-flux quantum technology, while optical links for microwave qubits can increase efficiency in interconnections as well as connect qubits between cryostats.
Whether weighing options to supply quantum controls solutions or to invest in or integrate quantum technologies into companies in other sectors, leaders can better position their organizations for success by starting with a well-informed and strategically focused plan.
The first strategic decision leaders in the quantum control sector must make is whether to buy or build their solutions. While various levels of quantum control solutions can be sourced from vendors, few companies specialize in control, and full-stack solutions for quantum computing are largely unavailable. The prevailing expertise is that vendors can offer considerable advantages in jump-starting quantum computing operations, especially those with complex and large-scale systems. Nevertheless, a lack of industrial standardization means that switching between quantum control vendors could result in additional costs down the road. Consequently, many leading quantum computing players opt to build their own quantum control.
Ideally, business leaders also determine early on which parts of the quantum tech stack to focus their research capacities on and how to benchmark their technology. To develop capabilities and excel in quantum control, it is important to establish KPIs that are tailored to measure how effectively quantum control systems perform to achieve specific goals, such as improved qubit fidelity.5 This allows for the continuous optimization and refinement of quantum control techniques to improve overall system performance and scalability.
Quantum control is key to creating business value. Thus, the maturity and scalability of control solutions are the chief considerations for leaders exploring business development related to quantum computing, quantum solutions integration, and quantum technologies investment. In addition to scalability (the key criterion for control solutions), leaders will need to consider and address the other control technology challenges noted previously. And as control technologies mature from innovations to large-scale solutions, establishing metrics for benchmarking them will be essential to assess, for example, ease of integration, cost effectiveness, error-suppression effectiveness, software offerings, and the possibility of standardizing across qubit technologies.
Finally, given the shortage of quantum talent, recruiting and developing the highly specialized capabilities needed for each layer of the quantum stack is a top priority to ensure quantum control systems are properly developed and maintained.
Henning Soller is a partner in McKinseys Frankfurt office, and Niko Mohr is a partner in the Dsseldorf office. Elisa Becker-Foss is a consultant in the New York office, Kamalika Dutta is a consultant in the Berlin office, Martina Gschwendtner is a consultant in the Munich office, Mena Issler is an associate partner in the Bay Area office, and Ming Xu is a consultant in the Stamford office.
1 Entangled qubits are qubits that remain in a correlated state in which changes to one affect the other, even if they are separated by long distances. This property can enable massive performance boosts in information processing. 2 A quantum transpiler converts code from one quantum language to another while preserving and optimizing functionality to make algorithms and circuits portable between systems and devices. 3 Dynamical error suppression is one approach to suppressing quantum error and involves the periodic application of control pulse sequences to negate noise. 4 A qubit in a decoherent state is losing encoded quantum information (superposition and entanglement properties). 5 Qubit fidelity is a measure of the accuracy of a qubits state or the difference between its current state and the desired state.
Here is the original post:
Quantum control's role in scaling quantum computing - McKinsey
Riverlane, the company making quantum computing useful far sooner than anticipated – Maddyness
You have recently been selected to the Tech Nations Future Fifty programme. What are your expectations and how does it feel to be identified as a future unicorn?
Were delighted to have been selected as the sole representative of a rich and diverse UK quantum tech industry. The quantum computing marketing is expected to grow to $28-72B over the next decade so I expect many unicorns to emerge, and we certainly hope to be one of them. Tech Nation has an excellent track record of picking and supporting high-growth leaders. Were excited to make the most of the opportunities the programme offers.
Quantum computing is an amazing idea the ability to harness the power of the atom to perform computation will transform many industries. Back in 2016, I was a research fellow at the University of Cambridge, and at that time, the majority view was that building a useful quantum computer wouldn't be possible in our lifetime - it was simply too big and too hard a problem. I disagreed but needed to validate this. By meeting with teams building quantum computers, I saw an amazing rate of progress a 'Moore's Law' of quantum computing with a doubling in power every two years, just like classical computers have done. That was the catalyst moment for me, and it became clear that if that trend continued, the next big problem would be quantum error correction. I founded Riverlane to make useful quantum computers a reality sooner!
Were building a technology called the quantum error correction stack, which corrects errors in quantum computers. Todays quantum computers can only perform a thousand or so operations before they fail under the weight of these errors. Quantum error correction technology will ultimately enable trillions of error-free operations, unlocking their full and transformative potential.
Implementing quantum error correction to achieve this milestone requires specialised knowledge of quantum science, engineering, software development and chip manufacturing. That makes quantum error correction systems difficult for each quantum computer maker to develop independently. Our strategy is not dissimilar to NVIDIA in providing a core enabling technology for an entirely new computing category.
When Riverlane was founded in 2016, there was a lot of focus on developing software applications to solve novel problems on small-scale quantum computers, a phase known as the noisy intermediate-scale quantum (NISQ) era. However, after the limits of NISQ became apparent due to considerable error rates hindering calculations, the industry shifted focus to building large and reliable quantum computers that could overcome the error problem
This is something weve been working on from the start through the invention of our quantum error correction stack but were now doubling down on its development to meet this growing demand from the industry. An important part to this has been scaling our team to nearly 100 people across our two offices in Cambridge (UK) and Boston (US) - two world-leading centres for quantum computing research and development.
Its a common misconception that you need a PhD in quantum physics or computer science to work in our field. The reality is we need people with a wide range of skills and from the broadest possible mix of backgrounds and demographics. Collectively, were a group that loves tackling hard and complex problems if not the hardest! This requires a culture that blends extremes of creativity, curiosity, problem-solving and analytical skills, plus an alchemy of driving urgency and zen like patience. Im also proud of the extraordinary openness and diversity of our team, including a healthy gender mix in a field where this is the exception not the norm.
Ive been fascinated with quantum physics since I was a student. Back then, the idea of building a computer that applied the unique properties of subatomic particles into computers to transform our understanding of nature and the universe was pure science fiction. Building a company that is now achieving this feels almost miraculous. Building a company with the right mix of skills and shared focus to do far faster than previously imaginable is brutally tricky and joyously rewarding in equal parts
Last September, we launched the worlds first quantum error correction chip. As the quantum computing industry develops, these chips will get better and better, faster and faster. Theyll ultimately enable the quantum industry to scale beyond its current limitations to achieve its full potential to solve currently impossible problems in areas like healthcare, climate science and chemistry. At a recent quantum conference, someone stood up and said quantum computing will be bigger than fire. I wouldnt go quite that far! But theyll unlock a fundamental new era of human knowledge and thats super exciting.
Have a bold and ambitious vision thats underpinned by a proven insight and data. In my case, it was that the presumption that a quantum computer was simply too hard to ever build could be disproven and overcome. Once you have this, be ready to learn fast and pivot fast in your tactics but never lose sight of your goal.
I spend at least a third of my time travelling. Meeting global leaders in our field face to face to hear their ideas, track their progress and build partnerships is priceless. When Im home, Im lucky enough to live about a mile from our office in Cambridge. No matter the weather, I walk to and from work every day. Cambridge is a beautiful place - the thinking time and fresh air give me energy and a calm headspace.
Steve Brierley is the CEO of Riverlane.
Tech Nations Future Fifty Programmeis designed to support late-stage companies with access and growth opportunities, the programme has supported some of the UKs most prominent unicorns, including Monzo, Darktrace, Revolut, Starling, Skyscanner and Deliveroo.
Visit link:
Riverlane, the company making quantum computing useful far sooner than anticipated - Maddyness
Quantum Computing and AI: A Perfect Match? – InformationWeek
It's a marriage that could only happen in cyberspace -- quantum computing and artificial intelligence.
Quantum AI is a burgeoning computer science sector, dedicated to exploring the potential synergy that exists between quantum computing and AI, says Gushu Li, a professor at the University of Pennsylvania School of Engineering and Applied Science, in an email interview. "It seeks to apply principles from quantum mechanics to enhance AI algorithms." A growing number of researchers now believe that AI models developed with quantum computing will soon outpace classical computing AI development.
Quantum AI creates an intersection between quantum computing and artificial intelligence, observes Romn Ors, chief scientific officer at quantum computing software development firm Multiverse Computing, via email. He notes that quantum computing has the potential to take AI to entirely new levels of performance. "For instance, it's possible to develop quantum neural networks that teach a quantum computer to detect anomalies, do image recognition, and other tasks." Ors adds that it's also possible to improve traditional AI methods by using quantum-inspired approaches to dramatically reduce the development and training costs of large language models (LLMs).
Related:Demystifying Quantum Computing: Separating Fact from Fiction
Combining the quantum physics properties of superposition and entanglement, which can perform limitless processes simultaneously with machine learning and AI, and suddenly it's possible to do more than ever imagined, says Tom Patterson, emerging technology security lead at business advisory firm Accenture, via email. "Unfortunately, that includes being used by adversaries to crack our encryption and develop new and insidious ways to separate us from our information, valuables, and anything else we hold dear."
Still, Patterson is generally optimistic. Like ChatGPT, he expects quantum AI to arrive gradually, and then all at once. "While full use of an AI-relevant quantum computer remains years away, the benefits of thinking about AI with quantum information science capabilities are exciting and important today," he states. "The opportunities are here and now, and the future is brighter than ever with quantum AI."
For his part, Li believes that quantum AI's biggest initial impact will be in four specific areas:
Drug Discovery: Simulating molecules to design new drugs and materials with superior properties.
Financial Modeling: Optimizing complex financial portfolios and uncovering hidden trends in the market.
Related:Cybersecurity's Future: Facing Post-Quantum Cryptography Peril
Materials Science: Developing new materials with specific properties for applications like superconductors or ultra-efficient solar cells.
Logistics and Optimization: Finding the most efficient routes for transportation and optimizing complex supply chains.
Quantum AI is already here, but it's a silent revolution, Ors says. "The first applications of quantum AI are finding commercial value, such as those related to LLMs, as well as in image recognition and prediction systems," he states. More quantum AI applications will become available as quantum computers grow more powerful. "It's expected that in two-to-three years there will be a broad range of industrial applications of quantum AI."
Yet the road ahead may be rocky, Li warns. "It's well known that quantum hardware suffers from noise that can destroy computation," he says. "Quantum error correction promises a potential solution, but that technology isn't yet available."
Meanwhile, while quantum AI algorithms are being developed, classical computing competitors are achieving new AI successes. "While progress is being made, it's prudent to acknowledge that the integration of quantum computing with AI is a complex endeavor that will unfold gradually," Li says.
Related:What Is the Future of AI-Driven Employee Monitoring?
Patterson notes that many of the most promising quantum AI breakthroughs aren't arriving from university and corporate research teams, but from various regional developer and support communities that closely mirror natural ecosystems. "Regions that have decided that quantum and AI are too big and too important to leave to one group or another have organized around providing everything progress demands -- from investment to science to academics to entrepreneurs, growth engines, and tier-one buyers," he says. "These regional ecosystems are where the magic happens with quantum AI."
GenAI and quantum computing are mind-blowing advances in computing technology, says Guy Harrison, enterprise architect at cybersecurity technology company OneSpan, in a recent email interview. "AI is a sophisticated software layer that emulates the very capabilities of human intelligence, while quantum computing is assembling the very building blocks of the universe to create a computing substrate," he explains. "We're pushing computing both into the realm of the mind and the realm of the sub-atomic."
The transition to quantum AI won't be optional, Ors warns, since current AI is fundamentally flawed due to excessive energy costs. New models and methods will be needed to lower energy demands and to make AI feasible in the long term. "Early adopters of quantum AI will get a competitive advantage and will survive, as opposed to those that do not adopt or adopt it too late."
Original post:
Quantum Computing and AI: A Perfect Match? - InformationWeek
The 3 Best Quantum Computing Stocks to Buy in June 2024 – InvestorPlace
Technology firms, both public and private, have been working hard to develop quantum computing technologies for decades. The reasons for that are straightforward. Quantum machines, which harness the quantum mechanics undergirding subatomic particles, have a number of advantages over classical computers. Portfolio optimization and climate predictive algorithms that improve with more complexity are better handled by quantum computers.
U.S. equities markets have surged with the rise of generative artificial intelligence (AI) and its potential to create enormous efficiencies and profits for firms across various industries. While AI has brought quantum computing back into the spotlight, a lack of practical ways to scale these complex products has severely dented the performance of pure-play quantum computing stocks, such as IonQ (NYSE:IONQ) and Rigetti Computing (NASDAQ:RGTI).
Fortunately, not every public company invested in quantum computing has seen doom and gloom. Below are the three best quantum computing stocks investors should buy in June.
Source: shutterstock.com/LCV
International Business Machines (NYSE:IBM) is a legacy American technology business. It has its hands in everything from cloud infrastructure, artificial intelligence, and technology consulting services to quantum computers.
The firm committed to developing quantum computing technologies in the early 2000s and tends to publish new findings in the burgeoning field frequently. In December 2023, IBM released a new quantum chip system, Quantum System Two, that leverages the firms Heron processor, which has 133 qubits. Qubits are analogous to bytes on a classical computer. But instead of being confined to states of 0s and 1s, qubits, by way of superposition, can assume both states at the same time.
Moreover, what makes Quantum System Two particularly innovative is its use of both quantum and classical computing technologies. In a press release, IBM states, It combines scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. IBM believes the combination of quantum computation and communication with classical computing resources can create a scalable quantum machine.
IBMs innovations in quantum computing technologies as well as AI has not gone unnoticed either. Shares have risen 31.3% over the past 12 months. The computing giants relatively cheap valuation coupled with its exposure to novel, high-growth fields could boost the value of its shares in the long-term.
Source: sdx15 / Shutterstock.com
Investors have given Nvidia (NASDAQ:NVDA) attention and praise over the past 12 months due to its critical role in AI computing technologies. The chipmakers advanced GPUs, including the H100 and H200 processors, are some of the most coveted chips on the market. The new Blackwell chips, coming to the market in the second half of 2024, bring to the table even better performance.
Though Nvidias prowess in the world of AI captures much of the headlines, the firm has already made inroads into the next stage of computing. In 2023, Nvidia announced a new quantum system in conjunction with startup Quantum Machines. It leverages what Nvidia calls the Grace Hoper Super Chip (GH200) as well as the chipmaker advanced CUDA Quantum (CUDA-Q) developer software.
In 2024, Nvidia released its Quantum Cloud platform, which allows users to build and test quantum computing algorithms in the cloud. The chipmakers GPUs and its open-source CUDA platform will likely be essential to scaling up the quantum computing space.
Nvidias share price has surged 214.2% over the past 12 months.
Source: Bartlomiej K. Wroblewski / Shutterstock.com
Quantum computers are complex machines that require all kinds of components. Furthermore, it is vital for quantum systems to operate at extremely low temperatures in order to operate efficiently.
FormFactor (NASDAQ:FORM) specializes in developing cryogenic systems or systems that are meant to deal with low temperatures. Everything from wafer testing probes to low-vibration probe stations as well as sophisticated refrigerators call cryostats, FormFactor provides. Also, the firms analytical probe tools are useful for developing advanced chips, such as NAND flash memory.
With quantum computing systems and advanced memory chips in greater demand these days, FormFactor could see revenues and earnings rise in the near and medium terms. FormFactors share price has surged 77.5% over the past 12 months, underscoring that investors are taking notice of the companys long-term value.
At the beginning of May, FormFactor released first quarter results for fiscal year 2024 and topped revenue estimates while EPS came in line with market expectations. The firm expects strong demand for advanced memory chips, such as DRAM, will help propel revenue growth in the following quarters.
On the date of publication, Tyrik Torresdid not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.
Tyrik Torres has been studying and participating in financial markets since he was in college, and he has particular passion for helping people understand complex systems. His areas of expertise are semiconductor and enterprise software equities. He has work experience in both investing (public and private markets) and investment banking.
Read the original post:
The 3 Best Quantum Computing Stocks to Buy in June 2024 - InvestorPlace
Better Qubits: Quantum Breakthroughs Powered by Silicon Carbide – SciTechDaily
By U.S. Department of Energy June 14, 2024
Artists representation of the formation pathway of vacancy complexes for spin-based qubits in the silicon carbide host lattice and to the right the associated energy landscape. Credit: University of Chicago
Quantum computers, leveraging the unique properties of qubits, outperform classical systems by simultaneously existing in multiple states. Focused research on silicon carbide aims to optimize qubits for scalable application, with studies revealing new methods to control and enhance their performance. This could lead to breakthroughs in large-scale quantum computing and sensor technologies.
While conventional computers use classical bits for calculations, quantum computers use quantum bits, or qubits, instead. While classical bits can have the values 0 or 1, qubits can exist in a mix of probabilities of both values at the same time. This makes quantum computing extremely powerful for problems conventional computers arent good at solving. To build large-scale quantum computers, researchers need to understand how to create and control materials that are suitable for industrial-scale manufacturing.
Semiconductors are very promising qubit materials. Semiconductors already make up the computer chips in cell phones, computers, medical equipment, and other applications. Certain types of atomic-scale defects, called vacancies, in the semiconductor silicon carbide (SiC) show promise as qubits. However, scientists have a limited understanding of how to generate and control these defects. By using a combination of atomic-level simulations, researchers were able to track how these vacancies form and behave.
Quantum computing could revolutionize our ability to answer challenging questions. Existing small scale quantum computers have given a glimpse of the technologys power. To build and deploy large-scale quantum computers, researchers need to know how to control qubits made of materials that make technical and economic sense for industry.
The research identified the stability and molecular pathways to create the desired vacancies for qubits and determine their electronic properties.
These advances will help the design and fabrication of spin-based qubits with atomic precision in semiconductor materials, ultimately accelerating the development of next-generation large-scale quantum computers and quantum sensors.
The next technological revolution in quantum information science requires researchers to deploy large-scale quantum computers that ideally can operate at room temperature. The realization and control of qubits in industrially relevant materials is key to achieving this goal.
In the work reported here, researchers studied qubits built from vacancies in silicon carbide (SiC) using various theoretical methods. Until now, researchers knew little about how to control and engineer the selective formation process for the vacancies. The involved barrier energies for vacancy migration and combination pose the most difficult challenges for theory and simulations.
In this study, a combination of state-of-the-art materials simulations and neural-network-based sampling technique led researchers at the Department of Energys (DOE) Midwest Center for Computational Materials (MICCoM) to discover the atomistic generation mechanism of qubits from spin defects in a wide-bandgap semiconductor.
The team showed the generation mechanism of qubits in SiC, a promising semiconductor with long qubit coherence times and all-optical spin initialization and read-out capabilities.
MICCoM is one of the DOE Computational Materials Sciences centers across the country that develops open-source, advanced software tools to help the scientific community model, simulate, and predict the fundamental properties and behavior of functional materials. The researchers involved in this study are from Argonne National Laboratory and the University of Chicago.
Reference: Stability and molecular pathways to the formation of spin defects in silicon carbide by Elizabeth M. Y. Lee, Alvin Yu, Juan J. de Pablo and Giulia Galli, 3 November 2021,Nature Communications. DOI: 10.1038/s41467-021-26419-0
This work was supported by the Department of Energy (DOE) Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division and is part of the Basic Energy Sciences Computational Materials Sciences Program in Theoretical Condensed Matter Physics. The computationally demanding simulations used several high-performance computing resources: Bebop in Argonne National Laboratorys Laboratory Computing Resource Center; the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science user facility; and the University of Chicagos Research Computing Center. The team was awarded access to ALCF computing resources through DOEs Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. Additional support was provided by NIH.
See the article here:
Better Qubits: Quantum Breakthroughs Powered by Silicon Carbide - SciTechDaily
Quantum data assimilation: A quantum leap in weather prediction – EurekAlert
image:
The novel quantum data assimilation method can significantly reduce the computation time required for numerical weather prediction, enabling deeper understanding and improved predictions
Credit: Brett Jordan from Openverse https://openverse.org/image/563410ca-1385-475c-a7f6-fd521f910623
Data assimilation is a mathematical discipline that integrates observed data and numerical models to improve the interpretation and prediction of dynamical systems. It is a crucial component of earth sciences, particularly in numerical weather prediction (NWP). Data assimilation techniques have been widely investigated in NWP in the last two decades to refine the initial conditions of weather models by combining model forecasts and observational data. Most NWP centers around the world employ variational and ensemble-variational data assimilation methods, which iteratively reduce cost functions via gradient-based optimization. However, these methods require significant computational resources.
Recently, quantum computing has emerged as a new avenue of computational technology, offering a promising solution for overcoming the computational challenges of classical computers. Quantum computers can take advantage of quantum effects such as tunneling, superposition, and entanglement to significantly reduce computational demands. Quantum annealing machines, in particular, are powerful for solving optimization problems.
In a recent study, Professor Shunji Kotsuki from the Institute for Advanced Academic Research/Center for Environmental Remote Sensing/Research Institute of Disaster Medicine, Chiba University, along with his colleagues Fumitoshi Kawasaki from the Graduate School of Science and Engineering and Masanao Ohashi from the Center for Environmental Remote Sensing, developed a novel data assimilation technique designed for quantum annealing machines. "Our study introduces a novel quantum annealing approach to accelerate data assimilation, which is the main computational bottleneck for numerical weather predictions. With this algorithm, we successfully solved data assimilation on quantum annealers for the first time," explains Prof. Kotsuki. Their study has been published in the journal Nonlinear Processes in Geophysics on June 07, 2024.
In the study, the researchers focused on the four-dimensional variational data assimilation (4DVAR) method, one of the most widely used data assimilation methods in NWP systems. However, since 4DVAR is designed for classical computers, it cannot be directly used on quantum hardware. Prof. Kotsuki clarifies, "Unlike the conventional 4DVAR, which requires a cost function and its gradient, quantum annealers require only the cost function. However, the cost function must be represented by binary variables (0 or 1). Therefore, we reformulated the 4DVAR cost function, a quadratic unconstrained optimization (QUO) problem, into a quadratic unconstrained binary optimization (QUBO) problem, which quantum annealers can solve."
The researchers applied this QUBO approach to a series of 4DVAR experiments using a 40-variable Lorentz-96 model, which is a dynamical system commonly used to test data assimilation. They conducted the experiments using the D-Wave Advantage physical quantum annealer, or Phy-QA, and the Fixstars Amplify's simulated quantum annealer, or Sim-QA. Moreover, they tested the conventionally utilized quasi-Newton-based iterative approaches, using the Broyden-Fletcher-Goldfarb-Shanno formula, in solving linear and nonlinear QUO problems and compared their performance to that of quantum annealers.
The results revealed that quantum annealers produced analysis with comparable accuracy to conventional quasi-Newton-based approaches but in a fraction of the time they took. The D-Wave's Phy-QA required less than 0.05 seconds for computation, much faster than conventional approaches. However, it also exhibited slightly larger root mean square errors, which the researchers attributed to the inherent stochastic quantum effects. To address this, they found that reading out multiple solutions from the quantum annealer improved stability and accuracy. They also noted that the scaling factor for quantum data assimilation, which is important for regulating the analysis accuracy, was different for the D-Wave Phy-QA and the Sim-QA, owing to the stochastic quantum effects associated with the former annealer.
These findings signify the role of quantum computers in reducing the computational cost of data assimilation. "Our approach could revolutionize future NWP systems, enabling a deeper understanding and improved predictions with much less computational time. In addition, it has the potential to advance the practical applications of quantum annealers in solving complex optimization problems in earth science," remarks Prof. Kotsuki.
Overall, the proposed innovative method holds great promise for inspiring future applications of quantum computers in advancing data assimilation, potentially leading to more accurate weather predictions.
About Professor Shunji Kotsuki
Dr. Shunji Kotsuki is currently a Professor at the Institute for Advanced Academic Research (IAAR), Chiba University, leading "Environmental Prediction Science." He received his B.S. (2009), M.S. (2011), and Ph.D. (2013) degrees in civil engineering from Kyoto University. He has over 40 publications and received over 500 citations. Dr. Kotsuki is a leading scientist in data assimilation, deep learning numerical weather prediction with over ten years of research experience in the development of the global atmospheric data assimilation system (a.k.a. NICAM-LETKF). His research interests include data assimilation mathematics, model parameter estimation, observation diagnosis including impact estimates, satellite data analysis, hydrological modeling, and atmospheric and hydrological disaster predictions. He is currently the project manager for Goal 8 of Japan's Moonshot Program, where he leads an interdisciplinary research team. This team includes experts in meteorology, disaster mathematics, information science, computer vision, ethics, and legal studies, all working together to achieve a weather-controlled society.
Nonlinear Processes in Geophysics
Computational simulation/modeling
Not applicable
Quantum Data Assimilation: A New Approach to Solve Data Assimilation on Quantum Annealers
7-Jun-2024
The authors have no competing interests to declare.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.
Continue reading here:
Quantum data assimilation: A quantum leap in weather prediction - EurekAlert
Quantum, AI Combine to Transform Energy Generation, AI Summit London – AI Business
The electrical grid is very complicated. Nobody thinks about it ever until it doesn't work. But it is critical infrastructure that runs minute-to-minute energy being consumed now was generated milliseconds ago, somewhere far away, instantaneously shot through power lines and delivered.
This gets more complicated when locally generated sustainable energy joins the mix, pushing it beyond the capabilities of classical computing solutions. Home energy supplier E.ON is trialing quantum computer solutions to manage this future grid.
Speaking at the AI Summit London, E.ON chief quantum scientist Corey OMeara explained the challenges presented by future decentralized grids.
The way grids are changing now is, if buildings have solar panels on the roofs, you want to use that renewable energy yourself, or you might want to inject that back into the grid to power your neighbor's house, he said.
This decentralized energy production and peer-to-peer energy-sharing model presents a massive overhead for an aging grid that was never meant to be digital. E.ON is working on solving this renewable energy integration optimization problem using quantum computing.
E.ON also uses AI extensively and some functions could in the future be enhanced using quantum computing. An important example is AI-driven predictive maintenance for power plants.
Related:Unilever's Alberto Prado on Quantum Computing's Future, Impact on Emerging Tech
Power plants are complex objects that have thousands of sensors that measure and monitor factors such as temperatures and pressures and store the data in the cloud. We have AI solutions to analyze them to make sure that they're functioning correctly, said OMeara.
We published a paper where we invented a novel anomaly detection algorithm using quantum computing as a subroutine. We used it with our gas turbine data as well as academic benchmark data sets from the computer science field and found that the quantum-augmented solution did perform better but only for certain metrics.
E.ON plans to develop this trial into an integrated quantum software solution that could run on today's noisy, intermediate-scale quantum computers rather than waiting for next-generation fully error-corrected devices.
Read more:
Quantum, AI Combine to Transform Energy Generation, AI Summit London - AI Business
Quantum Computers May Break Bitcoin by 2030, But We Won’t Know About It – Cryptonews
Last updated: June 13, 2024 09:00 EDT | 11 min read
Quantum computers might sound like another buzzword in the tech world, yet their threat to cryptocurrency is very real and approaching fast. Scientists may differ on the timeline, but they all agree: Q-day is not a matter of if, but when.
Weve spoken to quantum experts around the world to hear the latest estimates on when it will happen, what can be done to protect cryptocurrency, and whether these powerful machines could somehow benefit the crypto world.
Unlike traditional computers, which use bits as the smallest unit of data, each bit being a 1 or a 0, quantum computers use quantum bits, or qubits. These qubits can exist in 0 and 1 states or in multiple states at oncea property called superposition.
This allows quantum computers to perform calculations simultaneously and process large amounts of data much faster than standard computers.
As quantum computers can hold and process many possible outcomes at once, it reduces the time needed to solve problems that depend on trying many different solutions, such as factoring large numbers, which is the foundation of most cryptocurrency encryption.
Factoring large numbers, or integer factorization, is a mathematical process of breaking down a large number into smaller, simpler numbers called factors, which, when multiplied together, result in the original number. The process is called prime factorization if these integers are further restricted to prime numbers.
In cryptocurrency, security heavily relies on the mathematical relationship between private and public keys. A public key is a long string of characters associated with the wallet address. It can be shared openly. A private key, used to sign transactions, must remain confidential. This mathematical relationship is one-way, meaning that a public key can be derived from the private key but not the other way around.
Itan Barmes, who is the Global quantum cyber readiness capability lead at Deloitte, explained in a conversation with Cryptonews:
The quantum computer breaks this one-way relationship between the two. So, if you have someones public key, you can calculate their private key, impersonate them, transfer their funds elsewhere.
The task is currently nearly impossible for conventional computers. However, in 1999, mathematician Peter Shor showed that a quantum computer could solve the factoring problem much faster. Shors algorithm can also solve the Discrete Logarithm Problem, which is the basis for the security of most blockchains. This means if such a powerful quantum computer existed, it could break the cryptocurrency security model.
Not all cryptocurrencies would face the same level of risk from quantum attacks. In 2020, Itan Barmes and a team of Deloitte researchers examined the entire Bitcoin blockchain to determine how many coins were vulnerable. They discovered that about 25% of Bitcoins could be at risk.
Pay To Public Key (P2PK)
Pay to Pubkey Hash (P2PKH)
These addresses directly use the public key, making them visible and vulnerable to quantum attacks.
These addresses use a cryptographic hash of the public key. They dont expose the public key directly until coins are moved.
Vulnerable coins include those held in P2PK (Pay To Public Key) addresses, which directly expose the public key, making them easy targets for a quantum attack. Coins in reused P2PKH (Pay to Pubkey Hash) addresses are also at risk because these addresses display their public key when the owner moves the funds. This attack is called the storage attack, as it applies to coins residing in static addresses. Itan Barmes further explained:
A quantum attack only applies to specific coins, not everything. If we conducted the same research today, the percentage of vulnerable coins would be lower because the number of vulnerable addresses remains more or less the same, but due to mining, there are more coins in circulation.
Itan Barmes added that in addition to the storage attack, there is also an attack on active transactions, as the public key is exposed for the first time.
Such an attack must be performed within the mining time (for Bitcoin, around 10 minutes), which adds a requirement for the quantum computer to not only be powerful enough but also fast. This so-called transit attack is likely to be possible later than the storage attack due to this additional requirement.
Ideally, Bitcoin users must generate a new address for each transaction. Yet, recent research by Bitmex suggests that about 50% of transaction outputs still go to previously used addresses, which means the practice of address reuse is more common in Bitcoin transactions than we may think.
Are we nearing the point where quantum computers can pose a real threat? In 2017, a group of researchers, including Divesh Aggarwal and Gavin Brennen, published an article warning that the elliptic curve signature scheme used by Bitcoin could be completely broken by a quantum computer as early as 2027, by the most optimistic estimates.
Cryptonews reached out to the authors to ask whether their estimation has shifted. Gavin Brennen from Macquarie University in Australia replied that although a lot has changed in quantum computing space since then, the basic message is still the same:
Quantum computers pose a threat to blockchains, primarily by attacks on digital signatures, and cryptocurrencies should get started sooner rather than later to upgrade their systems to use post-quantum cryptography before their asset valuations are threatened.
To be able to break cryptocurrency security, quantum computers will likely need thousands, if not millions, of qubits. Currently, the most advanced machines have around 1000.
Another critical challenge is error reduction. Quantum bits are highly sensitive to their environment; even the slightest disturbance, like a change in temperature or vibration, can cause errors in computations, a problem known as quantum decoherence.
Dozens of companies, both public and private, are now actively advancing the development of large quantum computers. IBM has ambitious plans to build a 100,000-qubit chipset and 100 million gates by the end of this decade.
PsiQuantum aims to achieve 1 million photonic qubits within the same timeframe. Quantum gate fidelities and quantum error correction have also significantly advanced. Gavin Brennen continued:
What all this means is that estimates on the size of quantum computers needed to crack the 256-bit elliptic curve digital signatures used in Bitcoin have dropped from 10-20 million qubits to around a million. One article published by the French quantum startup Alice & Bob estimates that it could be cracked with 126,000 physical qubits, though that does assume a highly specialized error model for the quantum computer. In my opinion, a plausible timeline for cracking 256-bit digital signatures is by the mid-2030s.
Gavin Brennen added that substantial technological improvements would be required to reduce all types of gate errors, connect modules, and combine fast classical and quantum control, which is a challenging but surmountable problem.
Yet, if quantum technology becomes powerful enough to break cryptocurrency security, we may not even know about it, believes Marcos Allende, a quantum physicist and CTO of the LACChain Global Alliance. In an email conversation with Cryptonews, Allende wrote:
What is certain is that those who reach that power first will use it silently, making it impossible to guess that selected hackings are happening because of having quantum computers.
Many scientists remain skeptical about the quantum threat to cryptocurrency. Winfried Hensinger, a physicist at the University of Sussex in Brighton, UK, speaking to Nature magazine, described quantum computers as Theyre all terrible. They cant do anything useful.
Several challenges keep quantum computing from reaching its full potential. The delicate nature of qubits makes it difficult to maintain them in a quantum state for extended periods. Another challenge is cooling requirements. Many quantum processors must operate at temperatures close to absolute zero, which means they need complicated and costly refrigeration technology. Finally, the quantum systems would need to be integrated with the existing classical ones.
Just having 200 million qubits not connected to each other is not going to do anything. There are a lot of fundamental physics problems that need to be resolved before we get there. We are still very much at the beginning. But even in the past year, theres been tremendous improvement. The technology can accelerate in a way that all the timelines will be much shorter than we expect, Itan Barmes told Cryptonews.
Tommie van der Bosch, Partner at Deloitte and Blockchain & Digital Asset Leader of Deloitte North and South Europe, believes that the question is not if quantum computing will break cryptocurrency security but when: The fact that its a possibility is enough to start taking action. You should have a plan.
Indeed, this year several key crypto companies and the World Economic Forum (WEF) have shared concerns about the implications of quantum computing on cryptocurrency security.
The WEF, in its post published in May, warned that central bank digital currency (CBDC) could become a prime target for quantum attacks. Ripples recent report has also said that quantum computers could break the digital signatures that currently protect blockchain assets.
Earlier this year, Buterin, Ethereum founder, suggested the Ethereum blockchain would need to undergo a recovery fork to avoid the scenario when bad actors already have access to them and are able to use them to steal users funds.
To protect against these potential quantum attacks, blockchain systems will need to integrate post-quantum cryptographic algorithms. However, incorporating them into existing blockchain protocols is not easy.
New cryptographic methods must first be developed, tested, and standardized. This process can take years and requires the consensus of the cryptographic community to ensure the new methods are secure and efficient.
In 2016, the National Institute of Standards and Technology (NIST) started a project to set new standards for post-quantum cryptography. The project aims to finalize these standards later this year. In 2022, three digital signature methodsCRYSTALS-Dilithium, FALCON, and SPHINCS+were chosen for standardization.
Once standardized, these new cryptographic algorithms need to be implemented within the blockchains existing framework. After that, all network participants need to adopt the updated protocol.
Itan Barmes explained, Lets say someone could tell us exactly the date, three years from now, when we will have these kinds of quantum computers. How quickly do you think we can change the Bitcoin protocol to make it resilient to these attacks? The decentralized governance of Bitcoin can turn out to be a double-edged sword, by preventing timely action.
Quantum-resistant algorithms often require more processing power and larger key sizes, which could lead to performance issues on the blockchain. These include slower transaction times and increased computational requirements for mining and verification processes.
Tommie van der Bosch told Cryptonews that, ultimately, the rise of quantum computing could affect the entire economic model of cryptocurrencies.
Coins that upgrade to quantum-resistant protocols in time might gain a competitive advantage. Investors and users could prefer these quantum-safe cryptocurrencies, as they may see them as more secure long-term holdings. This shift could lead to an increase in demand for such cryptocurrencies, potentially enhancing their value and market share compared to those that are slower to adapt. Tommie van der Bosch told Cryptonews:
Lets draw a parallel with the banking system. Weve all seen the effects of a bank collapsing or even the rumor of one. Your money suddenly seems at risk. How quickly do people shift their assets? It can trigger a domino effect.
The development of quantum computing could also bring regulatory changes. Regulators could start enforcing stricter standards around trading and custody of cryptocurrencies that havent updated their cryptographic protocols. Such measures would aim to protect investors from sinking funds into potentially vulnerable assets.
Itan Barmes remarked, Not many people are aware that the cryptographic algorithm used in Bitcoin and essentially all cryptocurrencies is not part of the NIST recommendation (NIST SP800-186). The issue is already present if organizations require compliance to NIST standards. The issue becomes even more complex if algorithms need to be replaced; Whos responsibility is it to replace them?
Could quantum computing actually benefit the cryptocurrency industry? Gavin Brennen suggests it might. In an email exchange with Cryptonews, Brennen discussed the development of quantum-enabled blockchains.
Quantum computers could accelerate mining, although Brennen notes that the improvement over traditional mining rigs would be limited and require quantum computers with hundreds of millions of qubitsfar beyond current capabilities.
New computational problems have been suggested, like the boson sampling problem, that are slow for all types of classical computers but would be fast on a quantum device. Interestingly, the boson sampler is a small, specialized processor using photons of light, that is not as powerful as a full quantum computer, but much cheaper to build, and that solves a problem immune to ASIC speedups with an energy footprint that is orders of magnitude lower for reaching PoW consensus.
Currently, proof-of-work (PoW) requires vast amounts of electrical power for mining, raising concerns about sustainability and environmental impact. Boson sampling could become a greener alternative, significantly reducing the energy footprint of blockchain operations while maintaining security and efficiency.
More here:
Quantum Computers May Break Bitcoin by 2030, But We Won't Know About It - Cryptonews
European telecoms leading the way in quantum tech adoption, report finds – TNW
Say quantum technologies and most people probably still imagine something decades into the future. But, as a new report released today demonstrates, quantum is already here especially as it relates to the telecom industry.
After years of incremental progress confined to research institutions, the emerging quantum technology sector has begun to gather commercial momentum. While most of the developments have been related to the quantum computing domain and its future promises, there are many other use cases for quantum tech applicable already today.
Quantum communications, including networks and forms of encryption, are currently being commercialised by a growing number of major telecom industry players and startups throughout the world. And Europe has a major part to play.
According to a report released today by Infinity, a startup and ecosystem support branch of Quantum Delta NL, 32% of the 100 quantum startups, scaleups, and SMEs servicing the telecom and telecom infrastructure sector are based in continental Europe. Germany, the Netherlands, France, Switzerland, and Spain are the strongest ecosystems. An additional 14% are in the UK and Ireland.
In addition, 50% of the enterprises that serve as consumers of the technology are located in continental Europe, with a further 11% in the UK and Ireland. Indeed, there are already more than 25 quantum networks being deployed in Europe today.
This includes a commercial quantum network in London, launched through a partnership between BT and Toshiba Europe, and an EU-wide quantum communications network being developed by Deutsche Telekom and two consortia called Petrus and Nostradamus.
Telecom companies are becoming a driving force for real-world adoption of quantum technology, said Teun van der Veen, Quantum Lead at the Netherlands Organisation for Applied Scientific Research (TNO). They are at the forefront of integrating quantum into existing infrastructures and for them it is all about addressing end-user needs.
Quantum networks utilise the unique properties of quantum mechanics such as superposition and entanglement to connect systems and transmit data securely. This is done through quantum channels, which can be implemented using optical fibres, free-space optics, or satellite links.
The promise of quantum networks and quantum encryption is that they would be near-impossible, if not entirely impossible, to hack, thus offering ultra-secure forms of communication.
As Infinitys report states, they can be used to establish quantum-secure links between data centres, Earth and spacecraft and satellites, military and governments, trains and rail network control centres, hospital and health care sites, etc.
Quantum networks can also form the backbone of a global quantum internet, connecting quantum computers in different locations. Furthermore, they can offer opportunities for blind cloud quantum computing, which keeps quantum operations a secret to everyone but the user.
With geopolitical tensions on the rise and looming cybersecurity threats, companies and governments are increasingly looking into ways of securing IT infrastructure and data.
Perhaps unsurprisingly then, Infinitys report finds that Quantum Key Distribution (QKD) is the most popular use of quantum technology in the telecom sector. QKD utilises quantum mechanics to allow parties to generate a key that is known only to them, and is used to encrypt and decrypt messages.
One startup that knows a lot about QKD technology is Q*Bird. The Delft-based communications security company just raised 2.5mn to further develop its QKD product Falqon, already in trial with the Port of Rotterdam (the largest port in Europe).
Quantum communications solutions see increased interest across digital infrastructure in the EU, said Ingrid Romijn, co-founder and CEO of Q*Bird. Together with partners like Cisco, Eurofiber, Intermax, Single Quantum, Portbase and InnovationQuarter, Q*Bird is already testing quantum secure communications in the Port of Rotterdam using our novel quantum cryptography (QKD) technology.
Romjin further stated that moving forward, more industries and companies will be able to implement scalable solutions protecting data communications, leveraging next-generation QKD technology.
Another technology garnering interest is post-quantum cryptography (PQC). Q-day (the day when a quantum computer breaks the internet) is, in all probability, still some way into the future.
However, most classical cryptography methods will be vulnerable to hacking from a sufficiently powerful quantum computer sooner. PQC algorithms are designed to be secure against both classical and quantum attacks.
Other technologies with potential applications for the telecom industry are quantum sensors, clocks, simulation, random number generation, and, naturally, quantum computing.
Meanwhile, despite the increasing market interest, the report also finds that Europes quantum technology startups require more support and investment to help achieve the technical and market breakthroughs to drive the field forward.
Currently, only 42% of the quantum tech for telecom startups worldwide have external funding, having raised a total of 1.9bn between them.And despite the relative forward-thinking approach of the EU as demonstrated by the Deutsche Telekom network project, the US still leads in terms of private sector activity and investment.
Other challenges include raising awareness among business leaders, increasing skilled workforce, overcoming technical limitations, and building a stronger business narrative.
These can be surmounted partially through more regulatory standardisation, more collaboration with industry, and more early-stage support and investment for startups, the report says.
The key market opportunities for the quantum communications sector going forward are in government bodies including military and security services, financial institutions, and critical infrastructure departments, as well as companies in the energy, defence, space, and technology sectors.
Growing collaboration between enterprises and startups in telecom signals the industrys commitment to integrating quantum solutions into commercial applications, said Pavel Kalinin, Operations and Platforms Lead at Infinity. Successful implementation of such technologies will depend on coordinated efforts to prepare the workforce, facilitate collaborations, and set industry benchmarks and standards.
You can read the report in its entirety here.
The rest is here:
European telecoms leading the way in quantum tech adoption, report finds - TNW