Quantum computing: What are the data storage challenges? – ComputerWeekly.com

Quantum computing will process massive amounts of information. Workloads could include diagnostic simulations and analysis at speeds far greater than existing computing. But, to be fully effective, quantum computing will need to access, analyse and store huge amounts of data.

There is an expectation that quantum computing will be the next step in the evolution of IT systems. Just as the multicore processor allowed computers to perform multiple tasks in parallel, quantum processors will be a leap forward in compute power and allow performance of complex tasks in a fraction of the time required now.

Quantum computers, as the name implies, use quantum mechanics the branch of physics concerned with atomic and subatomic particles to overcome the limitations of existing computer systems.

The principles of the superposition of states and quantum entanglement enable a different computation method from that used currently. A quantum computer can potentially store more states per unit of information called quantum bits, or qubits and operate with much more efficient algorithms at the numerical level.

Qubits are a two-state quantum-mechanical system. However, because of superposition, they can also be both of the two states 1 and 0 at the same time. In a classic computer system, a bit would have to be in one state or the other 1 or 0. Quantum mechanics allows a qubit to be in a coherent superposition of both states simultaneously, a property that is fundamental to quantum mechanics and therefore to quantum computing.

At the core of the quantum computers potential for achieving exponentially greater computational power lies the qubits capacity to exist in a state of superposition Martin Weides, Glasgow University

Ultimately, this will allow quantum computers to process complex tasks using large datasets far more quickly than a classic computer, especially in the realms of big data and pattern recognition. For example, quantum computers have potential applications in the pharmaceutical industry, to screen larger and more complex molecules than they were previously able to, and to map the complex interactions between a pharmaceutical product and its intended target.

At the core of the quantum computers potential for achieving exponentially greater computational power lies the qubits capacity to exist in a state of superposition, explains Martin Weides, professor of quantum technologies at Glasgow University. It gives you a statistical answer of likelihoods and then you repeat the calculation a number of times, and amplify that result. At the end, you get some outcome, but its not with 100% certainty.

One of the core challenges of quantum computers is that their storage systems are unsuitable for long-term storage due to quantum decoherence, the effect of which can build up over time. Decoherence occurs when quantum computing data is brought into existing data storage frameworks and causes qubits to lose their quantum status, resulting in corrupted data and data loss.

Quantum mechanical bits cant be stored for long times as they tend to decay and collapse after a while, says Weides. Depending on the technology used, they can collapse within seconds, but the best ones are in a minute. You dont really achieve 10 years of storage. Maybe in two decades we might get there, but its not required either.

Quantum computers will need data storage during computation, but that needs to be a quantum memory for storing super-positioned or entangled states, and storage durations are going to present a challenge.

So, its likely data storage for quantum computing will need to rely on conventional storage, such as in high-performance computing (HPC).

Considering the massive financial investment required for quantum computing, to introduce a limitation of cheap data storage elements as a cost-saving exercise would be counter-productive.

Given the data storage challenges and requirement to process large datasets, quantum computing is likely to be best accessed through the cloud. IBMs current quantum systems are cloud-connected, for example. Naturally, the effectiveness of cloud storage is dependent on network connectivity to the cloud.

Although quantum computing faces challenges in scalability and decoherence, its ability to perform multiple simultaneous operations in a fraction of the time it would take conventional processors means it is likely to become a powerful tool for analytics workloads.

Quantum computing and quantum storage are unlikely to replace existing compute and storage systems.

Using classical compute and storage infrastructure will remain the easiest and most economical solution for tackling everyday problems, especially those that involve small, simple, repeatable tasks.

That said, quantum computing promises to deliver incredible advances in fields that include materials science, climate research and pharmaceutical development. Organisations are already experimenting with quantum computing to develop lighter and more powerful batteries for electric cars, and to help create new medicines.

The limited storage capabilities associated with quantum computers means they will continue to be dependent on classical storage systems for data extraction and information output. However, these would have to be capable of handling large datasets. Some of todays high-end storage systems, especially those that are cloud-based, should be more than adequate for the task.

A quantum computer being so expensive would almost certainly be operated in a dedicated facility with lots of new hardware, including storage, concludes Weides.

Continue reading here:
Quantum computing: What are the data storage challenges? - ComputerWeekly.com

Related Posts

Comments are closed.