Category Archives: Quantum Computing

What is quantum computing? – Definition from WhatIs.com

Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.

The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum computing research. In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we delve into what he started, it is beneficial to have a look at the background of the quantum world.

Quantum theory’s development began in 1900 with a presentation by Max Planck to the German Physical Society, in which he introduced the idea that energy exists in individual units (which he called “quanta”), as does matter. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is whatever it is measured to be (for example, a wave or a particle) but that it cannot be assumed to have specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality does not exist. This translates to a principle called superposition that claims that while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don’t look to check.

To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger’s Cat. First, we have a living cat and place it in a thick lead box. At this stage, there is no question that the cat is alive. We then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both dead and alive, according to quantum law – in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The second interpretation of quantum theory is the multiverse or many-worlds theory. It holds that as soon as a potential exists for any object to be in any state, the universe of that object transmutes into a series of parallel universes equal to the number of possible states in which that the object can exist, with each universe containing a unique single possible state of that object. Furthermore, there is a mechanism for interaction between these universes that somehow permits all states to be accessible in some way and for all possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among the scientists who have expressed a preference for the many-worlds theory.

Which ever argument one chooses, the principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time – that is, either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. While the time that the each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as the challenges that are presented.

The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we’ll call QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical computing). In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing. The two most relevant aspects of quantum physics are the principles of superposition and entanglement .

Think of a qubit as an electron in a magnetic field. The electron’s spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron’s spin from one state to another is achieved by using a pulse of energy, such as from a laser – let’s say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and completely isolate the particle from all external influences? According to quantum law, the particle then enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1. Thus, the number of computations that a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number – 2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing – classical computers today, even so called parallel processors, still only truly do one thing at a time: there are just two or more of them doing it). But how will these particles interact with each other? They would do so via quantum entanglement.

Entanglement Particles (such as photons, electrons, or qubits) that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation . Knowing the spin state of one entangled particle – up or down – allows one to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superpostition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is decided at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction to that of the measured particle. This is a real phenomenon (Einstein called it “spooky action at a distance”), the mechanism of which cannot, as yet, be explained by any theory – it simply must be taken as given. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Perhaps even more intriguing than the sheer power of quantum computing is the ability that it offers to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of “take all the superpositions of all the prior computations” – something which is meaningless with a classical computer – which would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers, one example of which we discuss below.

There have been two notable successes thus far with quantum programming. The first occurred in 1994 by Peter Shor, (now at AT&T Labs) who developed a quantum algorithm that could efficiently factorize large numbers. It centers on a system that uses number theory to estimate the periodicity of a large number sequence. The other major breakthrough happened with Lov Grover of Bell Labs in 1996, with a very fast algorithm that is proven to be the fastest possible for searching through unstructured databases. The algorithm is so efficient that it requires only, on average, roughly N square root (where N is the total number of elements) searches to find the desired result, as opposed to a search in classical computing, which on average needs N/2 searches.

The above sounds promising, but there are tremendous obstacles still to be overcome. Some of the problems with quantum computing are as follows:

Even though there are many problems to overcome, the breakthroughs in the last 15 years, and especially in the last 3, have made some form of practical quantum computing not unfeasible, but there is much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis. It is this potential that is rapidly breaking down the barriers to this technology, but whether all barriers can be broken, and when, is very much an open question.

View post:
What is quantum computing? – Definition from WhatIs.com

The Era of Quantum Computing Is Here. Outlook: Cloudy …

A lot of research on the fundamentals of quantum computing has been devoted to error correction. Part of the difficulty stems from another of the key properties of quantum systems: Superpositions can only be sustained as long as you dont measure the qubits value. If you make a measurement, the superposition collapses to a definite value: 1 or 0. So how can you find out if a qubit has an error if you dont know what state it is in?

One ingenious scheme involves looking indirectly, by coupling the qubit to another ancilla qubit that doesnt take part in the calculation but that can be probed without collapsing the state of the main qubit itself. Its complicated to implement, though. Such solutions mean that, to construct a genuine logical qubit on which computation with error correction can be performed, you need many physical qubits.

How many? Quantum theorist Aln Aspuru-Guzik of Harvard University estimates that around 10,000 of todays physical qubits would be needed to make a single logical qubit a totally impractical number. If the qubits get much better, he said, this number could come down to a few thousand or even hundreds. Eisert is less pessimistic, saying that on the order of 800 physical qubits might already be enough, but even so he agrees that the overhead is heavy, and for the moment we need to find ways of coping with error-prone qubits.

An alternative to correcting errors is avoiding them or canceling out their influence: so-called error mitigation. Researchers at IBM, for example, are developing schemes for figuring out mathematically how much error is likely to have been incurred in a computation and then extrapolating the output of a computation to the zero noise limit.

Some researchers think that the problem of error correction will prove intractable and will prevent quantum computers from achieving the grand goals predicted for them. The task of creating quantum error-correcting codes is harder than the task of demonstrating quantum supremacy, said mathematician Gil Kalai of the Hebrew University of Jerusalem in Israel. And he adds that devices without error correction are computationally very primitive, and primitive-based supremacy is not possible. In other words, youll never do better than classical computers while youve still got errors.

Others believe the problem will be cracked eventually. According to Jay Gambetta, a quantum information scientist at IBMs Thomas J. Watson Research Center, Our recent experiments at IBM have demonstrated the basicelementsof quantum error correction onsmalldevices, paving the way towards larger-scaledevices where qubits canreliablystorequantum informationfor a long period of time inthepresence of noise. Even so, he admits that a universal fault-tolerant quantum computer, which has to use logical qubits, is still along way off. Such developments make Childs cautiously optimistic. Im sure well see improved experimental demonstrations of [error correction], but I think it will be quite a while before we see it used for a real computation, he said.

For the time being, quantum computers are going to be error-prone, and the question is how to live with that. At IBM, researchers are talking about approximate quantum computing as the way the field will look in the near term: finding ways of accommodating the noise.

This calls for algorithms that tolerate errors, getting the correct result despite them. Its a bit like working out the outcome of an election regardless of a few wrongly counted ballot papers. A sufficiently large and high-fidelity quantum computation should have some advantage [over a classical computation] even if it is not fully fault-tolerant, said Gambetta.

One of the most immediate error-tolerant applications seems likely to be of more value to scientists than to the world at large: to simulate stuff at the atomic level. (This, in fact, was the motivation that led Feynman to propose quantum computing in the first place.) The equations of quantum mechanics prescribe a way to calculate the properties such as stability and chemical reactivity of a molecule such as a drug. But they cant be solved classically without making lots of simplifications.

In contrast, the quantum behavior of electrons and atoms, said Childs, is relatively close to the native behavior of a quantum computer. So one could then construct an exact computer model of such a molecule. Many in the community, including me, believe that quantum chemistry and materials science will be one of the first usefulapplications of such devices, said Aspuru-Guzik, who has been at the forefront of efforts to push quantum computing in this direction.

Quantum simulations are proving their worth even on the very small quantum computers available so far. A team of researchers including Aspuru-Guzik has developed an algorithm that they call the variational quantum eigensolver (VQE), which can efficiently find the lowest-energy states of molecules even with noisy qubits. So far it can only handle very small molecules with few electrons, which classical computers can already simulate accurately. But the capabilities are getting better, as Gambetta and coworkers showed last September when they used a 6-qubit device at IBM to calculate the electronic structures of molecules, including lithium hydride and beryllium hydride. The work was a significant leap forward for the quantum regime, according to physical chemist Markus Reiher of the Swiss Federal Institute of Technology in Zurich, Switzerland. The use of the VQE for the simulation of small molecules is a great example of the possibility of near-term heuristic algorithms, said Gambetta.

But even for this application, Aspuru-Guzik confesses that logical qubits with error correction will probably be needed before quantum computers truly begin to surpass classical devices. I would be really excited when error-corrected quantum computing begins to become a reality, he said.

If we had more than 200 logical qubits, we could do things in quantum chemistry beyond standard approaches, Reiher adds. And if we had about 5,000 such qubits, then the quantum computer would be transformative in this field.

Despite the challenges of reaching those goals, the fast growth of quantum computers from 5 to 50 qubits in barely more than a year has raised hopes. But we shouldnt get too fixated on these numbers, because they tell only part of the story. What matters is not just or even mainly how many qubits you have, but how good they are, and how efficient your algorithms are.

Any quantum computation has to be completed before decoherence kicks in and scrambles the qubits. Typically, the groups of qubits assembled so far have decoherence times of a few microseconds. The number of logic operations you can carry out during that fleeting moment depends on how quickly the quantum gates can be switched if this time is too slow, it really doesnt matter how many qubits you have at your disposal. The number of gate operations needed for a calculation is called its depth: Low-depth (shallow) algorithms are more feasible than high-depth ones, but the question is whether they can be used to perform useful calculations.

Whats more, not all qubits are equally noisy. In theory it should be possible to make very low-noise qubits from so-called topological electronic states of certain materials, in which the shape of the electron states used for encoding binary information confers a kind of protection against random noise. Researchers at Microsoft, most prominently, are seeking such topological states in exotic quantum materials, but theres no guarantee that theyll be found or will be controllable.

Researchers at IBM have suggested that the power of a quantum computation on a given device be expressed as a number called the quantum volume, which bundles up all the relevant factors: number and connectivity of qubits, depth of algorithm, and other measures of the gate quality, such as noisiness. Its really this quantum volume that characterizes the power of a quantum computation, and Gambetta said that the best way forward right now is to develop quantum-computational hardware that increases the available quantum volume.

This is one reason why the much vaunted notion of quantum supremacy is more slippery than it seems. The image of a 50-qubit (or so) quantum computer outperforming a state-of-the-art supercomputer sounds alluring, but it leaves a lot of questions hanging. Outperforming for which problem? How do you know the quantum computer has got the right answer if you cant check it with a tried-and-tested classical device? And how can you be sure that the classical machine wouldnt do better if you could find the right algorithm?

So quantum supremacy is a concept to handle with care. Some researchers prefer now to talk about quantum advantage, which refers to the speedup that quantum devices offer without making definitive claims about what is best. An aversion to the word supremacy has also arisen because of the racial and political implications.

Whatever you choose to call it, a demonstration that quantum computers can do things beyond current classical means would be psychologically significant for the field. Demonstrating an unambiguous quantum advantage will be an important milestone, said Eisert it would prove that quantum computers really can extend what is technologically possible.

That might still be more of a symbolic gesture than a transformation in useful computing resources. But such things may matter, because if quantum computing is going to succeed, it wont be simply by the likes of IBM and Google suddenly offering their classy new machines for sale. Rather, itll happen through an interactive and perhaps messy collaboration between developers and users, and the skill set will evolve in the latter only if they have sufficient faith that the effort is worth it. This is why both IBM and Google are keen to make their devices available as soon as theyre ready. As well as a 16-qubit IBM Q experience offered to anyone who registers online, IBM now has a 20-qubit version for corporate clients, including JP Morgan Chase, Daimler, Honda, Samsung and the University of Oxford. Not only will that help clients discover whats in it for them; it should create a quantum-literate community of programmers who will devise resources and solve problems beyond what any individual company could muster.

For quantum computing to take traction and blossom, we must enable the world to use and to learn it, said Gambetta. This period is for the world of scientists and industry to focus on getting quantum-ready.

See the rest here:
The Era of Quantum Computing Is Here. Outlook: Cloudy …

IBM puts its quantum computer to work in relaxing, nerdy ASMR …

Quantum computing is still a long way off delivering any actual tangible benefits, but that doesnt mean we cant appreciate it in other ways. Like, for example, this ASMR-style video made inside IBMs new Q computation center a research lab where the company is hard at work on its quantum computing hardware.

Like similar experiments run by Google and Microsoft, this means using things called qubits to create mind-blowingly powerful computers. In theory, anyway. While theres plenty of hype about quantum computing, the actual machines weve made to date are too slow and temperamental to be of practical use. Meanwhile, experts say commercial companies are making unjustified claims about their hardware, and theres not even any consensus on whether or not were building the right type of quantum computer. All of which is to say: dont hold your breath waiting for the Age of Quantum.

IBM is still bullish, though, and published this video last month to promote its new IBM Q Network a partnership of academic and industry players who will explore how quantum computers could improve various fields in the future. The company says proper quantum computing is just around the corner. We say, maybe, but in the meantime, just listen to those ventilators sing.

Original post:
IBM puts its quantum computer to work in relaxing, nerdy ASMR …

Quantum computing is going to change the world. Here’s what …

The science and tech world has been abuzz about quantum computers for years, but the devices are not yet affecting our daily lives. Quantum systems could seamlessly encrypt data, help us make sense of the huge amount of data weve already collected, and solve complex problems that even the most powerful supercomputers cannot such as medical diagnostics and weather prediction.

That nebulous quantum future became one step closer this November, when top-tier journal Nature published two papers that showed some of the most advanced quantum systems yet.

If you still dont understand what a quantum computer is, what it does, or what it could do for you, never fear. Futurism recently spoke with Mikhail Lukin, a physics professor at Harvard University and the senior author ofone of those papers, about the current state of quantum computing, when we might have quantum technology on our phones or our desks, and what it will take for that to happen.

This interview has been slightly edited for clarity and brevity.

Futurism: First, can you give me a simple explanation for how quantum computing works?

Mikhail Lukin: Lets start with how classical computers work. In classical computers, you formulate any problem you want to solve in the form of some input, which is basically a stream of 0s and 1s. When you want to do some calculation, you basically create a certain set of rules depending on how this stream should actually move. Thats the process of calculation addition, multiplication, whatever.

But weve known for more than 100 years that our microscopic world is fundamentally quantum mechanical. And in quantum mechanics, you can have systems. Your computer, for instance, or your chair can be placed in two different states at once thats the idea of quantum superpositions. In other words, your computer can be simultaneously both in Boston and in New York. So this quantum superposition, even though it sounds very weird, is allowed by the laws of quantum mechanics. On a large scale, like the example that I gave, it is clearly very strange. But in the microscopic world, like with a single atom, creating this kind of superposition state is actually quite common. So by doing these scientific experiments, scientists proved that a single atom is in two different states at once.

The idea of quantum computers is to basically make use of these rules of quantum mechanics to process information. Its pretty easy to understand how this can be so powerful. In classical computers, you give me a certain input, I put it in my computer, I give you an output. But if our hardware was quantum mechanical, rather than just sequentially providing some input and reading out the answers, I could prepare the computer register in the quantum superpositions of many different kind of inputs.

This means that if I then take this superposition state and process it using the laws of quantum mechanics, I can process many, many inputs at once. It could be potentially an exponential speedup, compared to the classical programs.

F: What does a quantum computer look like?

ML: If you were to walk into a room with our quantum machine in it you would see a vacuum cell or tube and a bunch of lasers which shine into it. Inside we have a very low density of a certain atom. We use lasers to slow down the atomic motion very close to absolute zero, which is called laser cooling.

F: So how do you program the thing?

ML:. To program a quantum computer, we shine a hundred tightly-focused laser beams into this vacuum chamber. Each of these laser beams acts as a optical tweezer, grabbing one atom or not. We have these atom traps, each of which is either loaded or empty. We then take a picture of these atoms in these traps, and we figure out which traps are full and which are empty. Then we rearrange the trap containing single atoms in any pattern that we wish. This desired arrangement of single atoms, each individually held in and easily controlled, are positioned basically at will.

Positioning these atoms is one way that we can program it. To actually control the qubit, we gently, carefully, push the atoms from their lowest energy state into a higher energy state. We do this with carefully chosen laser beams that shoot to one specific transition. Their frequency is very tightly controlled. In this excited state the atom actually becomes very big and, because of this atom size, the atoms start interacting or in other words talking to each other. By choosing the state to which we excite the atoms and choosing their arrangements and positions, we can then program the interaction in a highly controllable way.

F: What kinds of applications would a quantum computer be most useful for?

ML: To be honest, we really dont know the answer. Its generally believed that quantum computers will not necessarily help for all computational tasks. But there are problems that are mathematically hard for even the best classical computers. They usually involve some complex problems, such as problems involving complex optimizations in which you try to satisfy a number of contradictory constraints.

Suppose you want to give some kind of collective present to a group of people, each of which has its own niche. Some of the niches might be contradictory. So what happens is, if you solve this problem classically, you have to check each pair or triplet of people to make sure that at least their niche is satisfied. The complexity of this problem grows in size very, very rapidly because the number of classical combinations you need to check is exponential. There is some belief that for some of these problems, quantum computers can offer some advantage.

Another very well-known example is factoring. If you have a small number, like 15, its clear that the factors are 3 and 5, but this is the kind of problem that very quickly becomes complicated as the number grows. If you have a large number that is a product of two large factors, classically there is pretty much no better way to find what these factors are than just trying numbers from one, two, three, and so on. But it turns out that a quantum algorithm exists, called Shors algorithm, that can find the factors exponentially faster than the best known classical algorithms. If you can do something exponentially faster than using the alternative approach, then its a big gain.

F: It sounds like your mission, and that of others in your field, is to help us advance and understand this technology, but the applications are sort of secondary and will come when you have the tools. Does that seem about right?

ML: I will answer your question with an analogy. When classical computers were first developed, they were mostly used to do scientific calculations, numerical experiments to understand how complex physical systems behave. Right now, quantum machines are at this stage of development. They already allow us to study complex quantum physical phenomena. They are useful for scientific purposes, and scientists are already doing it now.

In fact, one significance of our papers [published in Nature] is that we have already built machines, which are large enough, and complex enough, and quantum enough to do scientific experiments that are very difficult to impossible to do on even the best possible classical computers essentially supercomputers. In our work, we already used our machine to make a scientific discovery, which had not been made up until now in part because its very difficult for classical computers to model these systems. In some ways, we are now crossing the threshold where quantum machines are becoming useful, at least for scientific purposes.

When classical computers were being developed, people had some ideas of which algorithms to run on them. But actually it turned out that when the first computers were built, people were able to start experimenting with them and discovered many more practically efficient, useful algorithms. In other words, thats really when they discovered what these computers can actually be good for.

Thats why Im saying that we really dont know now the tasks for which quantum computers will be particularly useful. The only way to find these tasks is to build large, functional, quantum machines to try these things out. Thats an important goal, and I should say that we are entering this phase now. Were very, very close to a stage when we can start experimenting with quantum algorithms on large scale machines

F: Tell me a little bit about your Nature paper. What actually is the advance here? And how close are we to being able to start discovering the algorithms that could work on quantum computers?

ML: So first lets talk about how one could quantify quantum machines. It can be done along three different axes. On one axis is the scale how many qubits [a quantum bit, the unit that makes up the basis of quantum computer the way bits do in classical computing] it is. More is better. Another axis is the degree of quantum-ness, that is, how coherent these systems are. So eventually, the way to quantify it is that if you have a certain number of qubits, and you perform some calculations with that, whats the probability that this calculation is error-free?

If you have a single qubit, you have a small chance to make an error. Once you have a lot of them, this probability is exponentially higher. So the systems described in our paper, and also in the complementary paper, have large enough qubits and are coherent enough so that we can basically do the entire series of computations with fairly low error probability. In other words, in a finite number of tries, we can have a result that has no errors.

But this is still not the complete story. The third axis is how well you can program this machine. Basically if you can make each qubit talk with any other qubit in an arbitrary fashion, you can also encode any quantum problem into this machine. Such machines are sometimes called universal quantum computers. Our machine is not fully universal, but we demonstrate a very high degree of programmability. We can actually change the connectivity very quickly. This in the end, is what allows us to probe and to make new discoveries about these complex quantum phenomena.

F: Could a quantum computer be scaled down to the size of a phone, or something vaguely portable at some point?

ML: That is not out of the question. There are ways to package it so that it can actually become portable and potentially can be miniaturized enough maybe not to the point of a mobile phone, but perhaps a desktop computer. But that cannot be done right now.

F: Do you think, like classical computers, quantum computers will make the shift from just scientific discoveries to the average user in about 30 years?

ML: The answer is yes, but why 30 years? It could happen much sooner.

F: What has to happen between now and then? What kind of advances need to be made to get us there?

ML: I think we need to have big enough computers to start really figuring out what they can be used for. We dont know yet what quantum computers are capable of doing, so we dont know their full potential. I think the next challenge is to do that.

The next stage will be for engineering and creating machines that could be used maybe to target some specialized applications. People, including [my team], are already working on developing some smallscale quantum devices, which are designed to, for example, aide in medical diagnostics. In some of these applications, quantum systems just measure tiny electric or magnetic fields, which could allow you to do diagnostics more efficiently. I think these things are already coming, and some of these ideas are already being commercialized.

Then maybe, some more general applications could be commercialized. In practice quantum computers and classical computers will likely work hand-in-hand. In fact, most likely what would happen is that the majority of the work is done by classical computers, but some elements, the most difficult problems, can be solved by quantum machines.

There is also another field called quantum communication where you can basically transfer quantum states between distant stations. If you use quantum states to send information, you can build communication lines that are completely secure. Moreover, through these so-called quantum networks, sometimes called quantum internet, we should be able to access quantum servers remotely. That way, I can certainly imagine many directions in which quantum computers can enter everyday life, even though you dont carry it in your own pocket.

F: Whats something that you wish more people knew about quantum computers?

ML: Quantum computing and quantum technology have been in the news for some time. We scientists know that its an exciting area. Its really the frontier of the scientific research across many subfields. Over the last five to 10 years, most people assumed that the developments have been very futuristic. They assumed that it will take a long time before we create any useful quantum machines.

I think that this is just not the case. I think we are already entering the new era with tremendous potential for scientific discoveries, which might have wideranging applications for material science, chemistry really anything that involves complex physical systems. But I also feel that very soon we will start discovering what quantum computers can be useful for in a much broader scope, ranging from optimization to artificial intelligence and machine learning. I think these things are around the corner.

We dont yet know what and how quantum computers will do it, but we will find out very soon.

Continued here:
Quantum computing is going to change the world. Here’s what …

Is Quantum Computing an Existential Threat to Blockchain …

Amid steep gains in value and wild headlines, its easy to forget cryptocurrencies and blockchain arent yet mainstream. Even so, fans of the technology believe blockchain has too much potential not to have a major sustained impact in the future.

But as is usually the case when pondering whats ahead, nothing is certain.

When considering existential threats to blockchain and cryptocurrencies, people generally focus on increased regulation. And this makes sense. In the medium term, greater regulation may stand in the way of cryptocurrencies and wider mainstream adoption. However, there might be a bigger threat further out on the horizon.

Much of blockchains allure arises from its security benefits. The tech allows a ledger of transactions to be distributed between a large network of computers. No single user can break into and change the ledger. This makes it both public and secure.

But combined with another emerging (and much hyped) technology, quantum computing, blockchains seemingly immutable ledgers would be under threat.

Like blockchain, quantum computing has been making progress and headlines too.

The number of quantum computing companies and researchers continues to grow. And while there is a lot of focus on hardware, many are looking into the software as well.

Cryptography is a commonly debated topic because quantum computing poses a threat to traditional forms of computer security, most notably public key cryptography, which undergirds most online communications and most current blockchain technology.

But first, how does computer security work today?

Public key cryptography uses a pair of keys to encrypt information: a public key which can be shared widely and a private key known only to the keys owner. Anyone can encrypt a message using the intended receivers public key, but only the receiver can decrypt the message using her private key. The more difficult it is to determine a private key from its corresponding public key, the more secure the system.

The best public key cryptography systems link public and private keys using the factors of a number that is the product of two incredibly large prime numbers. To determine the private key from the public key alone, one would have to figure out the factors of this product of primes. Even if a classical computer tested a trillion keys a second, it would take up to 785 million times longer than the roughly 14 billion years the universe has existed so far due to the size of the prime numbers in question.

If processing power were to greatly increase, however, then it might become possible for an entity exercising such computing power to generate a private key from the corresponding public key. If actors could generate private keys from corresponding public keys, then even the strongest forms of traditional public key cryptography would be vulnerable.

This is where quantum computing comes in. Quantum computing relies on quantum physics and has more potential power than any traditional form of computing.

Quantum computing takes advantage of quantum bits or qubits that can exist in any superposition of values between 0 and 1 and can therefore process much more information than just 0 or 1, which is the limit of classical computing systems.

The capacity to compute using qubits renders quantum computers many orders of magnitude faster than classical computers. Google showed a D-Wave quantum annealing computer could be 100 million times faster than classical computers at certain specialized tasks. And Google and IBM are working on their own quantum computers.

Further, although there are but a handful of quantum computing algorithms, one of the most famous ones, Shors algorithm, allows for the quick factoring of large primes. Therefore, a working quantum computer could, in theory, break todays public key cryptography.

Quantum computers capable of speedy number factoring are not here yet. However, if quantum computing continues to progress, it will get there eventually. And when it does, this advance will pose an existential threat to public key cryptography, and the blockchain technology that relies on it, including Bitcoin, will be vulnerable to hacking.

So, is blockchain security therefore impossible in a post-quantum world? Will the advent of quantum computing render blockchain technology obsolete?

Maybe, but not if we can develop a solution first.

The NSA announced in 2015 that it was moving to implement quantum-resistant cryptographic systems. Cryptographers are working on quantum-resistant cryptography, and there are already blockchain projects implementing quantum-resistant cryptography. The Quantum Resistant Ledger team, for example, is working on building such a blockchain right now.

What makes quantum-resistant or post-quantum cryptography, quantum resistant? When private keys are generated from public keys in ways that are much more mathematically complex than traditional prime factorization.

The Quantum Resistant Ledger team is working to implement hash-based cryptography, a form of post-quantum cryptography. In hash-based cryptography, private keys are generated from public keys using complex hash-based cryptographic structures, rather than prime number factorization. The connection between the public and private key pair is therefore much more complex than in traditional public key cryptography and would be much less vulnerable to a quantum computer running Shors algorithm.

These post-quantum cryptographic schemes do not need to run on quantum computers. The Quantum Resistant Ledger is a blockchain project already working to implement post-quantum cryptography. It remains to be seen how successful the effort and others like it will prove when full-scale quantum computing becomes a practical reality.

To be clear, quantum computing threatens all computer security systems that rely on public key cryptography, not just blockchain. All security systems, including blockchain systems, need to consider post-quantum cryptography to maintain data security for their systems. But the easiest and most efficient route may be to replace traditional systems with blockchain systems that implement quantum-resistant cryptography.

Disclosure: The author owns assorted digital assets. The author is also a principal at Crypto Lotus LLC, a cryptocurrency hedge fund based out of the San Francisco Bay Area, and an advisor at Green Sands Equity, both of which have positions in various digital assets. All opinions in this post are the authors alone and not those of Singularity University, Crypto Lotus, or Green Sands Equity. This post is not an endorsement by Singularity University, Crypto Lotus, or Green Sands Equity of any asset, and you should be aware of the risk of loss before trading or holding any digital asset.

Image Credit: Morrowind /Shutterstock.com

Continue reading here:
Is Quantum Computing an Existential Threat to Blockchain …

What is Quantum Computing? | SAP News Center

Whether its a astrophysical operations, weather prognosis, or explorations for locating oil and gas resources, powerful super computers are now ready to assist the computation of the most complex problems.

Yet there are some challenges that even the fastest computing machines in the world have been unable to solve, namely the simulation of molecular structures, which has left many professionals in the medical and chemical industry scratching their heads. The development of effective drugs against illnesses, as well as better quality fertilizer to help fight world hunger, is largely dependent on the ability to perform the relevant calculations.

Another example is optimization. A rucksack can hold up to 20 kilograms. If we take several objects all with a specific weight and use, a specific number of objects must be selected that does not exceed the maximum weight of the rucksack but maximizes the value. Inventory management frequently encounters these sorts of challenges, yet mathematical evidence shows that these problems cannot be solved satisfactorily using conventional computers.

This all comes down to how computers are built. The smallest possible storage unit (a bit) can have a value of either 0 or 1. Bits are physically represented by two voltage potentials that correspond to the states 0 and 1. This binary representation of information pushes it to the brink of its capabilities to perform certain tasks.

Qubits: Superposition and Entanglement

In 1981, Nobel Prize-winning physicist Richard Feynman claimed that a so-called quantum computer could be used to perform computations. This theoretical concept went on to generate a wealth of interest and has since become a broad field of research and development.

A quantum computer works with quantum bits, or qubits. In contrast to a traditional computer system, the states of qubits can overlap. In other words, they do not merely represent 0 or 1, but can achieve a mixed state where they are both 0 and 1 at the same time. This is known as a superposition. When measured however, Qubits behave like classical bits and yield the value of 0 or 1.

If various qubits are added together, they do not have a defined state but exist as a qubit entirety. In quantum mechanics, this process is known as entanglement, and refers to how the measurement of two qubits is dependent on the other. For instance, if two qubits are measured and the first measures as 1, the state of the second qubit is already known.

Overcoming Quantum Decoherence

Together, superposition and entanglement form the decisive difference from which quantum computers are said to benefit: with a given number of qubits, numerous sequences of conventional bits can also be displayed. This calculation is therefore equal to the calculation of all bit sequences simultaneously. For certain problems, this quantum parallelism ensures a decisive speed advantage compared to regular computers.

Decoherence nevertheless remains a challenge for researchers. As soon as closed quantum systems start interacting with their environment, the system and environment state are changed irreversibly and errors can occur if this happens during the calculation process.

To ensure that the operations are conducted without mistakes or errors, the quantum computer qubits should preferably be decoupled from their environment which, in turn, minimizes the time to decoherence. This can lead to a possible conflict of objectives, since it is also necessary that the state of an individual qubit can be changed from the outside.

The number of qubits also plays an important technical role the higher the number, the greater the expected speed advantage. At the same time, this increases the number of obstacles to avoid decoherence with each individual qubit.

Five Criteria for Quantum Computers

Based on these ideas, in 1996 physicist David DiVincenzo formulated five criteria that he deemed sufficient for a quantum computer:

So far, no one has succeeded in developing a system that fulfills all these requirements. This is partly due to the lack of clarity surrounding the most appropriate candidates able to physically implement qubits. The energy level of an atom and the angular moment of electrons are currently under discussion, although many other possibilities are also under research.

Applications for Quantum Computing

Further progress continues to be made in the development of quantum computers. To date, none of the prototypes have shown any definitive advantages compared against traditional super computers. This predominantly comes down to the number of qubits used. The widespread view suggests that 50 or more qubits should show a benefit a number that has been officially announced but never achieved.

Experts expect that the first standard quantum computer will appear at some point in the next 10 years. Yet for those who are expecting to have a device under their desks at home may be disappointed; for the foreseeable future, this technology will most likely only be used to perform tasks on a large scale.

Quantum Cryptography: Already in Use

Beyond the development of quantum computers, other technologies benefiting from quantum mechanical effects have sparked interest. An example of this is quantum cryptography, which has been under development since the 1970s, and is now ready for implementation.

Data is the fuel of the 21st century. The world can hugely benefit from the distribution of more devices that interconnectedly generate and analyze data. At the same time, security risks such as data theft and data abuse continue to rise. Experts have estimated that cybercrime cost the economy $454 billion in 2016.

Compared to the solutions already available, quantum cryptographic processes can provide an additional level of safety and security. Discoveries in quantum physics reveal that such encryptions are not only difficult to hack, but downright impossible if they have been implemented correctly.

The aforementioned qualities of quantum systems form the basis for this level of security. Individual light particles transfer a code that is used in message encryption. The particles cannot be intercepted and measured without disruption. If someone were to try and intercept, they would not be able to access the code without being detected.

Progress in quantum computing development is the main motivation to continue developing quantum cryptography. Current encryption processes, such as RSA, rely on the assumption that there is no process in existence fast enough for the prime factorization of large numbers. Yet in 1994, Peter Shor demonstrated that this type of algorithm can be achieved on a quantum computer. The first team to produce an adequately-sized standard quantum computer can therefore hack all such security systems.

Yet this development is still a long way away from the projected 1,000 qubits that would be needed to hack RSA. In areas where secure communication and data transfers are extremely important, quantum cryptography can already offer solutions to safeguard against current and future attacks.

Read the rest here:
What is Quantum Computing? | SAP News Center

Quantum Computing Explained | What is Quantum Computing?

In this series, Life’s Little Mysteries explains complex subjects in exactly 200 words.

Ordinarycomputersmanipulate “bits” of information, which, like light switches, can be in one of two states (represented by 1 or 0). Quantum computers manipulate “qubits”: units of information stored in subatomic particles, which, by thebizarre laws of quantum mechanics, may be in states |1> or |0>,orany “superposition” (linear combination) of the two. As long as the qubit is left unmeasured, it embodies both states at once; measuring it “collapses” it from the superposition to one of its terms. Now, suppose a quantumcomputerhas two qubits. If they were bits, they could be inonly oneof four possible states (00,01,10,11). A pair of qubits also has four states (|00>,|01>,|01>,|11>), but it can also exist in any combination of all four. As you increase the number of qubits in the system, you exponentially increase the amount of information they can collectively store. Thus, one can theoretically work with myriad information simultaneously byperforming mathematical operations on a system of unmeasured qubits (instead of probing one bit at a time), potentially reducing computing times for complex problems from years to seconds. The difficult task is to efficiently retrieve information stored in qubits and physicists aren’t there yet.

Follow Natalie Wolchover on Twitter @nattyover. Follow Life’s Little Mysteries on Twitter @llmysteries, then join us onFacebook.

Go here to read the rest:
Quantum Computing Explained | What is Quantum Computing?

New silicon structure opens the gate to quantum computers

In a major step toward making a quantum computer using everyday materials, a team led by researchers at Princeton University has constructed a key piece of silicon hardware capable of controlling quantum behavior between two electrons with extremely high precision. The study was published Dec. 7 in the journal Science.

The team constructed a gate that controls interactions between the electrons in a way that allows them to act as the quantum bits of information, or qubits, necessary for quantum computing. The demonstration of this nearly error-free, two-qubit gate is an important early step in building a more complex quantum computing device from silicon, the same material used in conventional computers and smartphones.

“We knew we needed to get this experiment to work if silicon-based technology was going to have a future in terms of scaling up and building a quantum computer,” said Jason Petta, a professor of physics at Princeton University. “The creation of this high-fidelity two-qubit gate opens the door to larger scale experiments.”

Silicon-based devices are likely to be less expensive and easier to manufacture than other technologies for achieving a quantum computer. Although other research groups and companies have announced quantum devices containing 50 or more qubits, those systems require exotic materials such as superconductors or charged atoms held in place by lasers.

Quantum computers can solve problems that are inaccessible with conventional computers. The devices may be able to factor extremely large numbers or find the optimal solutions for complex problems. They could also help researchers understand the physical properties of extremely small particles such as atoms and molecules, leading to advances in areas such as materials science and drug discovery.

The two-qubit silicon-based gate consists of two electrons (blue balls with arrows) in a layer of silicon (Si). By applying voltages through aluminum oxide (Al2O3)wires (red and green), the researchers trapped the electrons and coaxed quantum behaviors that transform their spin properties into quantum bits of information, or qubits. The image on the left shows a scanning electron micrograph of the device, which is about 200 nanometers (nm) across. The image on the right is a diagram of the device from the side.

Image courtesy of Science/AAAS

Building a quantum computer requires researchers to create qubits and couple them to each other with high fidelity. Silicon-based quantum devices use a quantum property of electrons called “spin” to encode information. The spin can point either up or down in a manner analogous to the north and south poles of a magnet. In contrast, conventional computers work by manipulating the electron’s negative charge.

Achieving a high-performance, spin-based quantum device has been hampered by the fragility of spin states they readily flip from up to down or vice versa unless they can be isolated in a very pure environment. By building the silicon quantum devices in Princeton’s Quantum Device Nanofabrication Laboratory, the researchers were able to keep the spins coherent that is, in their quantum states for relatively long periods of time.

To construct the two-qubit gate, the researchers layered tiny aluminum wires onto a highly ordered silicon crystal. The wires deliver voltages that trap two single electrons, separated by an energy barrier, in a well-like structure called a double quantum dot.

By temporarily lowering the energy barrier, the researchers allow the electrons to share quantum information, creating a special quantum state called entanglement. These trapped and entangled electrons are now ready for use as qubits, which are like conventional computer bits but with superpowers: while a conventional bit can represent a zero or a 1, each qubit can be simultaneously a zero and a 1, greatly expanding the number of possible permutations that can be compared instantaneously.

“The challenge is that its very difficult to build artificial structures small enough to trap and control single electrons without destroying their long storage times,” said David Zajac, a graduate student in physics at Princeton and first-author on the study. “This is the first demonstration of entanglement between two electron spins in silicon, a material known for providing one of the cleanest environments for electron spin states.”

The researchers demonstrated that they can use the first qubit to control the second qubit, signifying that the structure functioned as a controlled NOT (CNOT) gate, which is the quantum version of a commonly used computer circuit component. The researchers control the behavior of the first qubit by applying a magnetic field. The gate produces a result based on the state of the first qubit: If the first spin is pointed up, then the second qubit’s spin will flip, but if the first spin is down, the second one will not flip.

“The gate is basically saying it is only going to do something to one particle if the other particle is in a certain configuration,” Petta said. “What happens to one particle depends on the other particle.”

The researchers showed that they can maintain the electron spins in their quantum states with a fidelity exceeding 99 percent and that the gate works reliably to flip the spin of the second qubit about 75 percent of the time. The technology has the potential to scale to more qubits with even lower error rates, according to the researchers.

“This work stands out in a worldwide race to demonstrate the CNOT gate, a fundamental building block for quantum computation, in silicon-based qubits,” said HongWen Jiang, a professor of physics and astronomy at the University of California-Los Angeles.”The error rate for the two-qubit operation is unambiguously benchmarked. It is particularly impressive that this extraordinarily difficult experiment, which requires a sophisticated device fabrication and an exquisite control of quantum states, is done in a university lab consisting of only a few researchers.”

Additional researchers at Princeton are graduate student Felix Borjans and associate research scholar Anthony Sigillito. The team included input on the theory aspects of the work by Jacob Taylor, a professor at the Joint Quantum Institute and Joint Center for Quantum Information and Computer Science at the National Institute of Standards and Technology and the University of Maryland, and Maximilian Russ and Guido Burkard at the University of Konstanz in Germany.

Research was sponsored by U.S. Army Research Office grant W911NF-15-1-0149, the Gordon and Betty Moore Foundation’s EPiQS Initiative through grant GBMF4535, and National Science Foundation grant DMR-1409556. Devices were fabricated in the Princeton University Quantum Device Nanofabrication Laboratory.

The study, “Resonantly driven CNOT gate for electron spins,” by David M. Zajac, Anthony J. Sigillito, Maximilian Russ, Felix Borjans, Jacob M. Taylor, Guido Burkard and Jason R. Petta was published online in the journal Science on Dec. 7, 2017.

Originally posted here:
New silicon structure opens the gate to quantum computers

Microsoft offers developers a preview of its quantum …

The simulator will allow developers to test programs and debug code with their own computers, which is necessary since there really aren’t any quantum computers for them to test their work on yet. Microsoft is also offering a more powerful simulator — one with over 40 logical qubits of computing power — through its Azure cloud computing service. And because the kit is integrated into Microsoft’s Visual Studio developer tool suite, many aspects of the new kit will be familiar.

“What you’re going to see as a developer is the opportunity to tie into tools that you already know well, services you already know well,” Todd Holmdahl, Microsoft’s VP in charge of its quantum effort, said in a statement. “There will be a twist with quantum computing, but it’s our job to make it as easy as possible for the developers who know and love us to be able to use these new tools that could potentially do some things exponentially faster which means going from a billion years on a classical computer to a couple hours on a quantum computer.”

Read more:
Microsoft offers developers a preview of its quantum …

Quantum Computing Is the Next Big Security Risk | WIRED

The 20th century gave birth to the Nuclear Age as the power of the atom was harnessed and unleashed. Today, we are on the cusp of an equally momentous and irrevocable breakthrough: the advent of computers that draw their computational capability from quantum mechanics.

US representative Will Hurd (R-Texas) (@HurdOnTheHill) chairs the Information Technology Subcommittee of the Committee on Oversight and Government Reform and serves on the Committee on Homeland Security and the Permanent Select Committee on Intelligence.

The potential benefits of mastering quantum computing, from advances in cancer research to unlocking the mysteries of the universe, are limitless.

But that same computing power can be used to unlock different kinds of secretsfrom your personal financial or health records, to corporate research projects and classified government intelligence.

Its more than just theoretical: An algorithm formulated by mathematician Peter Shor demonstrates that quantum computers are able to factor large numbers more efficiently than classical computers. Large-number factoring is the foundation of todays encryption standards.

The impact of quantum on our national defense will be tremendous. The question is whether the United States and its allies will be ready.

The consequences of mastering quantum computing, while not as visual or visceral as a mushroom cloud, are no less significant than those faced by the scientists who lit up the New Mexico sky with the detonation at the Trinity test site 72 years ago. In the same way that atomic weaponry symbolized power throughout the Cold War, quantum capability is likely to define hegemony in todays increasingly digital, interconnected global economy.

Unlike traditional computers, which process information in binary bits, quantum computers exploit the ability of quantum bits (qubits) to exist in multiple states simultaneously. This allows them to perform incredibly complex calculations at speeds unimaginable today and solve certain classes of problems that are beyond the grasp of todays most advanced super computers.

Today, quantum computers are beginning to move out of research labs in search of broader investment and applications. In October, Google announced that by the end of this year it expects to achieve quantum supremacythe point at which a quantum computer can outperform a classical computer.

Because nations around the world, including China, are investing heavily in research and development, the world is likely less than a decade away from the day when a nation-state could use quantum computers to render many of todays most sophisticated encryption systems useless.

From academics to the National Security Agency, there is widespread agreement that quantum computers will rock current security protocols that protect global financial markets and the inner workings of government.

Already, intelligence agencies around the world are archiving intercepted communications transmitted with encryption that’s currently all but unbreakable, in the hopes that in the future computing advances will turn whats gibberish now into potentially valuable intelligence. Rogue states may also be able to leverage the power of quantum to attack the banking and financial systems at the heart of western capitalism.

Everyone has seen the damage individual hackers can do when they infiltrate a system. Imagine a nation-state intercepting the encrypted financial data that flows across the globe and being able to read it as easily as you are reading this. Quantum computers are so big and expensive thatoutside of global technology companies and well-funded research universitiesmost will be owned and maintained by nation-states. That means the first quantum attacks are likely to be organized by countries hostile to the US and our allies. Rogue states could read military communiques the way the United States and its allies did after cracking the Nazi Enigma codes.

In short, quantum computing presents both an unprecedented opportunity and a serious threat. The United States must lead this transition, in collaboration with its allies around the world. Whether lawmakers want to think of it as a new Manhattan Project or a race to the moon, the US cannot abdicate leadership in scientific discovery or international security.

The window is closing, fast. It took more than five years and nearly half a trillion dollars for companies and governments to prepare for Y2K, which resulted in a non-event for most people. But, the US is not ready for what experts call Y2Q (Years to Quantum), and the time to prepare is now. Even in a pre-quantum era, the need for quantum-safe encryption is real. Banks, government agencies, insurers, hospitals, utilities, and airlines all need to be thinking now about how to implement security and encryption that will withstand a quantum attack.

On complex, large-scale networks, it can take years to roll out even a relatively straightforward update. Quantum-safe encryption relies on mathematical approaches that even quantum computers have difficulty solving. The challenge is ensuring that every point through which data flows, and even the data itself, is wrapped in quantum-safe security.

Private sector research and development are happening in pockets across North America and among the US’s allies. Google and IBM both have well-publicized programs to build viable quantum computers. At the same time, though, the US and its allies must take practical steps to prepare for the quantum threat. The National Institute of Standards and Technology is working to evaluate quantum-safe cryptographic candidate algorithms. Other organizations like the European Telecommunications Standards Institute and the United Nations International Telecommunications Union are working to ensure our standards for connecting systems continue to evolve to be quantum safe. Companies like ISARA are among a small cadre of cryptographers and programmers building quantum-safe security solutions to help high-risk industries and organizations begin protecting themselves.

Its these kinds of efforts that the US and its allies must collaborate on to align the goals of scientific discovery, technological advancement, and national security. As companies build powerful quantum machines, leaders must simultaneously understand the risks those machines pose and the counter-measures required. Executives in every industry need to understand the implications that quantum computing will have on their legacy systems, and take steps to be ready. At a minimum, that means retrofitting their networks, computers, and applications with encryption that can withstand a quantum attack.

Nowhere is it more vital to begin preparations than with the vast network of governmental systems that do everything from processing Social Security checks to analyzing vast amounts of electronic intelligence.

Whether it was the discovery of fission or the launch of Sputnik, the United States has responded to scientific challenges of the past century with resolve and determination. The US must do the same with quantum computing.

WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.

Read the original here:
Quantum Computing Is the Next Big Security Risk | WIRED