What will you actually use quantum computing for? | ZDNet

With a tip of the hat to our Big on Data bro George Anadiotis, this week, we're breaking from our usual routine of the here and now to look at what's coming next. Mention the words quantum computing, and your first impression is that we're probably going to be spouting science fiction.

So what is quantum computing? It harnesses the physics of subatomic particles to provide a different way to store data and solve problems compared to conventional computers. Specifically, it totally turns the world of conventional binary computing on its side because quantum computing bits, or qubits, can represent multiple states at once, rather than just 0 or 1. The result is that quantum computers could solve certain HPC-like problems more efficiently.

Oh and by the way, did we mention that quantum computers must run at 4 degrees Kelvin? That's 4 degrees above absolute zero, far colder than interstellar space.

It's tempting to dismiss quantum computers as the computing equivalent of Warp Speed out of Star Trek. Then again, it was barely a few months ago where we saw SAS founder James Goodnight talking to Alexa to gin up a SAS analytics run in much the same way that Captain James T. Kirk spoke to his computers.

So why are we having this conversation?

Our attention was piqued by a chain of events over the past month. IBM first convened an analyst call around an upcoming article in the scientific journal Nature showing how a quantum computing modeling problem for complex molecular behavior would be documented in a Jupyter notebook. (If you want to get technical, it was about how to derive the lowest energy state of a molecule of beryllium hydride.)

Then Satya Nadella assembled a panel of Microsoft researchers to conclude his Ignite conference keynote with a session on pure theoretical physics that sailed straight over the heads of the business analyst and developer audience. Fortunately, the IBM call was way more plain spoken, addressing how quantum computers could be applied to common business problems, and where the technology stands today.

Turns out, quantum computers represent advances that would look familiar to veterans of big data analytics where you could query all of the data, not just a sample. It would also look familiar to those working with graph computing where you could factor the complexity of many-to-many relationships that would otherwise require endless joins with relational data models.

Quantum computing lends itself to any optimization problem where the combination of what-ifs, and all the permutations associated with them, would simply overwhelm a conventional binary computer. That lends itself to a large trove of mundane business and operational problems that are surprisingly familiar.

For instance, if you try to optimize a supply chain, chances are, you are narrowing down the problem to tackle the dozen most likely scenarios. With the resources of quantum computing, you could widen and deepen the analysis to virtually all possible scenarios. The same goes with tangible business challenges like managing financial risk when you have a complex tangle of interlocking trading systems across the globe. Or imagine, during drug testing, that a clinical research team could model all the potential interactions of a new drug with virtually the entire basket of medications that a specific patient cohort would be likely also be taking? And from there, could true personalized medicine be far behind?

But quantum computing development is still embryonic. A small Canadian startup, D Wave Systems, is selling units on a limited basis today. IBM is offering machines from of a half dozen 5 - 17 qubits in the cloud while Google is developing architectures that could scale up to 49. So it's not surprising that quantum still hits the wall with classes of problems that require complex, iterative processing (which, by the way, is what Spark excels at).

A good example of the type of problem that for now is just out of reach is encryption/decryption. As the algorithms grow more complex, it means factoring larger and larger prime numbers. Turns out, the interactions between qubits (which is called quantum entanglement) could short-cut such problems by taking the square root of the number of entries, and reducing the number of steps accordingly. The bottleneck is memory; such computations would require storing of state or interim results, much like a Spark or MapReduce problem. The problem is that, while development of compute chips is underway, nobody yet knows what true quantum memory would look like.

That would imply that for some problems, a division of labor where quantum factors the permutations while conventional scale-out systems handle the iterative processing might be an interim (or long-term) step.

There are a surprisingly sizable number of organizations currently pursuing quantum computing. Right now, most of the action is basic government-funded R&D, although some reports estimate VC investment over the past three years amounting to roughly $150 million. On one hand, it would be easy to get overly optimistic on near-term prospects for development given the rate at which technologies as varied as smart mobile devices, Internet of things, big data analytics, and cloud computing have blossomed from practically nothing a decade ago.

But the barriers to adoption of quantum are both physical and intellectual.

There is the physical need to super-cool machines that, in eras past, would have posed huge obstacles. But the cloud will likely do for quantum machines what they are already starting to do for GPUs: provide the economics for scale-out.

That leaves several more formidable hurdles. The physics of scale out still require basic rather than applied research - we still need to figure out how to scale such a large, fragile system. But the toughest challenge is likely to be intellectual, as it will likely require a different way of thinking to conceptualize a quantum computing problem. That suggests that the onramp to quantum will likely prove more gradual compared to the breakout technologies of the last decade.

See the article here:
What will you actually use quantum computing for? | ZDNet

Related Posts

Comments are closed.