Category Archives: Quantum Computing

Intel Takes First Steps To Universal Quantum Computing

October 11, 2017Timothy Prickett Morgan

Someone is going to commercialize a general purpose, universal quantum computer first, and Intel wants to be the first. So does Google. So does IBM. And D-Wave is pretty sure it already has done this, even if many academics and a slew of upstart competitors dont agree. What we can all agree on is that there is a very long road ahead in the development of quantum computing, and it will be a costly endeavor that could nonetheless help solve some intractable problems.

This week, Intel showed off the handiwork its engineers and those of partner QuTech, a quantum computing spinoff from the Technical University of Delft and Toegepast Natuurwetenschappelijk Onderzoek (TNO), which as the name suggests is an applied science research firm that, among other things, is working with Intel on quantum computing technology.

TNO, which was established in 1988, has a 500 million annual budget and does all kinds of primary research. The Netherlands has become a hotbed of quantum computing technology, along with the United States and Japan, and its government wants to keep it that way and hence the partnership in late 2015 with Intel, which invested $50 million in the QuTech partnership between TU Delft and TNO so it could jumpstart its own quantum computing program after sitting on the sidelines.

With this partnership, Intel is bringing its expertise in materials science, semiconductor manufacturing, interconnects, and digital systems to play to help develop two types of quantum bits, or qubits, which are the basic element of processing in a quantum computer. The QuTech partnership involves the manufacturing of superconducting qubits, but Intel also is working on another technology called spin qubits that makes use of more traditional semiconductor technologies to create what is, in essence, the quantum transistor for this very funky and very parallel style of computing.

The big news this week is that Intel has been able to take a qubit design that its engineers created alongside of those working at QuTech and scale it up to 17 qubits on a single package. A year ago, the Intel-QuTech partnership had only a few qubits on their initial devices, Jim Clarke, director of quantum hardware at Intel, tells The Next Platform, and two years ago it had none. So that is a pretty impressive roadmap in a world where Google is testing a 20 qubit chip and hopes to have one running at 49 qubits before the year is out. Google also has quantum annealing systems from D-Wave, which have much more scale in terms of qubits 1,000 today and 2,000 on the horizon but according to Intel are not a generic enough to be properly commercialized. And if Intel knows anything, it knows how to create a universal computing substrate and scale its manufacturing and its deployment in the datacenters of the world.

Production and cleanroom facilities for the quantum chip made at Intels D1D/D1X plant in Hillsboro, Oregon, in April 2017.

We are trying to build a general purpose, universal quantum computer, says Clarke. This is not a quantum annealer, like the D-Wave machine. There are many different types of qubits, which are the devices for quantum computing, and one of the things that sets Intel apart from the other players is that we are focused on multiple qubit types. The first is a superconducting qubit, which is similar to what Google, IBM, and a startup named Rigetti Computing are working on. But Intel is also working on spin qubits in silicon are very similar to our transistor technologies, and you can expect to hear about that in the next couple of months. These spin qubits build on our expertise in ordinary chip fabrication, and what really sets us apart here is our use of advanced packaging at very low temperatures to improve the performance of the qubit, and with an eye towards scalability.

Just as people are obsessed with the number of transistors or cores on a standard digital processor, people are becoming a bit obsessed with the number of qubits on a quantum chip, and Jim Held, director of emerging technology research at Intel Labs, says that this focus is a bit misplaced. And for those of us who look at systems for a living, this makes perfect sense. Intel is focused on getting the system design right, and then scaling it up on all vectors to build a very powerful quantum machine.

Here is the situation as Held sees it, and breathe in deeply here:

People focus on the number of qubits, but that is just one piece of what is needed. We are really approaching this as engineers, and everything is different about this kind of computer. It is not just the devices, but the control electronics and how the qubits are manipulated with microwave pulses and measured with very sensitive DC instrumentation, and it is more like an analog computer in some respects. Then it has digital electronics that do error correction because quantum devices are very fragile, and they are prone to errors and to the degree that we can correct the errors, we can compute better and longer with them. It also means a new kind of compiler in order to get the potential parallelism in an array of these qubits, and even the programs, the algorithms, written for these devices are an entirely different kind of thing from conventional digital programming. Every aspect of the stack is different. While there is research going on in the academic world at all levels, as an engineering organization we are coming at them all together because we know we have to deliver them all at once as a computer. Moreover, our experience tells us that we want to understand at any given point what our choices at one level are going to mean for the rest of the computer. What we know is that if you have a plate full of these qubits, you do not have a quantum computer, and some of the toughest problems with scaling are in the rest of the stack. Focusing on the number of qubits or the coherence time really does a disservice to the process of getting to something useful.

This is analogous to massively parallel machines that dont have enough bandwidth or low latency to talk across cores, sockets, or nodes efficiently and to share work. You can cram as many cores as you want in them, but the jobs wont finish faster.

And thus, Intel is focusing its research on the interconnects that will link qubits together on a device and across multiple devices.

The interconnects are one of the things that concerns us most with quantum computing, says Clarke. From the outset, we have not been focused on a near-term milestone, but rather on what it would take from the interconnect perspective, from the point of view of the design and the control, to deliver a large scale, universal quantum computer.

Interestingly, Clarke says that the on-chip interconnect on commercial quantum chips will be similar to that used on a conventional digital CPU, but it may not be made out of copper wires, but rather superconducting materials.

The one used in the superconducting qubit chip that Intel just fabbed in its Oregon factory and packaged in its Arizona packaging facility is a bit ridiculous looking.

Quantum computing presents a few physical challenges, and superconducting qubits are especially tricky. To keep preserve the quantum states that allow superposition a kind of multiple, concurrent state of the bits that allows for parallel processing at the bit level, to over simplify hugely requires for these analog devices to be kept at extremely cold temperatures and yet still have to interface with the control electronics in the outside world, crammed into a rack.

We are putting these chips in an extremely cold environment 20 millikelvins, and that is much colder than outer space, says Clarke. And first of all, we have to make sure that the chip doesnt fall apart at these temperatures. You have thermal coefficient of expansion. Then you need to worry about package yield and then about the individual qubit yield. Then we worry about wiring them up in a more extensible fashion. These are very high quality radio or microwave frequency chips and we have to make sure we maintain that quality at low temperature once the device is packaged. A lot of the performance and yield that we are getting comes from the packaging.

So for this chip, Intel has wallpapered one side of the chip with standard coaxial ports, like the ones on the back of your home router. Each qubit has two or more coax ports going into it to control its state and to monitor that state. How retro:

We are focused on a commercial machine, so we are much more interested in scaling issues, Held continues along this line of thinking. You have to be careful to not end up in a dead end that only gets you so far. This quantum chip interconnect is not sophisticated like Omni-Path, and it does not scale well, Held adds with a laugh. What we are interested in is improving on that to reduce the massive number of connections. A million qubits turning into millions of coax cables is obviously not going to work. Even at hundreds of qubits, this is not going to work. One way we are going to do this is to move the electronics that is going to control this quantum machine into this very cold environment, not down at the millikelvin level, but a layer or two up at the 4 kelvin temperature of liquid hydrogen. Our partners at QuTech are experts at cryo-CMOS, which means making chips work in this 4 kelvin range. By moving this control circuitry from a rack outside of the quantum computer into the refrigeration unit, it cuts the length of the connections to the qubits.

With qubits, superposition allows a single qubit to represent two different states, and quantum entanglement what Einstein called spooky action at a distance allows for the states to scale linearly as the qubit counts go up. Technically, n quantum bits yield 2 to the n states. (We wrote that out because there is something funky about superscripts in the Alike font we use here at The Next Platform.) The interconnect is not used to maintain the quantum states across the qubits that happens because of physics but to monitor the qubit states and maintain those states and, importantly, to do error correction. Qubits cant be shaken or stirred or they lose their state, and they are extremely fussy. As Google pointed out two years ago at the International Super Computing conference in Germany, a quantum computer could end up being an accelerator for a traditional parallel supercomputer, which is used to do error correction and monitoring of qubits. Intel is also thinking this might happen.

The fussiness of superconducting qubits is probably one of the reasons why Intel is looking to spin qubits and a more standard semiconductor process to create a quantum computer chip whose state is easier to maintain. The other is that Intel is obviously an expert at manufacturing semiconductor devices. So, we think, the work with QuTech is as much about creating a testbed system and a software stack that might be portable as it is investing in this particular superconducting approach. Time will tell.

And time, indeed, it will take. Both Held and Clarke independently think it will take maybe eight to ten years to get a general purpose, universal quantum computer commercialized and operating at a useful scale.

It is research, so we are only coming to timing based on how we think we are going to solve a number of problems, says Held. There will be a milestone where a machine will be able to tackle interesting but small problems, and then there will be a full scale machine that is mature enough to be a general purpose, widely useful accelerator in the supercomputer environment or in the cloud. These will not be free-standing computers because they dont do a lot of things that a classical digital computer does really well. They could do them, because in theory any quantum computer can do anything a digital computer can do, but they dont do it well. It is going to take on the order of eight to ten years to solve these problems we are solving now. They are all engineering problems; the physicists have done an excellent job of finding feasible solutions out of the lab and scaling them out.

Clarke adds a note of caution, pointing out that there are a lot of physics problems that need to be solved for the packaging aspects of a quantum computer. But I think to solve the next level of physics problems, we need a healthy dose of engineering and process control, Clarke says. I think eight to ten years is probably fair. We are currently at mile one of a marathon. Intel is already in the lead pack. But when we think of a commercially relevant quantum computer, we think of one that is relevant to the general population, and moreover, one that would show up on Intels bottom line. They key is that we are building a system, and at first, that system is going to be pretty small but it is going to educate us about all aspects of the quantum computing stack. At the same time, we are designing that system for extensibility, both at the hardware level and at the architecture control level to get to many more qubits. We want to make the system better, and larger, and it is probably a bit premature to start assigning numbers to that other than to say that we are thinking about the longer term.

It seems we might need a quantum computer to figure out when we might get a quantum computer.

Categories: Cloud, Compute, HPC, Hyperscale

Tags: Delft, Intel, quantum, qubit, QuTech, spin qubit, Superconducting, TNO

See the rest here:
Intel Takes First Steps To Universal Quantum Computing

quantum computing – engadget.com

According to Intel, the building blocks of quantum computing, qubits, are very fragile. They can only operate at extremely low temperatures (250 times colder than deep space) and must be packaged carefully to prevent data loss. Intel’s research groups in Oregon and Arizona have found a way to manufacture 17-quibit chips with an architecture that makes them more reliable at higher temperatures and reduced RF interference between each qubit. The chip can send and receive 10 to 100 times more signal than comparable wire-bonded chips and has an advanced design that allows for the techniques to be applied to larger quantum integrated circuits, which are much bigger than typical silicon chips.

“Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on a regular basis in our leading-edge manufacturing facilities,” said Intel Labs’ Dr. Michael Mayberry. “Intel’s expertise in fabrication, control electronics and architecture sets us apart and will serve us well as we venture into new computing paradigms, from neuromorphic to quantum computing.”

Go here to read the rest:
quantum computing – engadget.com

Qudits: The Real Future of Quantum Computing? – IEEE Spectrum

Instead of creating quantum computers based on qubits that can each adopt only two possible options, scientists have now developed a microchip that can generate qudits that can each assume 10 or more states, potentially opening up a new way to creating incredibly powerful quantum computers, a new study finds.

Classical computers switch transistors either on or off to symbolize data as ones and zeroes. In contrast, quantum computers use quantum bits, or qubitsthat, because of the bizarre nature of quantum physics, can be in a state ofsuperpositionwhere they simultaneously act as both 1 and 0.

The superpositions that qubits can adopt let them each help perform two calculations at once. If two qubitsare quantum-mechanically linked, orentangled,they can help perform four calculations simultaneously; three qubits, eight calculations; and so on. As a result, aquantum computer with 300 qubits could perform more calculations in an instant than there are atoms in the known universe, solving certain problems much faster than classical computers. However, superpositions are extraordinarily fragile, making it difficult to work with multiple qubits.

Most attempts at building practical quantum computers rely on particles that serve as qubits. However, scientists have long known that they could in principle use quditswith more than two states simultaneously. In principle, a quantum computer with two 32-state qudits, for example, would be able to perform as many operations as 10 qubits while skipping the challenges inherent with working with 10 qubits together.

Researchers used the setup pictured above to create, manipulate, and detect qudits. The experiment starts when a laser fires pulses of light into a micro-ring resonator, which in turn emits entangled pairs of photons.Because the ring has multiple resonances, the photons have optical spectrumswitha set of evenly spaced frequencies(red and blue peaks), a process known as spontaneous four-wave mixing (SFWM).The researchers were able to use each of thefrequencies to encode information, which means the photons act asqudits.Each quditis in a superposition of 10 possible states, extending the usual binary alphabet (0 and 1) of quantum bits.The researchers also showed they could perform basic gate operations on the qudits using optical filters and modulators, and then detect the results using single-photon counters.

Now scientists have for the first time created a microchip that can generate two entangled qudits each with 10 states, for 100 dimensions total, more than what six entangled qubits could generate. We have now achieved the compact and easy generation of high-dimensional quantum states, says study co-lead author Michael Kues, a quantum optics researcher at Canadas National Institute of Scientific Research, or INRS,its French acronym,in Varennes, Quebec.

The researchers developed a photonic chip fabricated using techniques similar to ones used for integrated circuits. A laser fires pulses of light into a micro-ring resonator, a 270-micrometer-diameter circle etched onto silica glass, which in turn emits entangled pairs of photons. Each photon is in a superposition of 10 possible wavelengths or colors.

For example, a high-dimensional photon can be red and yellow and green and blue, although the photons used here were in the infrared wavelength range, Kues says. Specifically, one photon from each pair spanned wavelengths from 1534 to 1550 nanometers, while the other spanned from 1550 to 1566 nanometers.

Using commercial off-the-shelf telecommunications components, the researchers showed they could manipulate these entangled photons. The basic capabilities they show are really what you need to do universal quantum computation, says quantum optics researcher Joseph Lukens at Oak Ridge National Laboratory, in Tennessee, who did not take part in this research. Its pretty exciting stuff.

In addition, by sending the entangled photons through a 24.2-kilometer-long optical fiber telecommunications system, the researchers showed that entanglement was preserved over large distances. This could prove useful for nigh-unhackable quantum communications applications, the researchers say.

What I think is amazing about our system is that it can be created using components that are out on the market, whereas other quantum computer technologies need state-of-the-art cryogenics, state-of-the-art superconductors, state-of-the-art magnets, saysstudy co-senior authorRoberto Morandotti, a physicistatINRSin Varennes. The fact that we use basic telecommunications components to access and control these states means that a lot of researchers could explore this area as well.

The scientists noted that current state-of-the-art components could conceivably generate entangled pairs of 96-state qudits, corresponding to more dimensions than 13 qubits. Conceptually, in principle, I dont see a limit to the number of states of qudits right now, Lukens, from Oak Ridge,says. I do think a 96-by-96-dimensional system is fairly reasonable, and achievable in the near future.

But he adds that several components of the experiment were not on the microchips, such as the programmable filters and phase modulators, which led to photon loss. Kues says that integrating such components with the rest of the chips and optimizing their micro-ring resonator would help reduce such losses to make their system more practical for use.

The next big challenge we will have to solve is to use our system for quantum computation and quantum communications applications, Kues says. While this will take some additional years, it is the final step required to achieve systems that can outperform classical computers and communications.

The scientists detailed their findings in the latest issue of the journal Nature.

Originally posted here:
Qudits: The Real Future of Quantum Computing? – IEEE Spectrum

Quantum Computing | Intel Newsroom

Quantum computing is an exciting new computing paradigm with unique problems to be solved and new physics to be discovered. Quantum computing, in essence, is the ultimate in parallel computing, with the potential to tackle problems conventional computers cant handle. For example, quantum computers may simulate nature to advance research in chemistry, materials science and molecular modeling.

In 2015, Intel established a collaborative relationship with QuTech to accelerate advancements in quantum computing. The collaboration spans the entire quantum system or stack from qubit devices to the hardware and software architecture required to control these devices as well as quantum applications. All of these elements are essential to advancing quantum computing from research to reality.

Download A Quantum Computing Primer

Intels director of quantum hardware, Jim Clarke, holds the new 17-qubit superconducting test chip. (Credit: Intel Corporation)

Intels 17-qubit superconducting test chip for quantum computing has unique features for improved connectivity and better electrical and thermo-mechanical performance. (Credit: Intel Corporation)

Researchers work in the quantum computing lab at QuTech, Intels quantum research partner in the Netherlands. Intel in October 2017 provided QuTech a 17-qubit superconducting test chip for quantum computing. (Credit: QuTech)

Professor Leo DiCarlo poses in the quantum computing lab at QuTech, Intels quantum research partner in the Netherlands. Intel in October 2017 provided QuTech a 17-qubit superconducting test chip for quantum computing. (Credit: QuTech)

Intel is collaborating with QuTech in the Netherlands to advance quantum computing research. Intel in October 2017 provided QuTech a 17-qubit superconducting test chip for quantum computing. (Credit: Intel Corporation)

Intels new 17-qubit superconducting test chip packaged for delivery to research partners at QuTech, Intels quantum research partner in the Netherlands. Intel in October 2017 provided QuTech with the 17-qubit superconducting test chip for quantum computing. (Credit: Intel Corporation)

Read more here:
Quantum Computing | Intel Newsroom

What will you actually use quantum computing for? | ZDNet

With a tip of the hat to our Big on Data bro George Anadiotis, this week, we’re breaking from our usual routine of the here and now to look at what’s coming next. Mention the words quantum computing, and your first impression is that we’re probably going to be spouting science fiction.

So what is quantum computing? It harnesses the physics of subatomic particles to provide a different way to store data and solve problems compared to conventional computers. Specifically, it totally turns the world of conventional binary computing on its side because quantum computing bits, or qubits, can represent multiple states at once, rather than just 0 or 1. The result is that quantum computers could solve certain HPC-like problems more efficiently.

Oh and by the way, did we mention that quantum computers must run at 4 degrees Kelvin? That’s 4 degrees above absolute zero, far colder than interstellar space.

It’s tempting to dismiss quantum computers as the computing equivalent of Warp Speed out of Star Trek. Then again, it was barely a few months ago where we saw SAS founder James Goodnight talking to Alexa to gin up a SAS analytics run in much the same way that Captain James T. Kirk spoke to his computers.

So why are we having this conversation?

Our attention was piqued by a chain of events over the past month. IBM first convened an analyst call around an upcoming article in the scientific journal Nature showing how a quantum computing modeling problem for complex molecular behavior would be documented in a Jupyter notebook. (If you want to get technical, it was about how to derive the lowest energy state of a molecule of beryllium hydride.)

Then Satya Nadella assembled a panel of Microsoft researchers to conclude his Ignite conference keynote with a session on pure theoretical physics that sailed straight over the heads of the business analyst and developer audience. Fortunately, the IBM call was way more plain spoken, addressing how quantum computers could be applied to common business problems, and where the technology stands today.

Turns out, quantum computers represent advances that would look familiar to veterans of big data analytics where you could query all of the data, not just a sample. It would also look familiar to those working with graph computing where you could factor the complexity of many-to-many relationships that would otherwise require endless joins with relational data models.

Quantum computing lends itself to any optimization problem where the combination of what-ifs, and all the permutations associated with them, would simply overwhelm a conventional binary computer. That lends itself to a large trove of mundane business and operational problems that are surprisingly familiar.

For instance, if you try to optimize a supply chain, chances are, you are narrowing down the problem to tackle the dozen most likely scenarios. With the resources of quantum computing, you could widen and deepen the analysis to virtually all possible scenarios. The same goes with tangible business challenges like managing financial risk when you have a complex tangle of interlocking trading systems across the globe. Or imagine, during drug testing, that a clinical research team could model all the potential interactions of a new drug with virtually the entire basket of medications that a specific patient cohort would be likely also be taking? And from there, could true personalized medicine be far behind?

But quantum computing development is still embryonic. A small Canadian startup, D Wave Systems, is selling units on a limited basis today. IBM is offering machines from of a half dozen 5 – 17 qubits in the cloud while Google is developing architectures that could scale up to 49. So it’s not surprising that quantum still hits the wall with classes of problems that require complex, iterative processing (which, by the way, is what Spark excels at).

A good example of the type of problem that for now is just out of reach is encryption/decryption. As the algorithms grow more complex, it means factoring larger and larger prime numbers. Turns out, the interactions between qubits (which is called quantum entanglement) could short-cut such problems by taking the square root of the number of entries, and reducing the number of steps accordingly. The bottleneck is memory; such computations would require storing of state or interim results, much like a Spark or MapReduce problem. The problem is that, while development of compute chips is underway, nobody yet knows what true quantum memory would look like.

That would imply that for some problems, a division of labor where quantum factors the permutations while conventional scale-out systems handle the iterative processing might be an interim (or long-term) step.

There are a surprisingly sizable number of organizations currently pursuing quantum computing. Right now, most of the action is basic government-funded R&D, although some reports estimate VC investment over the past three years amounting to roughly $150 million. On one hand, it would be easy to get overly optimistic on near-term prospects for development given the rate at which technologies as varied as smart mobile devices, Internet of things, big data analytics, and cloud computing have blossomed from practically nothing a decade ago.

But the barriers to adoption of quantum are both physical and intellectual.

There is the physical need to super-cool machines that, in eras past, would have posed huge obstacles. But the cloud will likely do for quantum machines what they are already starting to do for GPUs: provide the economics for scale-out.

That leaves several more formidable hurdles. The physics of scale out still require basic rather than applied research – we still need to figure out how to scale such a large, fragile system. But the toughest challenge is likely to be intellectual, as it will likely require a different way of thinking to conceptualize a quantum computing problem. That suggests that the onramp to quantum will likely prove more gradual compared to the breakout technologies of the last decade.

See the article here:
What will you actually use quantum computing for? | ZDNet

Here’s what quantum computing is and why it matters

Researchers for IBM, Google, Intel, and others are in a fantastic scientific arms race to build a commercially viable quantum computer. They already exist in laboratories, and were only a few years away from the beginning of what may turn out to be an entire shift in how we think about computing.

A typical computer, like the one inside the phone or laptop youre reading this on, is a binary system, basically a yes/no device. The most amazing thing about computer programmers is how they can take something as basic and simple as a computer chip and spit out something like Microsoft Office by creating a series of if this, then that scenarios. This showcases how useful the computer is as a tool for humans to accomplish tasks.

The quantum computer, however, is an entirely difference concept the reason its quantum is that it doesnt use binary logic. By its nature a quantum computer is a yes/no/both device. When a developer makes a logic choice they arent limited by if this then that, they can also ask if this, then that or both and that makes all the difference in the world.

There are several instances where a binary computer cant feasibly solve a problem the way wed like to. When asked to solve a problem where every answer is equally likely, a binary computer has to take the time to individually assess each possibility. Quantum computers can assess more than one probability at a time, through something called quantum entanglement.

When two particles become entangled a phenomenon occurs where anything that happens to one of these particles happens to the other. Einstein called this spooky action at a distance, and he was spot-on. A lions share of the research thats been done in quantum computing since the 1980s has been focused on figuring out how to use quantum entanglement to our advantage.

The quantum internet of the future is also being built right now, with Chinese researchers making amazing strides in quantum communications.

A quantum internet would be unhackable as theres no transmission of data. Of course storage vulnerabilities will still exist, but by then our security will be handled by AI anyway. The weird and wonderful phenomena of entanglement means you can stick data in one side and it pops out the other like teleportation. Theres nothing swirling through the ether; whatever happens to one entangled particle instantly happens to another.

The technology is here already, but there are numerous challenges to overcome on the way to full-scale implementation. First, the quantum computing were capable of is still a bit behind the binary computing weve mastered. We also need to overcome physical concerns such as the fact that, in the IBM labfor example, the processors need to be kept at perfect-zero temperatures within hundredths of a degree.

Despite several incredible problems the outlook is very bright. Recent breakthroughs include the first ever space-based video call secured by quantum encryption.

The video call connected a Chinese scientist in Beijing with an Austrian scientist in Vienna. The distance between the two was over 4,000 miles. The communication was sent to a satellite in space then beamed back down to earth. Scientists have chosen to investigate the quantum network this way due to issues of signal loss through traditional methods of sending photons like fiber-optic cables.

These quantum encrypted communications would be impossible to hack using a binary computer. On the flip-side the successful completion of a commercially viable quantum computer may signal the end of binary-based encryption systems. Theoretically, a quantum computer could crack 128-bit encryption almost instantly given the same resources for computing power as any binary system, for example.

Perhaps the best way to look at the change that quantum computing represents is to compare it to binary computing in the exact same way you would compare the iPhone Xs capabilities with those of a Timex calculator watch from the 1980s.

Read next: 7 tips for using Snapchat like a millennial

View post:
Here’s what quantum computing is and why it matters

Microsoft just upped its multi-million bet on quantum computing – ZDNet

The Nordic outpost of Microsoft’s US-based quantum research lab, Station Q, will be headed by professor Charles Marcus, one of four scientists Microsoft hired last year.

Microsoft has tipped several million dollars into a new quantum computing R&D lab at Copenhagen University, Denmark.

Microsoft has signed a multi-year deal with the university to collaborate on the development of a general-purpose quantum computer. Microsoft’s staff will be working with the university’s Niels Bohr Institute.

The institute is headed up by professor Charles Marcus, one of four scientists Microsoft hired last year to accelerate its bet that it can create a scalable quantum computer.

Marcus runs the institute’s Center for Quantum Devices (QDev) and the partnership establishes the university as a Nordic outpost of Microsoft’s US-based quantum research lab, Station Q. QDev will be home to Station Q Copenhagen, alongside Station Q labs at the University of Sydney, Purdue University, and Delft University.

Instead of conventional transistor’s on or off state, represented by 1 and 0, a quantum computer’s bits, called qubits, are based on quantum particles and can be both on and off at the same time. That characteristic offers the potential for far more powerful computers.

Microsoft is betting that topological quantum computing holds the key to creating a stable qubit. Topology, or the mathematical study of shapes and space, is gaining more attention among quantum computing researchers.

As noted in Nature, Microsoft’s approach aims to encode qubits in a quasiparticle called ‘non-abelian anyons’ that emerge from interactions inside matter. It hopes to use their topological properties, which make qubits more stable, to create its general-purpose quantum computer.

According to Copenhagen University, Microsoft now has over a dozen employees located there and expects the team to grow as they work toward developing a topological quantum computer.

On top of the multi-million dollar investment, Microsoft has also agreed to “significant” quantum research funding at the university.

“The critical pillars for successful and productive quantum research already exist at the University of Copenhagen – an aligned vision between Microsoft and the university, an exceptional team of top quantum researchers, a broad and deep pool of post doctorate and student talent, and a solid baseline of facilities and equipment dedicated to quantum research,” said David Pritchard, chief of staff for the Artificial Intelligence and Research division at Microsoft.

Report: Google takes steps to commercialize quantum computing

Google is reportedly giving researchers access to its quantum computers with the ultimate aim of using quantum computing for cloud services.

Microsoft’s next big bet? Clue: it’s just hired four top quantum computing scientists

Microsoft says it’s doubling down on quantum computing after nabbing four top scientists who will work with a Microsoft hardware veteran to turn research into reality.

Go here to read the rest:
Microsoft just upped its multi-million bet on quantum computing – ZDNet

Microsoft’s Aussie quantum computing lab set to scale up next-gen … – ARNnet

University of Sydney and Microsoft collaborators in front of Station Q’s dilution fridge (University of Sydney)

Microsofts Station Q quantum computing lab at the University of Sydney is set to embark on a new chapter in its research, moving to scale up its next generation of quantum-engineered devices.

The devices in question will form the heart of what the Microsoft-backed lab claims is the first practical topological quantum computers.

By now, the idea behind quantum computing is fairly well established. Unlike classical computing, which uses digital bits as binary switches to carry out calculations, quantum computing makes use of the unusual properties of subatomic quantum bits or qubits to perform calculations.

A topological quantum computer employs qubits using subatomic particles called Majorana fermions, a particle that is also its own antiparticle, will have their information encoded through their topology, or geometry.

The first generation of quantum bits suffers from interference from electromagnetic noise. This means they lack robustness and are proving very difficult to scale up to a fault-tolerant, universal quantum computer.

It has long been theorised that Majorana fermions could help scientists to build more robust quantum computers. Indeed, Station Q researchers suggest that by braiding the Majoranas, quantum information is encoded in a way that is highly resistant to interference.

As it turns out, a new study by Dr Maja Cassidy, who is based at the University of Sydneys Station Q lab, has confirmed one of the prerequisites for building these devices.

Now, researchers at Sydneys Station Q lab are set to build the next generation of devices that will use Majorana fermions as the basis for quantum computers.

In preparation, Station Q will move scientific equipment into the universitys Nanoscience Hub clean rooms over the next few months as it increases capacity to develop quantum machines.

Cassidy said that building such quantum devices is a bit like going on a detective hunt.

When Majorana fermions were first shown to exist in 2012, there were many who said there could be other explanations for the findings, she said.

The challenge to show that the latest findings were caused by Majoranas was put to a research team led by Professor Leo Kouwenhoven, who now leads Microsofts Station Q lab in the Netherlands.

The new research paper, published on 7 September, meets an essential part of that challenge.

In essence, the research aims to prove that electrons on a one-dimensional semiconducting nanowire will have a quantum spin opposite to its momentum in a finite magnetic field.

This information is consistent with previous reports observing Majorana fermions in these nanowires, Cassidy said.

Cassidy conducted the research while at the Technical University Delft in the Netherlands, where she held a post-doctorate position.

She has since returned to Australia and is based at the University of Sydney Station Q partnership with Microsoft.

For University of Sydney Professor and Station Q Sydney director, David Reilly, the Majorana fermion work being undertaken by Cassidy and Australian lab is practical science at the cutting-edge.

We have hired Dr Cassidy because her ability to fabricate next-generation quantum devices is second to none, Reilly said.

The new research comes just over a month after Microsoft revealed it had gone all in on its quantum computing research partnership with the University of Sydney, striking a multi-year global agreement with the institution.

The deal sees Microsoft commit to a new, long-term phase of its investment at the university, with the funding expected to result in state-of-the-art equipment, see the recruitment of new staff, help build out the nations scientific and engineering talent, and focus research project funding into the university.

In April, Microsoft revealed it would double the size of the lab, in a move expected to see at least 20 additional researchers come on board.

Quantum computing has largely been relegated to the realm of research by the likes of Station Q and other such university-affiliated labs.

However, in August, the University of NSW (UNSW) made a move to commercialise its quantum computing technology with the launch of what is being touted as Australias first quantum computing company.

The $83 million venture, from which the new company, Silicon Quantum Computing Pty Ltd, emerged, has received backing from UNSW itself, which has contributed $25 million, as well as the Commonwealth Bank of Australia and Telstra, which are contributing $14 million and $10 million, respectively.

The creation of the company is intended to help drive the development and commercialisation of a 10-qubit quantum integrated circuit prototype in silicon by 2022, as the forerunner to a silicon-based quantum computer.

The company will work alongside the Australian Research Council (ARC) Centre of Excellence for Quantum Computation and Communication Technology (CQC2T), operating from new laboratories within the Centres UNSW headquarters.

Error: Please check your email address.

Tags Station QSydneyresearchMicrosoftQuantum computing

View original post here:
Microsoft’s Aussie quantum computing lab set to scale up next-gen … – ARNnet

An Entirely New Type of Quantum Computing Has Just Been Invented – Futurism

A New Type of Qubit

Australian researchers have designed a new type of qubit the building block of quantum computers that they say will finally make it possible to manufacture a true, large-scale quantum computer.

Broadly speaking, there are currently a number of ways to make a quantum computer. Some take up less space, but tend to be incredibly complex. Others are simpler, but if you want it to scale up youre going to need to knock down a few walls.

Some tried and true ways to capture a qubit are to use standard atom-taming technology such as ion traps and optical tweezers that can hold onto particles long enough for their quantum states to be analysed.

Others use circuits made of superconducting materials to detect quantum superpositions within the insanely slippery electrical currents.

The advantage of these kinds of systems is their basis in existing techniques and equipment, making them relatively affordable and easy to put together.

The cost is space the technology might do for a relatively small number of qubits, but when youre looking at hundreds or thousands of them linked into a computer, the scale quickly becomes unfeasible.

Thanks to coding information in both the nucleus and electron of an atom, the new silicon qubit, which is being called a flip-flop qubit, can be controlled by electric signals, instead of magnetic ones. That means it can maintain quantum entanglement across a larger distance than ever before, making it cheaper and easier to build into a scalable computer.

If theyre too close, or too far apart, the entanglement between quantum bits which is what makes quantum computers so special doesnt occur, says the researcher who came up with the new qubit, Guilherme Tosi, from the University of New South Wales in Australia.

The flip-flop qubit will sit in the sweet spot between those two extremes, offering true quantum entanglement across a distance of hundreds of nanometres.

In other words, this might be just what weve been waiting for to make silicon-based quantum computers scalable.

To be clear, so far we only have a blueprint of the device it hasnt been built as yet. But according to team leader, Andrea Morello, the development is as important for the field as the seminal 1998 paper in Nature by Bruce Kane, which kicked off the silicon quantum computing movement.

Like Kanes paper, this is a theory, a proposal the qubit has yet to be built, says Morello. We have some preliminary experimental data that suggests its entirely feasible, so were working to fully demonstrate this. But I think this is as visionary as Kanes original paper.

The flip-flop qubit works by coding information on both the electron AND nucleus of a phosphorus atom implanted inside a silicon chip,and connected with a pattern of electrodes. The whole thing is then chilled to near absolute zero and bathed in a magnetic field.

The qubits value is then determined by combinations of a binary property called spin if the spin is up for an electron while down for the nucleus, the qubit represents an overall value of 1. Reversed, and its a 0.

That leaves the superposition of the spin-states to be used in quantum operations.

In flip-flop, researchers are able to control the qubit using an electric field instead of magnetic signals which gives two advantages. It is easier to integrate with normal electronic circuits and, most importantly, it also means qubits can communicate over larger distances.

To operate this qubit, you need to pull the electron a little bit away from the nucleus, using the electrodes at the top. By doing so, you also create an electric dipole, says Tosi.

This is the crucial point, adds Morello. These electric dipoles interact with each other over fairly large distances, a good fraction of a micron, or 1,000 nanometres.

This means we can now place the single-atom qubits much further apart than previously thought possible. So there is plenty of space to intersperse the key classical components such as interconnects, control electrodes and readout devices, while retaining the precise atom-like nature of the quantum bit.

Its easier to fabricate than atomic-scale devices, but still allows us to place a million qubits on a square millimetre.

What this new flip-flop qubit means is a balance that could make future quantum computers small and potentially affordable.

Its a brilliant design, and like many such conceptual leaps, its amazing no-one had thought of it before, says Morello.

The research has been published in Nature Communications.

Continue reading here:
An Entirely New Type of Quantum Computing Has Just Been Invented – Futurism

Quantum computing event explores the implications for business – Cambridge Network

A free, one-day ‘Executive Track’ on the issue – part of an international workshop on quantum-safe cryptography – takes place on Wednesday 13 September at the Westminster Conference Centre, London. It focuses on the implications for businesses and highlights developments underway to address them.

Government cyber-security agencies (UK, US, Canada) and experts from universities and industry (including Amazon, BT, Cisco and Microsoft) will present and discuss the issues and potential solutions to this fundamental technological development that threatens catastrophic damage to Government, industry and commerce alike.

Find out more and book your place at this free event here

The Executive Track on13 Septemberis designed for business leaders and will outline the state of the quantum threat and its mitigation for a C-level audience including CEOs, CTOs and CISOs.

Attendees will learn how quantum computers are poised to disrupt the current security landscape, how government and industry organisations are approaching this threat, and the emerging solutions to help organisations protect their cyber systems and assets, now and into the future of quantum computing.

___________________________________________________

See the article here:
Quantum computing event explores the implications for business – Cambridge Network