Page 472«..1020..471472473474..480490..»

Taiwan’s first home-grown quantum computer is now connected to the internet – TweakTown

A Taiwanese research institute has connected to the internet what is being reported as the nation's first home-grown quantum computer.

VIEW GALLERY - 2 IMAGES

The announcement of the successful connection occurred on January 19 and details Taiwan's first domestically built quantum computer that will be used as a test bed for researchers both on-site and elsewhere around the world. Collaborators on the project include the University of California, Santa Barbara, the University of Wisconsin-Madison, the Industrial Technology Research Institute, the National Changhua Normal University, the National Central University, and the National Chung Hsin University.

Academia Sinica's website states the quantum computer was completed back in October last year, but it was only connected to the internet in January this year, marking a milestone in Taiwan's exponential development into quantum computing. The intent of this step forward in the quantum computing industry is to demonstrate the capabilities of the new technology by solving basic problems. After that has been successfully demonstrated, researchers can move on to the next breakthrough application.

"The success of this project at this stage should prove the characteristics of technological research and development," said James Liao, president of Academia Sinica. "Only after a period of patiently solving basic problems can the next application breakthrough be achieved."

"It is hoped that Academia Sinica's small step will drive the development of quantum research and related industries in Taiwan, attract more domestic and foreign talents to participate in the event, and seize opportunities for Taiwan in the field of quantum technology," added Liao

Excerpt from:
Taiwan's first home-grown quantum computer is now connected to the internet - TweakTown

Read More..

TQI Exclusive: Photonics Illuminating Quantum Technology: Trends, Challenges and Opportunities – The Quantum Insider

Klea Dhimitri

Applications Engineer, Hamamatsu Corporation

What image pops into your head when you hear the words quantum technology?

What about quantum computer?

There is a very good chance you are thinking of the gold chandelier often found in a large dilution refrigerator that quantum computing players like IBM and Google use to cool and operate a superconducting qubit. However, there is one core technology that is often overlooked when we talk about quantum computers And thats photonics. Dr. Bob Sutor who spent more than two decades at IBM Research in New York working and leading IBMs efforts in quantum computing knows the large cyrostats very well. Currently Dr. Sutor is the vice president and chief quantum advocate at Infleqtion where at an event he stated:

Photonics is huge. Photonics is at the core of the future quantum infrastructure and without good growth of photonics industry driving down the cost and size into for example photonics integrated circuits none of this stuff is going to work beyond toy size machines that are disconnected from each other [1].

This article aims to shed light on how photonics is currently enabling multiple areas in emerging quantum fields like quantum computing, quantum communication & networking and quantum sensing, as well as touch on the challenges and opportunities that lie ahead.

The quantum technology landscape can be partitioned in a variety of ways, but in this article we will split the field into four pillars 1) Quantum Computation & Simulation 2) Quantum Communication & Networking 3) Quantum Metrology & Sensing and 4) Fundamental Research. The subsections within each pillar that utilizes photonics are highlighted in yellow in figure 1. The figure illustrates that over 2/3rd of the field utilizes photonics and photonics plays a major role in the quantum technology landscape.

Figure 1: Overview of quantum technology pillars where the use of photonics is highlighted in yellow.

Photonics role in quantum computation & simulation

Scientists are investigating a variety of qubit modalities to realize a universal fault tolerant quantum computer, but for several qubit modalities photonics is at the core of their toolbox. Photonics have a wide range of capabilities such as the ability to apply gate operations, confine atoms in an array and detect qubit states (either 0 or 1) through low light fluorescence or lack thereof from trapped ions or neutral atoms. Qubit modalities such as photonic quantum computing use the photon to the max. by utilizing a property of the photon to construct a qubit. The vision for photonic quantum computing is the entire optical table, from light sources to the optics to the photon detectors on a chip [2].

Scaling and fidelity are the main drivers for photonic component development in qubit modalities, such as trapped ions, neutral atoms and photonic qubits.

Trapped ions current infrastructure needed to operate a small chain of tens of ions typically takes up two optical tables [4]. Developing photonic components like photonic integrated circuit (PICs) are of interest to trapped ion developers, for example, because it could help make the modality more scalable. Bringing together more ions while each ion is being controlled precisely could help scale the processor without a large cumbersome infrastructure to scale along with it [7].

Figure 2: Trapped ion setup from the Blatt Lab

Photonics role in quantum communication & networking

The photon is a tried and tested carrier of sending information over long distances as seen in classical optical communication networks. The attraction of the photon in quantum communication and quantum networking is that a qubit can travel over long distances [6] as well as notify the users when an eavesdropper is listening in. Devices such as quantum random number generators (QRNG) that produce truly random keys used in quantum communication protocols could be realized with in utilizing light sources and detectors as well [5].

Quantum key distribution and quantum network hardware also rely on photonics, such as excitation or pump lasers for photon sources to emit photons in optical fibers, for example, and detectors from single photon detectors to photodiodes to detect them on the receiving end of the fiber.

Land quantum networks are distance limited due to optical fiber losses and lack of quantum repeaters. Space and satellite networks are aperture and diffraction loss limited. Main driver for photonic component development for quantum communication is to preserve photons over long distances.

Figure 3: Different types of quantum networks

Photonics role in quantum metrology & sensing

The field of quantum metrology and sensing is focused on accurate probing of the environment in terms of measurement such as electric, magnetic and gravitational fields, as well as timing and positioning. Information that quantum sensors measure could be communicated via fluorescence. In the case of a nitrogen vacancy- (NV) based magnetometer, different fluorescence intensities relate to the strength of the magnetic fields present. The main drivers for photonic component development of quantum sensing are size, weight, power & cost (SWaP-C) for field deployable applications.

Market opportunity for photonics

The market for quantum systems is currently not high, but the market for photonic components is high and promising. Over half of the bill of materials (BOM) cost for quantum systems goes to lasers while the rest is split among detectors, modulators and other components. Currently, the largest market for photonic components is in research at an estimated $171 million for lasers and $33 million for photonic components that include detectors, modulators, and other components. Its predicted that from 2025 and beyond that photonic components for quantum products manufactured by OEMs will be larger than photonic components used in research [8].

Challenges of photonics in quantum technologies

One of the photonic challenges for quantum technologies is the construction of single photon sources that contain all the desired features for applications like quantum key distribution and some forms of photonic quantum computing [9]. Photonic integrated circuits (PICs) are seen as the holy grail for quantum technology. However, PICs still present a few challenges such as bringing all optical components including lasers and detectors on chip as well as the cost of a PIC production line. These production lines need high volume to keep cost manageable and unclear if quantum applications will scale and if so when [10]

Ongoing advancements of photonic components will be instrumental in realizing quantum systems and building them for the quantum 2.0 era.

References

[1] YouTube Video of Bob Sutor. https://www.youtube.com/watch?v=sRAZ8PJzzLY 6:28 to 6:56

[2] Masuda, A. (2019). https://www.news.ucsb.edu/2019/019679/pushing-quantum-photonics

[3] Choi, C.Q. (2021). https://spectrum.ieee.org/race-to-hundreds-of-photonic-qubits-xanadu-scalable-photon

[4] Jurcevic,P.,Mandelbaum,R. (2021). Heres How Ion Trap Quantum Computers Work. The Quantum Aviary. https://thequantumaviary.blogspot.com/2021/03/heres-how-ion-trap-quantum-computers.html [5] Jennifer Aldama, Samael Sarmiento, Ignacio H. Lpez Grande, Stefano Signorini, Luis Trigo Vidarte, and Valerio Pruneri, Integrated QKD and QRNG Photonic Technologies, J. Lightwave Technol. 40, 7498-7517 (2022)

[6] Awschalom, D.D., et.al. https://doi.org/10.2172/1900586

[7] Niffenegger, R.J., Stuart, J., Sorace-Agaskar, C.et al.Integrated multi-wavelength control of an ion qubit.Nature586, 538542 (2020). https://doi.org/10.1038/s41586-020-2811-x

[8] Tematys Photonics@Quantum: Technologies for Quantum Systems Report (April 2022)

[9] OIDA, OIDA Quantum Photonics Roadmap: Every Photon Counts, Optica Industry Report, 3 (2020)

[10] Davis.S et al. Piercing the Fog off Quantum-Enabling Laser Technology (QELT) A report based on a QED-C Enabling Technology Workshop. QED-C. (2018).

Follow this link:
TQI Exclusive: Photonics Illuminating Quantum Technology: Trends, Challenges and Opportunities - The Quantum Insider

Read More..

Alice & Bob-led Research Shows Novel Approach to Error Correction Could Reduce Number of Qubits For Useful … – The Quantum Insider

Insider Brief

PRESS RELEASE Alice & Bob, a leading hardware developer in the race to fault tolerant quantum computers, in collaboration with the research institute Inria, today announced a new quantum error correction architecture low-density parity-check (LDPC) codes on cat qubits to reduce hardware requirements for useful quantum computers.

The theoretical work, available on arXiv, advances previous research on LDPC codes by enabling the implementation of gates as well as the use of short-range connectivity on quantum chips.

The resulting reduction in overhead required for quantum error correction will allow the operation of 100 high-fidelity logical qubits (with an error rate of 10-8) with as little as 1,500 physical cat qubits.

Over 90% of quantum computing value depends on strong error correction, which is currently many years away from meaningful computations, said Jean-Franois Bobier, Partner and Director at the Boston Consulting Group. By improving correction by an order of magnitude, Alice & Bobs combined innovations could deliver industry-relevant logical qubits on hardware technology that is mature today.

This new architecture using LDPC codes and cat qubits could run Shors algorithm with less than 100,000 physical qubits, a 200-fold improvement over competing approaches 20 million qubit requirement. said Thau Peronnin, CEO of Alice & Bob. Our approach makes quantum computers more realistic in terms of time, cost and energy consumption, demonstrating our continued commitment to advancing the path to impactful quantum computing with error corrected, logical qubits.

Cat qubits alone already enable logical qubit designs that require significantly fewer qubits, thanks to their inherent protection from bit flip errors. In a previous paper by Alice & Bob and CEA, researchers demonstrated how it would be possible to run Shors algorithm with 350,000 cat qubits, a 60-fold improvement over the state-of-the art.

LDPC codes are a class of efficient error correction codes that reduce hardware requirements to correct errors occurring in information transfer and storing. By using LDPC codes on a cat-qubit architecture, this latest work not only shows how the qubit footprint of a fault tolerant quantum computer could be further reduced but overcomes two key challenges for the implementation of quantum LDPC (qLDPC) codes.

Alice & Bob recently announced the tape out of a chip that would encode their first logical qubit prototype, known as Helium 1. When logical qubits with a sufficiently low error rate are implemented and using the cat qubit LDPC code technique, Alice & Bob would be capable of harnessing the computing power of 100 logical qubits with as little as 1,500 physical qubits, to run fault-tolerant algorithms.

As leading superconducting quantum computing manufacturers like IBM offer up to 1,121 physical qubits, outperforming classical computers in the simulation of quantum systems (quantum supremacy) is a milestone that would become attainable within current hardware capabilities using Alice & Bob new architecture.

Read the original here:
Alice & Bob-led Research Shows Novel Approach to Error Correction Could Reduce Number of Qubits For Useful ... - The Quantum Insider

Read More..

IonQ, Rescale team to put quantum, cloud-based HPC to work on AI – FierceElectronics

IonQ, one of the few companies that can lay claim to having sold a quantum computer, has partnered with Rescale, maker of accelerated computing software for cloud-based high-performance computing, in a match that could have big implications for users tackling large and complex AI projects.

The new partners said in a statement that they aim to merge the raw processing power of accelerated cloud computing with the unique problem-solving potential of quantum computing to tackle problems and applications in the realms of product development, healthcare, life sciences, financial services, materials research, logistics optimization, and national research labs.

IonQ already has sold two of its quantum computers to a European research lab, and Rescales platform would serve as a springboard for users of IonQs machines, such as its 29-algorithmic-qubit Forte, to be used on AI projects in a hybrid quantum computing/HPC environment, the companies said.

IonQ also just this week unveiled a more advanced 35-algorithmic-qubit machine, and as quantum computers continue to advance, they already are making a case for their ability to handle large AI and machine learning tasks.

At the recent Needham Growth Conference, IonQ officials spoke to quantums advantage, with IonQ CFO Thomas Kramer saying, AI runs on top of very large machine learning models, and what weve seen when they run on our quantum computers is that they're able to capture predicted outcomes [as well as] more outlying possible eventsBlack Swans, if you will and on many fewer iterations.

He added that running some problems on classical computing infrastructure may require running the same query thousands or even millions of times. We have seen that when you do that on a quantum computerour quantum computersyou can get to the same predictive output or even better output by going through 1/1000th the number of iterations. We've also seen that in very large models we can get to the same predictive output with 1/1000th the number of input variables, which translates to not only a time savings, but also reduced cost and greater power efficiency.

Because of that, we will be able to run machine learning problem sets [on quantum computers] that we can't do today [on classical computers] and will not be able to because the training set is too large to do it in an economic fashion, Kramer said.

Jordan Shapiro, vice president of financial planning and analysis and head of investor relations, also in attendance at the Needham conference, concluded, So imagine if you go to an OpenAI or any one of the thousands of companies now using these large language models and tell them that you can run and train their model cheaper, in less time, and using less power, and with better accuracy and fewer input variablesthat is a compelling use case.

The new partnership may demonstrate a new channel for Rescale, as it has not previously partnered with any quantum computing companies. To date, it has worked with many cloud providers and semiconductors firms, including Nvidia and AMD, on combining its software with their hardware on AI applications.

"Through seamlessly blending the largest full-stack integrated R&D capabilities and AI-powered computational workflows on Rescale with IonQ's cutting-edge quantum technology, we are embarking on a journey to accelerate engineering innovations and discover new science," said Joris Poort, CEO of Rescale. "This partnership not only accelerates R&D in fields such as engineering product development and life sciences exploration, but it creates a collaborative ecosystem where the boundaries of innovation are productively explored by the world's leading scientists."

"In our partnership with Rescale, we are exploring new ground in the realm of hybrid quantum computing," said Rima Alameddine, Chief Revenue Officer at IonQ. "IonQ's cutting-edge quantum computers, coupled with Rescale's R&D platform, form a dynamic duo poised to revolutionize how we approach healthcare, life sciences, financial services, and address national security challenges. This collaboration is about more than accelerating computer power; it's also about bringing the best of high performance computing, AI, and quantum computing to solve complex intractable problems and unlock unprecedented possibilities."

The companies also said the new partnership goes beyond hardware and software integration, fostering a collaborative environment where scientific expertise, computational power, and quantum know-how converge.

Originally posted here:
IonQ, Rescale team to put quantum, cloud-based HPC to work on AI - FierceElectronics

Read More..

Quantum computing’s ChatGPT moment could be right around the corner – Cointelegraph

Tech experts across government, academia, and the private sector are methodically working to ensure the worlds data is safe from the impending threat of quantum decryption. While this may represent the greatest technological threat this side of AI-wrought extinction, there may be some silver linings along the way.

At some point, possibly in the near future, researchers believe a quantum computing system capable of breaking RSA encryption the standard that protects banks, military bases, and countless other institutions from hackers and spies will emerge.

Related: WEF identifies AI and quantum computing as emerging global threats

Before that happens, however, several other quantum technology solutions will likely need to come into focus. Chief among them, may very well be quantum sensing.

Jack Hidary, the CEO of SandboxAQ, a Google sibling company focused on quantum technologies, is certain well see scaled, fault-tolerant quantum computers by the end of the decade.

In a talk entitled Quantums Black Swan, given at the World Economic Forum, the CEO discussed the threat of quantum decryption as well as some of the potential breakthroughs we could see ahead of it.

Hidary predicts that certainly by 2029-30, we're going to see scaled, fault-tolerant quantum computers, which could be capable of breaking encryption.

Hes not the only one making predictions that would have seemed bold just a few years ago. IBM, currently considered the industry leader by many, says itll hit an inflection point in quantum computing by 2029. And MIT/Harvard spinout QuEra claims itll have a 10,000-qubit error-corrected quantum computer by 2026.

Theoretically-speaking, any quantum computer capable of quantum advantage outperforming classical binary computers at useful tasks could break RSA encryption.

Luckily, as Hidary pointed out, groups around the world, including the U.S. government and IBM, have identified algorithms and policies that should be able to protect our data if they can be implemented in time.

Its likely well see a swell of related quantum technologies before the threat of quantum-based encryption breaking is realized. This could manifest in less-powerful quantum computing systems capable of pushing beyond the limitations of todays modern binary supercomputers.

However, a more immediate quantum technology might be quantum sensing. According to Hidary, quantum sensors could fill in the gaps in our GPS system perhaps even thwarting active attempts at obfuscating satellite signals.

There could be myriad uses for quantum sensors ranging from medical applications involving deeper, more accurate, real-time body and brain scanning to potential implications for robotics capable of full, untethered autonomy.

Much like most AI experts and pundits couldnt have predicted the impact ChatGPT would have less than a decade after the seminal Generative Adversarial Networks paper was published, it might be difficult to determine just how quantum computing will break through from the lab to the mainstream.

Read the original post:
Quantum computing's ChatGPT moment could be right around the corner - Cointelegraph

Read More..

Quantum Algorithm Can be Used to Optimize Telecommunication Networks – The Quantum Insider

Insider Brief

PRESS RELEASE The project, promoted by Cinfo and Kipu Quantumon the Rinfrastructure, applies the computing capacity offered today by quantum computing to the Galician operators optical fiber backbone network, examining its robustness and resilience to potential outages and/or critical situations.

The newly designed quantum algorithm identifies the most sensitive nodes, the ones that could have the greatest impact on the service in the event of a disconnection or breakdown. This relevant information makes it possible to focus on those points detected with current quantum technology and anticipate counteractions, achieving a maximum index of network availabilityand service excellence.

Cinfo, which has prepared the network model adapted to the capabilities of accessible quantum computers, has been supported by Kipu Quantum, which was in charge of preparing the model of the quantum algorithm to analyze the R backbone network.

Norberto Ojinaga, Director of Technology Solutions at Rand the MASMOVILGroup, believes that the considered use case is a realistic instance that allows a significant improvement in the quality and guarantee of the service that, as a telecommunications operator, we want to offer our customers.

In addition, Isidro Fernndez de la Calle, Director of Business at Rand MASMOVILGroup, explains that we manifest our strong engagement to provide our customers with a robust and resilient network in case of most adverse circumstances; therefore, we cannot neglect the advances quantum computing offers now to simulate and prepare our environments to be managed with a maximal guarantee.

Analysis in two phases: quantum annealers and neutral atoms

Unlike classical computing, and thanks to the large number of qubits of neutral-atom quantum computers(256 today and about 1,000 expected in about a year), the piloted quantum algorithm consumes the same time regardless of the number of network nodes.

In the first phase of the R-Cinfo-Kipu project, an analysis has been carried out for each node with currently available quantum computers. Specifically, an initial classification of the network topology has been performed with aquantum annealeron 180 of the 5,627 qubits available in the quantum computers of the D-Wave company, allowing a sensible network segmentation.

In the second phase, the one related to the examination of sublattices, QuEras quantum computer based on neutral-atom technology was used with 20 qubits for the solution of the main lattice structure and 46 qubitsfor the combined structure of sublattices.

This pioneering hybrid solution architecture employing various quantum computing technologies has been made possible due to the commercial access offered by providers through cloud services. QuEra platform access is enabled by AWS Braket service, while D-Wave provides its own services. Cloud quantum computing capabilities made these combined solutions possible by extracting the best from each of them.

The CEO of Cinfo, Antonio Rodrguez del Corral, points out that at Cinfo, we have accepted the challenge of creating valuable use cases in quantum computing for industry. To this end, we are developing a team of professionals graduates in Physics from the University of Santiago de Compostela and selecting technology partners that will introduce us to the design of quantum algorithms and to the understanding of the different capabilities of existing quantum computers, such as Kipu Quantum.

For Rodrguez del Corral, technology companies are needed to identify business problems where quantum computing can help, because the customer does not have to know the details of a complex field that evolves at great speed. We try to find unsolved business problems in large companies and see whether quantum computing can provide better solutions. Once the challenge has been identified, the most appropriate algorithm and quantum hardware will have to be chosen, which may change in time as quantum computers develop.

Regarding the analysis done of the R network, the CEO of Cinfo explains thatthe optimization of network traffic is always a key issue and, in complex networks such as the R network, it may require quantum computing. We proposed it, confirmed the fit, and this project was born.

On another note, Kipu Quantums Chief Visionary Officer, Enrique Solano, comments that quantum computers with digital, analog, and digital-analog encoding will move closer to quantum advantage for industrial use cases this year. Projects such as the one developed with Rand Cinfo are a step forward towards the practical use of quantum processors with hundreds of qubits. Kipu Quantum is proud to contribute to the leaders and pioneers of the Galician quantum pole in the use of quantum technologies.

The CEO of Kipu Quantum, Daniel Volz, believes that this project should lay the foundations for a long and fruitful collaboration with Cinfo, with the Galician industry and business community, as well as with the Spanish technological ecosystem in our joint path towards the usefulness of quantum computers in Europe.

Both quantum hardwarecompanies and algorithm vendors have embraced the challenge of achieving quantum advantage in the short term, possibly in a couple of years. The startup Kipu Quantumaims to achieve this as soon as possible with its application- and hardware-specific algorithms, which are adapted to existing hardware. In addition, Kipu Quantumhas the highest compression i.e. reduction of algorithms in quantum devices with digital, analog, and digital-analog encoding in optimization, logistics, finance, and artificial intelligence, as well as in the design of chemical molecules and materials. In this way, the best solutions can be extracted from quantum processors with qubits encoded in trapped ions, neutral atoms, or superconducting circuits.

Points for improvement on a realistic case

The topology of an advanced fiber network infrastructure such as Rs requires a detailed study of the connectivity of its nodes and strategic points that facilitate an evaluation of its strength. For this reason, the one of R has become a use case for Cinfo. In this sense, by running algorithms or models through quantum computers, it is possible to detect points of improvement in the network and draw conclusions to act and solve.

As quantum computers become more powerful, a larger number of more complex variables can be incorporated into the algorithmic study. In this way, the path established by this Cinfoproject, in collaboration with its expert partnerKipu Quantum,will go ahead and exploit the more than predictable quantum hardware enhancements. In fact, it is expected that by 2025 these supercomputers may already be able to process the algorithm for a complex network such as the backbone of Rand the entire MASMOVIL Group; basically, as the infrastructure grows, it is optimized and perfected.

All this shows that the use of quantum computing for the analytical and complete resolution of common problems in the industry is imminent. This will allow us to discard current approaches based on brute force and experience, which are not always effective in anticipating all scenarios.

Link:
Quantum Algorithm Can be Used to Optimize Telecommunication Networks - The Quantum Insider

Read More..

Quantum technology the black swans are gathering, claims start-up CEO – diginomica

(Image by Holger Detje from Pixabay)

Much has been written about whether there will be atipping point for quantum technologies, a critical moment at which they suddenly become adopted at scale. The reality is that quantum and classical computing will co-exist long into the future, joining forces in a hybrid environment that plays to both of their strengths.

There is more good news. Quantum computers will probably become sufficiently powerful, fault-tolerant, and reliable to run some enterprise tasks this decade. But industry consensus suggests that use cases that are ideally suited to such devices will emerge more slowly. These might include applications that model the natural world and chaotic processes, crunch huge numbers, reveal hidden correlations in specialized data, or help researchers develop new materials and drugs alongside AI on classical devices.

But in the absence of a tipping point, might there be a so-called black swan moment for quantum instead? A sudden event that has unforeseen, perhaps negative, consequences? The answer is that such a crisisisapproaching. We dont know precisely when it will hit, but we do know what it is and what will happen if we fail to prepare for it.

It is the threat to the global economy to banking, ecommerce, supply chains, government systems, and everyday communications that will arrive when quantum computing, or an emulation of it, can reliably and swiftly break the public-key encryption that underpins our secure transactions and communications.

In a forceful session entitled Quantums Black Swan at the World Economic Forum in Davos last week, Jack Hidary, CEO of quantum sensor provider SandboxAQ, urged much greater urgency in building quantum-safe systems and post-quantum security than is normal at industry events.

He said:

Let's say we want to build a tunnel under a river. We don't just start from one side and keep going; we start from both sides and meet in the middle. So, by analogy, the hardware folks IBM is doing an incredible job of advancing the superconducting quantum computing methodology, for example are digging from one side. But the algo [algorithm] people are digging from the other.

What has happened recently is that, in paper after paper, we've seen that the number of qubits we need to crack RSA is coming down. So, the two sides will meet faster and faster under the river to make this tunnel that breaks the banking system, that breaks the telco system, that breaks the energy system, and breaks government secrets.

So, is there any good news? Yes and no, he said. On the one hand, the US National Institute of Standards and Technology (NIST) and others have come together to create newpost-quantum cryptography protocols. These are still being sought, tested, and finalized.

But on the other, he explained:

[The bad news is] it takes seven or eight years for a bank or government to transition to a new protocol. So, what's very important right now is that we understand that this [the development of quantum technologies] does not work in a linear fashion.

On its own page onquantum-resistant cryptography, NIST says:

Historically, it has taken almost two decades to deploy our modern public key cryptography infrastructure.Therefore, regardless of whether we can estimate the exact time of the arrival of the quantum computing era, we must begin now to prepare our information security systems to be able to resist quantum computing.

In Davos, Hidary added:

People got surprised by Gen-AI, and what's going to happen here is the same thing. At some point, people are going to say, Wow, what a surprise, what a shock, that our cryptography is broken!

Certainly by 2029-30, we're going to see scaled, fault-tolerant quantum computers. But you might say, I need a certain number of qubits [to break encryption] today. But my prediction is that the number of qubits is going to come down.

Think about the brass prize of being able to decrypt everything in the world! This is a major issue. [] We have to act now.

So, there may be a global crisis conceivably as early as this decade unless organizations treat this foreseeable event as an urgent, real-world problem, and not as a long-term theoretical one. Less of a serene black swan, in fact, and more of a rampaging bear.

But was Hidary just trying to raise his own profile and using the World Economic Forum to do it? Perhaps, and he would not be the first. But it is equally possible that he has detected troubling signals amidst the industry noise.

The key issue (in every sense) is this: it is not as if the method for cracking RSA encryption, for example, is a secret. It just comes down to maths.

Peter Shor, a Professor of Applied Mathematics at MIT, proposed what became known as Shors Algorithm 30 years ago. This is a method for factoring semi-prime numbers on a quantum computer theoretical when he proposed it exponentially faster than on a classical device. (This is due to a qubits ability to superimpose multiple states, compared with the binary on or off of a classical bit.)

In this way, such an algorithm would, if run successfully, negate the security assumptions that underpin asymmetric cryptography. Namely that the timescales for running the required calculations on a classical computer billions of years to crack the minimal standard for secure encryption (RSA-2048) make it practically impossible. (The computation required grows exponentially larger with each digit in a sequence.)

By contrast, the only obstacles to using a quantum computer to run Shors algorithm or some evolution of it are the number of qubits (estimates range from one per bit all the way up to 20 million for cracking 2048-bit encryption), and the fact that their subatomic nature makes them noisy, and prone to error.

So, a quantum computer simply needs to be both powerful enough and fault tolerant, or self-correcting.

Most researchers believe that such an algorithm cant run at present; and certainly not for keys that have hundreds or thousands of digits. But it is purely a matter of time, though opinions differ as to whether that might be within a decade, a lifetime, or something closer to geological time.

But there is a problem, however. And that is: what if encryption is much closer to being cracked than most researchers believe?

Unsurprisingly, this is a matter of claim and counterclaim for anyone keen to make a name for themselves, or to spook rival governments. For example, a year ago, a group of Chinese researchers claimed that a 2048-bit RSA key could, theoretically, be broken by running the similarly named Schnorrs Algorithm on a quantum device of only a few hundred qubits.

That is troubling, given that the latest quantum hardware is up to the 1,000-qubit mark already, while smaller, more fault-tolerant devices exist too. However, others have claimed that this algorithm works well enough to crack, say, a 48-bit key, but cannot scale to much larger numbers. As a result, the computation would fail.

Meanwhile in November 2023,veteran researcher and Physics PhD Ed Gerck made a truly astonishing claim: that he had broken RSA-2048 encryption in seconds using quantum emulation on a cellphone, using an all states at once technique called simultaneous multifactor logic.

In the absence of formal publication of his research, or any peer-reviewed data, the security community remains deeply sceptical. Even so, the problem facing the industry is that even the most sensational or unlikely claims cant just be dismissed despite astronomer Carl Sagans aphorism that extraordinary claims require extraordinary evidence. The stakes are simply too high.

One reason is the possibility, however remote, that a researcher might have made a giant leap forward, or spotted a flaw in orthodox thinking; consider how Einsteins thought experiments a century ago transformed our picture of time, space, and gravity, for example.

Another is the phenomenon known as Store Now, Decrypt Later (SNDL): the awareness that any number of organizations, hackers, or hostile states will have been hording others encrypted data for decades, and are just waiting for the breakthrough that enables them to read it.

For this reason, Gerck urged authorities to retire RSA and implement quantum-safe standards as soon as possible. Even if his own claim proves to be bunkum, that sounds like sensible advice.

But the risk of a quantum computer breaking strong encryption is not the only black swan that might arise from the technology, or demand its urgent adoption. According to Hidary who was on a panel with Ana Paula Assis, EMEA Chair of the IBM Corporation, and Jol Mesot, President of ETH Zurich another black swan is already with us.

Quantum sensors are essential today, he explained, because of problems with the satellite-based GPS systems that we all use to navigate, plus the inaccuracy of others in the medical profession.

He said:

What if GPS is not available? Over huge swathes of the ocean right now in the Pacific Rim area, particularly near Taiwan, there is no GPS. And over huge swathes of the Middle East, GPS is not only being jammed, but being spoofed. Four planes went into Iranian airspace in the last four months, unintentionally. So, this is a major issue.

But we can use quantum sensors to detect the unique magnetic footprint of every square meter on earth, in the same way that birds and whales navigate.

Boeing and Airbus are among the aerospace companies that have been investing in quantum navigation and timing research in Boeings case, as far back as 2018.

Quantum sensors are also 24 months away from being approved for use in hospitals to monitor patients hearts more accurately, claimed Hidary, thus avoiding the problem of traditional sensors missing a defect. However, such devices demand AI running on a classical computer to pull the signal from the noise of the many other sources of electromagnetic radiation.

He explained:

The [magnetic] signal from a heart is very, very faint so faint that you need a quantum sensor to pick it up. But there is so much other noise, so much other information. If you have an iPhone, if you have a smartwatch, or any of the other magnetic signals in this room, it can confound that sensor. So, we have to pass it through a GPU into an AI model, trained on the data of what a heartbeat looks like.

This convergence of AI and quantum is what's happening now. We need to move into the quantum realm to understand our own bodies. First the heart. And then, of course, the brain.

While little of what Hidary said is new these issues have been known about, conceptually, for decades the force of his argument, and its delivery, was unusual. As a result, the possibility that these challenges might be more urgent than most researchers believe cannot be ignored.

Watch out for those black swans!

Go here to read the rest:
Quantum technology the black swans are gathering, claims start-up CEO - diginomica

Read More..

The secret tech investor: Prepare for the quantum leap – Citywire

The rapid advance of artificial intelligence (AI) has led to the proposition that we are now in the era of AI. This raises the question of what comes next.

According to Michio Kakus illuminating book Quantum Supremacy, the next era after AI will be quantum computing (QC). Quantum physics, once considered an abstract concept, now permeates numerous everyday activities, from photosynthesis to MRI scans and the behaviour of electrons on the nanoscale within semiconductors.

Is it too early to invest in this forthcoming wave now? Almost certainly yes but it is time to start preparing. Heres a briefing note to help you know more about QC with one ETF investment opportunity you can consider now.

The key differentiator in QC is the use of qubits instead of bits. Qubits can exist in multiple states simultaneously, unlike classic computer bits which can only be in one of two exclusive states: 1 or 0. This fundamental distinction means that while a classic computers power increases linearly with the number of transistors, a quantum computers power grows exponentially in relation to the number of qubits linked together.

Furthermore, quantum computers can leverage quantum algorithms that make use of quantum superposition or entanglement, reducing the time complexity of the algorithm and fundamentally accelerating problem-solving capabilities. Given that much of the interaction of real-world particles is quantum by nature, it is intuitive that using quantum technologies to simulate and predict their behaviour will offer more authentic results.

Advanced QC has the potential to revolutionise and solve complex real-life problems that are currently intractable for classic computers, even when using AI. It is worth noting that some of the great advances we are likely to see during this era may be the product of a collaboration between AI and QC.

Many computer scientists have proposed that artificial general intelligence (AGI) will only be reached once AI is working in full collaboration with QC. AGI is when computer programs exhibit the ability to understand, learn and apply knowledge across a wide range of tasks, essentially mirroring generalised human cognitive abilities.

The potential of QC for AI is immense. Quantum machine learning could classify larger datasets in less time, and quantum neural networks could process information in methodologies that classic neural networks cannot. While existing AI tools are powerful and practical for many applications today, QC represents a new frontier with the potential to significantly advance the field.

Some of the other areas where QC could have a significant real-world and equity market impact include:

Cybersecurity Quantum computers could break widely used encryption methods which rely on the difficulty of factoring large numbers. Conversely, they could enable the development of disruptive quantum-resistant encryption algorithms to ensure data security in a post-quantum era. There is no reason why the current crop of cybersecurity players could not be at the forefront of quantum encryption and cybersecurity but it is quite possible that we will see new players evolve leading to a cast change in the major sector players. Hence, I would suggest there is medium-level QC disruption risk in this sector.

Drug discovery and material science Quantum computers can simulate the behaviour of molecules and materials at the quantum level with high precision. The complexity of the interaction of molecules (which is a quantum event) means that classical computing is highly limited in its ability to process or simulate complex molecular problems.

QC will improve the discovery of new drugs and materials, and potentially revolutionise the pharmaceutical and materials science industries. Drug discovery is likely to become less expensive, more predictable and quicker to market through better simulation.

In turn, this should reduce costs throughout the industry and may lead to efficacy cliffs for existing blockbusting drugs. I see the QC disruption risk to the pharmaceutical sector as high.

Materials We could see a scenario where raw elemental materials are inferior for each of their respective end tasks when compared to QC-discovered synthetic composites which may be formulated more efficiently through more common elements or compounds.

Supply chain optimisation The current Red Sea hot flashes and the Covid supply chain issues show how the logistics and supply chain is vital to a functioning global economy. QC can optimise supply chains by efficiently solving large-scale logistical and transportation problems, reducing costs, and improving overall efficiency. Improved supply chains should reduce working capital levels and hence increase the return on invested capital for companies which should theoretically drive general equities higher.

Energy production and storage QC may be successful in finding efficient methodologies for nuclear fusion (which is a quantum phenomenon) resulting in a boundless energy supply, discovering new materials for energy production and storage, and potentially accelerating the development of renewable energy technologies and improving energy efficiency. The oil & gas sector will certainly go the way of the stagecoach sector, and I would guess the electricity utility companies will suffer demand destruction as customers shift to at home generation.

Climate modelling The weather is a quantum phenomenon which is why classic computing prediction models are of low efficacy. Simulating the behaviour of molecules, particles, and climate systems at a quantum level could improve the accuracy and speed of climate models. This can aid in understanding climate change and developing strategies to mitigate its effects. It will also allow Insurance companies to price premiums with more certainty lowering this cost for many businesses.

As ever with advances in technology, QC will have a transformative effect on the technology & software sectors and, as with AI, I would suspect that the software sector will again bear the brunt of disruption risk. One company that will have to respond and adapt to QC is simulation software specialist Ansys which we discussed in our last article.

Its important to note that while QC holds immense promise, it is still in its early stages of development, and practical, large-scale quantum computers are not yet widely available. Additionally, the full extent of their capabilities and limitations is still being explored by a very limited number of companies.

In the rapidly evolving landscape of quantum computing, several companies are at the forefront of driving the development and application of QC technologies. These companies are making strides in advancing the capabilities of quantum computing, with each taking a unique approach to this transformative technology.

Microsoft is actively working on delivering quantum at scale by engineering a unique, stable qubit and bringing a full-stack, fault-tolerant quantum machine to its Azure cloud platform. CEO Satya Nadella has emphasised the importance of QC in the companys cloud computing strategy, positioning it as the next-generation cloud. Additionally, Microsoft has launched a programming language called Q# which can be used for simulating quantum algorithms and developing quantum software.

Google Quantum AI achieved a significant milestone in 2019 by demonstrating quantum supremacy, with its 53-qubit Sycamore quantum computer performing a calculation in 200 seconds that would have taken the worlds fastest supercomputer 10,000 years. The companys Quantum AI research team has continued to push the boundaries of QC by reducing its errors through the increase of qubits, boasting a 3-5x performance enhancement over previous models. Google Quantum AI has also been expediting the prototyping of hybrid quantum-classical machine learning models.

Intel is actively involved in the development of QC chips, such as the 49-qubit Tangle Lake quantum chip. The companys focus on advancing the state of the art in QC reflects its commitment to driving innovation in this transformative technology.

IBM is a leader in QC and has launched advanced quantum computer systems. Last month it introduced the Heron processor, featuring 133 fixed-frequency qubits, which marks a significant leap in QC performance and error reduction. IBM believes that practical, effective QC will be available in the year 2033 and has set a roadmap of development to this date.

Alpine Quantum Technologies has taken a distinctive approach to QC by focusing on trapped-ion quantum technology. The company has developed a 20-qubit ion trap quantum computer using complex laser systems to control the trapped ions. This approach sets AQT apart from other companies that are pursuing different types of QC technologies, such as superconducting qubits or silicon-based approaches.

Rigetti Computings approach to QC is unique due to its focus on a multichip strategy for building quantum processors. This approach involves connecting several identical processors into a large-scale quantum processor, which, the company claims, reduces manufacturing complexity and allows for accelerated, predictable scaling.

While it is still early to predict which companies will emerge as the winners in the QC space, investing in a sector or thematic-based ETF or equity basket may be a prudent option in such uncertainty.

The Defiance Quantum ETF (QTUM) is a multi-cap global fund using the BlueStar Quantum Computing and Machine Learning index as a benchmark.

The ETF primarily includes semiconductor and software companies (AI star Nvidia is the fifth-largest holding) involved in the research, development, and commercialisation of QC systems and materials. The fund has assets of about $200m and rose almost 40% in 2023. I would be surprised if the larger US investment banks dont start to launch similar quantum-tracking equity baskets.

In conclusion, the era of QC holds great promise for reshaping our technological landscape and solving complex problems that are currently beyond the capabilities of classic computing. It should have far-reaching implications for the valuation of various sectors of the equity market.

The economic value unlocked by QC combined with its potential to reshape industries underscores the importance of understanding and preparing for the implications of this transformative technology on the equity market. The integration of AI and QC is expected to drive significant advancements, potentially leading to the realisation of artificial general intelligence.

So keep an eye on things QC-related and be prepared to invest should events dictate.

What might trigger such a change? Well Elon Musk has been remarkably silent on the subject - so may be we should use any change in that state as a signal.

Be warned though that from the bits of classic computing we have suffered Bitcoin, so it will not be long before the perma-perplexed fellow down the pub is boring you about QuBitcoins.

The Secret Tech Investor is an experienced professional who has been running tech assets for more than 20 years. The author may have long or short positions in the stocks mentioned.

See the original post here:
The secret tech investor: Prepare for the quantum leap - Citywire

Read More..

Preparing for Post-Quantum Cryptography: Trust is the Key – Embedded Computing Design

January 23, 2024

Blog

The era of quantum computing is on its way as governments and private sectors have been taking steps to standardize quantum cryptography. With the advent of the new era, we are faced with new opportunities and challenges. This article will outline the potential impact of quantum computing and discuss strategies for preparing ourselves amid these anticipated changes.

In 1980, Paul Benioff first introduced Quantum Computing (QC) by describing the quantum model of computing. In classical computing, data is processed using binary bits, which can be either 0 or 1, whereas quantum computing uses quantum particles called qubits. Qubits can be in multiple states beyond 0 or 1, making them much faster and more powerful to perform calculations than a normal bit. To be more specific, with a quantum computer, we can finish a series of operations that would take a classical computer thousands of years in just hundreds of seconds. In fact, IBM just launched the first quantum computer with more than 1,000 qubits in 2023.

Nevertheless, the speed boost of quantum computing can have double-edged consequences. Modern cryptographers have been concerned about the potential impacts on the security of public-key crypto algorithms. Those regarded as unbreakable are now at risk, as a cryptographically relevant quantum computer (CRQC) can do short work of decryption. For instance, the most popular public-key cryptosystem, Rivest-Shamir-Adleman (RSA), was previously considered very challenging with its complex inverse computation. However, in Shors algorithm where quantum speedup is particularly evident, the once reliable computation time becomes CRQC-vulnerable. As such, the US National Institute of Standards and Technology (NIST) has been promoting the standardization of post-quantum cryptography (PQC). In addition, the National Security Memorandum (NSM-10) was issued in 2022 in response to the threat brought by cryptographically relevant quantum computers (CRQC).

In fact, when it comes to quantum computing, there are still many issues that researchers cannot agree on. In the current noisy intermediate scale quantum (NISQ) era, it is still unclear what the ideal architecture of a quantum computer is, when we can expect the first CRQC, and how many qubits we will need for a quantum computer. Take the minimum number of qubits would qualify a quantum computer as an example. Google estimated that it may be 20 million qubits. But with a different quantum algorithm, Chinese researchers in 2022 proposed their own integer factoring algorithm, claiming that only 372 qubits are needed to break a 2048-bit RSA key.

Despite the various quantum computing issues, researchers have a consensus on the necessity and urgency of the PQC transition. Based on the guidelines proposed by both public and private sectors, we have concluded the following key points for a smooth PQC transition:

The above suggestions are, in fact, not dependent on the PQC standards, and the preparations can start now. It is important to keep in mind that overall system security remains the top priority in both classical computing and the PQC era. The scope of the transition will not really affect all the classical cryptographic algorithms we are familiar with. That is, the current NIST-recommended AES-256 cipher and SHA-384 hash algorithms are still acceptable (yet not satisfying) in the post-quantum world.

The full transition to PQC may span many years, giving us more time to examine PQC readiness and stay crypto-agile. According to the National Security Memorandum (NSM-10), the winners of the final round of NISTs PQC Standardization are expected to be announced in 2024, so organizations are suggested to start the timer then. Table 1 compares those algorithms that have already been selected for NIST standards with their classical counterparts in terms of public key and ciphertext/signature size (in bytes). More importantly, any systems built today should maintain the ability to stay flexible enough to account for possible future modifications, understanding that what may appear quantum-safe today may not be so soon.

Table1: Candidates of NISTs PQC Standardization

Security concerns and levels will continue to evolve as quantum computing advances. This makes a more robust safety storage system, such as NeoPUF, necessary. When all is said and done, security is all about trust. Without the foundation of trust, the classical RSA public-key algorithm or a lattice-based PQC algorithm becomes ineffective. Since important system keys should be highly random and unable to be guessed, the secure methods for creating trust in a system will become increasingly important in the post-quantum world.An even stronger base of trust, a hardware root of trust (HRoT), must be implemented in the hardware, as the software root of trust alone is no longer considered sufficient. The most robust form of such internal provisioning is PUF-based. Having delivered trust on multiple foundry platforms, eMemory and its subsidiary PUFsecurity are highly credible. Experienced solution providers such as eMemory and PUFsecurity will still be the best choice now and moving into the post-quantum world.

To learn more about Post-Quantum Cryptography, please read the full article on PUFsecurity Website.

Lawrence Liu is a leading member of the R&D team here at PUFsecurity, bringing over 25 years of experience working with NAND Flash, NOR Flash, and DRAM. Before specializing in the field of memory design, he graduated from the mid-peninsula university affectionately known as The Farm with BSEE/MSEE specializing in computer architecture.

More from Lawrence

View original post here:
Preparing for Post-Quantum Cryptography: Trust is the Key - Embedded Computing Design

Read More..

Quantum Reinforcement Learnings Impact on AI Evolution | by The Tech Robot | Jan, 2024 – Medium

Quantum Reinforcement Learnings Impact on AI Evolution

The quickly developing field of quantum computing has great promise for enhancing machine learning on conventional computers. Quantum computers are more effective than conventional computers because they can manage complex connections between inputs. These quantum computers provide ten times greater data processing and storage capacity than modern supercomputers.

Quantum computing is an area of study that integrates computer science, physics, and mathematics to tackle complicated problems more quickly than ordinary computers. To solve specific problems, it employs quantum mechanical processes such as coherence and quantum interference. Machine learning, efficiency, and modeling physical systems, as well as portfolio optimization and chemical simulation, will be future uses.

Qubits store data using what is known as the superposition principle in quantum computing. This enables qubits to be in many states at the same time. Quantum machine learning (QML) augments regular machine learning software with quantum devices. Quantum computers offer substantially more storage and processing power than ordinary computers, enabling them to analyze massive volumes of data that older technologies would take much longer to handle. With this extraordinary processing power, QML can speed and improve the development of machine learning models, neural networks, and other kinds of quantum artificial intelligence (AI).

Four major types of data, based on quantum (Q) or classical type and previous computation on Q or C computers, are derived from the blend of quantum and machine learning.

1. CC: Classical Dataset analyzed in Classical Computers Classical Machine Learning (ML) is a method that is unlikely to have a direct quantum base but draws principles from quantum machine learning theory.

2. QC: Quantum Dataset in Classical Computers learns from quantum states of consciousness using classical machine learning challenges. This technique would address the problem of classifying quantum states produced by physical experiments.

3. CQ: Quantum Computers Handle Classical Datasets In quantum computers, traditional datasets are processed. In a nutshell, quantum computers are employed to find faster solutions to problems that have previously been solved using ML. Traditional algorithms, like picture categorization, are fed into quantum machines to discover the best algorithm parameters.

4. QQ: Using quantum computers that work solely on quantum states would be the purest way. The outcome of a quantum simulation is fed into a machine learning system.

1. Positive and negative interference are used in quantum neural network training.

2. Multi-state exploration and convergence are accelerated by quantum reinforcement learning.

3. Run-time optimization: providing speedier outcomes; Enhancements to learning capacity: enhancing the capacity of connection or content-addressable memory.

4. Advances in learning efficiency: Depending on the degree of training knowledge required, the same data may be used to learn more complicated relations or simpler models.

1. Limited quantum hardware: In the current environment, Noise Intermediate-Scale Quantum (NISQ) systems must limit qubit availability for modeling reasons. Millions of qubits are expected to be required for practical usefulness.

2. Creating data that is quantum-ready: It is difficult to encode standard data using quantum state representations. Today, the bulk of data lacks underlying quantum structure.

3. Algorithm design: To reap the benefits of QML, new quantum-optimized machine learning frameworks and approaches, such as deep learning, are required.

4. Software infrastructure: Because quantum development frameworks are presently in their infancy, integrating them with regular Machine Learning technologies and workflows is difficult.

5. Training Datasets are Limited: There is insufficient labeled quantum data available. Although artificial dataset generation is advantageous, it has limitations.

6. Inadequate skills: Only a handful of academics are currently working on QML at the intersection of quantum research and AI.

Quantum Machine Learning (QML) is a new field of AI and quantum computing that has the potential for spectacular outcomes due to developments in quantum equipment, algorithms, and academic-engineer collaboration. Take classes, join clubs, or experiment with cloud-based technologies to participate in this exciting future.

People Also read The Role of Reinforcement Learning in NLP

The rest is here:
Quantum Reinforcement Learnings Impact on AI Evolution | by The Tech Robot | Jan, 2024 - Medium

Read More..