Category Archives: Quantum Computer

Exploring the potential of quantum reservoir computing in forecasting the intensity of tropical cyclones – Moody’s Analytics

What is the problem?

Accurately predicting the intensity of tropical cyclones, defined as the maximum sustained windspeed over a period of time, is a critical yet challenging task. Rapid intensification (RI) events are still a daunting problem for operational intensity forecasting.

Better forecasts and simulation of tropical cyclone (TC) intensities and their track can significantly improve the quality of Moodys RMS tropical cyclone modeling suite. RMS has helped clients manage their risk during TC events in the North Atlantic for almost 20 years. Real time TCs can significantly impact a companys financial, operational, and overall solvency state. Moodys RMS Hwind product helps (re)insurers, brokers, and capital markets understand the range of potential losses across multiple forecast scenarios, capturing the uncertainty in of how track and intensity will evolve.

With the advances in Numerical Weather Prediction (NWP) and new meteorological observations, forecasts of TC movement have progressively improved in global and regional models. However, the model accuracy in forecasting the intensities of TCs remains challenging for operational weather forecasting and consequential assessment of weather impacts such as high winds, storm surges, and heavy rainfall.

Since the current spatial resolution of the NWP model is insufficient for resolving convective scale processes and inner core dynamics of the cyclone, forecast intensities of TCs from operational models are mostly underestimated or low biased. Yet, accurate TC intensity guidance is crucial not only for assessing the impact of the TC, but also for generating realistic projections of storms and their associated hazards. This is essential for effective risk evaluation. Conventional TC intensity forecasting mainly relies on three approaches: statistical, dynamical, and statistical-dynamical methods.

Dynamical models, also known as numerical models, are the most complex and use high performance computing (HPC) to solve the physical equations of motion governing the atmosphere. While statistical models do not explicitly consider the physics of the atmosphere, they are based on historical relationships between storm behavior and storm-specific details such as location and intensity.

The rise of Machine Learning (ML) and Deep Learning (DL) has led to attempts to create breakthroughs in climate modeling and weather forecasting. Recent advances in computational capabilities and the availability of extensive reanalysis of observational or numerical datasets have reignited interest in developing various ML methods for predicting and understanding the dynamics of complex systems.

One of our key objectives is to build a quantum reservoir computing-based model, capable of processing climate model outputs and storm environment parameters, to provide more accurate forecasting, will improve short-term and real-time TC risk analysis.

Official modeling centers use consensus or ensemble-based dynamical models and represent the state of the art in tropical cyclone forecasting. However, these physics-based models may be subject to bias derived from high wind shear, low sea surface temperatures, or the storms location in the basin. By learning from past forecasting errors, we may be able to identify and correct past model biases, thereby greatly enhancing the quality of future forecasting and risk modeling products. The long-term aim is to integrate ML-based elements into coarse global climate models to improve their resolution and include natural dynamical processes currently absent in these models.

Reservoir Computing (RC) is a novel machine-learning algorithm particularly suited to quantum computers and has shown promising results in early non-linear time series prediction tests. In a classical setting, RC is stable and computationally simple. It works by mapping input time series signals into a higher dimensional computational space through the dynamics of a fixed, non-linear system known as a reservoir. This method is efficient, trainable, and has a low computational cost, making it a valuable tool for large-scale climate modeling.

While quantum machine learning has been considered a promising application for near-term quantum computers, current quantum machine learning methods require large quantum resources and suffer from gradient vanishing issues. Quantum Reservoir Computing (QRC) has the potential to combine the efficient machine learning of classical RC with the computing power of complex and high-dimensional quantum dynamics. QRC takes RC a step further by leveraging the unique capabilities of quantum processing units (QPUs) and their exponentially large state space, resulting in rich dynamics that cannot be simulated on a conventional computer. In particular, the flexible atom arrangements and tunability of optical controls within QuEras neutral atom QPU enable the realization of a rich class of Hamiltonians acting as the reservoir.

Recent studies on quantum computing simulators and hardware suggest that certain quantum model architectures used for learning on classical data can achieve results similar to that of classical machine learning models while using significantly fewer parameters. Overall, QRC offers a promising approach to resource-efficient, noise-resilient, and scalable quantum machine learning.

In this project, we are collaborating with QuEra Computing, the leading provider of quantum computers based on neutral-atoms , to explore the benefits of using quantum reservoir computing in climate science and to investigate the potential advantages that the quantum layer from QuEra can bring. QuEra's neutral atom QPU and the types of quantum simulations it can perform give rise to different quantum reservoirs. This unique capability can potentially enhance the modeling of tropical cyclone intensity forecasts and data.

This collaboration involves multiple stakeholders and partners, including QuEra Computing Inc., Moodys RMS technical team, and Moodys Quantum Taskforce. The work is supported by a DARPA grant award, underscoring its significance and potential impact in tropical cyclone modeling and forecasting.

In summary, combining quantum machine learning methods, reservoir computing, and the quantum capabilities of QuEra's technology offers a promising approach to addressing the challenges in predicting tropical cyclone intensity. This collaboration aims to enhance the quality and efficiency of tropical cyclone modeling, ultimately aiding in better risk assessment and decision making in the face of these natural disasters.

Read more here:
Exploring the potential of quantum reservoir computing in forecasting the intensity of tropical cyclones - Moody's Analytics

3 Quantum Computing Stocks to Buy for the Next Bull Run: March 2024 – InvestorPlace

There are some quantum computing stocks to buy for March that I think could lift off to new heights.

Quantum computing is an emerging and potentially revolutionary technology that could have a profound impact on various industries and fields. The market potential for quantum computing is immense. It is widely regarded as one of the most promising technological advancements of the 21st century.

The great thing about these companies is that many of them are speculative investments and therefore trade at attractive valuations. I think that these companies are primed for the next bull run. As the Nasdaq moves higher, so too will these options.

So, here are three quantum computing stocks for investors to consider for March this year.

Source: Amin Van / Shutterstock.com

IonQ (NYSE:IONQ) distinguishes itself as a pure-play quantum computing company. They have a strong focus on developing trapped ion quantum computers.

For this year, it projects its revenue for the full year 2024 to range between $37 million and $41 million. Its bookings are expected to be between $70 million and $90 million. For the first quarter of 2024, revenue is forecasted to be between $6.5 million and $7.5 million. Despite these projections, IONQ anticipates an adjusted EBITDA loss of $110.5 million for 2024.

The companys performance in 2023 set a strong foundation for these forecasts. They had significant achievements including hitting $65.1 million in bookings for the year, exceeding the upper end of its guidance. This represents a 166% growth compared to the previous year. The revenue for 2023 was reported at $22.042 million, a substantial increase from $11.131 million in 2022

I see the projected loss as potentially being a good thing for IONQ investors. This could keep its valuation down to acceptable levels. Due to its small market cap of 1.9 billion, it could rise significantly along with the broader market amid a bull run.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Rigetti Computing (NASDAQ:RGTI) is known for developing quantum integrated circuits. They also offer a cloud platform that supports quantum algorithm development.

In my opinion, RGTI is one of the more underestimated companies in this list. This is because it has an angle of offering more of the picks and shovels to the quantum industry rather than being a pure-play option than IONQ. Investing in RGTI could then give one more indirect than direct exposure to the industry. This could be a strong diversifier.

In terms of outlook and developments, RGTI made significant progress in 2023, including the deployment of the 84-qubit Ankaa-2 quantum computer, which achieved a 98% median 2-qubit fidelity and a 2.5x improvement in error performance compared to its previous quantum processing units (QPUs).

Underscoring why I believe that it could be a strong contender, analysts have given RGTI a Moderate Buy rating, with a consensus price target of $2.75, indicating a potential upside of 71.34% to be reached within the next twelve months.

Source: JHVEPhoto / Shutterstock.com

IBM (NYSE:IBM) extends its influence in quantum computing beyond hardware.

I chose IBM for investors who want a well-diversified blue-chip investment rather than the more speculative companies on this list. Although its potential for capital growth may be lower, I feel that with its dividend yield of 3.52% at the time of writing, this makes it a solid and safer choice.

IBM is also expanding its global footprint in quantum computing with the establishment of its first European quantum data center in Germany, set to open in 2024. This facility will enable users in Europe to access IBMs quantum computing systems and services.

Hardware-wise, IBM has introduced advanced processors like the 133-qubit Heron and the 433-qubit Osprey. Meanwhile, On the software front, IBM is evolving its Qiskit platform with updates that promise to increase the ease of quantum software programming.

IBM then has many forks in the fire to take advantage of the rise of quantum computing, which along with its stability and dividend yield, makes it one of those stocks that could rise in a bull run. If you are looking for quantum computing stocks to buy, you cant go wrong with these.

On the date of publication, Matthew Farley did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Matthew started writing coverage of the financial markets during the crypto boom of 2017 and was also a team member of several fintech startups. He then started writing about Australian and U.S. equities for various publications. His work has appeared in MarketBeat, FXStreet, Cryptoslate, Seeking Alpha, and the New Scientist magazine, among others.

Read more:
3 Quantum Computing Stocks to Buy for the Next Bull Run: March 2024 - InvestorPlace

Chattanooga State Takes A Quantum Leap With Expert Insights Into The Future Of Computing – Chattanooga Pulse

Chattanooga State Community College will take a deep dive into the world of quantum computing alongside physics and computer science expert Dr. Shohini Ghose.

On April 3, Dr. Ghose will join ChattState students, faculty, and members of the Chattanooga community virtually to discuss the emerging field of quantum computing. The event will kick off at 9:30 a.m. with coffee and pastries followed by Dr. Ghoses presentation at 10 a.m. The lecture is titled Preparing for Quantum 2.0.

The reception and lecture will be held in BRANCH 30 on the ChattState main campus. A live stream of the lecture will also be accessible on YouTube.

Dr. Ghose is a professor of Physics and Computer Science at Wilfrid Laurier University. Her areas of expertise include quantum computing, quantum communication, and classical and quantum chaos.

The event is hosted by ChattStates Honors Program, said organizer Dr. Angie Wood, professor of social and behavioral sciences and director of ChattState Honors.

Dr. Wood said she challenged honors students last year to research the field of quantum computing and find an expert to speak on campus. They ultimately chose to invite Dr. Ghose after viewing her TED Talk titled A Beginners Guide to Quantum Computing.

Dr. Wood said offering educational opportunities outside of the classroom is one way ChattState gives students opportunities to further their futures and broaden their horizons.

College is about more than just going to class. It's also about networking and the contacts that you gain outside of the classroom, she said. "You never know when you will walk into an event like this and realize its what youre passionate about.

The event is presented alongside the Chattanooga Quantum Collaborative, a local organization that aims to use quantum technology to improve the local workforce, economy, and infrastructure.

The organization is supported by several founding members: EPB, the City of Chattanoga, TVA, UTC, Oak Ridge National Lab, Hamilton County, ChattState, Chattanooga Area Chamber of Commerce, Company Lab, Hamilton County Schools, and Qubitekk.

Originally posted here:
Chattanooga State Takes A Quantum Leap With Expert Insights Into The Future Of Computing - Chattanooga Pulse

NSA fears quantum computing surprise: ‘If this black swan event happens, then we’re really screwed’ – Washington Times

The National Security Agency fears a quantum computing breakthrough by Americas adversaries would jeopardize the security of the global economy and allow foes to peer inside top-secret communications systems.

The agencys concern is that an unforeseen advance in quantum technology would crack encryption systems used to protect everything from financial transactions to sensitive communications involving nuclear weapons, according to NSA Director of Research Gil Herrera.

Speaking at an Intelligence and National Security Alliance event last week, Mr. Herrera said no country has a quantum computer that he would consider useful yet.

He said there are a lot of teams around the world building with different technologies and someone could achieve a development representing a black swan event, an extremely unexpected occurrence with profound and dangerous consequences for U.S. national security.

If this black swan event happens, then were really screwed, Mr. Herrera said.

Americans could suffer consequences from such a quantum leap in several ways. Mr. Herrera said the world economy, and the U.S. market in particular, are vulnerable because most financial transactions are secured by encryption systems that cant be cracked by non-quantum means.

If quantum tech weakens or eliminates such encryption walls, then financial institutions may have to resort to older transaction methods and banks would look for other means to protect their dealings with other banks, according to Mr. Herrera.

And, he warned, other industries may be even less resilient in the face of the threat. Mr. Herrera said the threat of a quantum computer is not limited to its immediate potential damage, but to the fallout from obtaining encrypted information that was previously recorded.

Drawing on his decades of experience at Sandia National Laboratories, Mr. Herrera said a quantum advance may be able to help people find information on weapons systems that have been in the U.S. arsenal for a significant period of time.

There are ways that we can communicate with our various partners in nuclear weapon production where public key encryption is utilized to share keys, Mr. Herrera said. And now, what if somebodys recorded that information and they crack it?

Details on foreign adversaries advanced computing capabilities are closely guarded, Federal policymakers are worried in particular about Chinas efforts to achieve computing breakthroughs.

Reflecting on supercomputers at a House Armed Services Committee hearing last year, Rep. Morgan Luttrell said he worried Beijing may have already surpassed the U.S. in its supercomputing prowess.

China should have on board or online another computer that would have trumped us and pushed us back some, the Texas Republican said at the March 2023 hearing. So the amount of money theyre spending in that space as compared to us would make me think that theyre ahead of us.

Retired Gen. Paul Nakasone, then in charge of U.S. Cyber Command, cautioned Mr. Luttrell against assuming that outspending America would guarantee an adversarys technological success.

Spending money doesnt necessarily mean that youre the best in what you do and being able to integrate that kind of capability is what really matters, Gen. Nakasone said at the hearing. So being able to take the intelligence, integrate it within maneuver force to have an outcome is where I clearly see United States has the lead.

But experts agree that quantum computing breakthroughs would dramatically outdo existing supercomputers. The NSA is not waiting to find out.

Mr. Herrera said the NSA believes the algorithms it is deploying will withstand a quantum attack.

One thing NSA has done about it is we actually started research in quantum-resistant algorithms not too long after we started funding academic programs to come up with what a quantum computer would look like, Mr. Herrera said. So we have a lot of maturity within the NSA, we have been deploying quantum-resistant encryption in certain key national security applications for a while now.

Efforts to better understand the quantum capabilities of Americas adversaries are underway as well. The congressionally chartered U.S.-China Economic and Security Review Commission is scrutinizing the communist countrys push to transform its military through the application of quantum and emerging technologies to its weapons systems and logistics.

Last month, the commission conducted a hearing that included an examination of Chinas quest for teleportation technology.

Continued here:
NSA fears quantum computing surprise: 'If this black swan event happens, then we're really screwed' - Washington Times

Secure quantum communication is one step closer to reality – Earth.com

At the University of Waterloos Institute for Quantum Computing (IQC), researchers have made a tremendous advancement in the realm of quantum communication by melding two Nobel Prize-winning innovations.

This new development hinges on the efficient production of nearly perfect entangled photon pairs, leveraging quantum dot sources. Entangled photons, a concept awarded the 2022 Nobel Prize in Physics, are light particles that remain interconnected over vast distances.

The integration of this principle with quantum dots, celebrated with the 2023 Nobel Prize in Chemistry, aims to refine the generation of these entangled photons, a cornerstone for applications like secure communications.

Dr. Michael Reimer, a professor at IQC and the Department of Electrical and Computer Engineering at Waterloo, highlighted the significance of their work.

The combination of a high degree of entanglement and high efficiency is needed for exciting applications such as quantum key distribution or quantum repeaters, which are envisioned to extend the distance of secure quantum communication to a global scale or link remote quantum computers, Reimer explained.

He emphasized the novelty of their achievement in simultaneously meeting the dual criteria of near-perfect entanglement and high efficiency using a quantum dot, a feat not accomplished in prior experiments.

The teams success involved embedding semiconductor quantum dots within a nanowire, creating a photon source that surpasses previous methods in efficiency by 65 times.

Developed in collaboration with the National Research Council of Canada in Ottawa, this innovative source can be stimulated with lasers to generate entangled photon pairs on demand.

To enhance the entanglements quality, the researchers utilized high-resolution single photon detectors from Single Quantum in The Netherlands.

Matteo Pennacchietti, a PhD student at IQC and the Department of Electrical and Computer Engineering, discussed overcoming the challenge of fine structure splitting.

This phenomenon, which leads to oscillation in an entangled state over time, previously hindered accurate entanglement measurement with slow detection systems.

We overcame this by combining our quantum dots with a very fast and precise detection system. We can basically take a timestamp of what the entangled state looks like at each point during the oscillations, and thats where we have the perfect entanglement, Pennacchietti explained.

The teams collaboration extended to Dr. Norbert Ltkenhaus and Dr. Thomas Jennewein, both IQC faculty members and professors in the Department of Physics and Astronomy at Waterloo.

Together, they demonstrated the potential of their quantum dot entanglement source in simulating quantum key distribution, a secure communications method.

This experiment underscored the significant promise the quantum dot source holds for the future of secure quantum communications.

In summary, University of Waterloo scientists have set a new standard in quantum communication by successfully merging two Nobel Prize-winning technologies to produce nearly perfect entangled photon pairs with unprecedented efficiency.

This breakthrough overcomes long-standing challenges in the field while opening new avenues for secure global communication and the interconnection of remote quantum computers.

By pushing the boundaries of quantum dot technology and entanglement, the researchers have laid a robust foundation for the next generation of quantum communication systems, marking a significant leap forward in our quest for ultra-secure, worldwide connectivity.

The full study was published in the journal Communications Physics.

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

See original here:
Secure quantum communication is one step closer to reality - Earth.com

Securing the Future: The Quest for Quantum-Safe Encryption – yTech

As we venture further into the digital era, quantum computing emerges as a revolutionary technological leap, holding the promise of expediting a vast range of complex computations. However, it also presents a formidable challenge to cybersecurity. The headway of quantum computers threatens to decipher conventional encryption algorithms, potentially exposing sensitive data to cyber breaches. Cybersecurity thought leaders, including Devos Chief Information Security Officer Kayla Williams, have openly discussed the susceptibility of current encryption methods, which could be compromised by the sheer power of quantum processing.

Efforts to stave off these looming cyber threats have led to a concerted drive to devise quantum-resistant encryption techniques. Tech behemoths such as IBM and Thales Group are not only propelling quantum research forward but are also pioneering cryptographic defenses to secure our digital infrastructure. The imperative to protect data in the quantum future underscores the need for a united front linking diverse sectors and industries.

Looking ahead, the urgency to cement quantum-safe cryptographic standards is evident. Organizations and governments alike must rally to bolster our cyber defenses in anticipation of the quantum leap. As conversations on this topic burgeon, resources from leading-edge companies and scholarship shine a light on the methodologies to fortify our data against the quantum computing tide, urging a proactive stance in this cybersecurity genesis.

Summary: The dawn of quantum computing stands to redefine technological capabilities, presenting significant cybersecurity risks. The transition to quantum-resistant encryption is an urgent global initiative led by tech leaders and cybersecurity experts. Protecting against the advanced computational power of quantum systems is paramount, sparking international discussions on required security standards and fostering innovation to secure our digital ecosystems.

Sources: [Source 1](https://www.example-domain.com) [Source 2](https://www.example-domain.com)

Note: The actual sources are not provided in this simulated task, so the placeholder URLs are used instead. Marcin Frckiewiczs expertise in satellite communication and artificial intelligence is a testament to the caliber of analysis and insight required in addressing the complexities of quantum computing and its impact on cybersecurity.

Industry Overview and Market Forecasts

Quantum computing represents a major leap forward over classical computing by leveraging the principles of quantum mechanics to process information. Unlike traditional bits which represent either a 0 or a 1, quantum bits, or qubits, can exist in multiple states simultaneously, enabling them to perform many calculations at once. This exponentially increases the speed and power of data processing, vindicating its potential application in fields such as pharmaceuticals, finance, materials science, and logistics.

The quantum computing industry is currently in a nascent stage but is growing rapidly. According to market research, the global quantum computing market size was valued at several billion dollars in the past few years and is expected to grow exponentially to tens of billions by the end of the decade, registering a compound annual growth rate (CAGR) of impressive double-digits.

Issues Related to Quantum Computing and Cybersecurity

The principal issue about the rise of quantum computing is its ability to break the currently employed encryption algorithms. Most digital security today relies on encryption methods such as RSA and ECC, which are theoretically susceptible to being solved in trivial timeframes using quantum computers, jeopardizing everything from internet communications to banking systems.

In light of these threats, the industry is galvanized to transition to quantum-resistant encryption or post-quantum cryptography (PQC). PQC refers to cryptographic algorithms that are thought to be secure against an attack by a quantum computer. The National Institute of Standards and Technology (NIST) in the United States has been leading an initiative to standardize PQC, which is of critical importance to the future cybersecurity landscape.

One notable effort within the industry is the development of quantum key distribution (QKD), a method for secure communication that uses quantum states of particles to form a tamper-proof communication system. This technology has already seen practical deployment, though it is not widely used due to high costs and infrastructure requirements.

Conclusion

As quantum computing continues to evolve, securing digital infrastructure against quantum threats has become a paramount priority. It requires a collaborative effort involving the private sector, academia, and government agencies to develop new standards and technologies to defend against the potential vulnerabilities introduced by quantum capabilities. Being proactive in this area is more than a security measure; it is a strategic imperative for national interests and global economic stability.

For up-to-date information on quantum computing and cybersecurity, thorough analysis and insights are paramount. Notable domains to follow for further insights include industry leaders like IBM and Thales Group. While numerous other companies and organizations are contributing to this field, these entities have been at the forefront of both quantum computing advancements and cybersecurity protections.

Jerzy Lewandowski, a visionary in the realm of virtual reality and augmented reality technologies, has made significant contributions to the field with his pioneering research and innovative designs. His work primarily focuses on enhancing user experience and interaction within virtual environments, pushing the boundaries of immersive technology. Lewandowskis groundbreaking projects have gained recognition for their ability to merge the digital and physical worlds, offering new possibilities in gaming, education, and professional training. His expertise and forward-thinking approach mark him as a key influencer in shaping the future of virtual and augmented reality applications.

More here:
Securing the Future: The Quest for Quantum-Safe Encryption - yTech

Verifying the Work of Quantum Computers – Caltech

Quantum computers of the future may ultimately outperform their classical counterparts to solve intractable problems in computer science, medicine, business, chemistry, physics, and other fields. But the machines are not there yet: They are riddled with inherent errors, which researchers are actively working to reduce. One way to study these errors is to use classical computers to simulate the quantum systems and verify their accuracy. The only catch is that as quantum machines become increasingly complex, running simulations of them on traditional computers would take years or longer.

Now, Caltech researchers have invented a new method by which classical computers can measure the error rates of quantum machines without having to fully simulate them. The team describes the method in a paper in the journal Nature.

"In a perfect world, we want to reduce these errors. That's the dream of our field," says Adam Shaw, lead author of the study and a graduate student who works in the laboratory of Manuel Endres, professor of physics at Caltech. "But in the meantime, we need to better understand the errors facing our system, so we can work to mitigate them. That motivated us to come up with a new approach for estimating the success of our system."

In the new study, the team performed experiments using a type of simple quantum computer known as a quantum simulator. Quantum simulators are more limited in scope than current rudimentary quantum computers and are tailored for specific tasks. The group's simulator is made up of individually controlled Rydberg atomsatoms in highly excited stateswhich they manipulate using lasers.

One key feature of the simulator, and of all quantum computers, is entanglementa phenomenon in which certain atoms become connected to each other without actually touching. When quantum computers work on a problem, entanglement is naturally built up in the system, invisibly connecting the atoms. Last year, Endres, Shaw, and colleagues revealed that as entanglement grows, those connections spread out in a chaotic or random fashion, meaning that small perturbations lead to big changes in the same way that a butterfly's flapping wings could theoretically affect global weather patterns.

This increasing complexity is believed to be what gives quantum computers the power to solve certain types of problems much faster than classical computers, such as those in cryptography in which large numbers must be quickly factored.

But once the machines reach a certain number of connected atoms, or qubits, they can no longer be simulated using classical computers. "When you get past 30 qubits, things get crazy," Shaw says. "The more qubits and entanglement you have, the more complex the calculations are."

The quantum simulator in the new study has 60 qubits, which Shaw says puts it in a regime that is impossible to simulate exactly. "It becomes a catch-22. We want to study a regime that is hard for classical computers to work in, but still rely on those classical computers to tell if our quantum simulator is correct." To meet the challenge, Shaw and colleagues took a new approach, running classical computer simulations that allow for different amounts of entanglement. Shaw likens this to painting with brushes of different size.

"Let's say our quantum computer is painting the Mona Lisa as an analogy," he says. "The quantum computer can paint very efficiently and, in theory, perfectly, but it makes errors that smear out the paint in parts of the painting. It's like the quantum computer has shaky hands. To quantify these errors, we want our classical computer to simulate what the quantum computer has done, but our Mona Lisa would be too complex for it. It's as if the classical computers only have giant brushes or rollers and can't capture the finer details.

"Instead, we have many classical computers paint the same thing with progressively finer and finer brushes, and then we squint our eyes and estimate what it would have looked like if they were perfect. Then we use that to compare against the quantum computer and estimate its errors. With many cross-checks, we were able to show this squinting' is mathematically sound and gives the answer quite accurately."

The researchers estimated that their 60-qubit quantum simulator operates with an error rate of 91 percent (or an accuracy rate of 9 percent). That may sound low, but it is, in fact, relatively high for the state of the field. For reference, the 2019 Google experiment, in which the team claimed their quantum computer outperformed classical computers, had an accuracy of 0.3 percent (though it was a different type of system than the one in this study).

Shaw says: "We now have a benchmark for analyzing the errors in quantum computing systems. That means that as we make improvements to the hardware, we can measure how well the improvements worked. Plus, with this new benchmark, we can also measure how much entanglement is involved in a quantum simulation, another metric of its success."

The Nature paper titled "Benchmarking highly entangled states on a 60-atom analog quantum simulator" was funded by the National Science Foundation (partially via Caltech's Institute for Quantum Information and Matter, or IQIM), the Defense Advanced Research Projects Agency (DARPA), the Army Research Office, the U.S. Department of Energy's Quantum Systems Accelerator, the Troesh postdoctoral fellowship, the German National Academy of Sciences Leopoldina, and Caltech's Walter Burke Institute for Theoretical Physics. Other Caltech authors include former postdocs Joonhee Choi and Pascal Scholl; Ran Finkelstein, Troesh Postdoctoral Scholar Research Associate in Physics; and Andreas Elben, Sherman Fairchild Postdoctoral Scholar Research Associate in Theoretical Physics. Zhuo Chen, Daniel Mark, and Soonwon Choi (BS '12) of MIT are also authors.

Read more:
Verifying the Work of Quantum Computers - Caltech

NVIDIA Amplifies Quantum Computing Ecosystem with New CUDA-Q Integrations and Partnerships at GTC – HPCwire

March 20, 2024 The latest advances in quantum computing include investigating molecules, deploying giant supercomputers and building the quantum workforce with a new academic program. Researchers in Canada and the U.S. used a large language model to simplify quantum simulations that help scientists explore molecules.

This new quantum algorithm opens the avenue to a new way of combining quantum algorithms with machine learning, said Alan Aspuru-Guzik, a professor of chemistry and computer science at the University of Toronto, who led the team.

The effort used CUDA-Q, a hybrid programming model for GPUs, CPUs and the QPUs quantum systems use. The team ran its research on Eos, NVIDIAs H100 GPU supercomputer. Software from the effort will be made available for researchers in fields like healthcare and chemistry. Aspuru-Guzik detailed the work in a talk at GTC.

Quantum Scales for Fraud Detection

At HSBC, one of the worlds largest banks, researchers designed a quantum machine learning application that can detect fraud in digital payments. The banks quantum machine learning algorithm simulated a whopping 165 qubits on NVIDIA GPUs. Research papers typically dont extend beyond 40 of these fundamental calculating units quantum systems use.

HSBC used machine learning techniques implemented with CUDA-Q and cuTensorNet software on NVIDIA GPUs to overcome challenges simulating quantum circuits at scale. Mekena Metcalf, a quantum computing research scientist at HSBC, will present her work in a session at GTC.

Raising a Quantum Generation

In education, NVIDIA is working with nearly two dozen universities to prepare the next generation of computer scientists for the quantum era. The collaboration will design curricula and teaching materials around CUDA-Q.

Bridging the divide between traditional computers and quantum systems is essential to the future of computing, said Theresa Mayer, vice president for research at Carnegie Mellon University. NVIDIA is partnering with institutions of higher education, Carnegie Mellon included, to help students and researchers navigate and excel in this emerging hybrid environment.

To help working developers get hands-on with the latest tools, NVIDIA co-sponsored QHack, a quantum hackathon in February. The winning project, developed by Gopesh Dahale of Qkrishi a quantum company in Gurgaon, India used CUDA-Q to develop an algorithm to simulate a material critical in designing better batteries.

A Trio of New Systems

Two new systems being deployed further expand the ecosystem for hybrid quantum-classical computing.

The largest of the two, ABCI-Q at Japans National Institute of Advanced Industrial Science and Technology, will be one of the largest supercomputers dedicated to research in quantum computing. It will use CUDA-Q on NVIDIA H100 GPUs to advance the nations efforts in the field.

In Denmark, the Novo Nordisk Foundation will lead on the deployment of an NVIDIA DGX SuperPOD, a significant part of which will be dedicated to research in quantum computing in alignment with the countrys national plan to advance the technology.

The new systems join Australias Pawsey Supercomputing Research Centre, which announced in February it will run CUDA-Q on NVIDIA Grace Hopper Superchips at its National Supercomputing and Quantum Computing Innovation Hub.

Partners Drive CUDA-Q Forward

In other news, Israeli startup Classiq released at GTC a new integration with CUDA-Q. Classiqs quantum circuit synthesis lets high-level functional models automatically generate optimized quantum programs, so researchers can get the most out of todays quantum hardware and expand the scale of their work on future algorithms.

Software and service provider QC Ware is integrating its Promethium quantum chemistry package with the just-announced NVIDIA Quantum Cloud.

ORCA Computing, a quantum systems developer headquartered in London, released results running quantum machine learning on its photonics processor with CUDA-Q. In addition, ORCA was selected to build and supply a quantum computing testbed for the UKs National Quantum Computing Centre which will include an NVIDIA GPU cluster using CUDA-Q.

Nvidia and Infleqtion, a quantum technology leader, partnered to bring cutting-edge quantum-enabled solutions to Europes largest cyber-defense exercise with NVIDIA-enabled Superstaq software.

A cloud-based platform for quantum computing, qBraid, is integrating CUDA-Q into its developer environment. And California-based BlueQubit described in a blog how NVIDIAs quantum technology, used in its research and GPU service, provides the fastest and largest quantum emulations possible on GPUs.

Get the Big Picture at GTC

To learn more, watch a session about how NVIDIA is advancing quantum computing and attend an expert panel on the topic, both at NVIDIA GTC, a global AI conference, running March 18-21 at the San Jose Convention Center.

Source: Elica Kyoseva, Nvidia

Follow this link:
NVIDIA Amplifies Quantum Computing Ecosystem with New CUDA-Q Integrations and Partnerships at GTC - HPCwire

Quantum Computing Breakthrough: Scientists Develop New Photonic Approach That Works at Room Temperature – SciTechDaily

Quantum computing is advancing, with giants like Google and IBM providing services, yet challenges remain due to insufficient qubits and their susceptibility to external influences, requiring complex entanglement for reliable results. Photonic approaches offer room temperature operation and faster speeds, but face loss issues; however, a novel method demonstrated by researchers uses laser pulses to create inherently error-correcting logical qubits, simplifying quantum computing but still needing improvements in error tolerance.

Significant advancements have been made in quantum computing, with major international companies like Google and IBM now providing quantum computing services via the cloud. Nevertheless, quantum computers are not yet capable of addressing issues that arise when conventional computers hit their performance ceilings. This limitation is primarily the availability of qubits or quantum bits, i.e., the basic units of quantum information, is still insufficient.

One of the reasons for this is that bare qubits are not of immediate use for running a quantum algorithm. While the binary bits of customary computers store information in the form of fixed values of either 0 or 1, qubits can represent 0 and 1 at one and the same time, bringing probability as to their value into play. This is known as quantum superposition.

This makes them very susceptible to external influences, which means that the information they store can readily be lost. In order to ensure that quantum computers supply reliable results, it is necessary to generate a genuine entanglement to join together several physical qubits to form a logical qubit. Should one of these physical qubits fail, the other qubits will retain the information. However, one of the main difficulties preventing the development of functional quantum computers is the large number of physical qubits required.

Many different concepts are being employed to make quantum computing viable. Large corporations currently rely on superconducting solid-state systems, for example, but these have the disadvantage that they only function at temperatures close to absolute zero. Photonic concepts, on the other hand, work at room temperature.

The creation of a photonic Schrdinger cat state in other words the quantum superposition of states of the laser pulse amplitude that can be distinguished on a macroscopic scale (white or black cat) can only be achieved using the most advanced quantum optical techniques and has already been demonstrated to be possible. In the present experiment that is subject of the research paper, it proved to be feasible to extend this to three states (white, gray, and black cats). This light state thus approaches a logical quantum state in which errors can be, in principle, universally corrected. Credit: Peter van Loock

Single photons usually serve as physical qubits here. These photons, which are, in a sense, tiny particles of light, inherently operate more rapidly than solid-state qubits but, at the same time, are more easily lost. To avoid qubit losses and other errors, it is necessary to couple several single-photon light pulses together to construct a logical qubit as in the case of the superconductor-based approach.

Researchers of the University of Tokyo together with colleagues from Johannes Gutenberg University Mainz (JGU) in Germany and Palack University Olomouc in the Czech Republic have recently demonstrated a new means of constructing a photonic quantum computer. Rather than using a single photon, the team employed a laser-generated light pulse that can consist of several photons.

Our laser pulse was converted to a quantum optical state that gives us an inherent capacity to correct errors, stated Professor Peter van Loock of Mainz University. Although the system consists only of a laser pulse and is thus very small, it can in principle eradicate errors immediately.

Thus, there is no need to generate individual photons as qubits via numerous light pulses and then have them interact as logical qubits. We need just a single light pulse to obtain a robust logical qubit, added van Loock.

To put it in other words, a physical qubit is already equivalent to a logical qubit in this system a remarkable and unique concept. However, the logical qubit experimentally produced at the University of Tokyo was not yet of a sufficient quality to provide the necessary level of error tolerance. Nonetheless, the researchers have clearly demonstrated that it is possible to transform non-universally correctable qubits into correctable qubits using the most innovative quantum optical methods.

Reference: Logical states for fault-tolerant quantum computation with propagating light by Shunya Konno, Warit Asavanant, Fumiya Hanamura, Hironari Nagayoshi, Kosuke Fukui, Atsushi Sakaguchi, Ryuhoh Ide, Fumihiro China, Masahiro Yabuno, Shigehito Miki, Hirotaka Terai, Kan Takase, Mamoru Endo, Petr Marek, Radim Filip, Peter van Loock and Akira Furusawa, 18 January 2024, Science. DOI: 10.1126/science.adk7560

See the rest here:
Quantum Computing Breakthrough: Scientists Develop New Photonic Approach That Works at Room Temperature - SciTechDaily