Category Archives: Quantum Computing
planqc Secures 50M Series A Funding to Advance Atom-Based Quantum Computing – HPCwire
July 8, 2024 planqc, a European leader in digital atom-based quantum computing, today announced securing 50 million (~US$54.17 million) in Series A funding. This significant investment round is led by the European Family Office CATRON Holding and the DeepTech & Climate Fonds (DTCF). Additional support comes from Bayern Kapital, the Max-Planck Foundation, and private investors, including UVC and Speedinvest. The round also includes a non-dilutive grant from Germanys Federal Ministry of Education and Research (BMBF).
Alexander Gltzle, CEO and co-founder of planqc, stated, This latest investment round is a strong endorsement of our technology as a leading platform for quantum applications. The substantial backing places us in a perfect position to take on global competitors with our Made in Germany quantum computers, targeting an emerging market valued at billions of euros.
planqcs unique technology, developed from award-winning research at the Max-Planck-Institute for Quantum Optics (MPQ), aims to rapidly advance industry-relevant quantum computers. The new financing will be used to establish a quantum computing cloud service and develop quantum software for applications in industries such as chemistry, healthcare, climate-tech, automotive, and finance. Currently, planqc is utilizing quantum machine learning to work on climate simulations and more efficient batteries for electric vehicles.
Dr. Sebastian Blatt, CTO of planqc, explained the core of planqcs technology: Unlike most other companies, including Big Tech, we use individual atoms confined in crystals of light as qubits. This approach is the fast track to scaling the number of qubits and improving their quality, the prerequisites for being the first to deliver fault-tolerant quantum computers.
Founded in 2022 by scientists from MPQ and Ludwig-Maximilians-University Munich (LMU), planqc is located in Munichs Quantum Valley (MQV). The company has been commissioned to deploy a 1,000-qubit quantum computer at the Leibniz Supercomputing Centre by the German Government. Additionally, planqc has secured a European tender to develop a quantum computer for the German Aerospace Center (DLR).
Dr. Torsten Lffler, Investment Director at DTCF on the occasion, We are thrilled to invest in a startup that not only leads in high-impact technology but also enables further breakthroughs in most pressing global computational challenges across various industries by offering access to the technology in form of a quantum cloud computing service. planqcs impressive track record in securing contracts in particular the DLR Tender and public grants within just 18 months of operations underscores the companys role as a frontrunner in the quantum computing sector both in Europe and globally.
Prof. Immanuel Bloch, director at MPQ, added, At MPQ, we have a strong tradition of supporting spin-offs from our institute and translating fundamental science to industry. planqc is the latest example and is based on our expertise in trapping, cooling and manipulating cold atoms and molecules. In the future, we look forward to extending this collaboration.
The MPQ, in collaboration with planqc, has already showcased scaling the number of neutral atoms used as qubits to 1,200. Reaching this milestone paves the way for fault-tolerant quantum computers. Further scaling to 10,000 or even 100,000 qubits is expected in the coming years. These systems will be capable of tackling previously unsolvable problems.
Quantum computers are poised to revolutionize the discovery of new materials and pharmaceuticals, address optimization challenges in climate research, industry, and transportation planning, and usher in a new era of cryptography. Quantum machine learning will unlock unprecedented applications for artificial intelligence, providing the scientific community with a new understanding of the world.
Dr. Anna Christmann, Coordinator of the Federal Government for German Aerospace and Commissioner for the Digital Economy and Start-ups at the Federal Ministry for Economic Affairs and Climate Action: The success story of planqc demonstrates that innovative research today can become the forward-looking companies of tomorrow, strengthening our long-term competitiveness. We are proud that our ongoing commitment to innovation-friendly frameworks and easier access to venture capital is bearing fruit, and we continue to work every day to improve the start-up ecosystem in Germany and Europe.
Neutral atoms are currently on the fast track to achieving fault-tolerant quantum computing, adds Hermann Hauser, representing the APEX Amadeus Technology fund, one of planqcs seed investors. I am deeply impressed by planqcs rapid progress in this area. Securing over 50 million euros in contracts within 18 months and achieving Europes first 1200-atomic qubit array are remarkable milestones for planqc and Europes tech sovereignty. I eagerly anticipate our continued collaboration in this pioneering field.
Quantum computers are one of the technologies that can offer unforeseeable added value by facilitating or even enabling the discovery, research and development of other future technologies. Examples include new medicines, sustainable battery technology and climate simulations. plancqs technologically promising approach coupled with the existing technological maturity convinced us, says Monika Steger, Managing Director of Bayern Kapital.
Benjamin Erhart, General Partner at UVC Partners concludes, the rapid progress of planqc since our initial investment is rooted both in planqcs scientific excellence as well as its capability to attract world class talent. In line with our strategy of building sustainable category leaders, it was a clear decision to double down on planqc significantily.
Source: planqc
Go here to see the original:
planqc Secures 50M Series A Funding to Advance Atom-Based Quantum Computing - HPCwire
Riverlane Releases Three-year Quantum Error Correction Roadmap – The Quantum Insider
Insider Brief
PRESS RELEASE Riverlane, the global leader inquantum error correctiontechnology,has released its groundbreaking three-year roadmap for itsQuantum Error Correction Stack, Deltaflow, starting in 2024 and culminating in a system where quantum computers can run one million (Mega) error-free Quantum Operations (QuOps) by 2026.
Today, the worlds best quantum computers can perform at most a few hundred quantum operations before failure. This must scale to a million and, ultimately, trillions of operations tounlock the transformative real-world applications that can open a new age of human progress.
Every reliable quantum computer will need quantum error correction (QEC) to reach this point. Deltaflow provides commercial-grade quantum computers with a comprehensive, modular QEC solution, scaling as quantum computers scale and across every qubit type.
As an important near-term milestone, crossing into the MegaQuOp regime will present a pivotal moment when the power of quantum computers will surpass the reach of any classical supercomputer.
Maria Maragkou, VP of product and partnerships at Riverlane, said: Our three-year roadmap promises a series of major milestones on the journey to fault tolerance, culminating in enabling one million error-free quantum operations by the end of 2026. The MegaQuOp is a landmark goal as it puts us outside the regime any supercomputer can simulate.
The Riverlane roadmap introduces a standard industry measure of error-free quantum operations, or QuOps. The maximum QuOp capacity is a useful measure to assess the power of a quantum computer. It will play a similar role to floating-point operations per secondor FLOPscommonly used to rank supercomputers.
Earl Campbell, VP of quantum science at Riverlane, said: The MegaQuOp is a critical milestone, but its just the first step on the journey to full fault-tolerant quantum computing. Well need to realise a trillion error-free operations to begin fully unlocking the higher value applications of quantum computing, but this is a critical set of milestones in that journey.
Its hard to predict, with certainty, what applications well unlock at the MegaQuOp until such a system is in the hands of innovators. But whats undeniable is the industry cant push toward a billion and a trillion error-free operations without first reaching the MegaQuOp threshold.
A series of fast-paced and impressive demonstrations of quantum error correction fromQuantinuum,ETH Zrich,Google,Harvard University,Yale University,IBM,Microsoft,Alice & Bob, andRiverlane(to name a few) have pushed the quantum error correction field further forward than anyone anticipated. But none of these demonstrations have integrated the fast, scalable real-time decoding processes needed for error-corrected quantum computation, which is exactly the problem Deltaflow will solve.
Maragkou added: Reaching the MegaQuOp relies on the entire quantum community pulling together to reach this goal. We are now partnering with the worlds leading quantum hardware companies to make this happen sooner than many anticipate.
Riverlanes quantum error correction roadmap is also aligned with the plans of government bodies around the world that have pivoted their focus to fault-tolerant systems, requiring error correction. The UK government, for instance, hascommittedto build an accessible,UK-based quantum computer capable of running one trillion operations by 2035.
Full details of Riverlanes roadmap are availablehere.
Follow this link:
Riverlane Releases Three-year Quantum Error Correction Roadmap - The Quantum Insider
EDF, Alice & Bob, Quandela, and CNRS team up to enhance quantum computing efficiency – Research & Development World
French electric utility outfit EDF is partnering with quantum computing firms Quandela and the quizzically-named Alice & Bob, alongside the French National Centre for Scientific Research (CNRS), to enhance energy consumption in quantum computing.
With the support of grant money from the public investment bank Bpifrance, the 6.1M initiative will also compare the energy requirements of quantum and classical high-performance computing systems.
The Energetic Optimisation of Quantum Circuits (OECQ) project will involve two phases. The first will compare the energy requirements of high-performance computing (HPC) systems with those of quantum computers.
The second will set its sights on curbing quantum computers energy consumption. Although quantum computers today consume significantly less energy than traditional supercomputers, they still need significant power, ranging from about 25 kW to 600 kWh daily, according to one estimate. Optimizing energy of the systems includes not only the quantum processing unit (QPU) itself but also ensuring the auxiliary technologies that power it are efficient.
The OECQ project arrives at an important time. As interest in AI builds, some Big Tech firms are eyeing nuclear energy to support burgeoning demand. Increasing demand for AI services is driving a substantive increases in power requirements for data centers, potentially surpassing the annual energy consumption of many small nations. By 2030, data centers could consume up to 9% of electricity in the U.S., more than double what is being used now, as Quartz noted.
The eventual commercialization of quantum computing could reshape the computational landscape. But it could presents a double-edged sword for energy consumption. On one hand, quantum computers promise to solve some problems exponentially faster (like factoring prime numbers) than classical computers, potentially curbing overall energy use for some applications. But on the other, they could enable entirely new classes of computationally intensive tasks, potentially driving overall energy demand even higher.
OECQ aims to optimize quantum computers energy use through a partnership-based approach:
The projects second phase will focus on concrete optimization strategies. This stage will include not only improving the efficiency of the quantum processing unit (QPU) itself but also minimizing the energy consumed by the cooling and control systems that support it.
Read more here:
EDF, Alice & Bob, Quandela, and CNRS team up to enhance quantum computing efficiency - Research & Development World
Quantum Computing Industry to Witness Massive Growth – openPR
DataM Intelligence has published a new research report on "Quantum Computing Market Size 2024". The report explores comprehensive and insightful Information about various key factors like Regional Growth, Segmentation, CAGR, Business Revenue Status of Top Key Players and Drivers. The purpose of this report is to provide a telescopic view of the current market size by value and volume, opportunities, and development status.
Get a Free Sample Research PDF - https://datamintelligence.com/download-sample/quantum-computing-market
The Quantum Computing market report majorly focuses on market trends, historical growth rates, technologies, and the changing investment structure. Additionally, the report shows the latest market insights, increasing growth opportunities, business strategies, and growth plans adopted by major players. Moreover, it contains an analysis of current market dynamics, future developments, and Porter's Five Forces Analysis.
Quantum computing is a field of computing that utilizes principles of quantum mechanics, such as superposition and entanglement, to process information. Unlike classical computers that use bits as binary units (0s and 1s), quantum computers use quantum bits or qubits. Qubits can exist in a superposition of states, allowing quantum computers to perform calculations on multiple possible states simultaneously. This parallelism enables quantum computers to potentially solve complex problems much faster than classical computers, especially in fields such as cryptography, optimization, and material science. Quantum computing is still in its early stages of development but holds promise for revolutionizing various industries by tackling computationally intensive tasks that are currently infeasible for classical computers.
Forecast Growth Projected:
The Global Quantum Computing Market is anticipated to rise at a considerable rate during the forecast period, between 2024 and 2031. In 2023, the market is growing at a steady rate, and with the rising adoption of strategies by key players, the market is expected to rise over the projected horizon.
List of the Key Players in the Quantum Computing Market:
Telstra Corporation Limited, IonQ Inc., Silicon Quantum Computing, Huawei Technologies Co. Ltd., Alphabet Inc., Rigetti & Co Inc., Microsoft Corporation, D-Wave Systems Inc., Zapata Computing Inc
Segment Covered in the Quantum Computing Market:
By Offering: Hardware, Software, Service
By Deployment Type: On-premises, Cloud-based
By Technology: Quantum Dots, Trapped Ions, Quantum Annealing
By Application: Optimization, Simulation and Data Problems, Sampling, Machine Learning, Others
By End-User: Banking, Financial Services and Insurance, Aerospace & Defense, Manufacturing, Healthcare, IT & Telecom, Energy & Utilities, Others
Regional Analysis:
The global Quantum Computing Market report focuses on six major regions: North America, Latin America, Europe, Asia Pacific, the Middle East, and Africa.
Get Customization in the report as per your requierments: https://datamintelligence.com/customize/quantum-computing-market
Regional Analysis:
The global Quantum Computing Market report focuses on six major regions: North America, Latin America, Europe, Asia Pacific, the Middle East, and Africa. The report offers detailed insight into new product launches, new technology evolutions, innovative services, and ongoing R&D. The report discusses a qualitative and quantitative market analysis, including PEST analysis, SWOT analysis, and Porter's five force analysis. The Quantum Computing Market report also provides fundamental details such as raw material sources, distribution networks, methodologies, production capacities, industry supply chain, and product specifications.
Chapter Outline:
Chapter 1: Introduces the report scope of the report, executive summary of different market segments (by region, product type, application, etc), including the market size of each market segment, future development potential, and so on. It offers a high-level view of the current state of the market and its likely evolution in the short to mid-term, and long term.
Chapter 2: key insights, key emerging trends, etc.
Chapter 3: Manufacturers competitive analysis, detailed analysis of Quantum Computing manufacturers competitive landscape, revenue market share, latest development plan, merger, and acquisition information, etc.
Chapter 4: Provides profiles of key players, introducing the basic situation of the main companies in the market in detail, including product revenue, gross margin, product introduction, recent development, etc.
Chapter 5 & 6: Revenue of Quantum Computing in regional level and country level. It provides a quantitative analysis of the market size and development potential of each region and its main countries and introduces the market development, future development prospects, market space, and market size of each country in the world.
Chapter 7: Provides the analysis of various market segments by Type, covering the market size and development potential of each market segment, to help readers find the blue ocean market in different market segments.
Chapter 8: Provides the analysis of various market segments by Application, covering the market size and development potential of each market segment, to help readers find the blue ocean market in different downstream markets.
Chapter 9: Analysis of industrial chain, including the upstream and downstream of the industry.
Chapter 10: The main points and conclusions of the report.
Get a Free Sample PDF copy of the report @ https://datamintelligence.com/download-sample/quantum-computing-market
FAQs
How big is the Quantum Computing Market?
The Quantum Computing Market size was USD $ 650.1 million in 2023. It is expected to reach USD $ 8,788.8 million by 2031.
How fast is the Quantum Computing Market growing?
The Quantum Computing Market will exhibit a CAGR of 38.9% during the forecast period, 2024-2031.
Read Latest Blog: https://www.datamintelligence.com/blogs/top-10-process-automation-companies-worldwide
Company Name: DataM Intelligence Contact Person: Sai Kiran Email: Sai.k@datamintelligence.com Phone: +1 877 441 4866 Website: https://www.datamintelligence.com
DataM Intelligence is a Market Research and Consulting firm that provides end-to-end business solutions to organizations from Research to Consulting. We, at DataM Intelligence, leverage our top trademark trends, insights and developments to emancipate swift and astute solutions to clients like you. We encompass a multitude of syndicate reports and customized reports with a robust methodology. Our research database features countless statistics and in-depth analyses across a wide range of 6300+ reports in 40+ domains creating business solutions for more than 200+ companies across 50+ countries; catering to the key business research needs that influence the growth trajectory of our vast clientele.
This release was published on openPR.
Go here to see the original:
Quantum Computing Industry to Witness Massive Growth - openPR
CMA CGM Group Invests in Pasqal, Companies to Explore Quantum Computing For Optimizing Maritime And Logistics Operations – The Quantum Insider
Insider Brief
PRESS RELEASE The CMA CGM Group, a global player in sea, land, air and logistics solutions, announces a strategic partnership with Pasqal, a world leader in neutral atom quantum computing.
As part of this collaboration, which aims to introduce cutting-edge quantum computing technologies into the Groups operations, CMA CGM also announces an investment in Pasqal.
A Quantum Computing Center of Excellence at TANGRAM to optimise transport and logistics
Among the objectives of this partnership, CMA CGM aims to leverage the power of quantum computing to enhance the efficiency, responsiveness, and adaptability of transport and logistics to market fluctuations. In particular, CMA CGM will seek to optimise container management, including their loading on ships.
Together, CMA CGM and Pasqal will establish a Quantum Computing Center of Excellence at TANGRAM, the Groups excellence center dedicated to training and innovation, with access to a quantum processor developed by Pasqal.
Training and events for CMA CGM Group staff members
The e-learning platform developed by Pasqal will be made available to CMA CGM staff members, who will be trained to improve understanding of quantum computing within the Group.
TANGRAM and Pasqal will jointly organize events dedicated to quantum computing, including use case workshops, technical presentations, and master classes. These events are intended to promote innovation and collaboration within CMA CGM and with its partners, clients, and suppliers.
Towards sustainable and innovative excellence
This initiative is part of the CMA CGM Groups strategy to transform its activities through innovation. It follows investments in technology companies and artificial intelligence initiatives such as Kyutai and Mistral AI. The partnership with Pasqal will enable CMA CGM to strengthen its position at the forefront of digitalisation in the transport and logistics sector.
Hadi Zablit, Executive Vice President for Information & Technology at the CMA CGM Group, states: This partnership with Pasqal will allow CMA CGM to apply quantum computing technologies to maritime transport and logistics, reinforcing our Groups position as a leader in the digital transformation of our industry. With Pasqal, we aim to unlock the full potential of quantum computing for greater operational efficiency, serving our customers.
All of Pasqals priorities are reflected in this partnership with CMA CGM: developing concrete use cases with leading industrial players, accelerating the understanding of the potential of quantum computing, and continuing cutting-edge research to serve the quantum ecosystem. Everyone at Pasqal is eager to start collaborating with the teams of this historic French company,says Georges-Olivier Reymond, CEO of Pasqal.
Simulating the universes most extreme environments with utility-scale quantum computation – IBM
The Standard Model of Particle Physics encapsulates nearly everything we know about the tiny quantum-scale particles that make up our everyday world. It is a remarkable achievement, but its also incomplete rife with unanswered questions. To fill the gaps in our knowledge, and discover new laws of physics beyond the Standard Model, we must study the exotic phenomena and states of matter that dont exist in our everyday world. These include the high-energy collisions of particles and nuclei that take place in the fiery heart of stars, in cosmic ray events occurring all across earths upper atmosphere, and in particle accelerators like the Large Hadron Collider (LHC) at CERN or the Relativistic Heavy Ion Collider at Brookhaven National Laboratory.
Computer simulations of fundamental physics processes play an essential role in this research, but many important questions require simulations that are much too complex for even the most powerful classical supercomputers. Now that utility-scale quantum computers have demonstrated the ability to simulate quantum systems at a scale beyond exact or brute force classical methods, researchers are exploring how these devices might help us run simulations and answer scientific questions that are inaccessible to classical computation. In two recent papers published in PRX Quantum (PRX)1 and Physical Review D (PRD)2, our research group did just that, developing scalable techniques for simulating the real-time dynamics of quantum-scale particles using the IBM fleet of utility-scale, superconducting quantum computers.
The techniques weve developed could very well serve as the building blocks for future quantum computer simulations that are completely inaccessible to both exact and even approximate classical methods simulations that would demonstrate what we call quantum advantage over all known classical techniques. Our results provide clear evidence that such simulations are potentially within reach of the quantum hardware we have today.
We are a team of researchers from the University of Washington and Lawrence Berkeley National Laboratory who have spent years investigating the use of quantum hardware for simulations of quantum chromodynamics (QCD).
This work was supported, in part, by the U.S. Department of Energy grant DE-FG02-97ER-41014 (Farrell), by U.S. Department of Energy, Office of Science, Office of Nuclear Physics, InQubator for Quantum Simulation (IQuS) under Award Number DOE (NP) Award DE-SC0020970 via the program on Quantum Horizons: QIS Research and Innovation for Nuclear Science (Anthony Ciavarella, Roland Farrell, Martin Savage), the Quantum Science Center (QSC) which is a National Quantum Information Science Research Center of the U.S. Department of Energy (DOE) (Marc Illa), and by the U.S. Department of Energy (DOE), Office of Science under contract DE-AC02-05CH11231, through Quantum Information Science Enabled Discovery (QuantISED) for High Energy Physics (KA2401032) (Anthony Ciavarella).
This work is also supported, in part, through the Department of Physics and the College of Arts and Sciences at the University of Washington.
This research used resources of the Oak Ridge Leadership Computing Facility (OLCF), which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.
We acknowledge the use of IBM Quantum services for this work.
This work was enabled, in part, by the use of advanced computational, storage and networking infrastructure provided by the Hyak supercomputer system at the University of Washington.
This research was done using services provided by the OSG Consortium, which is supported by the National Science Foundation awards #2030508 and #1836650.
One prominent example of these challenges comes from the field of collider physics. Physicists use colliders like the LHC to smash beams of particles and atomic nuclei into each other at extraordinarily high energies, recreating the kinds of collisions that take place in stars and cosmic ray events. Collider experiments give physicists the ability to observe how matter behaves in the universes most extreme environments. The data we collect from these experiments help us tighten the constraints of the Standard Model and can also help us discover new physics beyond the Standard Model.
Lets say we want to use the data from collider experiments to identify new physics theories. To do this, we must be able to accurately predict the way known physics theories like QCD contribute to the exotic physics processes that occur in collider runs, and we must be able to quantify the uncertainties of the corresponding theoretical calculations. Performing these tasks requires detailed simulations of systems of fundamental particles. These simulations are impossible to achieve with classical computation alone, but should be well within reach for a sufficiently capable quantum computer.
Quantum computing hardware is making rapid progress toward the day when it will be capable of simulating complex systems of fundamental particles, but we cant just sit back and wait for quantum technology to reach maturity. When that day comes, well need to be ready with scalable techniques for executing each step of the simulation process.
The research community is already beginning to make significant progress in this field, with most efforts today focused on simulations of simplified, low-dimensional models of QCD and other fundamental physics theories. This is exactly what our research group has been working on, with our experiments primarily centering on simulations of the widely used Schwinger model, a one-dimensional model of QCD that describes how electrons and positrons behave and interact through the exchange of photons.
In a paper submitted to arXiv in 2023, and published in PRX Quantum this past April, we used the Schwinger model to demonstrate the first essential step in building future simulations of high-energy collisions of matter: preparing a simulation of the quantum vacuum state in which particle collisions would occur. Our follow-up to that paper, published in PRD in June, shows techniques for performing the next step in this process preparing a beam of particles in the quantum vacuum.
More specifically, that follow-up paper shows how to prepare hadron wavepackets in a 1-dimensional quantum simulation and evolve them forward in time. In this context, you can think of a hadron as a composite particle made up of a positron and electron and bound together by something analogous to the strong force that binds neutrons and protons together in nuclei.
Due to the uncertainty principle, it is impossible to precisely know both the position and momentum of a particle. The best you can do is to create a wavepacket, a region of space over which a particle will appear with some probability and with a range of different momenta. The uncertainty in momentum causes the wavepacket to spread out or propagate across some area of space.
By evolving our hadron wavepacket forward in time, we effectively create a simulation of pulses or beams of hadrons moving in this 1-dimensional system, just like the beams of particles we smash into each other in particle colliders. The wavepacket we create has an equal probability of propagating in any direction. However, since were working in 1-dimensional space, essentially a straight line, its more accurate to say the particle is equally likely to propagate to the left or to the right.
Weve established that our primary goal is to simulate the dynamics of a composite hadron particle moving through the quantum vacuum in one-dimensional space. To achieve this, well need to prepare an initial state with the hadron situated on a simplified model of space made up of discrete points also known as a lattice. Then, well have to perform what we call time evolution so we can see the hadron move around and study its dynamics.
Our first step is to determine the quantum circuits well need to run on the quantum computer to prepare this initial state. To do this, we developed a new state preparation algorithm, Scalable Circuits ADAPT-VQE. This algorithm uses the popular ADAPT-VQE algorithm as a subroutine, and is able to find circuits for preparing the state with the lowest energy i.e., the ground state as well as a hadron wavepacket state. A key feature of this technique is the use of classical computers to determine circuit blocks for preparing a desired state on a small lattice that can be systematically scaled up to prepare the desired state on a much larger lattice. These scaled circuits cannot be executed exactly on a classical computer and are instead executed on a quantum computer.
Once we have the initial state, our next step is to apply the time evolution operator. This is a mathematical tool that allows us to take a quantum state as it exists at one point in time and evolve it into the state that corresponds to some future point in time. In our experiment, we use the conventional Trotterized time evolution, where you split up the different mathematical terms representing the Hamiltonian energy equation that describes the quantum system and convert each term into quantum gates in your circuit.
This, however, is where we run into a problem. Even the simplified Schwinger model states that interactions between individual matter particles in our system are all-to-all. In other words, every matter particle in the system must interact with every other particle in the system, meaning every qubit in our circuit needs to interact with every other qubit.
This poses a few challenges. For one thing, an all-to-all interaction causes the number of quantum gates required for time evolution to scale quadratically with the simulation volume, making these circuits much too large to run on current quantum hardware. Another key challenge is that, as of today, even the most advanced IBM Quantum processor allows only for native interactions between neighboring qubits so, for example, the fifth qubit in an IBM Quantum Heron processor can technically interact only with qubits 4 and 6. While there are special techniques that let us get around this linear connectivity and simulate longer range interactions, doing this in an all-to-all setting would make it so the required two-qubit gate depth also scales quadratically in the simulation volume.
To get around this problem, we used the emergent phenomenon of confinement one of the features that the Schwinger model also shares with QCD. Confinement tells us that interactions are significant only over distances around the size of the hadron. This motivated our use of approximate interactions, where the qubits need to interact only with at most next-to-next-to-nearest neighbor qubits, e.g., qubit 5 needs to interact only with qubits 2, 3, 4, 6, and 7. We established a formalism for constructing a systematically improvable interaction and turned that interaction into a sequence of gates that allowed us to perform the time evolution.
Once the time evolution is complete, all we need to do is measure some observable in our final state. In particular, we wanted to see the way our simulated hadron particle propagates on the lattice, so we measured the particle density. At the beginning of the simulation (t=0), the hadron is localized in a specific area. As it evolves forward in time it propagates with a spread that is bounded by the speed of light (a 45 angle).
This figure depicts the results of our simulation of hadron dynamics. The time direction is charted on the lefthand-side Y-axis, and the points on the lattice qubits 0 to 111 are charted on the X-axis. The colors correspond to the particle density, with higher values (lighter colors) corresponding to having a higher probability of finding a particle at that location. The left-half of this figure shows the results of error-free approximate classical simulation methods, while the right half shows the results obtained from performing simulations on real Quantum hardware (specifically, the `ibm_torino` system). In an error free simulation, the left and right halves would be mirror images of each other. Deviations from this are due to device errors.
Keeping in mind that this is a simplified simulation in one spatial dimension, we can say this behavior mimics what we would expect to see from a hadron propagating through the vacuum, such as the hadrons produced by a device like the Large Hadron Collider.
Utility-scale IBM quantum hardware played an essential role in enabling our research. Our experiment used 112 qubits on the IBM Quantum Heron processor ibm_torino to run circuits that are impossible to simulate with brute force classical methods. However, equally important was the Qiskit software stack, which provided a number of convenient and powerful tools that were absolutely critical in our simulation experiments.
Quantum hardware is extremely susceptible to errors caused by noise in the surrounding environment. In the future, IBM hopes to develop quantum error correction, a capability that allows quantum computers to correct errors as they appear during quantum computations. For now, however, that capability remains out of reach.
Instead, we rely on quantum error suppression methods to anticipate and avoid the effects of noise, and we use quantum error mitigation post-processing techniques to analyze the quantum computers noisy outputs and deduce estimates of the noise-free results.
In the past, leveraging these techniques for quantum computation could be enormously difficult, often requiring researchers to hand-code error suppression and error mitigation solutions specifically tailored to both the experiments they wanted to run, and the device they wanted to use. Fortunately, the recent advent of software tools like the Qiskit Runtime primitives have made it much easier to get meaningful results out of quantum hardware while taking advantage of built-in error handling capabilities.
In particular, we relied heavily on the Qiskit Runtime Sampler primitive, which calculates the probabilities or quasi-probabilities of bitstrings being output by quantum circuits, and makes it easy to compute physical observables like the particle density.
Sampler not only simplified the process of collecting these outputs, but also improved their fidelity by automatically inserting an error suppression technique known as dynamical decoupling into our circuits and by automatically applying quantum readout error mitigation to our results.
Obtaining accurate, error-mitigated results required running many variants of our circuits. In total, our experiment involved roughly 154 million "shots" on quantum hardware, and we couldn't have achieved this by running our circuits one by one. Instead, we used Qiskit execution modes, particularly Session mode, to submit circuits to quantum hardware in efficient multi-job workloads. The sequential execution of many circuits meant that the calibration and noise on the device was correlated between runs facilitating our error mitigation methods.
Sending circuits to IBM Quantum hardware while taking advantage of the Sampler primitive and Session mode required just a few lines of code, truly as simple as:
Our team did several runs both with and without Qiskit Runtimes built-in error mitigation, and found that methods offered natively via the Sampler primitive significantly improved the quality and accuracy of our results. In addition, the flexibility of Session and Sampler allowed us to add additional, custom layers of error mitigation like Pauli twirling and operator decoherence renormalization. The combination of all these error mitigation techniques enabled us to successfully perform a quantum simulation with 13,858 CNOTs and a CNOT depth of 370!
What is CNOT depth? CNOT depth is an important measure of the complexity of quantum circuits. A CNOT gate, or controlled NOT gate, is a quantum logic gate that takes two qubits as input, and performs a NOT operation that flips the value of the second (target) qubit depending on the value of the first (control) qubit. CNOT gates are an important building block in many quantum algorithms and are the noisiest gate on current quantum computers. CNOT depth of a quantum simulation refers to the number of layers of CNOT gates across the whole device that have to be executed (each layer can have multiple CNOT gates acting on different qubits, but they can be applied at the same time, i.e., in parallel). Without the use of quantum error handling techniques like those offered by the Qiskit software stack, reaching a CNOT depth of 370 would be impossible.
Over the course of two research papers, we have demonstrated techniques for using utility-scale quantum hardware to simulate the quantum vacuum, and to simulate the dynamics of a beam of particles on top of that vacuum. Our research group is already hard at work on the logical next step in this progression simulating collisions between two particle beams.
If we can simulate these collisions at high enough energy, we believe we can demonstrate the long-sought goal of quantum computational advantage. Today, no classical computing method is capable of accurately simulating the collision of two particles at the energies weve set our sights on, even using simplified physics theories like the Schwinger model. However, our research so far indicates that this task could be within reach for near-term utility-scale quantum hardware. This means that, even without achieving full quantum error correction, we may soon be able to use quantum hardware to build simulations of systems of fundamental particles that were previously impossible, and use those simulations to seek answers to some of the most enduring mysteries in all of physics.
At the same time, IBM hasnt given up hope for quantum error correction, and neither have we. Indeed, weve poured tremendous effort into ensuring that the techniques weve developed in our research are scalable, such that we can transition them from the noisy, utility-scale processors we have today to the hypothetical error-corrected processors of the future. If achieved, the ability to perform error correction in quantum computations will make quantum computers considerably more powerful, and open the door to rich, three-dimensional simulations of incredibly complex physics processes. With those capabilities at our fingertips, who knows what well discover?
See the rest here:
Simulating the universes most extreme environments with utility-scale quantum computation - IBM
How strategic visionary Atul Gupta is charting new frontiers in quantum computing and AI and setting global benchmarks – Business Insider Africa
Atul Gupta has significantly contributed to the fields of quantum computing and artificial intelligence (AI), influencing both organizational growth and industry standards. His work demonstrates the practical applications of these advanced technologies, showing how they can solve complex problems and optimize operations across various sectors.
In quantum computing, Gupta's initiatives have led to advancements that enable organizations to tackle problems previously unsolvable by classical computers. This has resulted in tangible benefits such as accelerated drug discovery processes in the pharmaceutical industry and enhanced design optimizations in CRM systems. These contributions have not only reduced processing times for complex data but also improved overall computational capabilities, demonstrating Gupta's role in pushing the boundaries of technological applications. "Our vision is to make the impossible possible by leveraging quantum computing to solve real-world challenges," Gupta states, emphasizing the transformative potential of this technology.
Gupta's expertise in AI has been crucial in transforming business operations from healthcare to cybersecurity. By integrating AI into key processes, he has improved efficiency and facilitated real-time decision-making. Applications like natural language processing for virtual assistants and data analysis for detecting financial fraud have been implemented under his guidance, streamlining operations and enabling businesses to adapt to the dynamic market landscape.
A significant aspect of Gupta's work is his impact on the Salesforce ecosystem. His proficiency with Salesforce Data Cloud AI has enabled businesses to analyze extensive customer data from multiple sources, providing actionable insights that enhance personalization and predict future behaviors. This has led to improved customer satisfaction and loyalty. Furthermore, his use of Salesforce's Einstein platform has helped businesses automate responses and anticipate customer needs, thereby refining marketing efforts and boosting sales performance. "Our goal is to empower businesses to understand and serve their customers better through intelligent data utilization," Gupta notes, highlighting the vision behind his work with Salesforce technologies.
In addressing the challenges of technological integration and security, Gupta has led efforts to develop robust hardware and sophisticated software solutions. Being an expert in the field, he has been particularly focused on mitigating the security risks posed by quantum computing, promoting the adoption of quantum-safe cryptographic solutions and stringent governance standards. This approach has set new benchmarks in digital security, ensuring organizations remain protected against evolving cyber threats.
He has played a crucial role in the integration of the Dynatrace platform, which tremendously helped in enhancing system observability and security for customer operations. By leveraging Dynatrace's analytics and automation capabilities, businesses now have comprehensive visibility across their digital ecosystems, enabling real-time threat detection and neutralization. This optimization of system performance has ensured high levels of security and reliability, showcasing the practical benefits of advanced technological solutions.
Gupta has shown consistent efforts in leveraging his skills beyond a particular industry or organization. His commitment to fostering investment and collaboration is evident in his efforts and vision for partnerships between governments and industries. These collaborations are crucial for overcoming existing limitations and unlocking the full potential of quantum computing and AI. Furthermore, his dedication to ensuring equitable access and ethical deployment of these technologies plays a critical role in preventing disparities and promoting responsible innovation.
Atul Guptas work in integrating quantum computing and artificial intelligence is transformative, setting new standards for computational power and cognitive capabilities across industries. By managing the challenges and harnessing the opportunities these technologies present, Mr. Gupta shows his leadership and commitment to a technologically advanced and ethically guided future. His contributions enhance competitive edges and operational efficiencies, ensuring the sustainable and equitable expansion of these advanced technologies.
View original post here:
How strategic visionary Atul Gupta is charting new frontiers in quantum computing and AI and setting global benchmarks - Business Insider Africa
Qubit Pharmaceuticals And Sorbonne University Reduce The Number of Qubits Needed to Simulate Molecules – The Quantum Insider
Insider Brief
PRESS RELEASE Qubit Pharmaceuticals, a deeptech company specializing in the discovery of new drug candidates through molecular simulation and modeling accelerated by hybrid HPC and quantum computing, announces that it has drastically reduced the number of qubits needed to compute the properties of small molecules with its Hyperion-1 emulator1, developed in partnership with Sorbonne University. This world first raises hopes of a near-term practical application of hybrid HPC quantum computing to drug discovery.
As a result of these advances, Qubit Pharmaceuticals and Sorbonne Universit are announcing that they have been awarded 8 million in funding under the France 2030 national plan for the further development of Hyperion-1.
A world first that saves years in research
By developing new hybrid HPC and quantum algorithms to leverage the computing power of quantum computers in the field of chemistry and drug discovery, Sorbonne Universit and Qubit Pharmaceuticals have succeeded, with just 32 logic qubits, in predicting the physico-chemical properties of nitrogen (N2), hydrogen fluoride (HF), lithium hydride and water molecules that would normally require more than 250 perfect qubits. The Hyperion-1 emulator uses Genci supercomputers, Nvidias SuperPod EOS, and one of Scaleways many GPU clusters.
With this first proof of concept, the teams have demonstrated that the routine use of quantum computers coupled with high-performance computing platforms for chemistry and drug discovery is much closer than previously thought. Nearly 5 years could be gained, bringing us significantly closer to the era when quantum computers (noisy or perfect) could be used in production within hybrid supercomputers combining HPC, AI and quantum. The use of these new computing powers will improve the precision, speed and carbon footprint of calculations.
Soon to be deployed on todays noisy machines
To achieve this breakthrough, teams from Qubit Pharmaceuticals and Sorbonne University have developed new algorithms that break down a quantum calculation into its various components, some of which can be calculated precisely on conventional hardware. This strategy enables calculations to be distributed using the best hardware (quantum or classical), while automatically improving the complexity of the algorithms needed to calculate the molecules properties.
In this way, all calculations not enhanced by quantum computers are performed on classical GPUs. As the physics used allows the number of qubits required for the calculations, the team, by optimizing the approach to the extreme, has even managed to limit GPU requirements to a single card in some cases. As this hybrid classical/quantum approach is generalist, it can be applied to any type of quantum chemistry calculation, and is not restricted to molecules of pharmaceutical interest, but also to catalysts (chemistry, energy) or materials.
Next steps include deploying these algorithms on existing noisy machines to quantify the impact of noise, and compare performance with recent calculations by IBM and Google, and predicting the properties of molecules of pharmaceutical interest. To achieve this, the teams will deploy new software acceleration methods to reach regimes that would require more than 400 qubits with purely quantum approaches. In the short term, this hybrid approach will reduce the need for physical qubits on quantum machines.
Robert Marino, CEO of Qubit Pharmaceuticals, declares: At the end of 2023, we announced quantum chemistry calculations using 40 qubits. A few months later, weve managed to solve equations that would require 250 logic qubits. An extremely rapid development that confirms the near-term potential of hybrid HPC and quantum algorithms in the service of drug discovery.
Jean-Philip Piquemal, Professor at Sorbonne University and Director of the Theoretical Chemistry Laboratory (Sorbonne University/CNRS), co-founder and Chief Scientific Officer of Qubit Pharmaceuticals, states: This work clearly demonstrates the need to progress simultaneously on hardware and software development. It is by making breakthroughs on both fronts that we will be able to enter the era of quantum utility for drug discovery in the very short term.
lisabeth Angel-Perez, Vice-President Research and Innovation at Sorbonne Universit: These innovative approaches developed by Qubit Pharmaceuticals are an illustration of Sorbonne Universits commitment to serving society. The precision and power of quantum computers offer major performance gains. With Qubit Pharmaceuticals, we measure the enormous potential of theoretical computing for quantum chemistry.
Sbastien Luttringer, Head of R&D at Scaleway : We are proud to have participated in Qubit Pharmaceuticals major algorithmic breakthrough with the support of our GPU computing power, the largest in the European cloud. Quantum computing is not just a hardware challenge, its also a software one that we need to develop in order to solve real-world problems. Scaleways pragmatic strategy, with the introduction of its QaaS (Quantum as a Service) service, is to simplify access to the best resources to help this algorithm of tomorrow emerge.
See the article here:
Qubit Pharmaceuticals And Sorbonne University Reduce The Number of Qubits Needed to Simulate Molecules - The Quantum Insider
New Method Could Pave The Way to Fast, Cross-Country Quantum Networks – The Quantum Insider
Insider Brief
PRESS RELEASE Quantum computers offer powerful ways to improve cybersecurity, communications, and data processing, among other fields. To realize these full benefits, however, multiple quantum computers need to be connected to build quantum networks or a quantum internet. Scientists have struggled to come up with practical methods of building such networks, which must transmit quantum information over long distances.
Now, researchers at theUniversity of Chicago Pritzker School of Molecular Engineering (PME)have proposed a new approach building long quantum channels using vacuum sealed tubes with an array of spaced-out lenses. These vacuum beam guides, about 20 centimeters in diameter, would have ranges of thousands of kilometers and capacities of more than 10 trillion qubits per second, better than any existing quantum communication approach. Photons of light encoding quantum data would move through the vacuum tubes and remain focused thanks to the lenses.
We believe this kind of network is feasible and has a lot of potential, saidLiang Jiang, professor of molecular engineering and senior author of the new work. It could not only be used for secure communication, but also for building distributed quantum computing networks, distributed quantum sensing technologies, new kinds of telescopes, and synchronized clocks.
Jiang collaborated with scientists at Stanford University and the California Institute of Technology on the new work,which is published inPhysical Review Letters.
Sending qubits
While classical computers encode data in conventional bits represented as a 0 or 1 quantum computers rely on qubits, which can exhibit quantum phenomena. These phenomena include superposition a kind of ambiguous combination of states as well as entanglement, which allows two quantum particles to be correlated with each other even across vast distances.
These properties give quantum computers the ability to analyze new types of data and store and pass along information in new, secure ways. Connecting multiple quantum computers can them even more powerful, as their data processing abilities can be pooled. However, networks typically used to connect computers are not ideal because they cannot maintain the quantum properties of qubits.
You cant send a quantum state over a classical network, explained Jiang. You might send a piece of data classically, a quantum computer can process it, but the result is then sent back classically again.
Some researchers have tested ways of using fiber-optic cables and satellites to transmit optical photons, which can act as qubits. Photons can travel a short distance through existing fiber-optic cables but generally lose their information quickly as photons are absorbed. Photons bounced to satellites and back to the ground in a new location are absorbed less because of the vacuum of space, but their transmission is limited by the atmosphere absorption and availability of the satellites.
What we wanted to do was to combine the advantages of each of those previous approaches, said PME graduate studentYuexun Huang, the first author of the new work. In a vacuum, you can send a lot of information without attenuation. But being able to do that on the ground would be ideal.
Learning from LIGO
Scientists working at the Laser Interferometer Gravitational-Wave Observatory (LIGO) the California Institute of Technology have built huge ground-based vacuum tubes to contain moving photons of light that can detect gravitational waves. Experiments at LIGO have shown that inside a nearly-molecule-free vacuum, photons can travel for thousands of kilometers.
Inspired by this technology, Jiang, Huang, and their colleagues began to sketch out how smaller vacuum tubes could be used to transport photons between quantum computers. In their new theoretical work, they showed that these tubes, if designed and arranged properly, could carry photons across the country. Moreover, they would only need medium vacuum (10^-4 atmosphere pressure), which is much easier to maintain than the ultra-high vacuum (10^-11 atmosphere pressure) required for LIGO.
The main challenge is that as a photon moves through a vacuum, it spreads out a bit, explained Jiang. To overcome that, we propose putting lenses every few kilometers that can focus the beam over long distances without diffraction loss.
In collaboration with researchers at Caltech, the group is planning tabletop experiments to test the practicality of the idea, and then plans to use larger vacuum tubes such as those at LIGO to work on how to align the lenses and stabilize the photon beams over long distances.
To implement this technology on a larger scale certain poses some civil engineering challenges that we need to figure out as well, said Jiang. But the ultimate benefit is that we have large quantum networks that can communicate tens of terabytes of data per second.
Citation: Vacuum Beam Guide for Large Scale Quantum Networks, Huang et al,Physical Review Letters, July 9, 2024. DOI:10.1103/PhysRevLett.133.020801
Funding: This work was supported by the Army Research Laboratory, Air Force Research Laboratory, National Science Foundation, NTT Research, Packard Foundation, the Marshall and Arlene Bennett Family Research Program, and the U.S. Department of Energy.
Go here to see the original:
New Method Could Pave The Way to Fast, Cross-Country Quantum Networks - The Quantum Insider
Register to host an event at Qiskit Fall Fest 2024! – IBM
Key dates for prospective event hosts:
August 7: Deadline to sign up for event host informational sessions and Qiskit Fall Fest mailing list
August 15: Informational session
August 16: Informational session
August 22: Deadline for event host applications
August 27: Application decisions to be announced
September 3: Qiskit Fall Fest 2024 event lineup to be announced to the public
October-November: Qiskit Fall Fest events take place
Since 2021, the Qiskit Fall Fest has brought together quantum enthusiasts of all backgrounds and experience levels for a worldwide celebration of quantum technology, research, and collaboration. Spearheaded primarily by student leaders and taking place on university campuses all around the globe, Qiskit Fall Fest gives participants a unique opportunity to engage with the Qiskit community and even get hands-on experience with real quantum computers. Now, the event series is gearing up to return for its fourth annual installment, which will kick off in October.
Qiskit Fall Fest is a collection of quantum computing events that invites students, researchers and industry professionals around the world to participate in a wide array of quantum-themed activities, ranging from quantum challenges, hackathons, and coding competitions to workshops, social events, and more. With each Qiskit Fall Fest, we partner with a select group of university students and other volunteer hosts to help them plan and run the global roster of Fall Fest events. This year's event theme, World of Quantum, celebrates the international scope of the event series and the rapid growth of the global quantum community.
Last years Qiskit Fall Fest engaged over 4,000 participants with the help of 95 event hosts all working alongside IBM Quantum to grow their local quantum communities. We hope to see even more participants in 2024!
Were looking for volunteers located all around the world to host their very own events as part of the Qiskit Fall Fest lineup. Anyone who has a passion for quantum computing is eligible to host a Fall Fest event. (See the next section of this post for more details on host eligibility.)
Interested in joining the fun? Click this link to register for one of the Qiskit Fall Fest informational sessions well be holding this summer for prospective event hosts.
The informational sessions will take place on Thursday, August 15 and Friday, August 16, and will give prospective event hosts valuable insights into the requirements and time commitment involved with running a Qiskit Fall Fest event.
If youd like to participate in Qiskit Fall Fest but dont plan on hosting an event, you can also use the same registration link to sign up for the Qiskit Fall Fest mailing list, which will keep you up-to-date with all the latest details on this years events.
Please submit all registrations for the Qiskit Fall Fest informational sessions and/or mailing list by Wednesday, August 7.
After the informational sessions, prospective event hosts will submit applications detailing their background and expertise in quantum computing. Applications will be due the week after the information sessions, and decisions will be announced the week after that. Be sure to check the sidebar at the top of this page for all key dates.
The full roster of Qiskit Fall Fest 2024 events will be announced to the public in early September, and the events themselves will take place in October and November.
Most Qiskit Fall Fest events take place on university campuses and are led by university students though there are certainly some exceptions. Weve intentionally put students at the forefront of the Qiskit Fall Fest event series since its initial launch in 2021. Thats because we believe the student leaders of today will be the quantum industry leaders of tomorrow. With the Qiskit Fall Fest, we aim to give students an opportunity to put their leadership skills to the test and help grow the quantum community using resources and guidance from IBM.
At the same time, anyone can participate in and even host a Qiskit Fall Fest event. Dont have access to a university campus? No problem! In the past, weve had high school students, recent graduates, and even industry professionals host events that take place virtually and in other appropriate settings. Just be sure to register for the informational sessions by August 7 and submit your idea for an event by August 22. If its a fit, well work with you to bring it to life. (Please note: Only those who attend one of the informational sessions will receive access to the event host application.)
Click here to register for the mailing list and informational sessions.
See the original post:
Register to host an event at Qiskit Fall Fest 2024! - IBM