Page 34«..1020..33343536..4050..»

Qubit Pharmaceuticals and Sorbonne University Reduce Number of Qubits Needed to Simulate Molecules – HPCwire

PARIS, July 11, 2024 Qubit Pharmaceuticals , a deeptech company specializing in the discovery of new drug candidates through molecular simulation and modeling accelerated by hybrid HPC and quantum computing, announced that it has drastically reduced the number of qubits needed to compute the properties of small molecules with its Hyperion-1 emulator, developed in partnership with Sorbonne Universit. This world first raises hopes of a near-term practical application of hybrid HPC quantum computing to drug discovery.

As a result of these advances, Qubit Pharmaceuticals and Sorbonne Universit are announcing that they have been awarded 8 million in funding under the France 2030 national plan for the further development of Hyperion-1.

At the end of 2023, we announced quantum chemistry calculations using 40 qubits, said Robert Marino, CEO of Qubit Pharmaceuticals. A few months later, weve managed to solve equations that would require 250 logic qubits. An extremely rapid development that confirms the near-term potential of hybrid HPC and quantum algorithms in the service of drug discovery.

By developing new hybrid HPC and quantum algorithms to leverage the computing power of quantum computers in the field of chemistry and drug discovery, Sorbonne Universit and Qubit Pharmaceuticals have succeeded, with just 32 logic qubits, in predicting the physico-chemical properties of nitrogen (N2), hydrogen fluoride (HF), lithium hydride and water molecules that would normally require more than 250 perfect qubits. The Hyperion-1 emulator uses Genci supercomputers, Nvidias SuperPod EOS, and one of Scaleways many GPU clusters.

With this first proof of concept, the teams have demonstrated that the routine use of quantum computers coupled with high-performance computing platforms for chemistry and drug discovery is much closer than previously thought. Nearly 5 years could be gained, bringing us significantly closer to the era when quantum computers (noisy or perfect) could be used in production within hybrid supercomputers combining HPC, AI and quantum. The use of these new computing powers will improve the precision, speed and carbon footprint of calculations.

To achieve this breakthrough, teams from Qubit Pharmaceuticals and Sorbonne University have developed new algorithms that break down a quantum calculation into its various components, some of which can be calculated precisely on conventional hardware. This strategy enables calculations to be distributed using the best hardware (quantum or classical), while automatically improving the complexity of the algorithms needed to calculate the molecules properties.

In this way, all calculations not enhanced by quantum computers are performed on classical GPUs. As the physics used allows the number of qubits required for the calculations, the team, by optimizing the approach to the extreme, has even managed to limit GPU requirements to a single card in some cases. As this hybrid classical/quantum approach is generalist, it can be applied to any type of quantum chemistry calculation, and is not restricted to molecules of pharmaceutical interest, but also to catalysts (chemistry, energy) or materials.

Next steps include deploying these algorithms on existing noisy machines to quantify the impact of noise, and compare performance with recent calculations by IBM and Google, and predicting the properties of molecules of pharmaceutical interest. To achieve this, the teams will deploy new software acceleration methods to reach regimes that would require more than 400 qubits with purely quantum approaches. In the short term, this hybrid approach will reduce the need for physical qubits on quantum machines.

These innovative approaches developed by Qubit Pharmaceuticals are an illustration of Sorbonne Universits commitment to serving society, said lisabeth Angel-Perez, Vice-President Research and Innovation at Sorbonne Universit. The precision and power of quantum computers offer major performance gains. With Qubit Pharmaceuticals, we measure the enormous potential of theoretical computing for quantum chemistry.

About Qubit Pharmaceuticals

Qubit Pharmaceuticals was founded in 2020 with the vision of co-developing new, more effective and safer drugs with pharmaceutical and biotech companies. A spin-off from the research work of five internationally renowned scientists Louis Lagardre (Sorbonne University and CNRS), Matthieu Montes (CNAM), Jean-Philip Piquemal (Sorbonne University and CNRS), Jay Ponder (Washington University in St Louis), Pengyu Ren (University of Texas at Austin) Qubit Pharmaceuticals leverages its Atlas platform to discover new drugs through simulation and molecular modeling accelerated by hybrid HPC and quantum computing. The multidisciplinary team, led by CEO Robert Marino, and the founders are based in France at the Paris Sant Cochin incubator and in the USA in Boston. Qubit Pharmaceuticals is supported and co-funded by the European Innovation Council and SMEs Executive Agency (EISMEA) European Innovation Council (EIC).

About Sorbonne Universit

Sorbonne Universit is a world-class, multidisciplinary, research-intensive university covering the humanities, health, science and engineering. Anchored in the heart of Paris and with a regional presence, Sorbonne Universit has 55,000 students, 7,300 teaching and research staff, and over a hundred laboratories. Alongside its partners in the Sorbonne University Alliance, and via its institutes and multidisciplinary initiatives, it conducts and programs research and training activities to strengthen its collective contribution to the challenges of three major transitions: a global approach to health (One Health), resources for a sustainable planet (One Earth), and changing societies, languages and cultures (One Humanity). Sorbonne Universit is committed to innovation and deeptech with the Cit de linnovation Sorbonne Universit, over 15,000 m2 dedicated to innovation, incubation and the link between research and entrepreneurship, as well as the Sorbonne Center of Artificial Intelligence (SCAI), a house of AI in the heart of Paris, to organize and make visible multidisciplinary AI research. Sorbonne Universit is also a member of Alliance 4EU+, an innovative model for European universities that develops strategic international partnerships and promotes the openness of its community to the rest of the world.

Source: Qubit Pharmaceuticals

Read more:
Qubit Pharmaceuticals and Sorbonne University Reduce Number of Qubits Needed to Simulate Molecules - HPCwire

Read More..

Worlds highest performing quantum chip unveiled by Oxford Ionics – Interesting Engineering

A new high-performance quantum chip built by Oxford Ionics, a spinoff from the University of Oxford, has broken previous records in the quantum computing domain. The achievement is commendable since error correction was not used during the process, and the chip can also be manufactured at existing semiconductor fabs. The company expects a useful quantum computer to be available to the world in the next three years.

Quantum computing is the next frontier of computing, where computers will be able to rapidly compute results by consuming information that would take todays fastest supercomputers years to process.

Research institutes and private enterprises are now locked in a race to build the worlds first usable quantum computer. However, the basic data storage unit or quantum bit (quantum bit) can only be worked with in highly specialized conditions. Researchers need to find simpler ways to process qubits to make the technology more mainstream.

Founded in 2019 by eminent Oxford scientists, Oxford Ionics uses a trapped ion approach to quantum computing. Compared to other approaches, trapped ions can help in precise measurements while staying in superposition for longer durations.

Controlling trapped ions for computation is typically achieved with lasers. However, Oxford Ionics has eliminated the use of lasers and developed an electronic way to achieve the same effect. They call it Electronic Qubit Control.

The team at Oxford Ionics has integrated everything needed to control the trapped ions onto a silicon chip. This chip can be manufactured at any existing semiconductor fabrication facility, making it possible to scale trapped-ion-based quantum computers.

In a press release sent to Interesting Engineering, Oxford Ionics confirmed that it achieved industry records in two-qubit and single-qubit gate performance.

The industrys biggest players have taken different paths towards the goal of making quantum computing a reality, said Chris Ballance, co-founder and CEO of Oxford Ionics, in the statement.

From the outset, we have taken a rocket ship approach focusing on building robust technology by solving the really difficult challenges first. This has meant using novel physics and smart engineering to develop scalable, high-performance qubit chips that do not need error correction to get to useful applications and can be controlled on a classic semiconductor chip, Ballance added.

A major challenge in adopting quantum computers is how easily the system accumulates errors, given its fast computing rates. Researchers, therefore, use large numbers of qubits to build logical qubits that give more coherent answers and deploy error correction to the computations.

Oxford Ionics says its high-performance qubits eliminate the need for error correction, allowing commercial applications without the associated costs of error correction. The company is confident that, thanks to the scalability of its Electronic Qubit Control system, it can build a 256-qubit chip in the next few years.

When you build a quantum computer, performance is as important as size increasing the number of qubits means nothing if they do not produce accurate results, said Tom Harty, CTO at Oxford Ionics. We have now proven that our approach has delivered the highest level of performance in quantum computing to date, and is now at the level required to start unlocking the commercial impact of quantum computing.

This is an incredibly exciting moment for our team, and for the positive impact that quantum computing will have on society at large, Harty concluded.

NEWSLETTER

Stay up-to-date on engineering, tech, space, and science news with The Blueprint.

Ameya Paleja Ameya is a science writer based in Hyderabad, India. A Molecular Biologist at heart, he traded the micropipette to write about science during the pandemic and does not want to go back. He likes to write about genetics, microbes, technology, and public policy.

View original post here:
Worlds highest performing quantum chip unveiled by Oxford Ionics - Interesting Engineering

Read More..

Colorado leads the world in quantum tech. Now its potential is growing. – Denver 7 Colorado News

BROOMFIELD, Colo. In an unassuming brick building in northern Colorados City of Broomfield, a new technology is harnessing the power of the very smallest particles and waves to solve our biggest problems.

On small golden chips, tiny particles known as quantum bits or qubits are trapped above the surface. Lasers and voltages move those qubits around, powering the computer of the future.

What will it mean?

Faster discovery, said Dr. Jenni Strabley who works with Quantinuum, the worlds biggest integrated quantum computing company.

Drew Smith, Denver7

Colorado already leads the world when it comes to quantum technology. Now, new investments in the Mountain Wests quantum tech hub are expanding its potential to transform everything from our health to our national security.

"If we want to live in the Jetsons Age, weve got to get this right, said Zachary Yerushalmi, chief executive and regional innovation officer for Elevate Quantum.

"The quantum community in Colorado has a gravity around it really unlike anywhere else on the planet," Yerushalmi said. "We have more organizations, more jobs, more Nobel Prizes than any other."

But to take these innovations to the next level, Yerushalmi said collaboration is key.

Local

5:26 AM, Oct 12, 2022

Elevate Quantum is the largest regional consortium of researchers and companies working together on quantum technology in the United States. And its set to keep growing. The Biden Administration recognized Elevate Quantum as a designated Tech Hub late last year, and the federal government will soon invest $40.5 million in funding through the Economic Development Administration.

On top of that, Colorado is investing $74 million in state support, including $44 million in refundable tax credits to help pay for a shared quantum research facility and $30 million to help smaller Colorado quantum companies access capital through a loan loss reserve.

Colorado Gov. Jared Polis expects these investments in quantum technology to create more than 10,000 jobs and $1 billion in economic impact statewide.

Drew Smith, Denver7

But what does quantum tech mean for the average person?

Quantum has been shaping our lives for a while, on an everyday basis, Yerushalmi said.

The atomic clock makes our internet connection and GPS systems possible.

With emerging quantum tech, literally any scientific promise that people think of is within reach, he said.

Quantum has the potential to help us cure diseases like cancer and Alzheimers, to create batteries that last an incredibly long time without needing to charge, to take artificial intelligence to the next level and even to upend our cybersecurity abilities.

"Us having these capabilities first is fundamental to national security, Yerushalmi said.

The job is to stay on top, he said, by staying together.

Looking at history, that's how it's been done. Whether it be the Manhattan Project, whether it be the Apollo Project. It was seen as, Look, guys, we just got to make it happen. So that's what we're doing, Yerushalmi said.

Drew Smith, Denver7

Quantinuum is one of the companies joining this collaboration.

This is a difficult technology to develop, said Quantinuums Dr. Strabley.

Quantum computers are expensive to fabricate and take skilled workers to operate, which is why the new investments will focus on making tech many researchers can share and training Coloradans who can fill these jobs.

Dr. Strabley said Quantinuum already shares its quantum computers capabilities by solving complex problems for clients.

You can think of scientific questions as a maze. A classic computer operates a lot like us it will look at the maze and take one path, then another, trying out every possible path until it solves the problem. But a quantum computer is so powerful, it can basically solve the whole maze all at once.

As the technology advances, Dr. Strabley said it will become faster and less energy-intensive to solve problems, including ones a classic computer cant handle at all.

She said almost all discoveries start with simulations or models. Since quantum computing will eventually be able to model things very quickly, new discoveries will come faster and faster.

Maybe it's chemistry, maybe it's machine learning, maybe it's AI, she said.

Drew Smith, Denver7

Dr. Brian Neyenhuis, another scientist at Quantinuum leading the team that operates their commercial quantum computers, said day to day, there's a lot of things going on here in Colorado.

In the control room, an array of screens help Dr. Neyenhuis and his team constantly monitor how their quantum computers are doing, so they can regulate everything from the temperature in the room to the lasers shining on the chips to manipulate the qubits.

For now, the computers fill a large room next door. The quantum chips are placed at the center of tables, surrounded by wires and hidden behind thick black curtains meant to protect scientists eyes from blinding laser beams. Someday, the technology might shrink down in size and grow in capability a lot like the computers we now hold in the palm of our hands.

"I've been dreaming about this stuff since I took my first physics classes in college, to be able to just take an individual atom and manipulate it. It's such a clean system, Dr. Neyenhuis said. That's quite beautiful.

Colorado leads the world in quantum tech. Now its potential is growing.

The Follow Up

What do you want Denver7 to follow up on? Is there a story, topic or issue you want us to revisit? Let us know with the contact form below.

View post:
Colorado leads the world in quantum tech. Now its potential is growing. - Denver 7 Colorado News

Read More..

Treasure Hunt: 3 Quantum Computing Stocks Wall Street Hasnt Discovered Yet – InvestorPlace

Quantum computing is making notable strides, positioning itself as a compelling sector for investment. These advancements are expected to drive interest in quantum computing stocks, as the industry progresses from research and development.

For quantum computing stocks, last year was marked by some significant developments. In 2023, public funding for quantum technologies increased by more than 50%, with global investments reaching $42 billion. Private investment in quantum technology startups totaled $1.71 billion, despite a decrease from previous years.

Furthermore, there were 367,000 graduates in quantum technology-relevant fields in 2023, with a 10% increase in universities offering masters degrees in quantum technology.

So quantum computing stocks and the industry as a whole is rapidly making progress toward commercialization. Here are three companies that I feel will be in the best position moving forward to take advantage of these trends and emerge as long-term winners. Dont miss out on these opportunities.

Source: T. Schneider / Shutterstock

D-Wave Quantum (NYSE:QBTS) specializes in quantum annealing technology, which is used for optimization problems and has various industrial applications.

Despite its small size (~$200M market cap), D-Wave has established itself as an early leader in commercial quantum computing. The companys Q1 results show strong traction, with revenue up 56% year-over-year to $2.5 million and bookings up 54% to $4.5 million. D-Wave has also been steadily growing its customer base, especially among large enterprises. It currently has 128 total customers, including 25 Forbes Global 2000 companies.

Encouragingly, D-Wave has also been improving its financial profile. Gross margins expanded from 27% to 67% year-over-year in Q1 as revenue grew and operating efficiencies kicked in. Adjusted EBITDA loss narrowed by 24% to $12.9 million. While still unprofitable, D-Waves losses are moving in the right direction.

Analysts are quite bullish on the stock, with a consensus strong buy rating and an average price target of $2.80, representing 150% upside from current levels. The companys modest ~$200M enterprise value leaves ample room for appreciation if D-Wave can execute on its growth initiatives and maintain its early leadership among quantum computing stocks.

Source: Amin Van / Shutterstock.com

IonQ (NYSE: IONQ) focuses on trapped ion quantum computers, offering systems across major public cloud services. I think its definitely one of the quantum computing stocks for investors to consider.

IonQ delivered strong Q1 results, with revenue of $7.6 million coming in above the high end of guidance and representing 77% year-over-year growth. The companys systems are gaining traction with customers, as evidenced by recent announcements like DESY using IonQ Aria to optimize airport gate assignments and Oak Ridge National Lab leveraging IonQ to explore power grid optimization. IonQs technology is helping solve real-world problems today.

The company raised its full-year 2024 bookings guidance range to $75-$95 million, suggesting accelerating commercial momentum. IonQ also noted its sales pipeline is expanding significantly in deal size, volume and visibility. Analysts currently expect IonQs revenue to reach $321 million by 2027, representing a 106% CAGR from 2023 levels.

From a valuation perspective, IonQ trades at a more reasonable 50x forward sales multiple compared to some other early-stage, high-growth technology stocks. If IonQ can sustain its triple-digit revenue growth and make continued technical progress, the companys valuation could expand considerably in the coming years. Analysts average price target of $16.50 implies the stock could more than double from current levels over the next year.

Source: Shutterstock

Quantum Computing (NASDAQ:QUBT) develops hardware-agnostic quantum software solutions, enabling its software to run on various quantum machines.

QUBT reported a few encouraging data points in Q1. Operating expenses decreased 18% year-over-year to $6.4 million, driven by a 25% reduction in selling, general and administrative expenses. It also reported an increase in total assets and a decrease in total liabilities compared to year-end 2023. The company ended Q1 with $6.1 million in cash, up from $2.1 million at the end of 2023.

As for valuation, QUBTs current market cap of around $52 million looks quite steep for a pre-revenue company with consistent heavy losses. The stock trades at nearly 142x sales based on Q1s $27,000 revenue run-rate. Of course, this is not a meaningful valuation indicator given the tiny revenue base. But it does highlight how much future growth is baked into the stock price already.

The one analyst covering QUBT has a $8.75 price target, representing a staggering 1,426% upside from current levels. This price target is stale (from November 2023) and may not fully reflect market realities. Still, it could be one of those undiscovered quantum computing stocks that investors should look more into.

On Penny Stocks and Low-Volume Stocks:With only the rarest exceptions, InvestorPlace does not publish commentary about companies that have a market cap of less than $100 million or trade less than 100,000 shares each day. Thats because these penny stocks are frequently the playground for scam artists and market manipulators. If we ever do publish commentary on a low-volume stock that may be affected by our commentary, we demand thatInvestorPlace.coms writers disclose this fact and warn readers of the risks.

Read More:Penny Stocks How to Profit Without Getting Scammed

On the date of publication, Matthew Farley did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

On the date of publication, the responsible editor did not have (either directly or indirectly) any positions in the securities mentioned in this article.

Matthew started writing coverage of the financial markets during the crypto boom of 2017 and was also a team member of several fintech startups. He then started writing about Australian and U.S. equities for various publications. His work has appeared in MarketBeat, FXStreet, Cryptoslate, Seeking Alpha, and the New Scientist magazine, among others.

The rest is here:
Treasure Hunt: 3 Quantum Computing Stocks Wall Street Hasnt Discovered Yet - InvestorPlace

Read More..

planqc Secures 50M Series A Funding to Advance Atom-Based Quantum Computing – HPCwire

July 8, 2024 planqc, a European leader in digital atom-based quantum computing, today announced securing 50 million (~US$54.17 million) in Series A funding. This significant investment round is led by the European Family Office CATRON Holding and the DeepTech & Climate Fonds (DTCF). Additional support comes from Bayern Kapital, the Max-Planck Foundation, and private investors, including UVC and Speedinvest. The round also includes a non-dilutive grant from Germanys Federal Ministry of Education and Research (BMBF).

Alexander Gltzle, CEO and co-founder of planqc, stated, This latest investment round is a strong endorsement of our technology as a leading platform for quantum applications. The substantial backing places us in a perfect position to take on global competitors with our Made in Germany quantum computers, targeting an emerging market valued at billions of euros.

planqcs unique technology, developed from award-winning research at the Max-Planck-Institute for Quantum Optics (MPQ), aims to rapidly advance industry-relevant quantum computers. The new financing will be used to establish a quantum computing cloud service and develop quantum software for applications in industries such as chemistry, healthcare, climate-tech, automotive, and finance. Currently, planqc is utilizing quantum machine learning to work on climate simulations and more efficient batteries for electric vehicles.

Dr. Sebastian Blatt, CTO of planqc, explained the core of planqcs technology: Unlike most other companies, including Big Tech, we use individual atoms confined in crystals of light as qubits. This approach is the fast track to scaling the number of qubits and improving their quality, the prerequisites for being the first to deliver fault-tolerant quantum computers.

Founded in 2022 by scientists from MPQ and Ludwig-Maximilians-University Munich (LMU), planqc is located in Munichs Quantum Valley (MQV). The company has been commissioned to deploy a 1,000-qubit quantum computer at the Leibniz Supercomputing Centre by the German Government. Additionally, planqc has secured a European tender to develop a quantum computer for the German Aerospace Center (DLR).

Dr. Torsten Lffler, Investment Director at DTCF on the occasion, We are thrilled to invest in a startup that not only leads in high-impact technology but also enables further breakthroughs in most pressing global computational challenges across various industries by offering access to the technology in form of a quantum cloud computing service. planqcs impressive track record in securing contracts in particular the DLR Tender and public grants within just 18 months of operations underscores the companys role as a frontrunner in the quantum computing sector both in Europe and globally.

Prof. Immanuel Bloch, director at MPQ, added, At MPQ, we have a strong tradition of supporting spin-offs from our institute and translating fundamental science to industry. planqc is the latest example and is based on our expertise in trapping, cooling and manipulating cold atoms and molecules. In the future, we look forward to extending this collaboration.

The MPQ, in collaboration with planqc, has already showcased scaling the number of neutral atoms used as qubits to 1,200. Reaching this milestone paves the way for fault-tolerant quantum computers. Further scaling to 10,000 or even 100,000 qubits is expected in the coming years. These systems will be capable of tackling previously unsolvable problems.

Quantum computers are poised to revolutionize the discovery of new materials and pharmaceuticals, address optimization challenges in climate research, industry, and transportation planning, and usher in a new era of cryptography. Quantum machine learning will unlock unprecedented applications for artificial intelligence, providing the scientific community with a new understanding of the world.

Dr. Anna Christmann, Coordinator of the Federal Government for German Aerospace and Commissioner for the Digital Economy and Start-ups at the Federal Ministry for Economic Affairs and Climate Action: The success story of planqc demonstrates that innovative research today can become the forward-looking companies of tomorrow, strengthening our long-term competitiveness. We are proud that our ongoing commitment to innovation-friendly frameworks and easier access to venture capital is bearing fruit, and we continue to work every day to improve the start-up ecosystem in Germany and Europe.

Neutral atoms are currently on the fast track to achieving fault-tolerant quantum computing, adds Hermann Hauser, representing the APEX Amadeus Technology fund, one of planqcs seed investors. I am deeply impressed by planqcs rapid progress in this area. Securing over 50 million euros in contracts within 18 months and achieving Europes first 1200-atomic qubit array are remarkable milestones for planqc and Europes tech sovereignty. I eagerly anticipate our continued collaboration in this pioneering field.

Quantum computers are one of the technologies that can offer unforeseeable added value by facilitating or even enabling the discovery, research and development of other future technologies. Examples include new medicines, sustainable battery technology and climate simulations. plancqs technologically promising approach coupled with the existing technological maturity convinced us, says Monika Steger, Managing Director of Bayern Kapital.

Benjamin Erhart, General Partner at UVC Partners concludes, the rapid progress of planqc since our initial investment is rooted both in planqcs scientific excellence as well as its capability to attract world class talent. In line with our strategy of building sustainable category leaders, it was a clear decision to double down on planqc significantily.

Source: planqc

Go here to see the original:
planqc Secures 50M Series A Funding to Advance Atom-Based Quantum Computing - HPCwire

Read More..

EDF, Alice & Bob, Quandela, and CNRS team up to enhance quantum computing efficiency – Research & Development World

French electric utility outfit EDF is partnering with quantum computing firms Quandela and the quizzically-named Alice & Bob, alongside the French National Centre for Scientific Research (CNRS), to enhance energy consumption in quantum computing.

With the support of grant money from the public investment bank Bpifrance, the 6.1M initiative will also compare the energy requirements of quantum and classical high-performance computing systems.

The Energetic Optimisation of Quantum Circuits (OECQ) project will involve two phases. The first will compare the energy requirements of high-performance computing (HPC) systems with those of quantum computers.

The second will set its sights on curbing quantum computers energy consumption. Although quantum computers today consume significantly less energy than traditional supercomputers, they still need significant power, ranging from about 25 kW to 600 kWh daily, according to one estimate. Optimizing energy of the systems includes not only the quantum processing unit (QPU) itself but also ensuring the auxiliary technologies that power it are efficient.

The OECQ project arrives at an important time. As interest in AI builds, some Big Tech firms are eyeing nuclear energy to support burgeoning demand. Increasing demand for AI services is driving a substantive increases in power requirements for data centers, potentially surpassing the annual energy consumption of many small nations. By 2030, data centers could consume up to 9% of electricity in the U.S., more than double what is being used now, as Quartz noted.

The eventual commercialization of quantum computing could reshape the computational landscape. But it could presents a double-edged sword for energy consumption. On one hand, quantum computers promise to solve some problems exponentially faster (like factoring prime numbers) than classical computers, potentially curbing overall energy use for some applications. But on the other, they could enable entirely new classes of computationally intensive tasks, potentially driving overall energy demand even higher.

OECQ aims to optimize quantum computers energy use through a partnership-based approach:

The projects second phase will focus on concrete optimization strategies. This stage will include not only improving the efficiency of the quantum processing unit (QPU) itself but also minimizing the energy consumed by the cooling and control systems that support it.

Read more here:
EDF, Alice & Bob, Quandela, and CNRS team up to enhance quantum computing efficiency - Research & Development World

Read More..

Riverlane Releases Three-year Quantum Error Correction Roadmap – The Quantum Insider

Insider Brief

PRESS RELEASE Riverlane, the global leader inquantum error correctiontechnology,has released its groundbreaking three-year roadmap for itsQuantum Error Correction Stack, Deltaflow, starting in 2024 and culminating in a system where quantum computers can run one million (Mega) error-free Quantum Operations (QuOps) by 2026.

Today, the worlds best quantum computers can perform at most a few hundred quantum operations before failure. This must scale to a million and, ultimately, trillions of operations tounlock the transformative real-world applications that can open a new age of human progress.

Every reliable quantum computer will need quantum error correction (QEC) to reach this point. Deltaflow provides commercial-grade quantum computers with a comprehensive, modular QEC solution, scaling as quantum computers scale and across every qubit type.

As an important near-term milestone, crossing into the MegaQuOp regime will present a pivotal moment when the power of quantum computers will surpass the reach of any classical supercomputer.

Maria Maragkou, VP of product and partnerships at Riverlane, said: Our three-year roadmap promises a series of major milestones on the journey to fault tolerance, culminating in enabling one million error-free quantum operations by the end of 2026. The MegaQuOp is a landmark goal as it puts us outside the regime any supercomputer can simulate.

The Riverlane roadmap introduces a standard industry measure of error-free quantum operations, or QuOps. The maximum QuOp capacity is a useful measure to assess the power of a quantum computer. It will play a similar role to floating-point operations per secondor FLOPscommonly used to rank supercomputers.

Earl Campbell, VP of quantum science at Riverlane, said: The MegaQuOp is a critical milestone, but its just the first step on the journey to full fault-tolerant quantum computing. Well need to realise a trillion error-free operations to begin fully unlocking the higher value applications of quantum computing, but this is a critical set of milestones in that journey.

Its hard to predict, with certainty, what applications well unlock at the MegaQuOp until such a system is in the hands of innovators. But whats undeniable is the industry cant push toward a billion and a trillion error-free operations without first reaching the MegaQuOp threshold.

A series of fast-paced and impressive demonstrations of quantum error correction fromQuantinuum,ETH Zrich,Google,Harvard University,Yale University,IBM,Microsoft,Alice & Bob, andRiverlane(to name a few) have pushed the quantum error correction field further forward than anyone anticipated. But none of these demonstrations have integrated the fast, scalable real-time decoding processes needed for error-corrected quantum computation, which is exactly the problem Deltaflow will solve.

Maragkou added: Reaching the MegaQuOp relies on the entire quantum community pulling together to reach this goal. We are now partnering with the worlds leading quantum hardware companies to make this happen sooner than many anticipate.

Riverlanes quantum error correction roadmap is also aligned with the plans of government bodies around the world that have pivoted their focus to fault-tolerant systems, requiring error correction. The UK government, for instance, hascommittedto build an accessible,UK-based quantum computer capable of running one trillion operations by 2035.

Full details of Riverlanes roadmap are availablehere.

Follow this link:
Riverlane Releases Three-year Quantum Error Correction Roadmap - The Quantum Insider

Read More..

Quantum Computing Industry to Witness Massive Growth – openPR

DataM Intelligence has published a new research report on "Quantum Computing Market Size 2024". The report explores comprehensive and insightful Information about various key factors like Regional Growth, Segmentation, CAGR, Business Revenue Status of Top Key Players and Drivers. The purpose of this report is to provide a telescopic view of the current market size by value and volume, opportunities, and development status.

Get a Free Sample Research PDF - https://datamintelligence.com/download-sample/quantum-computing-market

The Quantum Computing market report majorly focuses on market trends, historical growth rates, technologies, and the changing investment structure. Additionally, the report shows the latest market insights, increasing growth opportunities, business strategies, and growth plans adopted by major players. Moreover, it contains an analysis of current market dynamics, future developments, and Porter's Five Forces Analysis.

Quantum computing is a field of computing that utilizes principles of quantum mechanics, such as superposition and entanglement, to process information. Unlike classical computers that use bits as binary units (0s and 1s), quantum computers use quantum bits or qubits. Qubits can exist in a superposition of states, allowing quantum computers to perform calculations on multiple possible states simultaneously. This parallelism enables quantum computers to potentially solve complex problems much faster than classical computers, especially in fields such as cryptography, optimization, and material science. Quantum computing is still in its early stages of development but holds promise for revolutionizing various industries by tackling computationally intensive tasks that are currently infeasible for classical computers.

Forecast Growth Projected:

The Global Quantum Computing Market is anticipated to rise at a considerable rate during the forecast period, between 2024 and 2031. In 2023, the market is growing at a steady rate, and with the rising adoption of strategies by key players, the market is expected to rise over the projected horizon.

List of the Key Players in the Quantum Computing Market:

Telstra Corporation Limited, IonQ Inc., Silicon Quantum Computing, Huawei Technologies Co. Ltd., Alphabet Inc., Rigetti & Co Inc., Microsoft Corporation, D-Wave Systems Inc., Zapata Computing Inc

Segment Covered in the Quantum Computing Market:

By Offering: Hardware, Software, Service

By Deployment Type: On-premises, Cloud-based

By Technology: Quantum Dots, Trapped Ions, Quantum Annealing

By Application: Optimization, Simulation and Data Problems, Sampling, Machine Learning, Others

By End-User: Banking, Financial Services and Insurance, Aerospace & Defense, Manufacturing, Healthcare, IT & Telecom, Energy & Utilities, Others

Regional Analysis:

The global Quantum Computing Market report focuses on six major regions: North America, Latin America, Europe, Asia Pacific, the Middle East, and Africa.

Get Customization in the report as per your requierments: https://datamintelligence.com/customize/quantum-computing-market

Regional Analysis:

The global Quantum Computing Market report focuses on six major regions: North America, Latin America, Europe, Asia Pacific, the Middle East, and Africa. The report offers detailed insight into new product launches, new technology evolutions, innovative services, and ongoing R&D. The report discusses a qualitative and quantitative market analysis, including PEST analysis, SWOT analysis, and Porter's five force analysis. The Quantum Computing Market report also provides fundamental details such as raw material sources, distribution networks, methodologies, production capacities, industry supply chain, and product specifications.

Chapter Outline:

Chapter 1: Introduces the report scope of the report, executive summary of different market segments (by region, product type, application, etc), including the market size of each market segment, future development potential, and so on. It offers a high-level view of the current state of the market and its likely evolution in the short to mid-term, and long term.

Chapter 2: key insights, key emerging trends, etc.

Chapter 3: Manufacturers competitive analysis, detailed analysis of Quantum Computing manufacturers competitive landscape, revenue market share, latest development plan, merger, and acquisition information, etc.

Chapter 4: Provides profiles of key players, introducing the basic situation of the main companies in the market in detail, including product revenue, gross margin, product introduction, recent development, etc.

Chapter 5 & 6: Revenue of Quantum Computing in regional level and country level. It provides a quantitative analysis of the market size and development potential of each region and its main countries and introduces the market development, future development prospects, market space, and market size of each country in the world.

Chapter 7: Provides the analysis of various market segments by Type, covering the market size and development potential of each market segment, to help readers find the blue ocean market in different market segments.

Chapter 8: Provides the analysis of various market segments by Application, covering the market size and development potential of each market segment, to help readers find the blue ocean market in different downstream markets.

Chapter 9: Analysis of industrial chain, including the upstream and downstream of the industry.

Chapter 10: The main points and conclusions of the report.

Get a Free Sample PDF copy of the report @ https://datamintelligence.com/download-sample/quantum-computing-market

FAQs

How big is the Quantum Computing Market?

The Quantum Computing Market size was USD $ 650.1 million in 2023. It is expected to reach USD $ 8,788.8 million by 2031.

How fast is the Quantum Computing Market growing?

The Quantum Computing Market will exhibit a CAGR of 38.9% during the forecast period, 2024-2031.

Read Latest Blog: https://www.datamintelligence.com/blogs/top-10-process-automation-companies-worldwide

Company Name: DataM Intelligence Contact Person: Sai Kiran Email: Sai.k@datamintelligence.com Phone: +1 877 441 4866 Website: https://www.datamintelligence.com

DataM Intelligence is a Market Research and Consulting firm that provides end-to-end business solutions to organizations from Research to Consulting. We, at DataM Intelligence, leverage our top trademark trends, insights and developments to emancipate swift and astute solutions to clients like you. We encompass a multitude of syndicate reports and customized reports with a robust methodology. Our research database features countless statistics and in-depth analyses across a wide range of 6300+ reports in 40+ domains creating business solutions for more than 200+ companies across 50+ countries; catering to the key business research needs that influence the growth trajectory of our vast clientele.

This release was published on openPR.

Go here to see the original:
Quantum Computing Industry to Witness Massive Growth - openPR

Read More..

CMA CGM Group Invests in Pasqal, Companies to Explore Quantum Computing For Optimizing Maritime And Logistics Operations – The Quantum Insider

Insider Brief

PRESS RELEASE The CMA CGM Group, a global player in sea, land, air and logistics solutions, announces a strategic partnership with Pasqal, a world leader in neutral atom quantum computing.

As part of this collaboration, which aims to introduce cutting-edge quantum computing technologies into the Groups operations, CMA CGM also announces an investment in Pasqal.

A Quantum Computing Center of Excellence at TANGRAM to optimise transport and logistics

Among the objectives of this partnership, CMA CGM aims to leverage the power of quantum computing to enhance the efficiency, responsiveness, and adaptability of transport and logistics to market fluctuations. In particular, CMA CGM will seek to optimise container management, including their loading on ships.

Together, CMA CGM and Pasqal will establish a Quantum Computing Center of Excellence at TANGRAM, the Groups excellence center dedicated to training and innovation, with access to a quantum processor developed by Pasqal.

Training and events for CMA CGM Group staff members

The e-learning platform developed by Pasqal will be made available to CMA CGM staff members, who will be trained to improve understanding of quantum computing within the Group.

TANGRAM and Pasqal will jointly organize events dedicated to quantum computing, including use case workshops, technical presentations, and master classes. These events are intended to promote innovation and collaboration within CMA CGM and with its partners, clients, and suppliers.

Towards sustainable and innovative excellence

This initiative is part of the CMA CGM Groups strategy to transform its activities through innovation. It follows investments in technology companies and artificial intelligence initiatives such as Kyutai and Mistral AI. The partnership with Pasqal will enable CMA CGM to strengthen its position at the forefront of digitalisation in the transport and logistics sector.

Hadi Zablit, Executive Vice President for Information & Technology at the CMA CGM Group, states: This partnership with Pasqal will allow CMA CGM to apply quantum computing technologies to maritime transport and logistics, reinforcing our Groups position as a leader in the digital transformation of our industry. With Pasqal, we aim to unlock the full potential of quantum computing for greater operational efficiency, serving our customers.

All of Pasqals priorities are reflected in this partnership with CMA CGM: developing concrete use cases with leading industrial players, accelerating the understanding of the potential of quantum computing, and continuing cutting-edge research to serve the quantum ecosystem. Everyone at Pasqal is eager to start collaborating with the teams of this historic French company,says Georges-Olivier Reymond, CEO of Pasqal.

More here:
CMA CGM Group Invests in Pasqal, Companies to Explore Quantum Computing For Optimizing Maritime And Logistics Operations - The Quantum Insider

Read More..

Simulating the universes most extreme environments with utility-scale quantum computation – IBM

The Standard Model of Particle Physics encapsulates nearly everything we know about the tiny quantum-scale particles that make up our everyday world. It is a remarkable achievement, but its also incomplete rife with unanswered questions. To fill the gaps in our knowledge, and discover new laws of physics beyond the Standard Model, we must study the exotic phenomena and states of matter that dont exist in our everyday world. These include the high-energy collisions of particles and nuclei that take place in the fiery heart of stars, in cosmic ray events occurring all across earths upper atmosphere, and in particle accelerators like the Large Hadron Collider (LHC) at CERN or the Relativistic Heavy Ion Collider at Brookhaven National Laboratory.

Computer simulations of fundamental physics processes play an essential role in this research, but many important questions require simulations that are much too complex for even the most powerful classical supercomputers. Now that utility-scale quantum computers have demonstrated the ability to simulate quantum systems at a scale beyond exact or brute force classical methods, researchers are exploring how these devices might help us run simulations and answer scientific questions that are inaccessible to classical computation. In two recent papers published in PRX Quantum (PRX)1 and Physical Review D (PRD)2, our research group did just that, developing scalable techniques for simulating the real-time dynamics of quantum-scale particles using the IBM fleet of utility-scale, superconducting quantum computers.

The techniques weve developed could very well serve as the building blocks for future quantum computer simulations that are completely inaccessible to both exact and even approximate classical methods simulations that would demonstrate what we call quantum advantage over all known classical techniques. Our results provide clear evidence that such simulations are potentially within reach of the quantum hardware we have today.

We are a team of researchers from the University of Washington and Lawrence Berkeley National Laboratory who have spent years investigating the use of quantum hardware for simulations of quantum chromodynamics (QCD).

This work was supported, in part, by the U.S. Department of Energy grant DE-FG02-97ER-41014 (Farrell), by U.S. Department of Energy, Office of Science, Office of Nuclear Physics, InQubator for Quantum Simulation (IQuS) under Award Number DOE (NP) Award DE-SC0020970 via the program on Quantum Horizons: QIS Research and Innovation for Nuclear Science (Anthony Ciavarella, Roland Farrell, Martin Savage), the Quantum Science Center (QSC) which is a National Quantum Information Science Research Center of the U.S. Department of Energy (DOE) (Marc Illa), and by the U.S. Department of Energy (DOE), Office of Science under contract DE-AC02-05CH11231, through Quantum Information Science Enabled Discovery (QuantISED) for High Energy Physics (KA2401032) (Anthony Ciavarella).

This work is also supported, in part, through the Department of Physics and the College of Arts and Sciences at the University of Washington.

This research used resources of the Oak Ridge Leadership Computing Facility (OLCF), which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.

We acknowledge the use of IBM Quantum services for this work.

This work was enabled, in part, by the use of advanced computational, storage and networking infrastructure provided by the Hyak supercomputer system at the University of Washington.

This research was done using services provided by the OSG Consortium, which is supported by the National Science Foundation awards #2030508 and #1836650.

One prominent example of these challenges comes from the field of collider physics. Physicists use colliders like the LHC to smash beams of particles and atomic nuclei into each other at extraordinarily high energies, recreating the kinds of collisions that take place in stars and cosmic ray events. Collider experiments give physicists the ability to observe how matter behaves in the universes most extreme environments. The data we collect from these experiments help us tighten the constraints of the Standard Model and can also help us discover new physics beyond the Standard Model.

Lets say we want to use the data from collider experiments to identify new physics theories. To do this, we must be able to accurately predict the way known physics theories like QCD contribute to the exotic physics processes that occur in collider runs, and we must be able to quantify the uncertainties of the corresponding theoretical calculations. Performing these tasks requires detailed simulations of systems of fundamental particles. These simulations are impossible to achieve with classical computation alone, but should be well within reach for a sufficiently capable quantum computer.

Quantum computing hardware is making rapid progress toward the day when it will be capable of simulating complex systems of fundamental particles, but we cant just sit back and wait for quantum technology to reach maturity. When that day comes, well need to be ready with scalable techniques for executing each step of the simulation process.

The research community is already beginning to make significant progress in this field, with most efforts today focused on simulations of simplified, low-dimensional models of QCD and other fundamental physics theories. This is exactly what our research group has been working on, with our experiments primarily centering on simulations of the widely used Schwinger model, a one-dimensional model of QCD that describes how electrons and positrons behave and interact through the exchange of photons.

In a paper submitted to arXiv in 2023, and published in PRX Quantum this past April, we used the Schwinger model to demonstrate the first essential step in building future simulations of high-energy collisions of matter: preparing a simulation of the quantum vacuum state in which particle collisions would occur. Our follow-up to that paper, published in PRD in June, shows techniques for performing the next step in this process preparing a beam of particles in the quantum vacuum.

More specifically, that follow-up paper shows how to prepare hadron wavepackets in a 1-dimensional quantum simulation and evolve them forward in time. In this context, you can think of a hadron as a composite particle made up of a positron and electron and bound together by something analogous to the strong force that binds neutrons and protons together in nuclei.

Due to the uncertainty principle, it is impossible to precisely know both the position and momentum of a particle. The best you can do is to create a wavepacket, a region of space over which a particle will appear with some probability and with a range of different momenta. The uncertainty in momentum causes the wavepacket to spread out or propagate across some area of space.

By evolving our hadron wavepacket forward in time, we effectively create a simulation of pulses or beams of hadrons moving in this 1-dimensional system, just like the beams of particles we smash into each other in particle colliders. The wavepacket we create has an equal probability of propagating in any direction. However, since were working in 1-dimensional space, essentially a straight line, its more accurate to say the particle is equally likely to propagate to the left or to the right.

Weve established that our primary goal is to simulate the dynamics of a composite hadron particle moving through the quantum vacuum in one-dimensional space. To achieve this, well need to prepare an initial state with the hadron situated on a simplified model of space made up of discrete points also known as a lattice. Then, well have to perform what we call time evolution so we can see the hadron move around and study its dynamics.

Our first step is to determine the quantum circuits well need to run on the quantum computer to prepare this initial state. To do this, we developed a new state preparation algorithm, Scalable Circuits ADAPT-VQE. This algorithm uses the popular ADAPT-VQE algorithm as a subroutine, and is able to find circuits for preparing the state with the lowest energy i.e., the ground state as well as a hadron wavepacket state. A key feature of this technique is the use of classical computers to determine circuit blocks for preparing a desired state on a small lattice that can be systematically scaled up to prepare the desired state on a much larger lattice. These scaled circuits cannot be executed exactly on a classical computer and are instead executed on a quantum computer.

Once we have the initial state, our next step is to apply the time evolution operator. This is a mathematical tool that allows us to take a quantum state as it exists at one point in time and evolve it into the state that corresponds to some future point in time. In our experiment, we use the conventional Trotterized time evolution, where you split up the different mathematical terms representing the Hamiltonian energy equation that describes the quantum system and convert each term into quantum gates in your circuit.

This, however, is where we run into a problem. Even the simplified Schwinger model states that interactions between individual matter particles in our system are all-to-all. In other words, every matter particle in the system must interact with every other particle in the system, meaning every qubit in our circuit needs to interact with every other qubit.

This poses a few challenges. For one thing, an all-to-all interaction causes the number of quantum gates required for time evolution to scale quadratically with the simulation volume, making these circuits much too large to run on current quantum hardware. Another key challenge is that, as of today, even the most advanced IBM Quantum processor allows only for native interactions between neighboring qubits so, for example, the fifth qubit in an IBM Quantum Heron processor can technically interact only with qubits 4 and 6. While there are special techniques that let us get around this linear connectivity and simulate longer range interactions, doing this in an all-to-all setting would make it so the required two-qubit gate depth also scales quadratically in the simulation volume.

To get around this problem, we used the emergent phenomenon of confinement one of the features that the Schwinger model also shares with QCD. Confinement tells us that interactions are significant only over distances around the size of the hadron. This motivated our use of approximate interactions, where the qubits need to interact only with at most next-to-next-to-nearest neighbor qubits, e.g., qubit 5 needs to interact only with qubits 2, 3, 4, 6, and 7. We established a formalism for constructing a systematically improvable interaction and turned that interaction into a sequence of gates that allowed us to perform the time evolution.

Once the time evolution is complete, all we need to do is measure some observable in our final state. In particular, we wanted to see the way our simulated hadron particle propagates on the lattice, so we measured the particle density. At the beginning of the simulation (t=0), the hadron is localized in a specific area. As it evolves forward in time it propagates with a spread that is bounded by the speed of light (a 45 angle).

This figure depicts the results of our simulation of hadron dynamics. The time direction is charted on the lefthand-side Y-axis, and the points on the lattice qubits 0 to 111 are charted on the X-axis. The colors correspond to the particle density, with higher values (lighter colors) corresponding to having a higher probability of finding a particle at that location. The left-half of this figure shows the results of error-free approximate classical simulation methods, while the right half shows the results obtained from performing simulations on real Quantum hardware (specifically, the `ibm_torino` system). In an error free simulation, the left and right halves would be mirror images of each other. Deviations from this are due to device errors.

Keeping in mind that this is a simplified simulation in one spatial dimension, we can say this behavior mimics what we would expect to see from a hadron propagating through the vacuum, such as the hadrons produced by a device like the Large Hadron Collider.

Utility-scale IBM quantum hardware played an essential role in enabling our research. Our experiment used 112 qubits on the IBM Quantum Heron processor ibm_torino to run circuits that are impossible to simulate with brute force classical methods. However, equally important was the Qiskit software stack, which provided a number of convenient and powerful tools that were absolutely critical in our simulation experiments.

Quantum hardware is extremely susceptible to errors caused by noise in the surrounding environment. In the future, IBM hopes to develop quantum error correction, a capability that allows quantum computers to correct errors as they appear during quantum computations. For now, however, that capability remains out of reach.

Instead, we rely on quantum error suppression methods to anticipate and avoid the effects of noise, and we use quantum error mitigation post-processing techniques to analyze the quantum computers noisy outputs and deduce estimates of the noise-free results.

In the past, leveraging these techniques for quantum computation could be enormously difficult, often requiring researchers to hand-code error suppression and error mitigation solutions specifically tailored to both the experiments they wanted to run, and the device they wanted to use. Fortunately, the recent advent of software tools like the Qiskit Runtime primitives have made it much easier to get meaningful results out of quantum hardware while taking advantage of built-in error handling capabilities.

In particular, we relied heavily on the Qiskit Runtime Sampler primitive, which calculates the probabilities or quasi-probabilities of bitstrings being output by quantum circuits, and makes it easy to compute physical observables like the particle density.

Sampler not only simplified the process of collecting these outputs, but also improved their fidelity by automatically inserting an error suppression technique known as dynamical decoupling into our circuits and by automatically applying quantum readout error mitigation to our results.

Obtaining accurate, error-mitigated results required running many variants of our circuits. In total, our experiment involved roughly 154 million "shots" on quantum hardware, and we couldn't have achieved this by running our circuits one by one. Instead, we used Qiskit execution modes, particularly Session mode, to submit circuits to quantum hardware in efficient multi-job workloads. The sequential execution of many circuits meant that the calibration and noise on the device was correlated between runs facilitating our error mitigation methods.

Sending circuits to IBM Quantum hardware while taking advantage of the Sampler primitive and Session mode required just a few lines of code, truly as simple as:

Our team did several runs both with and without Qiskit Runtimes built-in error mitigation, and found that methods offered natively via the Sampler primitive significantly improved the quality and accuracy of our results. In addition, the flexibility of Session and Sampler allowed us to add additional, custom layers of error mitigation like Pauli twirling and operator decoherence renormalization. The combination of all these error mitigation techniques enabled us to successfully perform a quantum simulation with 13,858 CNOTs and a CNOT depth of 370!

What is CNOT depth? CNOT depth is an important measure of the complexity of quantum circuits. A CNOT gate, or controlled NOT gate, is a quantum logic gate that takes two qubits as input, and performs a NOT operation that flips the value of the second (target) qubit depending on the value of the first (control) qubit. CNOT gates are an important building block in many quantum algorithms and are the noisiest gate on current quantum computers. CNOT depth of a quantum simulation refers to the number of layers of CNOT gates across the whole device that have to be executed (each layer can have multiple CNOT gates acting on different qubits, but they can be applied at the same time, i.e., in parallel). Without the use of quantum error handling techniques like those offered by the Qiskit software stack, reaching a CNOT depth of 370 would be impossible.

Over the course of two research papers, we have demonstrated techniques for using utility-scale quantum hardware to simulate the quantum vacuum, and to simulate the dynamics of a beam of particles on top of that vacuum. Our research group is already hard at work on the logical next step in this progression simulating collisions between two particle beams.

If we can simulate these collisions at high enough energy, we believe we can demonstrate the long-sought goal of quantum computational advantage. Today, no classical computing method is capable of accurately simulating the collision of two particles at the energies weve set our sights on, even using simplified physics theories like the Schwinger model. However, our research so far indicates that this task could be within reach for near-term utility-scale quantum hardware. This means that, even without achieving full quantum error correction, we may soon be able to use quantum hardware to build simulations of systems of fundamental particles that were previously impossible, and use those simulations to seek answers to some of the most enduring mysteries in all of physics.

At the same time, IBM hasnt given up hope for quantum error correction, and neither have we. Indeed, weve poured tremendous effort into ensuring that the techniques weve developed in our research are scalable, such that we can transition them from the noisy, utility-scale processors we have today to the hypothetical error-corrected processors of the future. If achieved, the ability to perform error correction in quantum computations will make quantum computers considerably more powerful, and open the door to rich, three-dimensional simulations of incredibly complex physics processes. With those capabilities at our fingertips, who knows what well discover?

See the rest here:
Simulating the universes most extreme environments with utility-scale quantum computation - IBM

Read More..