Category Archives: Quantum Computing

D-Wave to deploy on-premise quantum computer at Davidson Technologies’ HQ in Alabama – DatacenterDynamics

Quantum computing firm D-Wave is to deploy an on-premise computer at a customers facility in Alabama.

Announced this week, the company will be placing an Advantage system on-premise at a new site operated by defense company Davidson Technologies.

Building on the companies existing relationship, the QPU will be located at Davidsons new global headquarters in Huntsville and placed in a secure facility developed to run sensitive applications. Further details, including timelines for deployment, werent shared.

Davidson has a track record of embracing emerging and advanced technologies to address unique and critical national defense challenges and protect our nations interests, said Dr. Alan Baratz, CEO of D-Wave. By placing an Advantage quantum computing system onsite at Davidsons headquarters and creating a unique environment for operation, were opening up opportunities to tackle the US governments most pressing computational problems.

Founded in 1996, Davidson Technologies provides software services to aerospace and missile defense customers, including weapon systems cybersecurity, software development, modeling and simulation, and AI offerings.

By housing the second US-based Advantage quantum computer at our facility in Huntsville, we will provide our government customers with unprecedented access to quantum computing technology in our facility, said Dale Moore, president of Davidson Technologies. Were honored to host a D-Wave Advantage computer and believe this will greatly advance quantums role in national security, as we support the critical mission of defending the US and its allies, both at home and abroad.

Initially accessible to all D-Wave customers located in select countries via the companys cloud service, D-Wave said the system may be exclusively dedicated to sensitive application development and operations once Davidsons secure facility is established.

The system will be the second US-based D-Wave Advantage quantum computer to be deployed. It is expected to become the first in the US certified for sensitive applications using annealing quantum computing.

D-Wave has a multi-year partnership with the University of Southern California (USC). Recently renewed, the USC Viterbi School of Engineering will continue to house a D-Wave Advantage quantum computer. The center has housed several generations of D-Waves quantum systems.

Like many other quantum companies, D-Wave offers access to its systems through its own cloud and public cloud providers, and sells hardware for on-premise deployments.

While cloud access is still the most common method of accessing quantum hardware, on-premise deployments are becoming more widespread, especially at supercomputing centers and universities. A small number have been deployed at colocation data centers and enterprise facilities.

Continue reading here:
D-Wave to deploy on-premise quantum computer at Davidson Technologies' HQ in Alabama - DatacenterDynamics

Quantinuum’s Genon Braiding Technique Adds Another Stitch in Company’s Fault-Tolerant Research Tapestry – The Quantum Insider

Insider Brief

Quantinuum which has been on a quantum AI and fault tolerance research roll this month report on yet another research development, this time detailing an advanced quantum error correction technique that the team says is pushing the industry ever closer toward practical quantum computing. The work also hints at the way the company is tying together its multiple research approaches from topological quantum computing to record-breaking fidelity that takes advantage of mid circuit measurementto quantum error correction to build quantum computers that can make calculations in spite of environmental noise.

This latest work, detailed in a company blog post and fully covered in a complementary research paper posted on ArXiv, focuses on the innovative technique of genon braiding to execute fault-tolerant gates using efficient codes.

The work aims right at quantum computings Achilles heel errors. In classical computing, hardware robustness and error correction methods like bit duplication make achieving fault tolerance relatively straightforward. However, quantum computing faces unique challenges. Quantum hardware is far more delicate, requiring precise control of quantum states, and the no-cloning theorem prohibits direct copying of qubits, the team explains.

Ilyas Khan, Quantinuum founder and Chief Product Officer writes in an email interview: One of the more interesting developments that I would highlight here is the emerging and very obvious point that theoretical ideas can only really be instantiated with full access to the metal and that real acceleration will benefit from an early push into genuine co-design. The Genon code is astonishingly impactful and we will be hearing a lot more about its ability to lift performances across the board, and this will be highlighted alongside all our other work in this field ranging from error mitigation all the way through to examining natively fault tolerant qubits that we create through the exploitation of our hardware and generate non-abelian anyons.

Genon Braiding: A New Approach

Quantinuums advance focuses on genon braiding a method thatleverages the unique properties of topological order to perform robust quantum operations.

In this braiding technique, researchers manipulate genons, which are twists or defects in topological codes. By braiding these genons around each other, logical quantum information can be encoded and manipulated fault-tolerantly, which then makes it easier to implement high-rate error-correcting codes and that, eventually, means less physical qubits per logical qubit.

This advance can significantly impact scaling, making quantum computers more practical and efficient, and demonstrates low overheads compared to quantum error correction approaches in the current literature, according to the team at Quantinuum, a full-stack quantum computing company formed from Honeywell and Cambridge Quantum in 2021.

The Theory Behind Genon Braiding

To take a step back: the research paper delves into the theoretical foundations of genon braiding. It explains that one of the main challenges in quantum error correction is balancing the protection of quantum information from errors with the ability to manipulate this protected information. Achieving this balance is essential for performing computational tasks, the paper notes.

The concept of genons or twists in topological codes plays a central role in this technique. Its also a point where the beauty of scientific discovery meets the efficiency of practical computational power, according to the team.

The scientists write in the post: What exactly genons are, and how they are braided, is beautiful and complex mathematics but the implementation is surprisingly simple. Inter-block logical gates can be realized through simple relabeling and physical operations. Relabeling, i.e. renaming qubit 1 to qubit 2, is very easy in Quantinuums QCCD architecture, meaning that this approach to gates will be less noisy, faster, and have less overhead. This is all due to our architectures native ability to move qubits around in space, which most other architectures cant do.

To dive deeper, the genons are associated with 3-valent vertices or three-pointed connections in topological codes and can be manipulated through braiding to perform logical Clifford gates. These gates are a set of operations that can be performed on encoded qubits within a quantum error-correcting code. They are important because they can be implemented fault-tolerantly, in other words they can function correctly even when there are small errors.

According to the scientists, genons are not arbitrary constructs but are derived from specific symmetrical properties in the quantum system, according to the paper. The team explains that genons arise from the properties of domain walls in topological order, which exhibit symmetry that can be exploited for fault-tolerant operations.

Practical Implementation on Quantinuums H1-1 Quantum Computer

Quantinuum has successfully demonstrated genon braiding on their H1-1 quantum computer, a trapped-ion device that allows for high connectivity through ion transport. This capability is essential for efficiently realizing the permutations required for implementing fault-tolerant gates without significant overhead, according to the blog post and the research paper.

The H1-1 device uses 20 ytterbium ions for physical qubits and 20 barium ions for sympathetic cooling, executing gates via stimulated Raman transitions with high fidelity. This setup enables the realization of genon protocols with minimal noise and high efficiency, showcasing the potential of genon braiding for practical quantum error correction, the team writes in their ArXiv paper.

Experimental Results and Proof-of-Concept

Quantinuums team conducted a series of proof-of-concept experiments on the H1-1 system. They demonstrated all single-qubit Clifford operations using genon braiding and performed two types of two-qubit logical gates equivalent to CNOTs. These experiments demonstrate that genon braiding works in practice and is complementary to and therefore enhancing well-understood codes such as the Steane code, according to the blog post.

The research also includes the construction of a symplectic double which effectively doubles the number of qubits involved allowing logical Clifford operations on the base code to be lifted to logical operations on the total code. This method enhances fault tolerance and demonstrates the versatility of the genon braiding technique, the paper states.

Implications for Quantum Computing

Scientifically beautiful and practically important theres still more.

The work demonstrates the importance of co-design, where error correction codes are tailored to specific hardware capabilities. Quantinuums approach leverages their unique hardware architecture to implement these advanced error correction techniques efficiently.

This is part of a larger effort to find fault-tolerant architectures tailored to Quantinuums hardware. Quantinuum scientist and pioneer of this work, Simon Burton, put it quite succinctly: Braiding genons is very powerful. Applying these techniques might prove very useful for realizing high-rate codes, translating to a huge impact on how our computers will scale,' the team explains in their blog post.

The Quantinuum Connection

There is some sense that the 370+ scientists and engineers at Quantinuum are engaged in a little scientific braiding of their own, tying multiple research strands from across the organization together to knit a multilayered approach to fault tolerance.

For example, this work is directly related to and inspired by Quantinuums prior research on non-Abelian anyons, which are particles that exhibit unique quantum statistics and are essential for certain topological quantum computing approaches. By leveraging insights from these previous studies, Quantinuum has been able to advance the understanding of genon braiding.

This new research also fits into Quantinuums broader effort in error mitigation and correction, showcasing their ongoing commitment to developing robust and scalable quantum computing solutions that can withstand errors and enhance computational reliability.

Quantinuum researchers involved in the work include Simon Burton, Elijah Durso-Sabina and Natalie C. Brown, who report to Khan.

For a fuller and more technical explanation of the work than this summary can provide, please review the paper in its entirety.

Read the original here:
Quantinuum's Genon Braiding Technique Adds Another Stitch in Company's Fault-Tolerant Research Tapestry - The Quantum Insider

Zapata AI Publishes Novel Research in PRX Quantum on the Future Potential of Quantum Computing – StockTitan

Zapata AI has published a groundbreaking paper titled 'Early Fault-Tolerant Quantum Computing' in the esteemed journal PRX Quantum, offering a framework for utilizing today's imperfect quantum computers to solve future complex problems. The research, led by CTO Yudong Cao, bridges the gap between current noisy quantum devices and future fault-tolerant systems, introducing the concept of Early Fault Tolerant Quantum Computing (EFTQC). This publication marks a significant milestone, as it is the second time this year that Zapata AI's research has been featured in a prestigious journal, with a previous paper published in Nature Communications. The company also highlighted its leadership in quantum computing at the Qubits conference, discussing the integration of generative AI with quantum computing for industrial applications like drug discovery and financial services.

Positive

The peer-reviewed research provides a robust framework for reasoning about how todays imperfect quantum computers can be scaled up to solve impactful problems in the future

BOSTON, June 18, 2024 (GLOBE NEWSWIRE) -- Zapata Computing, Inc. (Zapata AI) (Nasdaq: ZPTA), the Industrial Generative AI company, today announced that its paper on early fault tolerant quantum algorithms has been published in PRX Quantum, a highly selective journal that publishes research with an emphasis on outstanding and lasting impact. The paper, titled Early Fault-Tolerant Quantum Computing provides a unique, quantitative perspective that bridges the theoretical ideal of fault tolerant quantum computation and the present reality of noisy, imperfect quantum computers. The team argues that the path to scalable fault tolerant quantum computers of the future will likely go through a phase called Early Fault Tolerant Quantum Computing (EFTQC).

The paper was published online on June 17th and can be accessed here.

Quantum devices today are noisy and error prone while being on the verge of error correction. Quantum computers of the (distant) future can carry out any amount of error correction needed to keep the computation running. But how we get from here to there while keeping an eye on the usefulness of the quantum devices is still not entirely mapped out, said Yudong Cao, Chief Technology Officer and co-founder of Zapata AI. This research charts a path forward beyond the current NISQ era (near-term intermediate-scale quantum) and considers how we can design algorithms that leverage the next generation of quantum devices with some degree of error correction. We believe this new class of EFTQC algorithms will bring us closer to a practical quantum advantage for industrial applications across industries.

Earlier this week, Zapata AI presented its continued leadership in quantum computing at the Qubits conference, hosted by Zapata AI hardware partner D-Wave Quantum Inc. (D-Wave) (NYSE: QBTS). In a fireside chat hosted by The Boston Globes Aaron Pressman, Cao and Chief Revenue Officer Jon Zorio shared how generative AI can be enhanced by quantum computing and quantum-inspired techniques leveraging GPUs, as well as the implications for industrial applications such as drug discovery and other use cases in industries ranging from telecom to financial services.

The published research in PRX Quantum marks the second time this year Zapata AIs innovative research was published in a prestigious academic journal. The Company also announced that its foundational research on generator-enhanced optimization (GEO) was published in the esteemed Nature Communications.

Cao concluded, Having research published in premier and highly esteemed research journals like PRX Quantum and Nature Communications demonstrates the quality of our research team, the capabilities of our platform, and the role Zapata AI will play in advancing the cutting edge at the intersection of AI and quantum in a scientifically rigorous manner. Our mission to solve the most complex problems industries face, and we will not stop until we do so.

About Zapata AI

Zapata AI is an Industrial Generative AI company, revolutionizing how enterprises solve complex operational challenges with its powerful suite of generative AI software applications and cutting-edge reference architecture. By combining numerical and text-based generative AI models and custom software applications to power industrial-scale solutions, Zapata AI enables enterprises and government entities to drive growth, cost savings through operational efficiencies, and critical operational insights. With its proprietary data science and engineering techniques, and the Orquestra platform, Zapata AI is accelerating Generative AIs impact across industries by delivering solutions which are higher performing, less costly, and more accurate and expressive than current, classical approaches to AI. The Company was founded in 2017 and is headquartered in Boston, Massachusetts.

Forward Looking Statements Certain statements made herein are not historical facts but are forward-looking statements for purposes of the safe harbor provisions under The Private Securities Litigation Reform Act of 1995. Forward-looking statements generally are accompanied by words such as believe, may, will, intend, expect, should, would, plan, predict, potential, seem, seek, future, outlook, and similar expressions that predict or indicate future events or trends or that are not statements of historical matters. These forward-looking statements include, but are not limited to, statements regarding future events and other statements that are not historical facts. These statements are based on the current expectations of Zapata AIs management and are not predictions of actual performance. These forward-looking statements are provided for illustrative purposes only and are not intended to serve as, and must not be relied on, by any investor as a guarantee, an assurance, a prediction, or a definitive statement of fact or probability. These statements are subject to a number of risks and uncertainties regarding Zapata AIs business, and actual results may differ materially. These risks and uncertainties include, but are not limited to, Zapata AIs ability to attract new customers, retain existing customers, and grow; competition in the generative AI industry; Zapata AIs ability to raise additional capital on non-dilutive terms or at all; Zapata AIs failure to maintain and enhance awareness of its brand; and the risks and uncertainties discussed in the Companys filings with the Securities and Exchange Commission (including those described in the Risk Factors section in the Companys Annual Reports on Form 10-K and Quarterly Reports on Form 10-Q).

While Zapata AI may elect to update these forward-looking statements at some point in the future, Zapata AI specifically disclaims any obligation to do so. These forward-looking statements should not be relied upon as representing Zapata AIs assessments as of any date subsequent to the date of this press release. Accordingly, undue reliance should not be placed upon the forward-looking statements.

Contacts: Media: press@zapata.ai Investors: investors@zapata.ai

The publication introduces Early Fault Tolerant Quantum Computing (EFTQC), bridging the gap between today's noisy quantum computers and future fault-tolerant systems, highlighting Zapata AI's leadership in the field.

The research was published online on June 17, 2024, in the PRX Quantum journal.

The research outlines a pathway from current noisy quantum devices to scalable fault-tolerant systems, enhancing practical quantum computing applications for industries.

Potential applications include drug discovery, financial services, and various other industrial uses leveraging the next generation of quantum devices with error correction.

The publication in esteemed journals like PRX Quantum and Nature Communications underscores the quality of Zapata AI's research and its role in advancing AI and quantum computing.

Visit link:
Zapata AI Publishes Novel Research in PRX Quantum on the Future Potential of Quantum Computing - StockTitan

Estonia’s Roadmap for Encryption in the Age of Quantum Computing – The Quantum Insider

The emergence of powerful quantum computers poses an existential threat to todays encryption systems. At the Future Cryptography Conference in Tallinn, Estonia, cryptography expert Jan Willemson provided insights into when and why we need to transition to post-quantum cryptography (PQC) to maintain data security.

Willemson began by explaining the rationale for cryptography: The state is needed so that citizens could be provided with services. We want these services to be available to those who need them. He stressed properties like fairness, accountability and privacy that citizens expect from state services, which cryptography helps enable.

On the quantum computing threat, Willemson cited research estimating breaking 2048-bit RSA keys could take about 100 days under ideal conditions or years perhaps even decades under more realistic conditions with a large quantum computer. While much faster than classical computing, he noted its still some significant amount of time involved so its not like you will break it in a blink of an eye.

Willemson outlined three areas where pre-quantum cryptography may suffice even after large quantum computers emerge based on risk analysis.

If your confidentiality horizon is less than the time that it would take to break the encryption key then it might actually be okay to use pre-quantum encryption, he said.

He mentioned that if the value of a signature is less than the cost of breaking a key, then it is actually acceptable to use pre-quantum signatures. He also noted that authentication typically occurs for one session and for a limited time, implying that in many scenarios, using pre-quantum authentication may be quite adequate.

However, he cautioned you dont always know the future value of all your signatures which could retroactively incentivize attacks, suggesting it may be justified to convert to post-quantum crypto just in case.

Willemson described Estonias progress: The encryption part of the internet voting system is completely under our control, so we define what crypto system we usethis part is going to be much easier to upgrade.

As nations prepare for the quantum era, an open, transparent process is crucial according to Willemson.

NIST realizes this very well and this is a reason why for a few decades they already now are running very open competitions, he said.

With pragmatic risk analysis and strategic implementation across vital systems, Estonia is pioneering the quantum leap to quantum-resistant cryptography.

Featured image: Credit: YouTube

See the article here:
Estonia's Roadmap for Encryption in the Age of Quantum Computing - The Quantum Insider

D-Wave Introduces New Hybrid Quantum Solver For Workforce, Manufacturing And Logistics Optimization Problems – The Quantum Insider

Insider Brief

PRESS RELEASE D-Wave Quantum Inc., a leader in quantum computing systems, software, and services and the worlds first commercial supplier of quantum computers, announced today at its global Qubits 2024 user conference the launch of a new hybrid quantum solver for nonlinear programs, enabling customers to confront real-world problems of growing complexity. Available now through D-Waves Leap quantum cloud service, D-Wave believes the new solver will help customers solve complex optimization problems of increased scale, pushing past limits of previously available technologies.

The solver supports up to two million variables and constraints, with a tenfold increase in problem size capacity over other D-Wave solvers for certain applications, according to preliminary benchmarking studies. It is part of D-Waves expanding set of commercial quantum optimization offerings, supporting the companys aggressive go-to-market (GTM) growth strategy announced earlier this year. Comprising a combination of hardware, software and professional services, D-Waves solutions are designed to dramatically boost time-to-solution for organizations looking to optimize operational processes and performance.

Ready-to-Use Solutions for Workforce, Manufacturing and Logistics Problems

Real-world problems such as production scheduling have complex interactions between variables. D-Waves new solver excels at handling nonlinear relationships, where the effect of changes to variables on solution quality is complex, giving it an edge over solvers limited to linear relationships. Its user-friendly experience simplifies the translation of real-world problems into hybrid quantum problem-solving methods, and is exceptionally flexible, supporting a wide range of problems more accurately to deliver better results.

These problems include:

Customer Success with Quantum Optimization

Hybrid solvers have been shown to improve solutions to complex optimization problems by bringing together quantum and classical computing resources to explore vast solution spaces more adeptly, and pinpoint answers that are more difficult to calculate than using classical computing methods alone. Over the past year, D-Wave has seen customer use of its hybrid solver portfolio nearly double, which the company believes highlights the growing marketing demand for quantum optimization technology.

We are confident that this solver will simplify and accelerate customers journey to successful quantum technology adoption, helping them more quickly drive return-on-investment, and gain a competitive edge, said Dr. Alan Baratz, CEO of D-Wave. Many organizations are recognizing that their most complex computational problems go well beyond the capabilities of existing solutions. Theyre adopting hybrid quantum solutions to find better answers to transform operations faster and improve the bottom line.

From a logistics standpoint, so many elements go into making our experiences successful.We partnered with D-Wave to tackle the logisticsfor our large-scale tours and events in a whole new way, said Jason Snyder, global chief technology officer at Momentum Worldwide. Its not just about doing things faster or cheaper, but also about being smarter and more sustainable in our approach.For our work, it has helped us make significant progress toward more sophisticated, efficient, and eco-friendly operation models.

Link:
D-Wave Introduces New Hybrid Quantum Solver For Workforce, Manufacturing And Logistics Optimization Problems - The Quantum Insider

How to pivot for success in deeptech?| Ed Wood (Nu Quantum) – The Quantum Insider

Welcome back to DeepTech Product, where we delve into the world of deep tech product management. Today, we bring you insights from a riveting conversation with Ed Wood, VP of Product at Nu Quantum. Eds journey through the realms of quantum computing and hardware cryptography offers invaluable lessons for navigating the complex landscape of deep tech product development.

Prefer listening? .Audio versions on Apple podcasts,Spotify.

Subscribe to the newsletter and join the community for free :https://deeptechsyndicate.substack.com/

Dont forget to like and subscribe for more insightful conversations ondeeptechproductmanagement!

Ed emphasizes the importance of understanding the market before diving into product development. Nu Quantums journey began with a thorough analysis of potential markets, examining their dynamics, sizes, and existing problems that the companys technology could solve. This foundational step ensured that the team had a clear vision of the market needs, enabling them to tailor their product accordingly.

Initially, Nu Quantum aimed to produce components for quantum security markets, including random number generators and quantum key distribution systems. However, extensive market research and customer feedback revealed a more pressing need in the field of quantum computing. The company pivoted to focus on developing solutions that interconnect quantum computers, addressing the scalability limits of individual modalities.

Ed describes this pivot as a challenging yet necessary course correction. The decision was backed by relentless conversations with industry players and validation from larger players, who acknowledged the need for quantum networking.

One of the key insights Ed shares is the importance of finding a market with a sense of urgency. You can tick all the boxestheres a need, people are willing to pay, you can make a profitbut without a sense of urgency, youre not going to make any money, he explains. This urgency is crucial in deep tech, where the development cycles are long and the investment required is substantial.

To find this urgency, Ed and his team conducted relentless market research, involving numerous calls and meetings with potential customers and industry players. This exhaustive process helped them triangulate their understanding of the market and validate their new direction.

Ed also emphasizes the danger of getting hypnotized by your own technology. Deep tech is fascinating and magical, but just because its cutting-edge doesnt mean it automatically translates to a product people will pay for, he warns. Staying sober and focused on the business value and real-world applications of the technology is essential.

This approach helped New Quantum pivot from focusing on quantum security components to addressing the broader and more urgent need for quantum networking. By constantly questioning the business benefit and ensuring their technology solves a real, urgent problem, Ed and his team have been able to stay on course.

After identifying the new market direction, gaining buy-in from the board and the team was crucial. Initially, there was skepticism, but validation from major industry players helped solidify the new strategy. The transition involved leveraging existing technology and securing grants to mature additional components needed for their new product focus. This strategic alignment has allowed New Quantum to maintain a clear roadmap while being adaptable to market changes.

Eds advice for other deep tech product managers is grounded in experience:

For inquiries about sponsoring/supporting the podcast, email:deeptech[emailprotected]

(00:00) Teaser

(00:52) Podcast Intro

(01:26) Nadias Intro

(02:31) Role at Sandbox and the diverse product portfolio

(05:52) Product management strategies for such a deeptech (08:40) How to approach customers when product is not ready

(12:00) How to provide incremental value?

(14:50) When does make sense to buy vs make?

(16:55) Multiple reasons to do acquisitions

(18:03) Tactics for identifying acquisition targets

(20:57) Post acquisitions actions and strategy

(26:09) Internal and external acceleration of strategy post acquisition

(27:00) Activities to integrate teams post acquisition

(30:14) Examples of successful acquisitions

(30:25) Advice to your past self?

(30:35) Book recommendation and deep tech trend

(42:05) Outro and links to join the deep tech product community

If you found this article to be informative, you can explore more current quantum news here, exclusives, interviews, and podcasts.

See the article here:
How to pivot for success in deeptech?| Ed Wood (Nu Quantum) - The Quantum Insider

Riverlane Joins the Quantum Energy Initiative – The Quantum Insider

Insider Brief

PRESS RELEASE Riverlane, the global leader in quantum error correction technology, is pleased to announce its acceptance into the Quantum Energy Initiative, a global community of quantum technology companies and research organisations committed to better understanding the physical resource cost of quantum technologies.

The rise of new technologies, such as artificial intelligence, has led to an exponential growth in computational power in recent years, causing data centres to become increasingly power-hungry (IEA, 2024). Originally designed to prioritise performance over energy efficiency, new techniques are being explored to curb their carbon footprint, but despite these efforts, the electricity usage of the worlds data centres is still forecasted to double by 2026 (IEA, 2024).

With quantum computers expected to solve specific problems much more efficiently than classical supercomputers, theres growing hope that their integration with supercomputers in data centres could lead to significant energy savings and lower the amount of carbon emissions produced.

Early breakthrough research led by Google (with NASA and Oak Ridge National Labs) hinted that existing prototype quantum computers could already exhibit reduced energy consumption. However, it is now generally agreed that carefully chosen metrics of energy consumption are required to allow a fair comparison of quantum and classical computing. At the same time, if quantum computers are to do better than classicalcomputers, then every major hardware and software component of that quantum computer must be as energy efficient as possible.

Riverlane has joined the Quantum Energy Initiative (QEI), a community focused on researching the environmental footprint of quantum computing, to help with this effort. Alongside existing members of the QEI including Microsoft Azure Quantum, IBM Quantum and Alice & Bob, Riverlane aims to contribute to a number of goals including defining energy-based metrics for quantum technologies; deriving fundamental bounds for energy consumption; using energetic efficiencies as optimisation tools; and understanding the impact of hardware and software on energy consumption.

We enthusiastically welcome Riverlane into the Quantum Energy Initiative as an industrial partner. Their leading-edge expertise in quantum error correction will be crucial in shaping the future of large-scale quantum computing; we are delighted that they see the importance of making this future an energy-efficient one. said Robert Whitney, a co-founder of the Quantum Energy Initiative.

While research into these areas is still in its infancy, the Quantum Energy Initiative recommends that companies make conscious design choices during this current phase of quantum computing development to help pave the way towards energy-efficient quantum computing.

The energy consumption of a quantum computer is likely to be influenced by several factors, including how we implement a necessary process called quantum error correction (QEC). The building blocks of every quantum computer (qubits) are affected by errors to the extent that they are not usable to perform any valuable computation. QEC can correct these errors by mapping many physical qubits into one error-free logical qubit.

The QEC process can be designed with overall power consumption in mind. By doing so, we can reduce the number of qubits required to implement a logical qubit, translating into less power needed to keep them in a usable state and less energy required to manipulate them. At the same time, multiple measurements and operations are required to implement logical qubits which can impact the overall execution time and power consumption of the system. In line with guidance from the QEI, Riverlane believes deliberate design choices must be made to ensure the integration of QEC technologies produces a significant net benefit.

With this in mind, Riverlane is developing its core product the Quantum Error Correction Stack (Deltaflow) with energy efficiency as a key design principle. Riverlanes Deltaflow.Decode ASIC chipuses less than 10mW and is the first decoder to concretely demonstrate high-performance at a small hardware resource cost. Its been designed to accurately balance speed and hardware resource utilisation, meaning it decodes fast enough to keep up with a quantum computer while also producing a tiny resource footprint.

We are excited to be a supporting member of the Quantum Energy Initiative said Marco Ghibauldi, VP of Engineering at Riverlane. For quantum computers to deliver energy savings over todays classical supercomputers, we first need them to be useful. This is our main priority at Riverlane, through our specialist focus on our quantum error correction, but we recognise theres even more we can do at this stage. By joining the Quantum Energy Initiative, we hope to contribute to this exciting new field of research and continue to build our understanding of how our QEC Stack can be developed and integrated to maximise energy efficiencies across the whole quantum computing stack.

See the original post here:
Riverlane Joins the Quantum Energy Initiative - The Quantum Insider

Japan to help IBM build a massive 10000-qubit quantum computer – TechSpot

Forward-looking: After exiting the PC market and attempting to position the Watson AI supercomputer as the future of healthcare, IBM is focusing on making quantum computers practical and useful. Now, the company aims to usher in a new era in the quantum business with Japan's assistance.

IBM will partner with Japan's National Institute of Advanced Industrial Science and Technology (AIST) to develop a quantum system equipped with 10,000 qubits, which is 75 times more than the current quantum computers have.

This unprecedented super quantum machine is expected to be ready by 2029 and will significantly outclass today's quantum systems, which are equipped with no more than 133 qubits. According to Nikkei Asia, the IBM-AIST partnership will be announced publicly in the coming days and will leverage Japan's strong IT manufacturing capabilities.

IBM and AIST will develop new semiconductor parts and circuits designed to operate at near-absolute zero temperatures (K), as a quantum computer cannot perform its complex (and error-prone) calculations at room temperature. Japanese component manufacturers will be part of the project, and Japan's technology companies will have privileged access to the new quantum systems.

AIST will lobby for quantum computer adoption and offer training programs to Japanese companies, and IBM will likely be very happy to help with that. The tech giant is serious about quantum computing development, with a defined roadmap that includes a 2,000-qubit commercial system by 2033.

The expected 10,000-qubit quantum computer clearly goes way beyond IBM's rosiest expectations, all the more so as the plan is about a system that doesn't need an ancillary supercomputer working to fix the errors made by the quantum chips. Today's quantum computers cannot be used as traditional computing machines yet, and they usually need a complex, multi-room setup to try and keep errors and issues to a minimum.

IBM is betting big on quantum computers, a novel class of computing machines which have seemingly started to show the ability to tackle problems that traditional "brute force" simulations cannot. This "quantum utility" makes quantum computers a useful tool for scientific research, IBM said, although Big Blue could be wrong on this as well.

The recent Qommodore 64 project was seemingly able to effectively simulate IBM's quantum utility experiments on the puny 1MHz CPU of a Commodore 64, no qubits required.

Follow this link:
Japan to help IBM build a massive 10000-qubit quantum computer - TechSpot

Unlock Generous Growth With These 3 Top Quantum Computing Stocks – InvestorPlace

While the technology offers myriad innovations, investors ought to earmark the top quantum computing stocks for the speculative long-term section of their portfolio. Fundamentally, it all comes down to the projected relevance.

According to Grand View Research, the global quantum computing market size reached a valuation of $1.05 billion in 2022. Experts project that the sector could expand at a compound annual growth rate (CAGR) of 19.6% from 2023 to 2030. At the culmination of the forecast period, the segment could print revenue of $4.24 billion.

Better yet, we might be in the early stages. Per McKinsey & Company, quantum technology itself could lead to value creation worth trillions of dollars. Essentially, quantum computers represent a paradigm shift from the classical approach. These devices can generate myriad functions simultaneously, leading to explosive growth in productivity.

Granted, with every pioneering space comes high risks. If youre willing to accept the heat, these are the top quantum computing stocks to consider.

Source: Boykov / Shutterstock.com

To be sure, Honeywell (NASDAQ:HON) isnt exactly what you would call a direct player among top quantum computing stocks. Rather, the company is an industrial and applied sciences conglomerate, featuring acumen across myriad disciplines. However, Honeywell is very much relevant to the advanced computing world thanks to its investment in Quantinuum.

Earlier this year, Honeywells quantum computing enterprise reached a valuation of $5 billion following a $300 million equity funding round, per Reuters. Notably, JPMorgan Chase (NYSE:JPM) helped anchor the investment. According to the news agency, [c]ompanies are exploring ways to develop and scale quantum capabilities to solve complex problems such as designing and manufacturing hydrogen cell batteries for transportation.

Honeywell could play a big role in the applied capabilities of quantum computing, making it a worthwhile long-term investment. To be fair, its not the most exciting play in the world. Analysts rate shares a consensus moderate buy but with an average price target of $229.21. That implies about 10% upside.

Still, Honeywell isnt likely to implode either. As you build your portfolio of top quantum computing stocks, it may pay to have a reliable anchor like HON.

Source: Amin Van / Shutterstock.com

Getting into the more exciting plays among top quantum computing stocks, we have IonQ (NYSE:IONQ). Based in College Park, Maryland, IonQ mainly falls under the computer hardware space. Per its public profile, the company engages in the development of general-purpose quantum computing systems. Business-wise, IonQ sells access to quantum computers of various qubit capacities.

Analysts are quite optimistic about IONQ stock, rating shares a consensus strong buy. Further, the average price target comes in at $16.63, implying over 109% upside potential. Thats not all the most optimistic target calls for a price per share of $21. If so, we would be talking about a return of over 164%. Of course, with a relatively modest market capitalization of $1.68 billion, IONQ is a high-risk entity.

Even with the concerns, including an expansion of red ink for fiscal 2024, covering experts believe the growth narrative could overcome the anxieties. In particular, theyre targeting revenue of $39.47 million, implying 79.1% upside from last years print of $22.04 million. Whats more, fiscal 2025 sales could see a gargantuan leap to $82.38 million. Its one of the top quantum computing stocks to keep on your radar.

Source: Shutterstock

Headquartered in Berkeley, California, Rigetti Computing (NASDAQ:RGTI) through its subsidiaries builds quantum computers and superconducting quantum processors. In particular, Rigetti offers a cloud-based solution under a quantum processing umbrella. It also sells access to its groundbreaking computers through a business model called Quantum Computing as a Service.

While intriguing, RGTI stock is high risk. The reality is that the enterprise features a market cap of a little over $175 million. That translates to a per-share price of two pennies over a buck. With such a diminutive profile, anything can happen. Still, its tempting because analysts rate shares a unanimous strong buy. Also, the average price target lands at $3, implying over 194% upside potential.

Whats even more enticing are the financial projections. Covering experts believe that Rigetti will post a loss per share of 41 cents. Thats an improvement over last years loss of 57 cents. Further, revenue could hit $15.3 million, up 27.4% from the prior year. And in fiscal 2025, sales could soar to $28.89 million, up nearly 89% from projected 2024 revenue.

If you can handle the heat, RGTI is one of the top quantum computing stocks to consider.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare. Tweet him at @EnomotoMedia.

See the rest here:
Unlock Generous Growth With These 3 Top Quantum Computing Stocks - InvestorPlace

CERN welcomes International Year of Quantum Science and Technology – web.cern.ch

On the centenary of quantum mechanics -- the bedrock of particle physics and enabler of numerous technologies CERN is contributing to the development of a new generation of quantum technologies for fundamental research and beyond.

100 years ago, a handful of visionary physicists upturned notions about nature that had guided scientists for centuries. Particles can be point- or wave-like, depending on how you look at them. Their behaviour is probabilistic and can momentarily appear to violate cherished laws such as the conservation of energy. Particles can be entangled such that one feels the change of state of the other instantaneously no matter the distance between them, and, as befalls Schrdinger's famous cat, they can be in opposite states at the same time.

Today, thanks to pioneering theoretical and experimental efforts to understand this complex realm, physicists can confidently navigate through such apparently irrational concepts. Quantum theory has not only become foundational to physics, chemistry, engineering and biology, but underpins the transistors, lasers and LEDs that drive modern electronics and telecommunications -- not to mention solar cells, medical scanners and global positioning systems. But this is only the beginning.

On 7 June the United Nations declared 2025 the International Year of Quantum Science and Technology to celebrate the contributions of quantum science to technological progress, raise awareness of its importance to sustainable development, and ensure that all nations have access to quantum education and opportunities. As the worlds largest particle physics lab, CERN has been interrogating the quantum theories that govern the microscopic world for the past 70 years. Most recently, it has entered the rapidly growing domain of quantum technologies, which aims to harness the strangest aspects of quantum mechanics to build a new generation of quantum devices for fundamental research and beyond.

In recent years, we have learned not just to use the properties of the quantum world but, also, to control them, explains Sofia Vallecorsa, coordinator of the CERN Quantum Technology Initiative (QTI). Today, the revolution is all about controlling individual quantum systems, such as single atoms or ions, enabling even more powerful applications.

At CERN, quantum technologies are studied and developed through two initiatives: the QTI, whose aim is to enable technologies such as quantum computing, quantum state sensors, time synchronisation protocols, and many more for high-energy physics activities; and the recently established Open Quantum Institute (OQI), whose aim is to identify, support and accelerate the development of future societal applications benefiting from quantum computing algorithms.

One of the most promising fields is quantum computing. Unlike conventional computers that use bits that can be in one of just two states, quantum computers using qubits which can exist in superpositions of states. This enables a vast number of computations to be processed simultaneously, offering important applications in fields such as cryptography, logistics and process optimisation, and drug discovery. Quantum communication, which exploits the principles of quantum mechanics to make it impossible to intercept information without detection, is another significant area of development. A third pillar of CERNs quantum-technologies programme is sensing to allow ultra-precise measurements of physical quantities, with potential applications in fields including medicine, navigation and climate science.

What started 100 years ago as a purely theoretical physics investigation is now beginning to unleash its full potential, says OQI coordinator Tim Smith of CERN. The International Year of Quantum Science and Technology will be a wonderful opportunity to celebrate the past, the present and the future of our understanding of the quantum world.

Read more from the original source:
CERN welcomes International Year of Quantum Science and Technology - web.cern.ch