Page 123«..1020..122123124125..130140..»

Quantum control’s role in scaling quantum computing – McKinsey

June 14, 2024by Henning Soller and Niko Mohr with Elisa Becker-Foss, Kamalika Dutta, Martina Gschwendtner, Mena Issler, and Ming Xu

Quantum computing can leverage the states of entangled qubits1 to solve problems that classical computing cannot currently solve and to substantially improve existing solutions. These qubits, which are typically constructed from photons, atoms, or ions, can only be manipulated using specially engineered signals with precisely controlled energy that is barely above that of a vacuum and that changes within nanoseconds. This control system for qubits, referred to as quantum control, is a critical enabler of quantum computing because it ensures quantum algorithms perform with optimal efficiency and effectiveness.

While the performance and scaling limitations of current quantum control systems preclude large-scale quantum computing, several promising technological innovations may soon offer scalable control solutions.

A modern quantum computer comprises various hardware and software components, including quantum control components that require extensive space and span meters. In quantum systems, qubits interact with the environment, causing decoherence and decay of the encoded quantum information. Quantum gates (building blocks of quantum circuits) cannot be implemented perfectly at the physical system level, resulting in accumulated noise. Noise leads to decoherence, which lowers qubits superposition and entanglement properties. Quantum control minimizes the quantum noisefor example, thermal fluctuations and electromagnetic interferencecaused by the interaction between the quantum hardware and its surroundings. Quantum control also addresses noise by improving the physical isolation of qubits, using precise control techniques, and implementing quantum error correction codes. Control electronics use signals from the classical world to provide instructions for qubits, while readout electronics measure qubit states and transmit that information back to the classical world. Thus, the control layer in a quantum technology stack is often referred to as the interface between the quantum and classical worlds.

Components of the control layer include the following:

A superconducting- or spin qubitbased computer, for example, includes physical components such as quantum chips, cryogenics (cooling electronics), and control and readout electronics.

Quantum computing requires precise control of qubits and manipulation of physical systems. This control is achieved via signals generated by microwaves, lasers, and optical fields or other techniques that support the underlying qubit type. A tailored quantum control system is needed to achieve optimal algorithm performance.

In the context of a quantum computing stack, control typically refers to the hardware and software system that connects to the qubits the application software uses to solve real-world problems such as optimization and simulation (Exhibit 1).

At the top of the stack, software layers translate real-world problems into executable instructions for manipulating qubits. The software layer typically includes middleware (such as a quantum transpiler2) and control software comprising low-level system software that provides compilation, instrument control, signal generation, qubit calibration, and dynamical error suppression.3 Below the software layer is the hardware layer, where high-speed electronics and physical components work together to send signals to and read signals from qubits and to protect qubits from noise. This is the layer where quantum control instructions are executed.

Quantum control hardware systems are highly specialized to accommodate the intricacies of qubits. Control hardware interfaces directly with qubits, generating and reading out extremely weak and rapidly changing electromagnetic signals that interact with qubits. To keep qubits functioning for as long as possible, control hardware systems must be capable of adapting in real time to stabilize the qubit state (feedback calibration) and correct qubits from decaying to a completely decoherent state4 (quantum error correction).

Although all based on similar fundamental principles of quantum control, quantum control hardware can differ widely depending on the qubit technology with which it is designed to be used (Exhibit 2).

For example, photonic qubits operate at optical frequencies (similar to fiber internet), while superconducting qubits operate at microwave frequencies (similar to a fifth-generation network). Different types of hardware using laser technology or electronic circuits are needed to generate, manipulate, and transmit signals to and from these different qubit types. Additional hardware may be needed to provide environmental control. Cryostats, for example, cool superconducting qubits to keep them in a working state, and ion trap devices are used in trapped-ion qubit systems to confine ions using electromagnetic fields.

Quantum control is critical to enable fault-tolerant quantum computingquantum computing in which as many errors as possible are prevented or suppressed. But realizing this capability on a large scale will require substantial innovation. Existing control systems are designed for a small number of qubits (1 to 1,000) and rely on customized calibration and dedicated resources for each qubit. A fault-tolerant quantum computer, on the other hand, needs to control 100,000 to 1,000,000 qubits simultaneously. Consequently, a transformative approach to quantum control design is essential.

Specifically, to achieve fault-tolerant quantum computing on a large scale, there must be advances to address issues with current state-of-the-art quantum control system performance and scalability, as detailed below.

Equipping quantum systems to perform at large scales will require the following:

The limitations that physical space poses and the cost to power current quantum computing systems restrict the number of qubits that can be controlled with existing architecture, thus hindering large-scale computing.

Challenges to overcoming these restrictions include the following:

Several technologies show promise for scaling quantum control, although many are still in early-research or prototyping stages (Exhibit 3).

Multiplexing could help reduce costs and prevent overheating. The cryogenic complementary metal-oxide-semiconductor (cryo-CMOS) approach also helps mitigate overheating; it is the most widely used approach across industries because it is currently the most straightforward way to add control lines, and it works well in a small-scale R&D setup. However, cryo-CMOS is close to reaching the maximum number of control lines, creating form factor and efficiency challenges to scaling. Even with improvements, the number of control lines would only be reduced by a few orders of magnitude, which is not sufficient for scaling to millions of qubits. Another option to address overheating is single-flux quantum technology, while optical links for microwave qubits can increase efficiency in interconnections as well as connect qubits between cryostats.

Whether weighing options to supply quantum controls solutions or to invest in or integrate quantum technologies into companies in other sectors, leaders can better position their organizations for success by starting with a well-informed and strategically focused plan.

The first strategic decision leaders in the quantum control sector must make is whether to buy or build their solutions. While various levels of quantum control solutions can be sourced from vendors, few companies specialize in control, and full-stack solutions for quantum computing are largely unavailable. The prevailing expertise is that vendors can offer considerable advantages in jump-starting quantum computing operations, especially those with complex and large-scale systems. Nevertheless, a lack of industrial standardization means that switching between quantum control vendors could result in additional costs down the road. Consequently, many leading quantum computing players opt to build their own quantum control.

Ideally, business leaders also determine early on which parts of the quantum tech stack to focus their research capacities on and how to benchmark their technology. To develop capabilities and excel in quantum control, it is important to establish KPIs that are tailored to measure how effectively quantum control systems perform to achieve specific goals, such as improved qubit fidelity.5 This allows for the continuous optimization and refinement of quantum control techniques to improve overall system performance and scalability.

Quantum control is key to creating business value. Thus, the maturity and scalability of control solutions are the chief considerations for leaders exploring business development related to quantum computing, quantum solutions integration, and quantum technologies investment. In addition to scalability (the key criterion for control solutions), leaders will need to consider and address the other control technology challenges noted previously. And as control technologies mature from innovations to large-scale solutions, establishing metrics for benchmarking them will be essential to assess, for example, ease of integration, cost effectiveness, error-suppression effectiveness, software offerings, and the possibility of standardizing across qubit technologies.

Finally, given the shortage of quantum talent, recruiting and developing the highly specialized capabilities needed for each layer of the quantum stack is a top priority to ensure quantum control systems are properly developed and maintained.

Henning Soller is a partner in McKinseys Frankfurt office, and Niko Mohr is a partner in the Dsseldorf office. Elisa Becker-Foss is a consultant in the New York office, Kamalika Dutta is a consultant in the Berlin office, Martina Gschwendtner is a consultant in the Munich office, Mena Issler is an associate partner in the Bay Area office, and Ming Xu is a consultant in the Stamford office.

1 Entangled qubits are qubits that remain in a correlated state in which changes to one affect the other, even if they are separated by long distances. This property can enable massive performance boosts in information processing. 2 A quantum transpiler converts code from one quantum language to another while preserving and optimizing functionality to make algorithms and circuits portable between systems and devices. 3 Dynamical error suppression is one approach to suppressing quantum error and involves the periodic application of control pulse sequences to negate noise. 4 A qubit in a decoherent state is losing encoded quantum information (superposition and entanglement properties). 5 Qubit fidelity is a measure of the accuracy of a qubits state or the difference between its current state and the desired state.

Here is the original post:
Quantum control's role in scaling quantum computing - McKinsey

Read More..

Riverlane, the company making quantum computing useful far sooner than anticipated – Maddyness

You have recently been selected to the Tech Nations Future Fifty programme. What are your expectations and how does it feel to be identified as a future unicorn?

Were delighted to have been selected as the sole representative of a rich and diverse UK quantum tech industry. The quantum computing marketing is expected to grow to $28-72B over the next decade so I expect many unicorns to emerge, and we certainly hope to be one of them. Tech Nation has an excellent track record of picking and supporting high-growth leaders. Were excited to make the most of the opportunities the programme offers.

Quantum computing is an amazing idea the ability to harness the power of the atom to perform computation will transform many industries. Back in 2016, I was a research fellow at the University of Cambridge, and at that time, the majority view was that building a useful quantum computer wouldn't be possible in our lifetime - it was simply too big and too hard a problem. I disagreed but needed to validate this. By meeting with teams building quantum computers, I saw an amazing rate of progress a 'Moore's Law' of quantum computing with a doubling in power every two years, just like classical computers have done. That was the catalyst moment for me, and it became clear that if that trend continued, the next big problem would be quantum error correction. I founded Riverlane to make useful quantum computers a reality sooner!

Were building a technology called the quantum error correction stack, which corrects errors in quantum computers. Todays quantum computers can only perform a thousand or so operations before they fail under the weight of these errors. Quantum error correction technology will ultimately enable trillions of error-free operations, unlocking their full and transformative potential.

Implementing quantum error correction to achieve this milestone requires specialised knowledge of quantum science, engineering, software development and chip manufacturing. That makes quantum error correction systems difficult for each quantum computer maker to develop independently. Our strategy is not dissimilar to NVIDIA in providing a core enabling technology for an entirely new computing category.

When Riverlane was founded in 2016, there was a lot of focus on developing software applications to solve novel problems on small-scale quantum computers, a phase known as the noisy intermediate-scale quantum (NISQ) era. However, after the limits of NISQ became apparent due to considerable error rates hindering calculations, the industry shifted focus to building large and reliable quantum computers that could overcome the error problem

This is something weve been working on from the start through the invention of our quantum error correction stack but were now doubling down on its development to meet this growing demand from the industry. An important part to this has been scaling our team to nearly 100 people across our two offices in Cambridge (UK) and Boston (US) - two world-leading centres for quantum computing research and development.

Its a common misconception that you need a PhD in quantum physics or computer science to work in our field. The reality is we need people with a wide range of skills and from the broadest possible mix of backgrounds and demographics. Collectively, were a group that loves tackling hard and complex problems if not the hardest! This requires a culture that blends extremes of creativity, curiosity, problem-solving and analytical skills, plus an alchemy of driving urgency and zen like patience. Im also proud of the extraordinary openness and diversity of our team, including a healthy gender mix in a field where this is the exception not the norm.

Ive been fascinated with quantum physics since I was a student. Back then, the idea of building a computer that applied the unique properties of subatomic particles into computers to transform our understanding of nature and the universe was pure science fiction. Building a company that is now achieving this feels almost miraculous. Building a company with the right mix of skills and shared focus to do far faster than previously imaginable is brutally tricky and joyously rewarding in equal parts

Last September, we launched the worlds first quantum error correction chip. As the quantum computing industry develops, these chips will get better and better, faster and faster. Theyll ultimately enable the quantum industry to scale beyond its current limitations to achieve its full potential to solve currently impossible problems in areas like healthcare, climate science and chemistry. At a recent quantum conference, someone stood up and said quantum computing will be bigger than fire. I wouldnt go quite that far! But theyll unlock a fundamental new era of human knowledge and thats super exciting.

Have a bold and ambitious vision thats underpinned by a proven insight and data. In my case, it was that the presumption that a quantum computer was simply too hard to ever build could be disproven and overcome. Once you have this, be ready to learn fast and pivot fast in your tactics but never lose sight of your goal.

I spend at least a third of my time travelling. Meeting global leaders in our field face to face to hear their ideas, track their progress and build partnerships is priceless. When Im home, Im lucky enough to live about a mile from our office in Cambridge. No matter the weather, I walk to and from work every day. Cambridge is a beautiful place - the thinking time and fresh air give me energy and a calm headspace.

Steve Brierley is the CEO of Riverlane.

Tech Nations Future Fifty Programmeis designed to support late-stage companies with access and growth opportunities, the programme has supported some of the UKs most prominent unicorns, including Monzo, Darktrace, Revolut, Starling, Skyscanner and Deliveroo.

Visit link:
Riverlane, the company making quantum computing useful far sooner than anticipated - Maddyness

Read More..

The 3 Best Quantum Computing Stocks to Buy in June 2024 – InvestorPlace

Technology firms, both public and private, have been working hard to develop quantum computing technologies for decades. The reasons for that are straightforward. Quantum machines, which harness the quantum mechanics undergirding subatomic particles, have a number of advantages over classical computers. Portfolio optimization and climate predictive algorithms that improve with more complexity are better handled by quantum computers.

U.S. equities markets have surged with the rise of generative artificial intelligence (AI) and its potential to create enormous efficiencies and profits for firms across various industries. While AI has brought quantum computing back into the spotlight, a lack of practical ways to scale these complex products has severely dented the performance of pure-play quantum computing stocks, such as IonQ (NYSE:IONQ) and Rigetti Computing (NASDAQ:RGTI).

Fortunately, not every public company invested in quantum computing has seen doom and gloom. Below are the three best quantum computing stocks investors should buy in June.

Source: shutterstock.com/LCV

International Business Machines (NYSE:IBM) is a legacy American technology business. It has its hands in everything from cloud infrastructure, artificial intelligence, and technology consulting services to quantum computers.

The firm committed to developing quantum computing technologies in the early 2000s and tends to publish new findings in the burgeoning field frequently. In December 2023, IBM released a new quantum chip system, Quantum System Two, that leverages the firms Heron processor, which has 133 qubits. Qubits are analogous to bytes on a classical computer. But instead of being confined to states of 0s and 1s, qubits, by way of superposition, can assume both states at the same time.

Moreover, what makes Quantum System Two particularly innovative is its use of both quantum and classical computing technologies. In a press release, IBM states, It combines scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. IBM believes the combination of quantum computation and communication with classical computing resources can create a scalable quantum machine.

IBMs innovations in quantum computing technologies as well as AI has not gone unnoticed either. Shares have risen 31.3% over the past 12 months. The computing giants relatively cheap valuation coupled with its exposure to novel, high-growth fields could boost the value of its shares in the long-term.

Source: sdx15 / Shutterstock.com

Investors have given Nvidia (NASDAQ:NVDA) attention and praise over the past 12 months due to its critical role in AI computing technologies. The chipmakers advanced GPUs, including the H100 and H200 processors, are some of the most coveted chips on the market. The new Blackwell chips, coming to the market in the second half of 2024, bring to the table even better performance.

Though Nvidias prowess in the world of AI captures much of the headlines, the firm has already made inroads into the next stage of computing. In 2023, Nvidia announced a new quantum system in conjunction with startup Quantum Machines. It leverages what Nvidia calls the Grace Hoper Super Chip (GH200) as well as the chipmaker advanced CUDA Quantum (CUDA-Q) developer software.

In 2024, Nvidia released its Quantum Cloud platform, which allows users to build and test quantum computing algorithms in the cloud. The chipmakers GPUs and its open-source CUDA platform will likely be essential to scaling up the quantum computing space.

Nvidias share price has surged 214.2% over the past 12 months.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Quantum computers are complex machines that require all kinds of components. Furthermore, it is vital for quantum systems to operate at extremely low temperatures in order to operate efficiently.

FormFactor (NASDAQ:FORM) specializes in developing cryogenic systems or systems that are meant to deal with low temperatures. Everything from wafer testing probes to low-vibration probe stations as well as sophisticated refrigerators call cryostats, FormFactor provides. Also, the firms analytical probe tools are useful for developing advanced chips, such as NAND flash memory.

With quantum computing systems and advanced memory chips in greater demand these days, FormFactor could see revenues and earnings rise in the near and medium terms. FormFactors share price has surged 77.5% over the past 12 months, underscoring that investors are taking notice of the companys long-term value.

At the beginning of May, FormFactor released first quarter results for fiscal year 2024 and topped revenue estimates while EPS came in line with market expectations. The firm expects strong demand for advanced memory chips, such as DRAM, will help propel revenue growth in the following quarters.

On the date of publication, Tyrik Torresdid not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Tyrik Torres has been studying and participating in financial markets since he was in college, and he has particular passion for helping people understand complex systems. His areas of expertise are semiconductor and enterprise software equities. He has work experience in both investing (public and private markets) and investment banking.

Read the original post:
The 3 Best Quantum Computing Stocks to Buy in June 2024 - InvestorPlace

Read More..

Quantum data assimilation: A quantum leap in weather prediction – EurekAlert

image:

The novel quantum data assimilation method can significantly reduce the computation time required for numerical weather prediction, enabling deeper understanding and improved predictions

Credit: Brett Jordan from Openverse https://openverse.org/image/563410ca-1385-475c-a7f6-fd521f910623

Data assimilation is a mathematical discipline that integrates observed data and numerical models to improve the interpretation and prediction of dynamical systems. It is a crucial component of earth sciences, particularly in numerical weather prediction (NWP). Data assimilation techniques have been widely investigated in NWP in the last two decades to refine the initial conditions of weather models by combining model forecasts and observational data. Most NWP centers around the world employ variational and ensemble-variational data assimilation methods, which iteratively reduce cost functions via gradient-based optimization. However, these methods require significant computational resources.

Recently, quantum computing has emerged as a new avenue of computational technology, offering a promising solution for overcoming the computational challenges of classical computers. Quantum computers can take advantage of quantum effects such as tunneling, superposition, and entanglement to significantly reduce computational demands. Quantum annealing machines, in particular, are powerful for solving optimization problems.

In a recent study, Professor Shunji Kotsuki from the Institute for Advanced Academic Research/Center for Environmental Remote Sensing/Research Institute of Disaster Medicine, Chiba University, along with his colleagues Fumitoshi Kawasaki from the Graduate School of Science and Engineering and Masanao Ohashi from the Center for Environmental Remote Sensing, developed a novel data assimilation technique designed for quantum annealing machines. "Our study introduces a novel quantum annealing approach to accelerate data assimilation, which is the main computational bottleneck for numerical weather predictions. With this algorithm, we successfully solved data assimilation on quantum annealers for the first time," explains Prof. Kotsuki. Their study has been published in the journal Nonlinear Processes in Geophysics on June 07, 2024.

In the study, the researchers focused on the four-dimensional variational data assimilation (4DVAR) method, one of the most widely used data assimilation methods in NWP systems. However, since 4DVAR is designed for classical computers, it cannot be directly used on quantum hardware. Prof. Kotsuki clarifies, "Unlike the conventional 4DVAR, which requires a cost function and its gradient, quantum annealers require only the cost function. However, the cost function must be represented by binary variables (0 or 1). Therefore, we reformulated the 4DVAR cost function, a quadratic unconstrained optimization (QUO) problem, into a quadratic unconstrained binary optimization (QUBO) problem, which quantum annealers can solve."

The researchers applied this QUBO approach to a series of 4DVAR experiments using a 40-variable Lorentz-96 model, which is a dynamical system commonly used to test data assimilation. They conducted the experiments using the D-Wave Advantage physical quantum annealer, or Phy-QA, and the Fixstars Amplify's simulated quantum annealer, or Sim-QA. Moreover, they tested the conventionally utilized quasi-Newton-based iterative approaches, using the Broyden-Fletcher-Goldfarb-Shanno formula, in solving linear and nonlinear QUO problems and compared their performance to that of quantum annealers.

The results revealed that quantum annealers produced analysis with comparable accuracy to conventional quasi-Newton-based approaches but in a fraction of the time they took. The D-Wave's Phy-QA required less than 0.05 seconds for computation, much faster than conventional approaches. However, it also exhibited slightly larger root mean square errors, which the researchers attributed to the inherent stochastic quantum effects. To address this, they found that reading out multiple solutions from the quantum annealer improved stability and accuracy. They also noted that the scaling factor for quantum data assimilation, which is important for regulating the analysis accuracy, was different for the D-Wave Phy-QA and the Sim-QA, owing to the stochastic quantum effects associated with the former annealer.

These findings signify the role of quantum computers in reducing the computational cost of data assimilation. "Our approach could revolutionize future NWP systems, enabling a deeper understanding and improved predictions with much less computational time. In addition, it has the potential to advance the practical applications of quantum annealers in solving complex optimization problems in earth science," remarks Prof. Kotsuki.

Overall, the proposed innovative method holds great promise for inspiring future applications of quantum computers in advancing data assimilation, potentially leading to more accurate weather predictions.

About Professor Shunji Kotsuki

Dr. Shunji Kotsuki is currently a Professor at the Institute for Advanced Academic Research (IAAR), Chiba University, leading "Environmental Prediction Science." He received his B.S. (2009), M.S. (2011), and Ph.D. (2013) degrees in civil engineering from Kyoto University. He has over 40 publications and received over 500 citations. Dr. Kotsuki is a leading scientist in data assimilation, deep learning numerical weather prediction with over ten years of research experience in the development of the global atmospheric data assimilation system (a.k.a. NICAM-LETKF). His research interests include data assimilation mathematics, model parameter estimation, observation diagnosis including impact estimates, satellite data analysis, hydrological modeling, and atmospheric and hydrological disaster predictions. He is currently the project manager for Goal 8 of Japan's Moonshot Program, where he leads an interdisciplinary research team. This team includes experts in meteorology, disaster mathematics, information science, computer vision, ethics, and legal studies, all working together to achieve a weather-controlled society.

Nonlinear Processes in Geophysics

Computational simulation/modeling

Not applicable

Quantum Data Assimilation: A New Approach to Solve Data Assimilation on Quantum Annealers

7-Jun-2024

The authors have no competing interests to declare.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Continue reading here:
Quantum data assimilation: A quantum leap in weather prediction - EurekAlert

Read More..

Better Qubits: Quantum Breakthroughs Powered by Silicon Carbide – SciTechDaily

By U.S. Department of Energy June 14, 2024

Artists representation of the formation pathway of vacancy complexes for spin-based qubits in the silicon carbide host lattice and to the right the associated energy landscape. Credit: University of Chicago

Quantum computers, leveraging the unique properties of qubits, outperform classical systems by simultaneously existing in multiple states. Focused research on silicon carbide aims to optimize qubits for scalable application, with studies revealing new methods to control and enhance their performance. This could lead to breakthroughs in large-scale quantum computing and sensor technologies.

While conventional computers use classical bits for calculations, quantum computers use quantum bits, or qubits, instead. While classical bits can have the values 0 or 1, qubits can exist in a mix of probabilities of both values at the same time. This makes quantum computing extremely powerful for problems conventional computers arent good at solving. To build large-scale quantum computers, researchers need to understand how to create and control materials that are suitable for industrial-scale manufacturing.

Semiconductors are very promising qubit materials. Semiconductors already make up the computer chips in cell phones, computers, medical equipment, and other applications. Certain types of atomic-scale defects, called vacancies, in the semiconductor silicon carbide (SiC) show promise as qubits. However, scientists have a limited understanding of how to generate and control these defects. By using a combination of atomic-level simulations, researchers were able to track how these vacancies form and behave.

Quantum computing could revolutionize our ability to answer challenging questions. Existing small scale quantum computers have given a glimpse of the technologys power. To build and deploy large-scale quantum computers, researchers need to know how to control qubits made of materials that make technical and economic sense for industry.

The research identified the stability and molecular pathways to create the desired vacancies for qubits and determine their electronic properties.

These advances will help the design and fabrication of spin-based qubits with atomic precision in semiconductor materials, ultimately accelerating the development of next-generation large-scale quantum computers and quantum sensors.

The next technological revolution in quantum information science requires researchers to deploy large-scale quantum computers that ideally can operate at room temperature. The realization and control of qubits in industrially relevant materials is key to achieving this goal.

In the work reported here, researchers studied qubits built from vacancies in silicon carbide (SiC) using various theoretical methods. Until now, researchers knew little about how to control and engineer the selective formation process for the vacancies. The involved barrier energies for vacancy migration and combination pose the most difficult challenges for theory and simulations.

In this study, a combination of state-of-the-art materials simulations and neural-network-based sampling technique led researchers at the Department of Energys (DOE) Midwest Center for Computational Materials (MICCoM) to discover the atomistic generation mechanism of qubits from spin defects in a wide-bandgap semiconductor.

The team showed the generation mechanism of qubits in SiC, a promising semiconductor with long qubit coherence times and all-optical spin initialization and read-out capabilities.

MICCoM is one of the DOE Computational Materials Sciences centers across the country that develops open-source, advanced software tools to help the scientific community model, simulate, and predict the fundamental properties and behavior of functional materials. The researchers involved in this study are from Argonne National Laboratory and the University of Chicago.

Reference: Stability and molecular pathways to the formation of spin defects in silicon carbide by Elizabeth M. Y. Lee, Alvin Yu, Juan J. de Pablo and Giulia Galli, 3 November 2021,Nature Communications. DOI: 10.1038/s41467-021-26419-0

This work was supported by the Department of Energy (DOE) Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division and is part of the Basic Energy Sciences Computational Materials Sciences Program in Theoretical Condensed Matter Physics. The computationally demanding simulations used several high-performance computing resources: Bebop in Argonne National Laboratorys Laboratory Computing Resource Center; the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science user facility; and the University of Chicagos Research Computing Center. The team was awarded access to ALCF computing resources through DOEs Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. Additional support was provided by NIH.

See the article here:
Better Qubits: Quantum Breakthroughs Powered by Silicon Carbide - SciTechDaily

Read More..

European telecoms leading the way in quantum tech adoption, report finds – TNW

Say quantum technologies and most people probably still imagine something decades into the future. But, as a new report released today demonstrates, quantum is already here especially as it relates to the telecom industry.

After years of incremental progress confined to research institutions, the emerging quantum technology sector has begun to gather commercial momentum. While most of the developments have been related to the quantum computing domain and its future promises, there are many other use cases for quantum tech applicable already today.

Quantum communications, including networks and forms of encryption, are currently being commercialised by a growing number of major telecom industry players and startups throughout the world. And Europe has a major part to play.

According to a report released today by Infinity, a startup and ecosystem support branch of Quantum Delta NL, 32% of the 100 quantum startups, scaleups, and SMEs servicing the telecom and telecom infrastructure sector are based in continental Europe. Germany, the Netherlands, France, Switzerland, and Spain are the strongest ecosystems. An additional 14% are in the UK and Ireland.

In addition, 50% of the enterprises that serve as consumers of the technology are located in continental Europe, with a further 11% in the UK and Ireland. Indeed, there are already more than 25 quantum networks being deployed in Europe today.

This includes a commercial quantum network in London, launched through a partnership between BT and Toshiba Europe, and an EU-wide quantum communications network being developed by Deutsche Telekom and two consortia called Petrus and Nostradamus.

Telecom companies are becoming a driving force for real-world adoption of quantum technology, said Teun van der Veen, Quantum Lead at the Netherlands Organisation for Applied Scientific Research (TNO). They are at the forefront of integrating quantum into existing infrastructures and for them it is all about addressing end-user needs.

Quantum networks utilise the unique properties of quantum mechanics such as superposition and entanglement to connect systems and transmit data securely. This is done through quantum channels, which can be implemented using optical fibres, free-space optics, or satellite links.

The promise of quantum networks and quantum encryption is that they would be near-impossible, if not entirely impossible, to hack, thus offering ultra-secure forms of communication.

As Infinitys report states, they can be used to establish quantum-secure links between data centres, Earth and spacecraft and satellites, military and governments, trains and rail network control centres, hospital and health care sites, etc.

Quantum networks can also form the backbone of a global quantum internet, connecting quantum computers in different locations. Furthermore, they can offer opportunities for blind cloud quantum computing, which keeps quantum operations a secret to everyone but the user.

With geopolitical tensions on the rise and looming cybersecurity threats, companies and governments are increasingly looking into ways of securing IT infrastructure and data.

Perhaps unsurprisingly then, Infinitys report finds that Quantum Key Distribution (QKD) is the most popular use of quantum technology in the telecom sector. QKD utilises quantum mechanics to allow parties to generate a key that is known only to them, and is used to encrypt and decrypt messages.

One startup that knows a lot about QKD technology is Q*Bird. The Delft-based communications security company just raised 2.5mn to further develop its QKD product Falqon, already in trial with the Port of Rotterdam (the largest port in Europe).

Quantum communications solutions see increased interest across digital infrastructure in the EU, said Ingrid Romijn, co-founder and CEO of Q*Bird. Together with partners like Cisco, Eurofiber, Intermax, Single Quantum, Portbase and InnovationQuarter, Q*Bird is already testing quantum secure communications in the Port of Rotterdam using our novel quantum cryptography (QKD) technology.

Romjin further stated that moving forward, more industries and companies will be able to implement scalable solutions protecting data communications, leveraging next-generation QKD technology.

Another technology garnering interest is post-quantum cryptography (PQC). Q-day (the day when a quantum computer breaks the internet) is, in all probability, still some way into the future.

However, most classical cryptography methods will be vulnerable to hacking from a sufficiently powerful quantum computer sooner. PQC algorithms are designed to be secure against both classical and quantum attacks.

Other technologies with potential applications for the telecom industry are quantum sensors, clocks, simulation, random number generation, and, naturally, quantum computing.

Meanwhile, despite the increasing market interest, the report also finds that Europes quantum technology startups require more support and investment to help achieve the technical and market breakthroughs to drive the field forward.

Currently, only 42% of the quantum tech for telecom startups worldwide have external funding, having raised a total of 1.9bn between them.And despite the relative forward-thinking approach of the EU as demonstrated by the Deutsche Telekom network project, the US still leads in terms of private sector activity and investment.

Other challenges include raising awareness among business leaders, increasing skilled workforce, overcoming technical limitations, and building a stronger business narrative.

These can be surmounted partially through more regulatory standardisation, more collaboration with industry, and more early-stage support and investment for startups, the report says.

The key market opportunities for the quantum communications sector going forward are in government bodies including military and security services, financial institutions, and critical infrastructure departments, as well as companies in the energy, defence, space, and technology sectors.

Growing collaboration between enterprises and startups in telecom signals the industrys commitment to integrating quantum solutions into commercial applications, said Pavel Kalinin, Operations and Platforms Lead at Infinity. Successful implementation of such technologies will depend on coordinated efforts to prepare the workforce, facilitate collaborations, and set industry benchmarks and standards.

You can read the report in its entirety here.

The rest is here:
European telecoms leading the way in quantum tech adoption, report finds - TNW

Read More..

Quantum, AI Combine to Transform Energy Generation, AI Summit London – AI Business

The electrical grid is very complicated. Nobody thinks about it ever until it doesn't work. But it is critical infrastructure that runs minute-to-minute energy being consumed now was generated milliseconds ago, somewhere far away, instantaneously shot through power lines and delivered.

This gets more complicated when locally generated sustainable energy joins the mix, pushing it beyond the capabilities of classical computing solutions. Home energy supplier E.ON is trialing quantum computer solutions to manage this future grid.

Speaking at the AI Summit London, E.ON chief quantum scientist Corey OMeara explained the challenges presented by future decentralized grids.

The way grids are changing now is, if buildings have solar panels on the roofs, you want to use that renewable energy yourself, or you might want to inject that back into the grid to power your neighbor's house, he said.

This decentralized energy production and peer-to-peer energy-sharing model presents a massive overhead for an aging grid that was never meant to be digital. E.ON is working on solving this renewable energy integration optimization problem using quantum computing.

E.ON also uses AI extensively and some functions could in the future be enhanced using quantum computing. An important example is AI-driven predictive maintenance for power plants.

Related:Unilever's Alberto Prado on Quantum Computing's Future, Impact on Emerging Tech

Power plants are complex objects that have thousands of sensors that measure and monitor factors such as temperatures and pressures and store the data in the cloud. We have AI solutions to analyze them to make sure that they're functioning correctly, said OMeara.

We published a paper where we invented a novel anomaly detection algorithm using quantum computing as a subroutine. We used it with our gas turbine data as well as academic benchmark data sets from the computer science field and found that the quantum-augmented solution did perform better but only for certain metrics.

E.ON plans to develop this trial into an integrated quantum software solution that could run on today's noisy, intermediate-scale quantum computers rather than waiting for next-generation fully error-corrected devices.

Read more:
Quantum, AI Combine to Transform Energy Generation, AI Summit London - AI Business

Read More..

Quantum Computers May Break Bitcoin by 2030, But We Won’t Know About It – Cryptonews

Last updated: June 13, 2024 09:00 EDT | 11 min read

Quantum computers might sound like another buzzword in the tech world, yet their threat to cryptocurrency is very real and approaching fast. Scientists may differ on the timeline, but they all agree: Q-day is not a matter of if, but when.

Weve spoken to quantum experts around the world to hear the latest estimates on when it will happen, what can be done to protect cryptocurrency, and whether these powerful machines could somehow benefit the crypto world.

Unlike traditional computers, which use bits as the smallest unit of data, each bit being a 1 or a 0, quantum computers use quantum bits, or qubits. These qubits can exist in 0 and 1 states or in multiple states at oncea property called superposition.

This allows quantum computers to perform calculations simultaneously and process large amounts of data much faster than standard computers.

As quantum computers can hold and process many possible outcomes at once, it reduces the time needed to solve problems that depend on trying many different solutions, such as factoring large numbers, which is the foundation of most cryptocurrency encryption.

Factoring large numbers, or integer factorization, is a mathematical process of breaking down a large number into smaller, simpler numbers called factors, which, when multiplied together, result in the original number. The process is called prime factorization if these integers are further restricted to prime numbers.

In cryptocurrency, security heavily relies on the mathematical relationship between private and public keys. A public key is a long string of characters associated with the wallet address. It can be shared openly. A private key, used to sign transactions, must remain confidential. This mathematical relationship is one-way, meaning that a public key can be derived from the private key but not the other way around.

Itan Barmes, who is the Global quantum cyber readiness capability lead at Deloitte, explained in a conversation with Cryptonews:

The quantum computer breaks this one-way relationship between the two. So, if you have someones public key, you can calculate their private key, impersonate them, transfer their funds elsewhere.

The task is currently nearly impossible for conventional computers. However, in 1999, mathematician Peter Shor showed that a quantum computer could solve the factoring problem much faster. Shors algorithm can also solve the Discrete Logarithm Problem, which is the basis for the security of most blockchains. This means if such a powerful quantum computer existed, it could break the cryptocurrency security model.

Not all cryptocurrencies would face the same level of risk from quantum attacks. In 2020, Itan Barmes and a team of Deloitte researchers examined the entire Bitcoin blockchain to determine how many coins were vulnerable. They discovered that about 25% of Bitcoins could be at risk.

Pay To Public Key (P2PK)

Pay to Pubkey Hash (P2PKH)

These addresses directly use the public key, making them visible and vulnerable to quantum attacks.

These addresses use a cryptographic hash of the public key. They dont expose the public key directly until coins are moved.

Vulnerable coins include those held in P2PK (Pay To Public Key) addresses, which directly expose the public key, making them easy targets for a quantum attack. Coins in reused P2PKH (Pay to Pubkey Hash) addresses are also at risk because these addresses display their public key when the owner moves the funds. This attack is called the storage attack, as it applies to coins residing in static addresses. Itan Barmes further explained:

A quantum attack only applies to specific coins, not everything. If we conducted the same research today, the percentage of vulnerable coins would be lower because the number of vulnerable addresses remains more or less the same, but due to mining, there are more coins in circulation.

Itan Barmes added that in addition to the storage attack, there is also an attack on active transactions, as the public key is exposed for the first time.

Such an attack must be performed within the mining time (for Bitcoin, around 10 minutes), which adds a requirement for the quantum computer to not only be powerful enough but also fast. This so-called transit attack is likely to be possible later than the storage attack due to this additional requirement.

Ideally, Bitcoin users must generate a new address for each transaction. Yet, recent research by Bitmex suggests that about 50% of transaction outputs still go to previously used addresses, which means the practice of address reuse is more common in Bitcoin transactions than we may think.

Are we nearing the point where quantum computers can pose a real threat? In 2017, a group of researchers, including Divesh Aggarwal and Gavin Brennen, published an article warning that the elliptic curve signature scheme used by Bitcoin could be completely broken by a quantum computer as early as 2027, by the most optimistic estimates.

Cryptonews reached out to the authors to ask whether their estimation has shifted. Gavin Brennen from Macquarie University in Australia replied that although a lot has changed in quantum computing space since then, the basic message is still the same:

Quantum computers pose a threat to blockchains, primarily by attacks on digital signatures, and cryptocurrencies should get started sooner rather than later to upgrade their systems to use post-quantum cryptography before their asset valuations are threatened.

To be able to break cryptocurrency security, quantum computers will likely need thousands, if not millions, of qubits. Currently, the most advanced machines have around 1000.

Another critical challenge is error reduction. Quantum bits are highly sensitive to their environment; even the slightest disturbance, like a change in temperature or vibration, can cause errors in computations, a problem known as quantum decoherence.

Dozens of companies, both public and private, are now actively advancing the development of large quantum computers. IBM has ambitious plans to build a 100,000-qubit chipset and 100 million gates by the end of this decade.

PsiQuantum aims to achieve 1 million photonic qubits within the same timeframe. Quantum gate fidelities and quantum error correction have also significantly advanced. Gavin Brennen continued:

What all this means is that estimates on the size of quantum computers needed to crack the 256-bit elliptic curve digital signatures used in Bitcoin have dropped from 10-20 million qubits to around a million. One article published by the French quantum startup Alice & Bob estimates that it could be cracked with 126,000 physical qubits, though that does assume a highly specialized error model for the quantum computer. In my opinion, a plausible timeline for cracking 256-bit digital signatures is by the mid-2030s.

Gavin Brennen added that substantial technological improvements would be required to reduce all types of gate errors, connect modules, and combine fast classical and quantum control, which is a challenging but surmountable problem.

Yet, if quantum technology becomes powerful enough to break cryptocurrency security, we may not even know about it, believes Marcos Allende, a quantum physicist and CTO of the LACChain Global Alliance. In an email conversation with Cryptonews, Allende wrote:

What is certain is that those who reach that power first will use it silently, making it impossible to guess that selected hackings are happening because of having quantum computers.

Many scientists remain skeptical about the quantum threat to cryptocurrency. Winfried Hensinger, a physicist at the University of Sussex in Brighton, UK, speaking to Nature magazine, described quantum computers as Theyre all terrible. They cant do anything useful.

Several challenges keep quantum computing from reaching its full potential. The delicate nature of qubits makes it difficult to maintain them in a quantum state for extended periods. Another challenge is cooling requirements. Many quantum processors must operate at temperatures close to absolute zero, which means they need complicated and costly refrigeration technology. Finally, the quantum systems would need to be integrated with the existing classical ones.

Just having 200 million qubits not connected to each other is not going to do anything. There are a lot of fundamental physics problems that need to be resolved before we get there. We are still very much at the beginning. But even in the past year, theres been tremendous improvement. The technology can accelerate in a way that all the timelines will be much shorter than we expect, Itan Barmes told Cryptonews.

Tommie van der Bosch, Partner at Deloitte and Blockchain & Digital Asset Leader of Deloitte North and South Europe, believes that the question is not if quantum computing will break cryptocurrency security but when: The fact that its a possibility is enough to start taking action. You should have a plan.

Indeed, this year several key crypto companies and the World Economic Forum (WEF) have shared concerns about the implications of quantum computing on cryptocurrency security.

The WEF, in its post published in May, warned that central bank digital currency (CBDC) could become a prime target for quantum attacks. Ripples recent report has also said that quantum computers could break the digital signatures that currently protect blockchain assets.

Earlier this year, Buterin, Ethereum founder, suggested the Ethereum blockchain would need to undergo a recovery fork to avoid the scenario when bad actors already have access to them and are able to use them to steal users funds.

To protect against these potential quantum attacks, blockchain systems will need to integrate post-quantum cryptographic algorithms. However, incorporating them into existing blockchain protocols is not easy.

New cryptographic methods must first be developed, tested, and standardized. This process can take years and requires the consensus of the cryptographic community to ensure the new methods are secure and efficient.

In 2016, the National Institute of Standards and Technology (NIST) started a project to set new standards for post-quantum cryptography. The project aims to finalize these standards later this year. In 2022, three digital signature methodsCRYSTALS-Dilithium, FALCON, and SPHINCS+were chosen for standardization.

Once standardized, these new cryptographic algorithms need to be implemented within the blockchains existing framework. After that, all network participants need to adopt the updated protocol.

Itan Barmes explained, Lets say someone could tell us exactly the date, three years from now, when we will have these kinds of quantum computers. How quickly do you think we can change the Bitcoin protocol to make it resilient to these attacks? The decentralized governance of Bitcoin can turn out to be a double-edged sword, by preventing timely action.

Quantum-resistant algorithms often require more processing power and larger key sizes, which could lead to performance issues on the blockchain. These include slower transaction times and increased computational requirements for mining and verification processes.

Tommie van der Bosch told Cryptonews that, ultimately, the rise of quantum computing could affect the entire economic model of cryptocurrencies.

Coins that upgrade to quantum-resistant protocols in time might gain a competitive advantage. Investors and users could prefer these quantum-safe cryptocurrencies, as they may see them as more secure long-term holdings. This shift could lead to an increase in demand for such cryptocurrencies, potentially enhancing their value and market share compared to those that are slower to adapt. Tommie van der Bosch told Cryptonews:

Lets draw a parallel with the banking system. Weve all seen the effects of a bank collapsing or even the rumor of one. Your money suddenly seems at risk. How quickly do people shift their assets? It can trigger a domino effect.

The development of quantum computing could also bring regulatory changes. Regulators could start enforcing stricter standards around trading and custody of cryptocurrencies that havent updated their cryptographic protocols. Such measures would aim to protect investors from sinking funds into potentially vulnerable assets.

Itan Barmes remarked, Not many people are aware that the cryptographic algorithm used in Bitcoin and essentially all cryptocurrencies is not part of the NIST recommendation (NIST SP800-186). The issue is already present if organizations require compliance to NIST standards. The issue becomes even more complex if algorithms need to be replaced; Whos responsibility is it to replace them?

Could quantum computing actually benefit the cryptocurrency industry? Gavin Brennen suggests it might. In an email exchange with Cryptonews, Brennen discussed the development of quantum-enabled blockchains.

Quantum computers could accelerate mining, although Brennen notes that the improvement over traditional mining rigs would be limited and require quantum computers with hundreds of millions of qubitsfar beyond current capabilities.

New computational problems have been suggested, like the boson sampling problem, that are slow for all types of classical computers but would be fast on a quantum device. Interestingly, the boson sampler is a small, specialized processor using photons of light, that is not as powerful as a full quantum computer, but much cheaper to build, and that solves a problem immune to ASIC speedups with an energy footprint that is orders of magnitude lower for reaching PoW consensus.

Currently, proof-of-work (PoW) requires vast amounts of electrical power for mining, raising concerns about sustainability and environmental impact. Boson sampling could become a greener alternative, significantly reducing the energy footprint of blockchain operations while maintaining security and efficiency.

More here:
Quantum Computers May Break Bitcoin by 2030, But We Won't Know About It - Cryptonews

Read More..

Reduce AI Hallucinations With This Neat Software Trick – WIRED

To start off, not all RAGs are of the same caliber. The accuracy of the content in the custom database is critical for solid outputs, but that isnt the only variable. It's not just the quality of the content itself, says Joel Hron, a global head of AI at Thomson Reuters. It's the quality of the search, and retrieval of the right content based on the question. Mastering each step in the process is critical since one misstep can throw the model completely off.

Any lawyer who's ever tried to use a natural language search within one of the research engines will see that there are often instances where semantic similarity leads you to completely irrelevant materials, says Daniel Ho, a Stanford professor and senior fellow at the Institute for Human-Centered AI. Hos research into AI legal tools that rely on RAG found a higher rate of mistakes in outputs than the companies building the models found.

Which brings us to the thorniest question in the discussion: How do you define hallucinations within a RAG implementation? Is it only when the chatbot generates a citation-less output and makes up information? Is it also when the tool may overlook relevant data or misinterpret aspects of a citation?

According to Lewis, hallucinations in a RAG system boil down to whether the output is consistent with whats found by the model during data retrieval. Though, the Stanford research into AI tools for lawyers broadens this definition a bit by examining whether the output is grounded in the provided data as well as whether its factually correcta high bar for legal professionals who are often parsing complicated cases and navigating complex hierarchies of precedent.

While a RAG system attuned to legal issues is clearly better at answering questions on case law than OpenAIs ChatGPT or Googles Gemini, it can still overlook the finer details and make random mistakes. All of the AI experts I spoke with emphasized the continued need for thoughtful, human interaction throughout the process to double check citations and verify the overall accuracy of the results.

Law is an area where theres a lot of activity around RAG-based AI tools, but the processs potential is not limited to a single white-collar job. Take any profession or any business. You need to get answers that are anchored on real documents, says Arredondo. So, I think RAG is going to become the staple that is used across basically every professional application, at least in the near to mid-term. Risk-averse executives seem excited about the prospect of using AI tools to better understand their proprietary data without having to upload sensitive info to a standard, public chatbot.

Its critical, though, for users to understand the limitations of these tools, and for AI-focused companies to refrain from overpromising the accuracy of their answers. Anyone using an AI tool should still avoid trusting the output entirely, and they should approach its answers with a healthy sense of skepticism even if the answer is improved through RAG.

Hallucinations are here to stay, says Ho. We do not yet have ready ways to really eliminate hallucinations. Even when RAG reduces the prevalence of errors, human judgment reigns paramount. And thats no lie.

See the rest here:

Reduce AI Hallucinations With This Neat Software Trick - WIRED

Read More..

Accelerating the next wave of generative AI startups | Amazon Web Services – AWS Blog

Since day one, AWS has helped startups bring their ideas to life by democratizing access to the technology powering some of the largest enterprises around the world including Amazon. Each year since 2020, we have provided startups nearly $1 billion in AWS Promotional Credits. Its no coincidence then that 80% of the worlds unicorns use AWS. I am lucky to have had a front row seat to the development of so many of these startups over my time at AWScompanies like Netflix, Wiz, and Airtasker. And Im enthusiastic about the rapid pace at which startups are adopting generative artificial intelligence (AI) and how this technology is creating an entirely new generation of startups. A staggering 96% of AI/ML unicorns run on AWS.

These generative AI startups have the ability to transform industries and shape the future, which is why today we announced a commitment of $230 million to accelerate the creation of generative AI applications by startups around the world. We are excited to collaborate with visionary startups, nurture their growth, and unlock new possibilities. In addition to this monetary investment, today were also announcing the second-annual AWS Generative AI Accelerator in partnership with NVIDIA. This global 10-week hybrid program is designed to propel the next wave of generative AI startups. This year, were expanding the program 4x to serve 80 startups globally. Selected participants will each receive up to $1 million in AWS Promotional Credits to fuel their development and scaling needs. The program also provides go-to-market support as well as business and technical mentorship. Participants will tap into a network that includes domain experts from AWS as well as key AWS partners such as NVIDIA, Meta, Mistral AI, and venture capital firms investing in generative AI.

In addition to these programs, AWS is committed to making it possible for startups of all sizes and developers of all skill levels to build and scale generative AI applications with the most comprehensive set of capabilities across the three layers of the generative AI stack. At the bottom layer of the stack, we provide infrastructure to train large language models (LLMs) and foundation models (FMs) and produce inferences or predictions. This includes the best NVIDIA GPUs and GPU-optimized software, custom, machine learning (ML) chips including AWS Trainium and AWS Inferentia, as well as Amazon SageMaker, which greatly simplifies the ML development process. In the middle layer, Amazon Bedrock makes it easier for startups to build secure, customized, and responsible generative AI applications using LLMs and other FMs from leading AI companies. And at the top layer of the stack, we have Amazon Q, the most capable generative AI-powered assistant for accelerating software development and leveraging companies internal data.

Customers are innovating using technologies across the stack. For instance, during my time at the VivaTech conference in Paris last month, I sat down Michael Chen, VP of Strategic Alliances at PolyAI, which offers customized voice AI solutions for enterprises. PolyAI develops natural-sounding text-to-speech models using Amazon SageMaker. And they build on Amazon Bedrock to ensure responsible and ethical AI practices. They use Amazon Connect to integrate their voice AI into customer service operations.

At the bottom layer of the stack, NinjaTech uses Trainium and Inferentia2 chips, along with Amazon SageMaker, to build, train, and scale custom AI agents. From conducting research to scheduling meetings, these AI agents save time and money for NinjaTechs users by bringing the power of generative AI into their everyday workflows. I recently sat down with Sam Naghshineh, Co-founder and CTO, to discuss how this approach enables them to save time and resources for their users.

Leonardo.AI, a startup from the 2023 AWS Generative AI Accelerator cohort, is also harnessing the capabilities of AWS Inferentia2 to enable artists and professionals to produce high-quality visual assets with unmatched speed and consistency. By reducing their inference costs without sacrificing performance, Leonardo.AI can offer their most advanced generative AI features at a more accessible price point.

Leading generative AI startups, including Perplexity, Hugging Face, AI21 Labs, Articul8, Luma AI, Hippocratic AI, Recursal AI, and DatologyAI are building, training, and deploying their models on Amazon SageMaker. For instance, Hugging Face used Amazon SageMaker HyperPod, a feature that accelerates training by up to 40%, to create new open-source FMs. The automated job recovery feature helps minimize disruptions during the FM training process, saving them hundreds of hours of training time a year.

At the middle layer, Perplexity leverages Amazon Bedrock with Anthropic Claude 3 to build their AI-powered search engine. Bedrock ensures robust data protection, ethical alignment through content filtering, and scalable deployment of Claude 3. While Nexxiot, an innovator in transportation and supply chain solutions, quickly moved its Scope AI assistant solution to Amazon Bedrock with Anthropic Claude in order to give their customers the best real-time, conversational insights into their transport assets.

At the top layer, Amazon Q Developer helps developers at startups build, test, and deploy applications faster and more efficiently, allowing them to focus their valuable energy on driving innovation. Ancileo, an insurance SaaS provider for insurers, re-insurers, brokers, and affinity partners, uses Amazon Q Developer to reduce the time to resolve coding-related issues by 30%, and is integrating ticketing and documentation with Amazon Q to speed up onboarding and allow anyone in the company to quickly find their answers. Amazon Q Business enables everyone at a startup to be more data-driven and make better, faster decisions using the organizations collective knowledge. Brightcove, a leading provider of cloud video services, deployed Amazon Q Business to streamline their customer support workflow, allowing the team to expedite responses, provide more personalized service, and ultimately enhance the customer experience.

The future of generative AI belongs to those who act now. The application window for the AWS Generative AI Accelerator program is open from June 13 to July 19, 2024, and well be selecting a global cohort of the most promising generative AI startups. Dont miss this unique chance to redefine whats possible with generative AI, and apply now!

Other helpful resources include:

Apply now, explore the resources, and join the generative AI revolution with AWS.

Twitch series: Lets Ship It with AWS! Generative AI

AWS Generative AI Accelerator Program: Apply now

See the original post:

Accelerating the next wave of generative AI startups | Amazon Web Services - AWS Blog

Read More..