Category Archives: Quantum Computer
Quantinuum Unveils Quantum Computer Featuring 56 Trapped-Ion Qubits, Setting New Performance Benchmarks – HPCwire
BROOMFIELD, Colo. and LONDON, June 5, 2024 Quantinuumtoday announced the development of the industrys first quantum computer featuring 56 trapped-ion qubits. The H2-1 model has improved its already leading fidelity.
In collaboration with JPMorgan Chase, Quantinuum ran a Random Circuit Sampling (RCS) algorithm, achieving a significant 100-fold improvement over previous industry results from Google in 2019 and setting a new benchmark for the cross entropy metric. H2-1s combination of scale and hardware fidelity presents a challenge for todays most powerful supercomputers and other quantum computing architectures to match this achievement.
Were extending our lead in the race towards fault tolerant quantum computing, accelerating research for customers like JPMorgan Chase in ways that arent possible with any other technology, said Rajeeb Hazra, CEO of Quantinuum. Our focus on quality of qubits versus quantity of qubits is changing whats possible, and bringing us closer to the long-awaited commercialization of quantums applications across industries like finance, logistics, transportation and chemistry.
Quantinuums analysis also indicates that the H2-1 executes RCS at 56 qubits with an estimated 30,000x reduction in power consumption compared to classical supercomputers, reinforcing it as the preferred solution for a wide array of computational challenges.
The fidelity achieved in our random circuit sampling experiment shows unprecedented system-level performance of the Quantinuum quantum computer. We are excited to leverage this high fidelity to advance the field of quantum algorithms for industrial use cases broadly, and financial use cases in particular, said Marco Pistoia, Head of Global Technology Applied Research at JPMorgan Chase.
Todays announcement is the latest in a string of breakthroughs made by Quantinuum in 2024:
Microsoft looks forward to a continued collaboration with Quantinuum as they release their high fidelity 56-qubit machine, said Dennis Tom, General Manager Microsoft Azure Quantum. Recently, the teams created four highly reliable logical qubits by applying Azure Quantums qubit-virtualization system to Quantinuums 32-qubit machine. With the additional physical qubits available on Quantinuums new machine, we anticipate creating more logical qubits with even lower error rates. As we reach these milestones, we will continue to increase the resiliency of quantum operations as well as the utility of quantum computing.
Learn more here.
To read the scientific paper, please visit: https://arxiv.org/abs/2406.02501
About Quantinuum
Quantinuum, the worlds largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuums technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With almost 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents. Since its formation by Honeywell and Cambridge Quantum in 2021, Quantinuum has raised approximately $625 million to further the development and commercialization of quantum computing.
Source: Quantinuum
Continue reading here:
Quantinuum Unveils Quantum Computer Featuring 56 Trapped-Ion Qubits, Setting New Performance Benchmarks - HPCwire
Quantum Computer ‘Surpasses Simulation by Supercomputer’ – IoT World Today
Quantum computing company Quantinuum has demonstrated that its quantum computer has surpassed the ability to be simulated by the worlds best supercomputers for running a benchmark algorithm.
In tests with partner JPMorgan Chase, the system also outperformed by 100x the previous results of a well-known industry-standard benchmark of quantum computing power.
According to Quantinuum, its new H2-1 quantum computer, with 56 trapped-ion qubits, has improved fidelity and is now impossible for a classical computer to fully simulate.
In quantum computing, fidelity is a measure of the accuracy of quantum operations and indicates how close an operation gets during execution to the operation it is intended to do.
A team from Quantinuum and JPMorgan Chase ran a benchmark algorithm known as the Random Circuit Sampling (RCS) algorithm, which is the one Google used in the claim to quantum supremacy it made in 2029.
The researchers reported achieving a 100x improvement over Google's result, setting a new world record for the cross entropy benchmark.
Were extending our lead in the race towards fault-tolerant quantum computing, accelerating research for customers like JPMorgan Chase in ways that arent possible with any other technology, said Quantinuum CEO Rajeeb Hazra.
Related:Researchers Demonstrate Step Toward Quantum Advantage
Our focus on quality of qubits versus quantity of qubits is changing whats possible and bringing us closer to the long-awaited commercialization of quantums applications across industries like finance, logistics, transportation and chemistry.
Quantinuum also reported that H2-1 executed the RCS algorithm using an estimated 30,000x less power than a classical supercomputer.
The fidelity achieved in our random circuit sampling experiment shows unprecedented system-level performance of the Quantinuum quantum computer, saidJPMorgan Chase head of global technology applied research Marco Pistoia.
We are excited to leverage this high fidelity to advance the field of quantum algorithms for industrial use cases broadly and financial use cases in particular.
Read more:
Quantum Computer 'Surpasses Simulation by Supercomputer' - IoT World Today
NIST Q&A: Getting Ready for the Post Quantum Cryptography Threat? You Should be. – HPCwire
With the National Institute of Standards and Technology (NIST) set to publish the first Post Quantum Cryptography (PQC) Standards in a few weeks, attention is shifting to how to put the new quantum-resistant algorithms into practice. Indeed, the number of companies with practices to help others implement PQC is mushrooming and contains familiar (IBM, Deloitte, et al.) and unfamiliar names (QuSecure, SandboxAQ, etc.).
The Migration to Post-Quantum Cryptography project, being run out of NISTs National Cybersecurity Center of Excellence (NCCoE), is running at full-tilt and includes on the order of 40 commercial participants.
In its own words, The project will engage industry in demonstrating use of automated discovery tools to identify all instances of public-key algorithm use in an example network infrastructures computer and communications hardware, operating systems, application programs, communications protocols, key infrastructures, and access control mechanisms. The algorithm employed and its purpose would be identified for each affected infrastructure component.
Getting to that goal remains a WIP that started with NISTs PQC program in 2016. NIST scientist Dustin Moody leads the PQC project and talked with HPCwire about the need to take post quantum cryptography seriously now, not later.
The United States government is mandating their agencies to it, but industry as well as going to need to be doing this migration. The migration is not going to be easy [and] its not going to be pain free, said Moody, whose Ph.D. specialized in elliptic curves, a commonly used base for encryption. Very often, youre going to need to use sophisticated tools that are being developed to assist with that. Also talk to your vendors, your CIOs, your CEOs to make sure theyre aware and that theyre planning for budgets to do this. Just because a quantum computer [able to decrypt] isnt going to be built for, who knows, maybe 15 years, they may think I can just put this off, but understanding that threat is coming sooner than than you realize is important.
Estimates vary wildly around the size of the threat but perhaps 20 billion devices will need to be updated with PQC safeguarding. NIST has held four rounds of submissions and the first set of standards will encompass algorithms selected the first three. These are the main weapons against quantum decryption attack. The next round seeks to provide alternatives and, in some instances, somewhat less burdensome computational characteristics.
The discussion with Moody was wide-ranging, if perhaps a little dry. He covers PQC strategy and progress and the need to monitor the constant flow of new quantum algorithms. Shors algorithm is the famous threat but others are percolating. He notes that many submitted algorithms broke down under testing but says not to make much of that as thats the nature of the standards development process. He talks about pursuing cryptoagility and offers a few broad tips on preparation.
Moody also touched on geopolitcal rivalries amid what has been a generally collaborative international effort.
There are some exceptions like China never trusting the United States. Theyre developing their own PQC standards. Theyre actually very, very similar to the algorithms [were using] but they were selected internally. Russia has been doing their own thing, they dont really communicate with the rest of the world very much. I dont have a lot of information on what theyre doing. China, even though they are doing their own standards, did have researchers participate in the process; they hosted one of the workshops in the field a few years back. So the community is small enough that people are very good at working together, even if sometimes the country will develop their own standards, said Moody.
How soon quantum computers will actually be able to decrypt current RSA codes is far from clear, but early confidence that would be many decades has diminished. If youre looking for a good primer on the PQS threat, he recommended the Quantum Treat Timeline Report released in December by the Global Risk Institute (GRI) as one (figures from its study below).
HPCwire: Lets talk a little bit about the threat. How big is it and when do we need to worry
Dustin Moody: Well, cryptographers have known for a few decades that if we are able to build a big enough quantum computer, it will threaten all of the public key crypto systems that which we use today. So its a its a serious threat. We dont know when a quantum computer would be built thats large enough to attack current levels of security. Theres been estimates of 10 to 15 years, but you know, nobody knows for certain. We have seen progress in companies building quantum computers systems from IBM and Google, for example, are getting larger and larger. So this is definitely a threat to take seriously, especially because you cant just wait until the quantum computer is built and then say now well worry about the problem. We need to solve this 10 to 15 years in advance to protect your information for a long time. Theres a threat of harvest-now-decrypt-later that helps you understand that.
HPCwire: Marco Pistoia, who leads quantum research for JPMorgan Chase, said hed seen a study suggesting as few as 1300 or so logical qubits might be able to break conventional RSA code, although it would take six months to do so. That was a year ago. It does seem like our ability to execute Shors algorithm on these systems is improving, not just the brute force, but our cleverness in getting the algorithm to run.
Dustin Moody: Yep, thats true. And itll take a lot of logical qubits. So were not there yet. But yeah, progress has been made. You have to solve the problem solved and migrate to new solutions before we ever get to that point,
HPCwire: We tend to focus on Shors algorithm because its a direct threat to the current encryption techniques. Are there others in the wings that we should be worried about?
Dustin Moody: Theres a large number of quantum algorithms that we are aware of, Shor being one of them, Grovers being another one that has an impact on cryptography. But theres plenty of other quantum algorithms that do interesting things. So whenever anyone is designing the crypto system, they have to take a look at all those and see if they look like they could attack the system in any way? Theres kind of a list of I dont know, maybe around 15 or so that potentially people have to kind of look at him and figure out, do I need to worry about these.
HPCwire: Does NIST have that list someplace?
Dustin Moody: There was a guy at NIST who kept up such a list. I think hes at Microsoft, now. Its been a little while, but he maintained something called the Quantum Algorithms Zoo.
HPCwire: Lets get back to the NIST effort to develop quantum-resistant algorithms. As I understand it, the process began being around 2016 has gone through this iterative process where you invite submissions of potential quantum resistant algorithms from the community, then test them and come up with some selections; there have been three rounds completed and in the process of becoming standards, with an ongoing fourth round. Walk me through the project and progress.
Dustin Moody: So these kinds of cryptographic competitions have been done in the past to select some of the algorithms that we use today. [So far] a widely used block cypher was selected through a competition. More recently a hash function. Back in 2016, we decided to do one of these [competitions] for new post quantum algorithms that we needed standards for. We let the community know about that. Theyre all excited and we got 82 submissions of which 69 met kind of the requirements that wed set out to be involved. Then we had a process that over six or seven years [during which] we evaluated them going through a period of rounds. In each round, we went further down to the most promising to advance the tons of work going on in there, both internally at NIST, and by the cryptographic community, doing research and benchmarks and experiments and everything.
The third round had seven finalists and eight alternate concluded in July of 2022, where we announced items that we would be standardizing as a result, that included one encryption algorithm and three signature algorithms. We did also keep a few encryption algorithms on into a fourth round for further study. They werent quite ready to be selected for standardization. That fourth round is still ongoing and will probably end as this fall, and well pick one or two of those to also standardize. Well have two or three encryption [methods] and three signatures as well.
HPCwire: It sounds like a relatively smooth process?
Dustin Moody: That process got a lot of attention from the community. A lot of the algorithms ended up being broken, some late in the process thats kind of the nature of how this thing works. Thats where we are now. Were just about done writing the standards for the first ones that we selected, our expected date is publishing them this summer. The fourth round will end this fall, and then well write standards for those that will take another year or two.
We also have ongoing work to select a few more digital signature algorithms as well. The reason for that is so many of the algorithms we selected are based on what are called lattices; theyre the most promising family, [with] good performance, good security. And for signatures, we had two based on lattices, and then one not based on lattices. The one that wasnt based on lattices its called SPHINCS+ turns out to be bigger and slower. So if applications needed to use it, it might not be ideal for them. We wanted to have a backup not based on lattices that could get used easily. Thats what this ongoing digital signature process is about [and] were encouraging researchers to try and design new solutions that are not based on lattices that are better performing.
HPCwire: When NIST assesses these algorithms, it must look to see how many computational resources are required to run them?
Dustin Moody: Theres specific evaluation criteria that we look at. Number one is security. Number two is performance. And number three is this laundry list of everything else. But we work internally at NIST, we have a team of experts and try to work with cryptography and industry experts around the world who are independently doing it. But sometimes were doing joint research with them in the field.
Security has a wide number of ways to look at it. Theres the theoretical security, where youre trying to create security proofs where youre trying to say, if you can break my crypto system, then you can break this hard mathematical problem. And we can give a proof for that and because that hard mathematical problem has been studied, that gives us a little bit more confidence. Then it gets complicated because were used to doing this with classical computers and looking at how they can attack things. But now we have to look at how can quantum computers attack things and they dont yet exist. We dont know their performance. capabilities. So we have to extrapolate and do the best that we can. But its all thrown into the mix.
Typically, you dont end up needing supercomputers. Youre able to analyze how long would the attacks take, how many resources they take, if you were to fully tried to break the security parameters at current levels. The parameters are chosen so that its [practically] infeasible to do so. You can figure out, if I were to break this, it would take, you know, 100 years, so theres no use in actually trying to do that unless you kind of find a breakthrough to find a different way. (See descriptive list of NIST strengths categories at end of article)
HPCwire: Do you test on todays NISQ (near-term intermediate scale quantum) computers?
Dustin Moody: Theyre too small right now to really have any impact in looking at how will a larger quantum computer fare against concrete parameters chosen at high enough security levels. So its more theoretical, when youre figuring out how much resources it would take.
HPCwire: So summarizing a little bit, you think in the fall youll finish this last fourth round. Those would all be candidates for standards, which then anyone could use for incorporation into encryption schemes that would be quantum computer resistant.
Dustin Moody: Thats correct. The main ones that we expect to use were already selected in our first batch. So those are kind of the primary ones, most people will use those. But we need to have some backups in case you know, someone comes up with a new breakthrough.
HPCwire: When you select them do you deliberately have a range in terms of computational requirements, knowing that not everyone is going to have supercomputers at their doorstep. Many organizations may need to use more modest resources when running these encryption codes. So people could pick and choose a little bit based on the computational requirements.
Dustin Moody: Yes, theres a range of security categories from one to five. Category Five has the highest security, but performance is impacted. So theres a trade off. We include parameters for categories one, three, a five so people can choose the one thats best suited for their needs.
HPCwire: Can you talk a little bit about the Migration to PQC project, which is also I believe in NIST initiative to develop a variety of tools for implementingPQC Whats your involvement? How is that going?
Dustin Moody: That project is being run by NISTs National Cybersecurity Center of Excellence (NCCoE). Im not one of the managers but I attend all the meetings and Im there to support what goes on. Theyve collaborated withI think the list is up 40 or 50 industry partners and the list is on their website. Its a really strong collaboration. A lot of these companies on their own would typically be competing with each but here, theyre all working for the common good of making the migration as smooth as possible, getting experience developing tools that people are going to need to do cryptographic inventories. Thats kind of one of the first steps that an organization is going to need to do. Trying to make sure everything will be interoperable. What lessons can we learn as we. Some people are further along than others and how can we share that information best? Its really good to have weekly calls, [and] we hold events from time to time. Mostly these industry collaborators are driving it and talking with each other and we just kind of organize them together and help them to keep moving.
HPCwire: Is there any effort to build best practices in this area? Something that that NIST and these collaborators from industry and academia and DOE and DOD could all provide? It would be perhaps have the NIST stamp of authority on best practices for implementing quantum resistant cryptography.
Dustin Moody: Well, the standards that my team is writing, and those are written by NIST and those are the algorithms that people will implement. Then theyll also then get tested and validated by some of our labs at NIST. The migration project is producing documents, in a series (NIST SP 1800-38A, NIST SP 1800-38B, NIST SP 1800-38C) and those are updated from time to time, where theyre sharing what theyve learned and putting best practice in this. They are NIST documents, written jointly with the NIST team and with these collaborators to share what theyve got so far.
HPCwire: What can the potential user community do to be involved? I realize the project is quite mature, its been around for a while, and youve got lots of people who whove been involved already. Are we at the stage where the main participants are working with each other and NIST in developing these algorithms, and its now a matter of sort of monitoring the tools that come out.
Dustin Moody: I would say every organization should be becoming educated on understanding the quantum threat, knowing whats going on with standardization, knowing that youre going to need to migrate, and what thats going to involve your organization. Its not going to be easy and pain free. So planning ahead, and all that. If they want to join that that collaboration (Migration to PQC), people are still joining from time to time and it is still open if they have something that theyve got to share. But for most organizations or groups, its going to be just trying to create your plan preparing for the migration. We want you to wait till the final standards are published, so youre not implementing the something thats 99% the final standard, we want you to wait until thats there, but you can prepare now.
HPCwire: When will they be final?
Dustin Moody: Of the four that we selected, three of them. We put out draft standards a year ago, got public feedback, and have been revising since. The final versions are going to be published this summer. We dont have an exact date, but it will, itll be this summer.
HPCwire: At that point, will a variety of requirements will come around using these algorithms, for example in the U.S. government and perhaps in industry requiring compliance?
Dustin Moody: Technically NIST isnt a regulatory agency. So yes, US government can. I think the OMB says that all agencies need to use our standards. So the federal government has to use the standards that we use for cryptography, but we know that a wider audience industry in the United States and globally tends to use the algorithms that we standardized as well.
HPCwire: Were in a world in which geopolitical tensions are real. Are we worried about rivals from China or Russia, or other competing nations not sharing their advances? Or is the cryptoanalyst community small enough that those kinds of things are not likely to happen because the people know each other?
Dustin Moody: There is a real geopolitical threat in terms of who gets the quantum computer quickest. If China develops that and theyre able to break into our cryptography, thats a thats a real threat. In terms of designing the algorithms and making the standards, its been a very cooperative effort internationally. Industry benefits when a lot of people are using the same algorithms all over the world. And weve seen other countries in global standards organizations say theyre going to use the algorithms that were involved in our process.
There are some exceptions like China never trusting the United States. Theyre developing their own PQC standards. Theyre actually very, very similar to the algorithms [were using] but they were selected internally. Russia has been doing their own thing, they dont really communicate with the rest of the world very much. I dont have a lot of information on what theyre doing. China, even though they are doing their own standards, did have researchers participate in the process; they hosted one of the workshops in the field a few years back. So the community is small enough that people are very good at working together, even if sometimes the country will develop their own standards.
HPCwire: How did you get involved in cryptography? What drew you into this field?
Dustin Moody: Well, I love math and the math I was studying has some applications in cryptography, specifically, something called elliptic curves, and theres crypto systems we use today that are based on the curve, which is this beautiful mathematical object that probably no one ever thought they would be of any use in the in the real world. But it turns out they are for cryptography. So thats kind of my hook into cryptography.
I ended up at NIST because NIST has elliptic curve cryptography standards. I didnt know anything about post quantum cryptography. Around 2014, my boss said, were going to put you in this project dealing with post quantum cryptography and I was like, Whats this? Ive no idea what this is. Within a couple of years, it kind of really took off and grew and has become this high priority for the United States government. Its been a kind of a fun journey to be on.
HPCwire: Will the PQC project just continue or will it wrap up at some point?
Dustin Moody: Well continue for a number of years. We still have the fourth round to finish. Were still doing this additional digital signature process, which will take several more years. But then again, every everything we do in the future needs to protect against quantum computers. So these initial standards will get published, theyll be done at some point, but all future cryptography standards will have to take the quantum threat into account. So its kind of built in that we have to keep going for the future.
HPCwire: When you talk to the vendor community, they all say, Encryption has been implemented in such a haphazard way across systems that its everywhere, and that in simply finding where it exists in all those things is difficult. The real goal, they argue, should be to move to a more modular predictable approach. Is there a way NIST can influence that? Or the selection of the algorithms can influence that?
Dustin Moody: Yes, and no. Its very tricky. That idea youre talking about, sometimes the word cryptoagility gets thrown out there in that direction. A lot of people are talking about, okay, were going to need to migrate these algorithms, this is an opportunity to redesign systems and protocols, maybe we can do it a little bit more intelligently than we did in the past. At the same time, its difficult to do that, because youve got so many interconnected pieces doing so many things. So its tricky to do, but we are encouraging people and having lots of conversations like with the migration and PQC project. Were encouraging people to think about this, to redesign systems and protocols when youre designing your applications. Knowing I need to transition to these algorithms, maybe I can redesign my system so that if I need to upgrade again, at some point, itll be much easier to do. I can keep track of where my cryptography is, what happens when Im using it, what information and protecting. I hope that well get some benefit out of this migration, but its, its certainly going to be very difficult, complicated and painful as well.
HPCwire: Do you have an off the top of your head checklist sort of five things you should be thinking about now to prepare for post quantum cryptography?
Dustin Moody: Id say number one, just know that the migration is coming. The United States government is mandating their agencies to it, but industry as well as going to need to be doing this migration. The migration is not going to be easy, its not going to be pain free. You should be educating yourself as to what PQC is, the whole quantum threat, and starting to figure out, where are you using cryptography, what information is protected with cryptography. As you noted, thats not as easy as it should be. Very often, youre going to need to use sophisticated tools that are being developed to assist with that. Also talk to your vendors, your CIOs, your CEOs to make sure theyre aware and that theyre planning for budgets to do this. Just because a quantum computer [able to decrypt] isnt going to be built for, who knows, maybe 15 years, they may think I can just put this off, but understanding that threat is coming sooner than than you realize is important.
HPCwire: Thank you for your time!
Strength Categories from NIST
In accordance with the second and third goals above (Submission Requirements and Evaluation Criteria for the Post-Quantum Cryptography Standardization Process), NIST will base its classification on the range of security strengths offered by the existing NIST standards in symmetric cryptography, which NIST expects to offer significant resistance to quantum cryptanalysis. In particular, NIST will define a separate category for each of the following security requirements (listed in order of increasing strength2 ):
1) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 128-bit key (e.g. AES-128)
2) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for collision search on a 256-bit hash function (e.g. SHA-256/ SHA3-256)
3) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 192-bit key (e.g. AES-192)
4) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for collision search on a 384-bit hash function (e.g. SHA-384/ SHA3-384)
5) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 256-bit key (e.g. AES-256)
Original post:
NIST Q&A: Getting Ready for the Post Quantum Cryptography Threat? You Should be. - HPCwire
Successful demonstration of a superconducting circuit for qubit control within large-scale quantum computer systems – NEC Global
Tokyo, June 3, 2024 - In support of the development of large-scale superconducting quantum computers, researchers with the National Institute of Advanced Industrial Science and Technology (AIST), one of the largest public research organizations in Japan, in collaboration with Yokohama National University, Tohoku University, and NEC Corporation, proposed and successfully demonstrated a superconducting circuit that can control many qubits at low temperature.
To realize a practical quantum computer, it is necessary to control the state of a huge number of qubits (as many as one million) operating at low temperature. In conventional quantum computers, microwave signals for controlling qubits are generated at room temperature and are individually transmitted to qubits at low temperature via different cables. This results in numerous cables between room and low temperature and limits the number of controllable qubits to approximately 1,000.
In this study, a superconducting circuit that can control multiple qubits via a single cable using microwave multiplexing was successfully demonstrated in proof-of-concept experiments at 4.2 K in liquid helium. This circuit has the potential of increasing the density of microwave signals per cable by approximately 1,000 times, thereby increasing the number of controllable qubits significantly and contributing to the development of large-scale quantum computers.
The above results will be published in "npj Quantum Information" on June 3 at 10 a.m. London time.
Article Information Journal: npj Quantum Information Title: Microwave-multiplexed qubit controller using adiabatic superconductor logic Authors: Naoki Takeuchi, Taiki Yamae, Taro Yamashita, Tsuyoshi Yamamoto, and Nobuyuki Yoshikawa DOI: 10.1038/s41534-024-00849-2
Go here to read the rest:
Successful demonstration of a superconducting circuit for qubit control within large-scale quantum computer systems - NEC Global
Quantum firm D-Wave extends agreement with European Aramco Research Center – DatacenterDynamics
Quantum computing firm D-Wave has extended its agreement with the European arm of Saudi oil company Aramco.
D-Wave has partnered with the Aramco Research Center in Delft, the Netherlands, for the past two years, developing quantum technologies to solve geophysical optimization problems, including intensive seismic imaging.
Its unclear by how long the partnership has been extended.
According to the Aramco Research Center, the team has used quantum technology provided by D-Wave to create its first subsurface maps, using tens of gigabytes of seismic data as input. The center aims to process a terabyte of seismic data with the D-Wave quantum computer this year.
In a statement, Marcin Dukalski, quantum applications lead at the Aramco Research Center said: Im excited to see how far weve been able to push quantum technologies to tackle such a large optimization problem as subsurface imaging.
He added: We look forward to expanding our work with D-Wave, which will be centered on reaping even greater tangible benefits from the Advantage2 system.
First unveiled in 2022, D-Waves Advantage2 prototype features more than 1,200 qubits and 10,000 couplers and reportedly demonstrates a 20x faster time-to-solution on hard optimization problems.
In March, the company avoided being delisted from the New York Stock Exchange for the third time after the company was able to bring its share price back above a $1 average closing share price over a 30-day period.
Its stock price has not dipped below the $1 mark since the company was relisted.
Meanwhile, state-owned Aramco is the worlds largest producer of oil and has rights to the worlds second-largest proven crude oil reserves. In 2017, a report by CDP and the Climate Accountability Institute found that Aramco was responsible for 4.5 percent of global industrial greenhouse gas emissions from 1988 to 2015, placing second to the country of China.
In May 2024 the company announced it was partnering with French startup Pasqal to install the first quantum computer in Saudi Arabia.
Under the terms of that agreement, neutral atom quantum vendor Pasqal will install, maintain, and operate a 200-qubit quantum computer scheduled for deployment in the second half of 2025.
View post:
Quantum firm D-Wave extends agreement with European Aramco Research Center - DatacenterDynamics
Singapore looks to boost AI with plans for quantum computing and data centers – ZDNet
Karl Hendon/Getty Images
Singapore is looking to carve out a global footprint in artificial intelligence (AI) with the release of international standards for large language model (LLM) testing and investments in quantum computing and new data center capacity.
Quantum has the potential to unlock new value, where higher processing capabilities can be harnessed in areas such as simulating complex molecules for drug discovery, said Deputy Prime Minister Heng Swee Keat at last week's Asia Tech x Singapore 2024 summit.
Also: Generative AI may be creating more work than it saves
He added that quantum computing can also have synergies with AI, for example, in improving the efficiency of developing and training advanced AI models. This development, in turn, can further drive innovations in deep learning, natural language processing, and computer vision.
However, there still are challenges to resolve in quantum, including requirements for cryogenic cooling and error correction, Heng said. He noted that researchers worldwide were assessing different approaches to achieve scale and enable quantum computing to be commercially viable.
Also:Rote automation is so last year: AI pushes more intelligence into software development
Singapore wants to address thesechallenges with its National Quantum Strategy, coupled with almost SG$300 million ($221.99 million) in investment. This cash is on top of a previous SG$96.6 million commitmentannounced in 2022. The new investment is earmarked for five years, through to 2030, to boost the country's position as a leading hub in the development and deployment of quantum technologies, Heng said.
This roadmap focuses on four areas, including initiatives in quantum research, such asquantum communications and security and quantum processors, and a scholarship program to produce 100 PhD and 100 master's-level graduates over the next five years, he said.
Efforts are underway for Singapore to buildcapabilities in the design and development of quantum processors. This work will encompass research onqubit technologies, including photonic networks, neutral atoms, and superconducting circuits.
ZDNET understands Singapore's target is to have the first prototype ready in the next three years and scale out production in five years.
The government in 2022 unveiled a three-year initiative to build a quantum-safe network that it hopes will showcase "crypto-agile connectivity" and facilitate trials with both public and private organizations. The initiative also includes a quantum security lab for vulnerability research.
Singapore last week also launched its green data center roadmap to chart "digital sustainability and chart green growth pathways" for such facilities, supporting AI and computing developments.
The country has over 1.4 gigawatts of data center capacity and is home to more than 70 cloud, enterprise, and co-location data centers.
Singapore is aiming to add at least 300 megawatts of additional data center capacity "in the near term" and another 200 megawatts through green energy deployments, said Janil Puthucheary, senior minister of state for the Ministry of Communications and Information, at the summit.
Efforts will be made to enhance efficiency through both hardware and software, Puthucheary said, pointing to technologies that maximize energy efficiency and capacity, and green software tools.
He added that improving data center efficiency is also about greening software, so the carbon emissions of applications can be reduced.
He said the focus will be placed on data centers to accelerate their use of green energy, with the government offering support via grants and incentives to switch to energy-efficient IT equipment. In addition, the Infocomm Media Development Authority (IMDA) will work with PUB to help data centers push their water usage effectiveness (WUE) to 2.0 cubic meters or less per megawatt hour, up from the 2021 median WUE of 2.2 cubic meters.
Also:Agile development can unlock the power of generative AI - here's how
IMDA will jointly develop standards and certifications with industry partners to drive the development and operation of data centers with power usage effectiveness (PUE) of 1.3 or lower.
In addition, the BCA-IMDA Green Mark for data centers will be refreshed by year-end to raise the standards for energy efficiency in data centers. IMDA will also introduce standards for IT equipment energy efficiency and liquid cooling by 2025, to drive the adoption of these technologies in Singapore.
The green data center roadmap outlines plans to reduce energy use for air-cooling by raising operating temperatures via IMDA's tropical DC methodology.
According to the government agency, data centers can achieve 2% to 5% energy savings for every 1C increase in operating temperature.
It also pointed to simulations that have found existing data centers can achieve a 50% reduction in energy consumption of supporting infrastructure, with energy-efficient retrofits and upgrades for key equipment, such as chiller plants and uninterruptible power supplies.
"We aim to uplift all data centers in Singapore to achieve PUE of less than 1.3 at 100% IT load over the next 10 years," IMDA said. "This gives existing data centers sufficient time to plan for upgrades."
The tech industry today emits an estimated 1.5% to 4% of global greenhouse gas emissions, Heng noted, with this figure projected to climb as the use of AI expands alongside the need for data storage and processing.
Also:3 ways to accelerate generative AI implementation and optimization
He said technologies that drive the country's digital economy, such as cloud and AI, fuel demand for powerful and energy-intensive computing.
"Data centers lie at the heart of such activities and require large amounts of energy for processing and cooling. Greening ICT, especially data centers, is therefore crucial in a digital and carbon-constrained world," he said.
"There is a need to balance the economic and social benefits of digital applications with the environmental effects from the resultant emissions," he said, noting that Singapore has committed to a net-zero target by 2050.
"The [green data center] roadmap sets out low-carbon energy sources that data centers can explore, which include bioenergy, fuel cells with carbon capture, low-carbon hydrogen and ammonia for a start," Puthucheary explained. "We welcome proposals from the industry to push boundaries in realizing these pathways in Singapore."
Meanwhile, the country wants to lead the way by releasing standards for large language model (LLM) testing, developed via partnerships with global organizations such as MLCommons, IBM, and Singtel.
Dubbed Project Moonshot, the LLM testing tool provides benchmarking, red-teaming, and testing baselines to help developers and organizations mitigate risks associated with LLM deployment.
Also:Generative AI is the technology that IT feels most pressure to exploit
LLMs without guardrails can reinforce biases and create harmful content, with unintended consequences. "IMDA is seeking to establish guardrails to manage the risks while enabling space for innovation," the government agency said.
"It is important to adopt an agile, test-and-iterate approach to address key risks in model development and use. Project Moonshot provides intuitive results, so testing unveils the quality and safety of a model or application in an easily understood manner, even for a non-technical user."
The testing tool provides a five-tier scoring system where each completed scoring sheet will place the application on a scale. Grade cut-offs can be determined by the author of each of these scoring sheets.
AI Verify Foundation and MLCommons jointly developed the testing LLM benchmarks. The latter is an open-engineering consortium supported by Qualcomm, Google, Intel, and NVIDIA and recognized by the US National Institute of Science and Technology under its AI Safety Consortium. AI Verify Foundation is Singapore's not-for-profit foundation that focuses on developing AI testing tools.
Also:AI business is booming: ChatGPT Enterprise now boasts 600,000+ users
Project Moonshot is currently available as an open beta.
IMDA said it is working with companies such as Anthropic to develop a practical guide to multilingual and multicultural red-teaming for LLMs. The guide is slated for release later this year for global use.
Read more here:
Singapore looks to boost AI with plans for quantum computing and data centers - ZDNet
Quantum Computing Revolutionizes AGV Scheduling – AZoQuantum
In an article recently published in the journal Scientific Reports, researchers investigated the potential of quantum computing technology for solving the automated guided vehicle (AGV) scheduling problem.
Currently, AGVs are used extensively in every aspect of production, transportation, and logistics, which significantly improved industrial intelligence and automation levels and enhanced efficiency. The amount of parallel work AGVs do is increasing to meet the requirements of application scenarios, which greatly increases the AGV scheduling challenges.
The AGV scheduling problem is a challenging combinatorial optimization problem. Although several studies have been performed on AGV scheduling problems covering multiple scenarios like terminals and workshops, finding high-quality scheduling solutions quickly/within a short timeframe remains a major challenge.
Significant progress has been achieved recently in both practical applications and theoretical understanding of quantum computing. Quantum computers' dependence on quantum mechanical principles is their fundamental difference from traditional computers.
Specifically, quantum bits are utilized as fundamental information storage units in quantum computers, which enable these computers to hold substantially more information than traditional computers. Additionally, quantum computers are advantageous for addressing problems like combinatorial optimization. Combinatorial optimization problems can be mapped to the Ising model's ground state search problem.
In this regard, the scheduling problem of AGVs could be considered as a type of routing problem.
Traditional solutions for routing problems often require significant computational resources. However, quantum computing techniques have displayed great potential in solving optimization and routing problems. Although several studies have utilized quantum computing to solve practical optimization problems, quantum computing research on AGV scheduling remains at the nascent stage, with several researchers using simulators to solve them.
In this study, researchers applied quantum computing technology to the AGV scheduling problemand proposed new quadratic unconstrained binary optimization (QUBO) models that adapt to solving the problem under two separate criteria: minimizing the overall AGV travel time and task completion time/makespan.
Specifically, two types of QUBO models suitable for various AGV scheduling objectives were constructed, and the scheduling scheme was coded into the Hamiltonian operator's ground state. The problem was solved using an optical coherent Ising machine (CIM).
The objective of the study was to effectively meet the requirements of large-scale scheduling.
In traditional AGV scheduling problem research, the computation time significantly increases with the rising number of tasks and AGVs. In practical scenarios, dispatchers set several scheduling objectives based on the nature of the work, with minimizing the total travel time and task completion time being the most common objectives. Thus, researchers constructed the QUBO models based on different objectives and presented the solutions and theoretical underpinnings for each.
The CIM and a traditional computer were used to perform the numerical experiments on the proposed QUBO model and the traditional model, respectively. Gurobi solver was utilized to solve the proposed mixed integer programming (MIP) model on a traditional computer, and its computing performance was demonstrated under various problem scales.
Additionally, an optical quantum computer was employed to solve the arc and node models' problem cases at different scales, and the computation performance was compared with the performance of traditional computers. The components of the CIM used in this study were primarily composed of electrical and optical parts.
The machine's optical part was composed of periodically poled lithium niobate crystals, fiber rings, erbium-doped fiber amplifiers, and pulsed lasers. The electrical part consisted of field-programmable gate arrays, analog-to-digital/digital-to-analog converters, and optical balanced homodyne detectors.
The comparison of the arc and node model performance on a quantum computer with the MIP model performance on traditional computers showed that the solutions obtained using CIM were all optimal. In small-scale examples, the CIM was significantly faster than the traditional computer.
Unlike traditional computers, CIM's computation time did not increase significantly with increasing problem scales. This indicates CIM's great application and development potential. Additionally, little difference was observed in the computing performance between the arc model and the node model on the quantum computer.
Specifically, the node model was slightly faster than the arc model and more universal than the node model. Overall, the experimental results showed that the optical quantum computer could save 92 % computation time on average compared to the traditional calculation method.
To summarize, the findings of this study demonstrated that CIM has significant application potential in solving the AGV scheduling problem and other similar combinatorial optimization problems. However, the benefits of quantum computing in large-scale situations/problems could not be demonstrated due to hardware constraints, which was the major limitation of this study.
Tang, L., Yang, C., Wen, K., Wu, W., Guo, Y. (2024). Quantum computing for several AGV scheduling models. Scientific Reports, 14(1), 1-16. https://doi.org/10.1038/s41598-024-62821-6, https://www.nature.com/articles/s41598-024-62821-6
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.
Link:
Quantum Computing Revolutionizes AGV Scheduling - AZoQuantum
UK quantum company reaches new milestones on the path to powerful quantum computers – E&T Magazine
Quantinuum has announced that its H-Series processor has surpassed the ability to be simulated by the worlds best supercomputers, enabling the company to extend the lead in the race towards fault tolerant quantum computing.
The UK quantum computing company has announced a major qubit (or quantum bit) count enhancement to its flagship System Model H2 quantum computer from 32 to 56 trapped-ion qubits.
Formed in 2021 by Honeywell and Cambridge Quantum, Quantinuum has to date raised approximately $625m to further the development and commercialisation of quantum computing. Its mission is to see the quantum computing industry depart the era when quantum computers could be simulated by a classical computer.
This has been achieved, with the announcement that its upgraded H2-1 from 32 to 56 qubits makes its processor impossible to classically simulate.
This upgrade is also challenging the worlds most powerful supercomputers. It proved this in a demonstration in partnership with US finance company JPMorgan Chase and a team from research institutions Caltech and Argonne National Lab.
This demonstration tackled a well-known algorithm of quantum computing power, random circuit sampling, and measured the quality of results with a suite of tests including the linear cross entropy benchmark an approach first made famous by Google in 2019 in a bid to demonstrate quantum supremacy.
The results on H2-1 reveal a significant 100x lift in performance over prior industry results from Google, setting a new world record for the cross entropy benchmark.
According to Quantinuum, H2-1s combination of scale and hardware fidelity makes it difficult for todays most powerful supercomputers and other quantum computing architectures to match this result.
Were extending our lead in the race towards fault tolerant quantum computing, accelerating research for customers like JPMorgan Chase in ways that arent possible with any other technology, said Rajeeb Hazra, CEO of Quantinuum.
Our focus on quality of qubits versus quantity of qubits is changing whats possible, and bringing us closer to the long-awaited commercialisation of quantums applications across industries like finance, logistics, transportation and chemistry.
Quantinuums analysis also indicates that not only does HS2-1 achieve significant system-level performance, it does so with an estimated 30,000x reduction in power consumption compared to classical supercomputers.
Continue reading here:
UK quantum company reaches new milestones on the path to powerful quantum computers - E&T Magazine
US-returned Chinese physicist and team achieve world first in quantum computing – South China Morning Post
Chinese scientists are one step closer to a future large-scale quantum computer after building the worlds largest quantum simulation machine based on the trapped-ion technique, praised by one academic journal reviewer as a milestone to be recognised.
The breakthrough was achieved under the leadership of Duan Luming, a quantum physicist renowned for his pioneering research, who returned to China in 2018 after 15 years of teaching in the United States.
Duan received his doctorate in 1998 from the University of Science and Technology of China, the countrys premier institute for quantum research, before joining the University of Michigan in the early 2000s.
Since his return, he has been a full-time professor at Tsinghua Universitys Institute for Interdisciplinary Information Sciences.
Duan and his colleagues, along with several research groups at universities and hi-tech companies around the world, have been chasing the trapped-ion approach to qubits.
Quantum bits, or qubits, are the building blocks of quantum computers, just as bits are in regular computers.
However, qubits are extremely difficult to harness in a controlled and repeatable way because of what is called their hazy nature.
Regular bits can be described as switches that are either on or off. But because uncertainty and probability hold sway in quantum physics, qubits can be both on and off at the same time, and also exist in a variety of in-between states.
Ions, or charged atomic particles, can be trapped and suspended in free space using electromagnetic fields. The qubits are stored in stable electronic states of each ion, and quantum information can be transferred through the collective motion of the ions in a shared trap.
But scalability remains a key challenge for this system.
This is where the trapped-ion approach comes in, as it offers one of the most promising architectures for a scalable, universal quantum computer.
Researchers earlier achieved quantum simulations with up to 61 ions in a one-dimensional crystal. Ion crystals are solids made up of ions bound together in a regular lattice the symmetrical three-dimensional structural arrangements of atoms, ions or molecules inside a solid.
But Duan and his teams quantum simulator was able to achieve the stable trapping and cooling of a two-dimensional crystal of up to 512 ions, in a first for science.
The feat holds great significance for the future of quantum computing, given that scalability is a major hurdle. The teams scaling up of the ions in a stable simulation system is seen as likely to pave the way to building more powerful quantum computers.
The findings of their study were published on Wednesday in the peer-reviewed journal Nature.
This is the largest quantum simulation or computation performed to date in a trapped-ion system, commented one reviewer.
Quantum simulators are devices that actively use quantum effects to answer questions about model systems and, through them, real systems. They are increasingly popular tools in the world of quantum computing for their role in advancing scientific knowledge and developing technologies.
Duan and his team also managed to perform a quantum simulation calculation using 300-ion qubits. They found the computational complexity of 300-ion quantum bits working simultaneously to be astronomical, far exceeding the direct simulation capability of classical computers.
The rest is here:
US-returned Chinese physicist and team achieve world first in quantum computing - South China Morning Post
Unveiling Protein Structures with Quantum Computing – AZoQuantum
May 31 2024Reviewed by Lexie Corner
Recent findings from IBM and Cleveland Clinic researchersmay pave the way for applying quantum computing techniques to protein structure prediction. These findings are publishedin the Journal of Chemical Theory and Computation.This publication represents the Cleveland Clinic-IBM Discovery Accelerator collaboration's first peer-reviewed paper on quantum computing.
For many years, researchers have used computational methods to predict protein structures. A protein folds into a structure that controls its molecular interactions and mode of action. These structures determine numerous facets of human health and illness.
Researchers can create more effective treatments by better understanding how diseases spread through precise protein structure predictions. Bryan Raubenolt, Ph.D., a Postdoctoral Fellow at the Cleveland Clinic, and Hakan Doga, Ph.D., a researcher at IBM, led a team to discover how quantum computing can enhance existing techniques.
Machine learning techniques have significantly advanced the prediction of protein structure in recent years. To make predictions, these techniques rely on training data, a database of protein structuresdetermined through experimentation. This indicates that the number of proteins they have been trained to identify is a limitation. When programs or algorithms come across a protein that is mutated or significantly different from the ones they were trained on, as is frequently the case with genetic disorders, this can result in decreased accuracy levels.
A different approach is to model the physics involved in protein folding. Through simulations, scientists can examine multiple protein configurations and determine the most stable form, whichis essential for drug design.
The challenge is that these simulations are nearly impossible on a classical computer beyond a certain protein size. In a way, increasing the size of the target protein is comparable to increasing the dimensions of a Rubik's cube. For a small protein with 100 amino acids, a classical computer would need the time equal to the age of the universe to exhaustively search all the possible outcomes.
Dr. Bryan Raubenolt, Postdoctoral Fellow, Cleveland Clinic
The research team combined quantum and classical computing techniques to get around these restrictions. Within this framework, quantum algorithms can tackle problems that current state-of-the-art classical computing finds difficult, such as the physics of protein folding, intrinsic disorder, mutations, and protein size.
The accuracy with which the framework predicted, on a quantum computer, the folding of a small fragment of the Zika virus protein, compared to the most advanced classical methods, served as validation.
The initial results of the quantum-classical hybrid framework outperformed both AlphaFold2 and a method based on classical physics. The latter shows that this framework can produce accurate models without directly relying on large amounts of training data, even though it is optimized for larger proteins.
The most computationally intensive part of the calculation usually involves modeling the lowest energy conformation for the fragment's backbone, which the researchers accomplish using a quantum algorithm. After that, classical methods were employed to translate the quantum computer's output, rebuild the protein along with its sidechains, and refine the structure one last time using force fields from classical molecular mechanics.
The project illustrates how problems can be broken down into smaller components for better accuracy. Some components can be addressed by quantum computing techniques, while classical computing methods can handle others.
Working across disciplines was crucial to creating this framework.
One of the most unique things about this project is the number of disciplines involved. Our teams expertise ranges from computational biology and chemistry, structural biology, software, and automation engineering to experimental atomic and nuclear physics, mathematics, and, of course,quantum computing and algorithm design. It took the knowledge from each of these areas to create a computational framework that can mimic one of the most important processes for human life.
Dr. Bryan Raubenolt, Postdoctoral Fellow, Cleveland Clinic
The teams combination of classical and quantum computing methods is essential for advancing our understanding of protein structures and how they impact our ability to treat and prevent disease. The team plans to continue developing and optimizing quantum algorithms that can predict the structure of larger and more sophisticated proteins.
This work is an important step forward in exploring where quantum computing capabilities could show strengths in protein structure prediction. Our goal is to design quantum algorithms that can find how to predict protein structures as realistically as possible.
Dr. Hakan Doga, Researcher, IBM
Doga, H., et al. (2024) A Perspective on Protein Structure Prediction Using Quantum Computers. Chemical Theory and Computation. doi.org/10.1021/acs.jctc.4c00067
Follow this link:
Unveiling Protein Structures with Quantum Computing - AZoQuantum