Page 136«..1020..135136137138..150160..»

How QRL Is Securing The Blockchain Of The Future – Benzinga

Technological innovation seems to be advancing at an accelerated pace, from artificial intelligence to blockchain to quantum computing but these technological advancements can be leveraged by bad actors as well, bringing their own risks that must be accounted for. Quantum computing and blockchain technology are set to collide soon, and the cryptocurrency industry must be prepared. Otherwise, these powerful next-generation computers stand to put digital assets and the blockchains that support them at risk.

One pioneering project that is helping prepare for this future is QRL, a blockchain and digital asset shielded by a post-quantum secure digital security scheme. In a nutshell, QRL has developed a decentralized ledger on which digital signatures on the blockchain are secure in a world where quantum computing exists.

Quantum computing is a field that blends computer science, physics and mathematics while harnessing quantum mechanics to solve complicated problems. It does so in a more efficient manner than traditional computers ever could.

Quantum computers are designed to solve problems in seconds that would take even the world's fastest supercomputer billions of years to tackle , and their mainstream adoption is near on the horizon. They can carry out many computations while simultaneously considering several different configurations, making them exponentially faster than traditional computers. They will not replace legacy computers, but they are expected to be the go-to machines for sophisticated technological advancements across sectors of the economy, enabling breakthroughs in fields from logistics to molecular engineering.

Business investments in quantum computers are growing by leaps and bounds in areas ranging from electric vehicle batteries to the blockchain. While still in their infancy in terms of scale, quantum computers have been proven to work. For example, Google's 54-qubit Sycamore processor completed a computation in 200 seconds that would have taken the most powerful classical computer in the world 10,000 years. According to a report by IBM, cryptographic protocols can be solved within a few hours with quantum computers. Blockchain networks are buttressed by cryptography, which is the link through which digital signatures are assigned to messages on the network.

As such, the nearing of quantum computing adoption has revealed vulnerabilities that exist in high-tech innovations like blockchains, for which security and integrity are of the utmost importance.

Blockchains are built to be inherently secure, owing to the decentralized nature of ledgers combined with cryptography.

While this level of security is sufficient in a world of conventional computers, it has not yet been battle-tested against quantum computers, making the former "quantum insecure," reports QRL. As a result, the signatures protected from conventional computing power suddenly become vulnerable in the quantum-computing age.

The stakes are indeed high. Not only are both Bitcoin and Ethereum (which currently account for over 65% of all of the cryptocurrency market capitalization) vulnerable to a sufficiently powerful Quantum Computer, but Ethereum itself has over $90 billion locked up through all of the layer 2 applications building on it.

However, most crypto innovators are so busy building that many have not taken the time to recognize the threat quantum computing poses to the blockchain networks of tomorrow. According to a recent study by Deloitte, about 65% of all ether coins are vulnerable to a quantum attack, and this number has been continuously increasing. Enter The Quantum Resistant Ledger, which estimates that 99% of all blockchains and cryptocurrencies are vulnerable, and was created in response to the future threat posed by quantum computing to blockchain technology.

Quantum Resistant Ledger founder Dr. Peter Waterland, who is also a colorectal cancer surgeon, published a whitepaper for the QRL in 2016. In it, he identified how the Bitcoin and Ethereum blockchains were both fundamentally flawed. He explained how quantum computers stand to eventually decrypt the vital cryptography on which blockchains are dependent, which could lead to disaster for the industry and destroy any hopes of mainstream adoption.

But he didn't only name the problem. Dr. Waterland also proposed a solution, which

lies in the project he formed nearly 6 years ago now.

The Quantum Resistant Ledger harnesses the NIST-approved XMSS encryption instead of the standard elliptical curve cryptography (ECC) for digital signatures on the blockchain. Unlike ECC, which is expected to be broken at some stage owing to the power of these Quantum Computers, XMSS is mathematically provably secure.

The XMSS algorithm underpinning QRL is a forward-secure signature scheme that is designed to make blockchains "future-proof and to ensure their security even in the light of practical quantum computers," as noted in Yale research.

Consider supporting the network and running your own node on the Quantum Resistant Ledger.

In addition to Dr. Waterland, the QRL project is also led by J.P Lomas in the capacity of technical advisor. The development team is spearheaded by Dubai-based Kaushal Singh, while the wider non-tech team is spread across the UK, the United States and Canada. The project has had a low profile until now, as the team has had their noses down while focusing on the development of their revolutionary ledger technology prioritizing development at the expense of marketing.

Now that the project has matured more, it is ready for the limelight. QRL is gearing up for the release of its Go Zond project, a proof-of-stake blockchain that is set to make its debut in Q3/Q4 2024. QRL says this platform is not only provably secure against quantum computing but will also enable Ethereum Virtual Machine contracts, meaning that contracts written on the Ethereum blockchain will be easily portable to the QRL.

QRL invites you to become part of their growing community and be part of the next generation of blockchain technology.

Featured photo by TheDigitalArtist on Pixabay.

This post contains sponsored content. This content is for informational purposes only and is not intended to be investing advice.

Read the rest here:
How QRL Is Securing The Blockchain Of The Future - Benzinga

Read More..

Quantinuum Unveils Quantum Computer Featuring 56 Trapped-Ion Qubits, Setting New Performance Benchmarks – HPCwire

BROOMFIELD, Colo. and LONDON, June 5, 2024 Quantinuumtoday announced the development of the industrys first quantum computer featuring 56 trapped-ion qubits. The H2-1 model has improved its already leading fidelity.

In collaboration with JPMorgan Chase, Quantinuum ran a Random Circuit Sampling (RCS) algorithm, achieving a significant 100-fold improvement over previous industry results from Google in 2019 and setting a new benchmark for the cross entropy metric. H2-1s combination of scale and hardware fidelity presents a challenge for todays most powerful supercomputers and other quantum computing architectures to match this achievement.

Were extending our lead in the race towards fault tolerant quantum computing, accelerating research for customers like JPMorgan Chase in ways that arent possible with any other technology, said Rajeeb Hazra, CEO of Quantinuum. Our focus on quality of qubits versus quantity of qubits is changing whats possible, and bringing us closer to the long-awaited commercialization of quantums applications across industries like finance, logistics, transportation and chemistry.

Quantinuums analysis also indicates that the H2-1 executes RCS at 56 qubits with an estimated 30,000x reduction in power consumption compared to classical supercomputers, reinforcing it as the preferred solution for a wide array of computational challenges.

The fidelity achieved in our random circuit sampling experiment shows unprecedented system-level performance of the Quantinuum quantum computer. We are excited to leverage this high fidelity to advance the field of quantum algorithms for industrial use cases broadly, and financial use cases in particular, said Marco Pistoia, Head of Global Technology Applied Research at JPMorgan Chase.

Todays announcement is the latest in a string of breakthroughs made by Quantinuum in 2024:

Microsoft looks forward to a continued collaboration with Quantinuum as they release their high fidelity 56-qubit machine, said Dennis Tom, General Manager Microsoft Azure Quantum. Recently, the teams created four highly reliable logical qubits by applying Azure Quantums qubit-virtualization system to Quantinuums 32-qubit machine. With the additional physical qubits available on Quantinuums new machine, we anticipate creating more logical qubits with even lower error rates. As we reach these milestones, we will continue to increase the resiliency of quantum operations as well as the utility of quantum computing.

Learn more here.

To read the scientific paper, please visit: https://arxiv.org/abs/2406.02501

About Quantinuum

Quantinuum, the worlds largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuums technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With almost 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents. Since its formation by Honeywell and Cambridge Quantum in 2021, Quantinuum has raised approximately $625 million to further the development and commercialization of quantum computing.

Source: Quantinuum

Continue reading here:
Quantinuum Unveils Quantum Computer Featuring 56 Trapped-Ion Qubits, Setting New Performance Benchmarks - HPCwire

Read More..

Quantum Computer ‘Surpasses Simulation by Supercomputer’ – IoT World Today

Quantum computing company Quantinuum has demonstrated that its quantum computer has surpassed the ability to be simulated by the worlds best supercomputers for running a benchmark algorithm.

In tests with partner JPMorgan Chase, the system also outperformed by 100x the previous results of a well-known industry-standard benchmark of quantum computing power.

According to Quantinuum, its new H2-1 quantum computer, with 56 trapped-ion qubits, has improved fidelity and is now impossible for a classical computer to fully simulate.

In quantum computing, fidelity is a measure of the accuracy of quantum operations and indicates how close an operation gets during execution to the operation it is intended to do.

A team from Quantinuum and JPMorgan Chase ran a benchmark algorithm known as the Random Circuit Sampling (RCS) algorithm, which is the one Google used in the claim to quantum supremacy it made in 2029.

The researchers reported achieving a 100x improvement over Google's result, setting a new world record for the cross entropy benchmark.

Were extending our lead in the race towards fault-tolerant quantum computing, accelerating research for customers like JPMorgan Chase in ways that arent possible with any other technology, said Quantinuum CEO Rajeeb Hazra.

Related:Researchers Demonstrate Step Toward Quantum Advantage

Our focus on quality of qubits versus quantity of qubits is changing whats possible and bringing us closer to the long-awaited commercialization of quantums applications across industries like finance, logistics, transportation and chemistry.

Quantinuum also reported that H2-1 executed the RCS algorithm using an estimated 30,000x less power than a classical supercomputer.

The fidelity achieved in our random circuit sampling experiment shows unprecedented system-level performance of the Quantinuum quantum computer, saidJPMorgan Chase head of global technology applied research Marco Pistoia.

We are excited to leverage this high fidelity to advance the field of quantum algorithms for industrial use cases broadly and financial use cases in particular.

Read more:
Quantum Computer 'Surpasses Simulation by Supercomputer' - IoT World Today

Read More..

Successful demonstration of a superconducting circuit for qubit control within large-scale quantum computer systems – NEC Global

Tokyo, June 3, 2024 - In support of the development of large-scale superconducting quantum computers, researchers with the National Institute of Advanced Industrial Science and Technology (AIST), one of the largest public research organizations in Japan, in collaboration with Yokohama National University, Tohoku University, and NEC Corporation, proposed and successfully demonstrated a superconducting circuit that can control many qubits at low temperature.

To realize a practical quantum computer, it is necessary to control the state of a huge number of qubits (as many as one million) operating at low temperature. In conventional quantum computers, microwave signals for controlling qubits are generated at room temperature and are individually transmitted to qubits at low temperature via different cables. This results in numerous cables between room and low temperature and limits the number of controllable qubits to approximately 1,000.

In this study, a superconducting circuit that can control multiple qubits via a single cable using microwave multiplexing was successfully demonstrated in proof-of-concept experiments at 4.2 K in liquid helium. This circuit has the potential of increasing the density of microwave signals per cable by approximately 1,000 times, thereby increasing the number of controllable qubits significantly and contributing to the development of large-scale quantum computers.

The above results will be published in "npj Quantum Information" on June 3 at 10 a.m. London time.

Article Information Journal: npj Quantum Information Title: Microwave-multiplexed qubit controller using adiabatic superconductor logic Authors: Naoki Takeuchi, Taiki Yamae, Taro Yamashita, Tsuyoshi Yamamoto, and Nobuyuki Yoshikawa DOI: 10.1038/s41534-024-00849-2

Go here to read the rest:
Successful demonstration of a superconducting circuit for qubit control within large-scale quantum computer systems - NEC Global

Read More..

NIST Q&A: Getting Ready for the Post Quantum Cryptography Threat? You Should be. – HPCwire

With the National Institute of Standards and Technology (NIST) set to publish the first Post Quantum Cryptography (PQC) Standards in a few weeks, attention is shifting to how to put the new quantum-resistant algorithms into practice. Indeed, the number of companies with practices to help others implement PQC is mushrooming and contains familiar (IBM, Deloitte, et al.) and unfamiliar names (QuSecure, SandboxAQ, etc.).

The Migration to Post-Quantum Cryptography project, being run out of NISTs National Cybersecurity Center of Excellence (NCCoE), is running at full-tilt and includes on the order of 40 commercial participants.

In its own words, The project will engage industry in demonstrating use of automated discovery tools to identify all instances of public-key algorithm use in an example network infrastructures computer and communications hardware, operating systems, application programs, communications protocols, key infrastructures, and access control mechanisms. The algorithm employed and its purpose would be identified for each affected infrastructure component.

Getting to that goal remains a WIP that started with NISTs PQC program in 2016. NIST scientist Dustin Moody leads the PQC project and talked with HPCwire about the need to take post quantum cryptography seriously now, not later.

The United States government is mandating their agencies to it, but industry as well as going to need to be doing this migration. The migration is not going to be easy [and] its not going to be pain free, said Moody, whose Ph.D. specialized in elliptic curves, a commonly used base for encryption. Very often, youre going to need to use sophisticated tools that are being developed to assist with that. Also talk to your vendors, your CIOs, your CEOs to make sure theyre aware and that theyre planning for budgets to do this. Just because a quantum computer [able to decrypt] isnt going to be built for, who knows, maybe 15 years, they may think I can just put this off, but understanding that threat is coming sooner than than you realize is important.

Estimates vary wildly around the size of the threat but perhaps 20 billion devices will need to be updated with PQC safeguarding. NIST has held four rounds of submissions and the first set of standards will encompass algorithms selected the first three. These are the main weapons against quantum decryption attack. The next round seeks to provide alternatives and, in some instances, somewhat less burdensome computational characteristics.

The discussion with Moody was wide-ranging, if perhaps a little dry. He covers PQC strategy and progress and the need to monitor the constant flow of new quantum algorithms. Shors algorithm is the famous threat but others are percolating. He notes that many submitted algorithms broke down under testing but says not to make much of that as thats the nature of the standards development process. He talks about pursuing cryptoagility and offers a few broad tips on preparation.

Moody also touched on geopolitcal rivalries amid what has been a generally collaborative international effort.

There are some exceptions like China never trusting the United States. Theyre developing their own PQC standards. Theyre actually very, very similar to the algorithms [were using] but they were selected internally. Russia has been doing their own thing, they dont really communicate with the rest of the world very much. I dont have a lot of information on what theyre doing. China, even though they are doing their own standards, did have researchers participate in the process; they hosted one of the workshops in the field a few years back. So the community is small enough that people are very good at working together, even if sometimes the country will develop their own standards, said Moody.

How soon quantum computers will actually be able to decrypt current RSA codes is far from clear, but early confidence that would be many decades has diminished. If youre looking for a good primer on the PQS threat, he recommended the Quantum Treat Timeline Report released in December by the Global Risk Institute (GRI) as one (figures from its study below).

HPCwire: Lets talk a little bit about the threat. How big is it and when do we need to worry

Dustin Moody: Well, cryptographers have known for a few decades that if we are able to build a big enough quantum computer, it will threaten all of the public key crypto systems that which we use today. So its a its a serious threat. We dont know when a quantum computer would be built thats large enough to attack current levels of security. Theres been estimates of 10 to 15 years, but you know, nobody knows for certain. We have seen progress in companies building quantum computers systems from IBM and Google, for example, are getting larger and larger. So this is definitely a threat to take seriously, especially because you cant just wait until the quantum computer is built and then say now well worry about the problem. We need to solve this 10 to 15 years in advance to protect your information for a long time. Theres a threat of harvest-now-decrypt-later that helps you understand that.

HPCwire: Marco Pistoia, who leads quantum research for JPMorgan Chase, said hed seen a study suggesting as few as 1300 or so logical qubits might be able to break conventional RSA code, although it would take six months to do so. That was a year ago. It does seem like our ability to execute Shors algorithm on these systems is improving, not just the brute force, but our cleverness in getting the algorithm to run.

Dustin Moody: Yep, thats true. And itll take a lot of logical qubits. So were not there yet. But yeah, progress has been made. You have to solve the problem solved and migrate to new solutions before we ever get to that point,

HPCwire: We tend to focus on Shors algorithm because its a direct threat to the current encryption techniques. Are there others in the wings that we should be worried about?

Dustin Moody: Theres a large number of quantum algorithms that we are aware of, Shor being one of them, Grovers being another one that has an impact on cryptography. But theres plenty of other quantum algorithms that do interesting things. So whenever anyone is designing the crypto system, they have to take a look at all those and see if they look like they could attack the system in any way? Theres kind of a list of I dont know, maybe around 15 or so that potentially people have to kind of look at him and figure out, do I need to worry about these.

HPCwire: Does NIST have that list someplace?

Dustin Moody: There was a guy at NIST who kept up such a list. I think hes at Microsoft, now. Its been a little while, but he maintained something called the Quantum Algorithms Zoo.

HPCwire: Lets get back to the NIST effort to develop quantum-resistant algorithms. As I understand it, the process began being around 2016 has gone through this iterative process where you invite submissions of potential quantum resistant algorithms from the community, then test them and come up with some selections; there have been three rounds completed and in the process of becoming standards, with an ongoing fourth round. Walk me through the project and progress.

Dustin Moody: So these kinds of cryptographic competitions have been done in the past to select some of the algorithms that we use today. [So far] a widely used block cypher was selected through a competition. More recently a hash function. Back in 2016, we decided to do one of these [competitions] for new post quantum algorithms that we needed standards for. We let the community know about that. Theyre all excited and we got 82 submissions of which 69 met kind of the requirements that wed set out to be involved. Then we had a process that over six or seven years [during which] we evaluated them going through a period of rounds. In each round, we went further down to the most promising to advance the tons of work going on in there, both internally at NIST, and by the cryptographic community, doing research and benchmarks and experiments and everything.

The third round had seven finalists and eight alternate concluded in July of 2022, where we announced items that we would be standardizing as a result, that included one encryption algorithm and three signature algorithms. We did also keep a few encryption algorithms on into a fourth round for further study. They werent quite ready to be selected for standardization. That fourth round is still ongoing and will probably end as this fall, and well pick one or two of those to also standardize. Well have two or three encryption [methods] and three signatures as well.

HPCwire: It sounds like a relatively smooth process?

Dustin Moody: That process got a lot of attention from the community. A lot of the algorithms ended up being broken, some late in the process thats kind of the nature of how this thing works. Thats where we are now. Were just about done writing the standards for the first ones that we selected, our expected date is publishing them this summer. The fourth round will end this fall, and then well write standards for those that will take another year or two.

We also have ongoing work to select a few more digital signature algorithms as well. The reason for that is so many of the algorithms we selected are based on what are called lattices; theyre the most promising family, [with] good performance, good security. And for signatures, we had two based on lattices, and then one not based on lattices. The one that wasnt based on lattices its called SPHINCS+ turns out to be bigger and slower. So if applications needed to use it, it might not be ideal for them. We wanted to have a backup not based on lattices that could get used easily. Thats what this ongoing digital signature process is about [and] were encouraging researchers to try and design new solutions that are not based on lattices that are better performing.

HPCwire: When NIST assesses these algorithms, it must look to see how many computational resources are required to run them?

Dustin Moody: Theres specific evaluation criteria that we look at. Number one is security. Number two is performance. And number three is this laundry list of everything else. But we work internally at NIST, we have a team of experts and try to work with cryptography and industry experts around the world who are independently doing it. But sometimes were doing joint research with them in the field.

Security has a wide number of ways to look at it. Theres the theoretical security, where youre trying to create security proofs where youre trying to say, if you can break my crypto system, then you can break this hard mathematical problem. And we can give a proof for that and because that hard mathematical problem has been studied, that gives us a little bit more confidence. Then it gets complicated because were used to doing this with classical computers and looking at how they can attack things. But now we have to look at how can quantum computers attack things and they dont yet exist. We dont know their performance. capabilities. So we have to extrapolate and do the best that we can. But its all thrown into the mix.

Typically, you dont end up needing supercomputers. Youre able to analyze how long would the attacks take, how many resources they take, if you were to fully tried to break the security parameters at current levels. The parameters are chosen so that its [practically] infeasible to do so. You can figure out, if I were to break this, it would take, you know, 100 years, so theres no use in actually trying to do that unless you kind of find a breakthrough to find a different way. (See descriptive list of NIST strengths categories at end of article)

HPCwire: Do you test on todays NISQ (near-term intermediate scale quantum) computers?

Dustin Moody: Theyre too small right now to really have any impact in looking at how will a larger quantum computer fare against concrete parameters chosen at high enough security levels. So its more theoretical, when youre figuring out how much resources it would take.

HPCwire: So summarizing a little bit, you think in the fall youll finish this last fourth round. Those would all be candidates for standards, which then anyone could use for incorporation into encryption schemes that would be quantum computer resistant.

Dustin Moody: Thats correct. The main ones that we expect to use were already selected in our first batch. So those are kind of the primary ones, most people will use those. But we need to have some backups in case you know, someone comes up with a new breakthrough.

HPCwire: When you select them do you deliberately have a range in terms of computational requirements, knowing that not everyone is going to have supercomputers at their doorstep. Many organizations may need to use more modest resources when running these encryption codes. So people could pick and choose a little bit based on the computational requirements.

Dustin Moody: Yes, theres a range of security categories from one to five. Category Five has the highest security, but performance is impacted. So theres a trade off. We include parameters for categories one, three, a five so people can choose the one thats best suited for their needs.

HPCwire: Can you talk a little bit about the Migration to PQC project, which is also I believe in NIST initiative to develop a variety of tools for implementingPQC Whats your involvement? How is that going?

Dustin Moody: That project is being run by NISTs National Cybersecurity Center of Excellence (NCCoE). Im not one of the managers but I attend all the meetings and Im there to support what goes on. Theyve collaborated withI think the list is up 40 or 50 industry partners and the list is on their website. Its a really strong collaboration. A lot of these companies on their own would typically be competing with each but here, theyre all working for the common good of making the migration as smooth as possible, getting experience developing tools that people are going to need to do cryptographic inventories. Thats kind of one of the first steps that an organization is going to need to do. Trying to make sure everything will be interoperable. What lessons can we learn as we. Some people are further along than others and how can we share that information best? Its really good to have weekly calls, [and] we hold events from time to time. Mostly these industry collaborators are driving it and talking with each other and we just kind of organize them together and help them to keep moving.

HPCwire: Is there any effort to build best practices in this area? Something that that NIST and these collaborators from industry and academia and DOE and DOD could all provide? It would be perhaps have the NIST stamp of authority on best practices for implementing quantum resistant cryptography.

Dustin Moody: Well, the standards that my team is writing, and those are written by NIST and those are the algorithms that people will implement. Then theyll also then get tested and validated by some of our labs at NIST. The migration project is producing documents, in a series (NIST SP 1800-38A, NIST SP 1800-38B, NIST SP 1800-38C) and those are updated from time to time, where theyre sharing what theyve learned and putting best practice in this. They are NIST documents, written jointly with the NIST team and with these collaborators to share what theyve got so far.

HPCwire: What can the potential user community do to be involved? I realize the project is quite mature, its been around for a while, and youve got lots of people who whove been involved already. Are we at the stage where the main participants are working with each other and NIST in developing these algorithms, and its now a matter of sort of monitoring the tools that come out.

Dustin Moody: I would say every organization should be becoming educated on understanding the quantum threat, knowing whats going on with standardization, knowing that youre going to need to migrate, and what thats going to involve your organization. Its not going to be easy and pain free. So planning ahead, and all that. If they want to join that that collaboration (Migration to PQC), people are still joining from time to time and it is still open if they have something that theyve got to share. But for most organizations or groups, its going to be just trying to create your plan preparing for the migration. We want you to wait till the final standards are published, so youre not implementing the something thats 99% the final standard, we want you to wait until thats there, but you can prepare now.

HPCwire: When will they be final?

Dustin Moody: Of the four that we selected, three of them. We put out draft standards a year ago, got public feedback, and have been revising since. The final versions are going to be published this summer. We dont have an exact date, but it will, itll be this summer.

HPCwire: At that point, will a variety of requirements will come around using these algorithms, for example in the U.S. government and perhaps in industry requiring compliance?

Dustin Moody: Technically NIST isnt a regulatory agency. So yes, US government can. I think the OMB says that all agencies need to use our standards. So the federal government has to use the standards that we use for cryptography, but we know that a wider audience industry in the United States and globally tends to use the algorithms that we standardized as well.

HPCwire: Were in a world in which geopolitical tensions are real. Are we worried about rivals from China or Russia, or other competing nations not sharing their advances? Or is the cryptoanalyst community small enough that those kinds of things are not likely to happen because the people know each other?

Dustin Moody: There is a real geopolitical threat in terms of who gets the quantum computer quickest. If China develops that and theyre able to break into our cryptography, thats a thats a real threat. In terms of designing the algorithms and making the standards, its been a very cooperative effort internationally. Industry benefits when a lot of people are using the same algorithms all over the world. And weve seen other countries in global standards organizations say theyre going to use the algorithms that were involved in our process.

There are some exceptions like China never trusting the United States. Theyre developing their own PQC standards. Theyre actually very, very similar to the algorithms [were using] but they were selected internally. Russia has been doing their own thing, they dont really communicate with the rest of the world very much. I dont have a lot of information on what theyre doing. China, even though they are doing their own standards, did have researchers participate in the process; they hosted one of the workshops in the field a few years back. So the community is small enough that people are very good at working together, even if sometimes the country will develop their own standards.

HPCwire: How did you get involved in cryptography? What drew you into this field?

Dustin Moody: Well, I love math and the math I was studying has some applications in cryptography, specifically, something called elliptic curves, and theres crypto systems we use today that are based on the curve, which is this beautiful mathematical object that probably no one ever thought they would be of any use in the in the real world. But it turns out they are for cryptography. So thats kind of my hook into cryptography.

I ended up at NIST because NIST has elliptic curve cryptography standards. I didnt know anything about post quantum cryptography. Around 2014, my boss said, were going to put you in this project dealing with post quantum cryptography and I was like, Whats this? Ive no idea what this is. Within a couple of years, it kind of really took off and grew and has become this high priority for the United States government. Its been a kind of a fun journey to be on.

HPCwire: Will the PQC project just continue or will it wrap up at some point?

Dustin Moody: Well continue for a number of years. We still have the fourth round to finish. Were still doing this additional digital signature process, which will take several more years. But then again, every everything we do in the future needs to protect against quantum computers. So these initial standards will get published, theyll be done at some point, but all future cryptography standards will have to take the quantum threat into account. So its kind of built in that we have to keep going for the future.

HPCwire: When you talk to the vendor community, they all say, Encryption has been implemented in such a haphazard way across systems that its everywhere, and that in simply finding where it exists in all those things is difficult. The real goal, they argue, should be to move to a more modular predictable approach. Is there a way NIST can influence that? Or the selection of the algorithms can influence that?

Dustin Moody: Yes, and no. Its very tricky. That idea youre talking about, sometimes the word cryptoagility gets thrown out there in that direction. A lot of people are talking about, okay, were going to need to migrate these algorithms, this is an opportunity to redesign systems and protocols, maybe we can do it a little bit more intelligently than we did in the past. At the same time, its difficult to do that, because youve got so many interconnected pieces doing so many things. So its tricky to do, but we are encouraging people and having lots of conversations like with the migration and PQC project. Were encouraging people to think about this, to redesign systems and protocols when youre designing your applications. Knowing I need to transition to these algorithms, maybe I can redesign my system so that if I need to upgrade again, at some point, itll be much easier to do. I can keep track of where my cryptography is, what happens when Im using it, what information and protecting. I hope that well get some benefit out of this migration, but its, its certainly going to be very difficult, complicated and painful as well.

HPCwire: Do you have an off the top of your head checklist sort of five things you should be thinking about now to prepare for post quantum cryptography?

Dustin Moody: Id say number one, just know that the migration is coming. The United States government is mandating their agencies to it, but industry as well as going to need to be doing this migration. The migration is not going to be easy, its not going to be pain free. You should be educating yourself as to what PQC is, the whole quantum threat, and starting to figure out, where are you using cryptography, what information is protected with cryptography. As you noted, thats not as easy as it should be. Very often, youre going to need to use sophisticated tools that are being developed to assist with that. Also talk to your vendors, your CIOs, your CEOs to make sure theyre aware and that theyre planning for budgets to do this. Just because a quantum computer [able to decrypt] isnt going to be built for, who knows, maybe 15 years, they may think I can just put this off, but understanding that threat is coming sooner than than you realize is important.

HPCwire: Thank you for your time!

Strength Categories from NIST

In accordance with the second and third goals above (Submission Requirements and Evaluation Criteria for the Post-Quantum Cryptography Standardization Process), NIST will base its classification on the range of security strengths offered by the existing NIST standards in symmetric cryptography, which NIST expects to offer significant resistance to quantum cryptanalysis. In particular, NIST will define a separate category for each of the following security requirements (listed in order of increasing strength2 ):

1) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 128-bit key (e.g. AES-128)

2) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for collision search on a 256-bit hash function (e.g. SHA-256/ SHA3-256)

3) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 192-bit key (e.g. AES-192)

4) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for collision search on a 384-bit hash function (e.g. SHA-384/ SHA3-384)

5) Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 256-bit key (e.g. AES-256)

Original post:
NIST Q&A: Getting Ready for the Post Quantum Cryptography Threat? You Should be. - HPCwire

Read More..

Quantum firm D-Wave extends agreement with European Aramco Research Center – DatacenterDynamics

Quantum computing firm D-Wave has extended its agreement with the European arm of Saudi oil company Aramco.

D-Wave has partnered with the Aramco Research Center in Delft, the Netherlands, for the past two years, developing quantum technologies to solve geophysical optimization problems, including intensive seismic imaging.

Its unclear by how long the partnership has been extended.

According to the Aramco Research Center, the team has used quantum technology provided by D-Wave to create its first subsurface maps, using tens of gigabytes of seismic data as input. The center aims to process a terabyte of seismic data with the D-Wave quantum computer this year.

In a statement, Marcin Dukalski, quantum applications lead at the Aramco Research Center said: Im excited to see how far weve been able to push quantum technologies to tackle such a large optimization problem as subsurface imaging.

He added: We look forward to expanding our work with D-Wave, which will be centered on reaping even greater tangible benefits from the Advantage2 system.

First unveiled in 2022, D-Waves Advantage2 prototype features more than 1,200 qubits and 10,000 couplers and reportedly demonstrates a 20x faster time-to-solution on hard optimization problems.

In March, the company avoided being delisted from the New York Stock Exchange for the third time after the company was able to bring its share price back above a $1 average closing share price over a 30-day period.

Its stock price has not dipped below the $1 mark since the company was relisted.

Meanwhile, state-owned Aramco is the worlds largest producer of oil and has rights to the worlds second-largest proven crude oil reserves. In 2017, a report by CDP and the Climate Accountability Institute found that Aramco was responsible for 4.5 percent of global industrial greenhouse gas emissions from 1988 to 2015, placing second to the country of China.

In May 2024 the company announced it was partnering with French startup Pasqal to install the first quantum computer in Saudi Arabia.

Under the terms of that agreement, neutral atom quantum vendor Pasqal will install, maintain, and operate a 200-qubit quantum computer scheduled for deployment in the second half of 2025.

View post:
Quantum firm D-Wave extends agreement with European Aramco Research Center - DatacenterDynamics

Read More..

Quantum Computing Revolutionizes AGV Scheduling – AZoQuantum

In an article recently published in the journal Scientific Reports, researchers investigated the potential of quantum computing technology for solving the automated guided vehicle (AGV) scheduling problem.

Currently, AGVs are used extensively in every aspect of production, transportation, and logistics, which significantly improved industrial intelligence and automation levels and enhanced efficiency. The amount of parallel work AGVs do is increasing to meet the requirements of application scenarios, which greatly increases the AGV scheduling challenges.

The AGV scheduling problem is a challenging combinatorial optimization problem. Although several studies have been performed on AGV scheduling problems covering multiple scenarios like terminals and workshops, finding high-quality scheduling solutions quickly/within a short timeframe remains a major challenge.

Significant progress has been achieved recently in both practical applications and theoretical understanding of quantum computing. Quantum computers' dependence on quantum mechanical principles is their fundamental difference from traditional computers.

Specifically, quantum bits are utilized as fundamental information storage units in quantum computers, which enable these computers to hold substantially more information than traditional computers. Additionally, quantum computers are advantageous for addressing problems like combinatorial optimization. Combinatorial optimization problems can be mapped to the Ising model's ground state search problem.

In this regard, the scheduling problem of AGVs could be considered as a type of routing problem.

Traditional solutions for routing problems often require significant computational resources. However, quantum computing techniques have displayed great potential in solving optimization and routing problems. Although several studies have utilized quantum computing to solve practical optimization problems, quantum computing research on AGV scheduling remains at the nascent stage, with several researchers using simulators to solve them.

In this study, researchers applied quantum computing technology to the AGV scheduling problemand proposed new quadratic unconstrained binary optimization (QUBO) models that adapt to solving the problem under two separate criteria: minimizing the overall AGV travel time and task completion time/makespan.

Specifically, two types of QUBO models suitable for various AGV scheduling objectives were constructed, and the scheduling scheme was coded into the Hamiltonian operator's ground state. The problem was solved using an optical coherent Ising machine (CIM).

The objective of the study was to effectively meet the requirements of large-scale scheduling.

In traditional AGV scheduling problem research, the computation time significantly increases with the rising number of tasks and AGVs. In practical scenarios, dispatchers set several scheduling objectives based on the nature of the work, with minimizing the total travel time and task completion time being the most common objectives. Thus, researchers constructed the QUBO models based on different objectives and presented the solutions and theoretical underpinnings for each.

The CIM and a traditional computer were used to perform the numerical experiments on the proposed QUBO model and the traditional model, respectively. Gurobi solver was utilized to solve the proposed mixed integer programming (MIP) model on a traditional computer, and its computing performance was demonstrated under various problem scales.

Additionally, an optical quantum computer was employed to solve the arc and node models' problem cases at different scales, and the computation performance was compared with the performance of traditional computers. The components of the CIM used in this study were primarily composed of electrical and optical parts.

The machine's optical part was composed of periodically poled lithium niobate crystals, fiber rings, erbium-doped fiber amplifiers, and pulsed lasers. The electrical part consisted of field-programmable gate arrays, analog-to-digital/digital-to-analog converters, and optical balanced homodyne detectors.

The comparison of the arc and node model performance on a quantum computer with the MIP model performance on traditional computers showed that the solutions obtained using CIM were all optimal. In small-scale examples, the CIM was significantly faster than the traditional computer.

Unlike traditional computers, CIM's computation time did not increase significantly with increasing problem scales. This indicates CIM's great application and development potential. Additionally, little difference was observed in the computing performance between the arc model and the node model on the quantum computer.

Specifically, the node model was slightly faster than the arc model and more universal than the node model. Overall, the experimental results showed that the optical quantum computer could save 92 % computation time on average compared to the traditional calculation method.

To summarize, the findings of this study demonstrated that CIM has significant application potential in solving the AGV scheduling problem and other similar combinatorial optimization problems. However, the benefits of quantum computing in large-scale situations/problems could not be demonstrated due to hardware constraints, which was the major limitation of this study.

Tang, L., Yang, C., Wen, K., Wu, W., Guo, Y. (2024). Quantum computing for several AGV scheduling models. Scientific Reports, 14(1), 1-16. https://doi.org/10.1038/s41598-024-62821-6, https://www.nature.com/articles/s41598-024-62821-6

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Link:
Quantum Computing Revolutionizes AGV Scheduling - AZoQuantum

Read More..

Singapore looks to boost AI with plans for quantum computing and data centers – ZDNet

Karl Hendon/Getty Images

Singapore is looking to carve out a global footprint in artificial intelligence (AI) with the release of international standards for large language model (LLM) testing and investments in quantum computing and new data center capacity.

Quantum has the potential to unlock new value, where higher processing capabilities can be harnessed in areas such as simulating complex molecules for drug discovery, said Deputy Prime Minister Heng Swee Keat at last week's Asia Tech x Singapore 2024 summit.

Also: Generative AI may be creating more work than it saves

He added that quantum computing can also have synergies with AI, for example, in improving the efficiency of developing and training advanced AI models. This development, in turn, can further drive innovations in deep learning, natural language processing, and computer vision.

However, there still are challenges to resolve in quantum, including requirements for cryogenic cooling and error correction, Heng said. He noted that researchers worldwide were assessing different approaches to achieve scale and enable quantum computing to be commercially viable.

Also:Rote automation is so last year: AI pushes more intelligence into software development

Singapore wants to address thesechallenges with its National Quantum Strategy, coupled with almost SG$300 million ($221.99 million) in investment. This cash is on top of a previous SG$96.6 million commitmentannounced in 2022. The new investment is earmarked for five years, through to 2030, to boost the country's position as a leading hub in the development and deployment of quantum technologies, Heng said.

This roadmap focuses on four areas, including initiatives in quantum research, such asquantum communications and security and quantum processors, and a scholarship program to produce 100 PhD and 100 master's-level graduates over the next five years, he said.

Efforts are underway for Singapore to buildcapabilities in the design and development of quantum processors. This work will encompass research onqubit technologies, including photonic networks, neutral atoms, and superconducting circuits.

ZDNET understands Singapore's target is to have the first prototype ready in the next three years and scale out production in five years.

The government in 2022 unveiled a three-year initiative to build a quantum-safe network that it hopes will showcase "crypto-agile connectivity" and facilitate trials with both public and private organizations. The initiative also includes a quantum security lab for vulnerability research.

Singapore last week also launched its green data center roadmap to chart "digital sustainability and chart green growth pathways" for such facilities, supporting AI and computing developments.

The country has over 1.4 gigawatts of data center capacity and is home to more than 70 cloud, enterprise, and co-location data centers.

Singapore is aiming to add at least 300 megawatts of additional data center capacity "in the near term" and another 200 megawatts through green energy deployments, said Janil Puthucheary, senior minister of state for the Ministry of Communications and Information, at the summit.

Efforts will be made to enhance efficiency through both hardware and software, Puthucheary said, pointing to technologies that maximize energy efficiency and capacity, and green software tools.

He added that improving data center efficiency is also about greening software, so the carbon emissions of applications can be reduced.

He said the focus will be placed on data centers to accelerate their use of green energy, with the government offering support via grants and incentives to switch to energy-efficient IT equipment. In addition, the Infocomm Media Development Authority (IMDA) will work with PUB to help data centers push their water usage effectiveness (WUE) to 2.0 cubic meters or less per megawatt hour, up from the 2021 median WUE of 2.2 cubic meters.

Also:Agile development can unlock the power of generative AI - here's how

IMDA will jointly develop standards and certifications with industry partners to drive the development and operation of data centers with power usage effectiveness (PUE) of 1.3 or lower.

In addition, the BCA-IMDA Green Mark for data centers will be refreshed by year-end to raise the standards for energy efficiency in data centers. IMDA will also introduce standards for IT equipment energy efficiency and liquid cooling by 2025, to drive the adoption of these technologies in Singapore.

The green data center roadmap outlines plans to reduce energy use for air-cooling by raising operating temperatures via IMDA's tropical DC methodology.

According to the government agency, data centers can achieve 2% to 5% energy savings for every 1C increase in operating temperature.

It also pointed to simulations that have found existing data centers can achieve a 50% reduction in energy consumption of supporting infrastructure, with energy-efficient retrofits and upgrades for key equipment, such as chiller plants and uninterruptible power supplies.

"We aim to uplift all data centers in Singapore to achieve PUE of less than 1.3 at 100% IT load over the next 10 years," IMDA said. "This gives existing data centers sufficient time to plan for upgrades."

The tech industry today emits an estimated 1.5% to 4% of global greenhouse gas emissions, Heng noted, with this figure projected to climb as the use of AI expands alongside the need for data storage and processing.

Also:3 ways to accelerate generative AI implementation and optimization

He said technologies that drive the country's digital economy, such as cloud and AI, fuel demand for powerful and energy-intensive computing.

"Data centers lie at the heart of such activities and require large amounts of energy for processing and cooling. Greening ICT, especially data centers, is therefore crucial in a digital and carbon-constrained world," he said.

"There is a need to balance the economic and social benefits of digital applications with the environmental effects from the resultant emissions," he said, noting that Singapore has committed to a net-zero target by 2050.

"The [green data center] roadmap sets out low-carbon energy sources that data centers can explore, which include bioenergy, fuel cells with carbon capture, low-carbon hydrogen and ammonia for a start," Puthucheary explained. "We welcome proposals from the industry to push boundaries in realizing these pathways in Singapore."

Meanwhile, the country wants to lead the way by releasing standards for large language model (LLM) testing, developed via partnerships with global organizations such as MLCommons, IBM, and Singtel.

Dubbed Project Moonshot, the LLM testing tool provides benchmarking, red-teaming, and testing baselines to help developers and organizations mitigate risks associated with LLM deployment.

Also:Generative AI is the technology that IT feels most pressure to exploit

LLMs without guardrails can reinforce biases and create harmful content, with unintended consequences. "IMDA is seeking to establish guardrails to manage the risks while enabling space for innovation," the government agency said.

"It is important to adopt an agile, test-and-iterate approach to address key risks in model development and use. Project Moonshot provides intuitive results, so testing unveils the quality and safety of a model or application in an easily understood manner, even for a non-technical user."

The testing tool provides a five-tier scoring system where each completed scoring sheet will place the application on a scale. Grade cut-offs can be determined by the author of each of these scoring sheets.

AI Verify Foundation and MLCommons jointly developed the testing LLM benchmarks. The latter is an open-engineering consortium supported by Qualcomm, Google, Intel, and NVIDIA and recognized by the US National Institute of Science and Technology under its AI Safety Consortium. AI Verify Foundation is Singapore's not-for-profit foundation that focuses on developing AI testing tools.

Also:AI business is booming: ChatGPT Enterprise now boasts 600,000+ users

Project Moonshot is currently available as an open beta.

IMDA said it is working with companies such as Anthropic to develop a practical guide to multilingual and multicultural red-teaming for LLMs. The guide is slated for release later this year for global use.

Read more here:
Singapore looks to boost AI with plans for quantum computing and data centers - ZDNet

Read More..

UK quantum company reaches new milestones on the path to powerful quantum computers – E&T Magazine

Quantinuum has announced that its H-Series processor has surpassed the ability to be simulated by the worlds best supercomputers, enabling the company to extend the lead in the race towards fault tolerant quantum computing.

The UK quantum computing company has announced a major qubit (or quantum bit) count enhancement to its flagship System Model H2 quantum computer from 32 to 56 trapped-ion qubits.

Formed in 2021 by Honeywell and Cambridge Quantum, Quantinuum has to date raised approximately $625m to further the development and commercialisation of quantum computing. Its mission is to see the quantum computing industry depart the era when quantum computers could be simulated by a classical computer.

This has been achieved, with the announcement that its upgraded H2-1 from 32 to 56 qubits makes its processor impossible to classically simulate.

This upgrade is also challenging the worlds most powerful supercomputers. It proved this in a demonstration in partnership with US finance company JPMorgan Chase and a team from research institutions Caltech and Argonne National Lab.

This demonstration tackled a well-known algorithm of quantum computing power, random circuit sampling, and measured the quality of results with a suite of tests including the linear cross entropy benchmark an approach first made famous by Google in 2019 in a bid to demonstrate quantum supremacy.

The results on H2-1 reveal a significant 100x lift in performance over prior industry results from Google, setting a new world record for the cross entropy benchmark.

According to Quantinuum, H2-1s combination of scale and hardware fidelity makes it difficult for todays most powerful supercomputers and other quantum computing architectures to match this result.

Were extending our lead in the race towards fault tolerant quantum computing, accelerating research for customers like JPMorgan Chase in ways that arent possible with any other technology, said Rajeeb Hazra, CEO of Quantinuum.

Our focus on quality of qubits versus quantity of qubits is changing whats possible, and bringing us closer to the long-awaited commercialisation of quantums applications across industries like finance, logistics, transportation and chemistry.

Quantinuums analysis also indicates that not only does HS2-1 achieve significant system-level performance, it does so with an estimated 30,000x reduction in power consumption compared to classical supercomputers.

Continue reading here:
UK quantum company reaches new milestones on the path to powerful quantum computers - E&T Magazine

Read More..

AWS Custom Silicon Chips Range a Sign of What’s Coming to APAC Cloud Computing – TechRepublic

The surge in AI computing has resulted in delays to the supply of AI-capable chips, as demand has outstripped supply. Global giants Microsoft, Google and AWS are ramping up custom silicon production to reduce dependence on the dominant suppliers of GPUs, NVIDIA and AMD.

As a result, APAC enterprises may soon find themselves utilising an expanding array of chip types in cloud data centres. The chips they choose will depend on the compute power and speed required for different application workloads, cost and cloud vendor relationships.

Compute-intensive tasks like training an AI large language model require massive amounts of computing power. As demand for AI computing has risen, super advanced semiconductor chips from the likes of NVIDIA and AMD have become very expensive and difficult to secure.

The dominant hyperscale cloud vendors have responded by accelerating the production of custom silicon chips in 2023 and 2024. The programs will reduce dependence on dominant suppliers, so they can deliver AI compute services to customers globally, and in APAC.

Google debuted its first ever custom ARM-based CPUs with the release of the Axion processor during its Cloud Next conference in April 2024. Building on custom silicon work over the past decade, the step up to producing its own CPUs is designed to support a variety of general purpose computing, including CPU-based AI training.

For Googles cloud customers in APAC, the chip is expected to enhance Googles AI capabilities within its data center footprint, and will be available to Google Cloud customers later in 2024.

Microsoft, likewise, has unveiled its own first in-house custom accelerator optimised for AI and generative AI tasks, which it has badged the Azure Maia 100 AI Accelerator. This is joined by its own ARM-based CPU, the Cobalt 100, both of which were formally announced at Microsoft Ignite in November 2023. The firms custom silicon for AI has already been in use for tasks like running OpenAIs ChatGPT 3.5 large language model. The global tech giant said it was expecting a broader rollout into Azure cloud data centres for customers from 2024.

AWS investment in custom silicon chips dates back to 2009. The firm has now released four generations of Graviton CPU processors, which have been rolled out into data centres worldwide, including in APAC; the processors were designed to increase the price performance for cloud workloads. These have been joined by two generations of Inferentia for deep learning and AI inferencing, and two generations of Trainium for training 100B+ parameter AI models.

At a recent AWS Summit held in Australia, Dave Brown, vice president of AWS Compute & Networking Services, told TechRepublic the cloud providers reason for designing custom silicon was about providing customers choice and improving price performance of available compute.

Providing choice has been very important, Brown said. Our customers can find the processors and accelerators that are best for their workload. And with us producing our own custom silicon, we can give them more compute at a lower price, he added.

AWS has long-standing relationships with major suppliers of semiconductor chips. For example, AWS relationship with NVIDIA, the now-dominant player in AI, dates back 13 years, while Intel, which has released Gaudi accelerators for AI, has been a supplier of semiconductors since the cloud providers beginnings. AWS has been offering chips from AMD in data centres since 2018.

Brown said the cost optimisation fever that has gripped organisations over the last two years as the global economy has slowed has seen customers moving to AWS Graviton in every single region, including in APAC. He said the chips have been widely adopted by the market by more than 50,000 customers globally including all the hyperscalers top 100 customers. The largest institutions are moving to Graviton because of performance benefits and cost savings, he said.

SEE: Cloud cost optimisation tools not enough to reign in cloud spending.

The wide deployment of custom AWS silicon is seeing customers in APAC utilize these options.

Enterprise customers in APAC could benefit from an expanding range of compute options, whether that is measured by performance, cost or appropriateness to different cloud workloads. Custom silicon options could also help organisations meet sustainability goals.

The competition provided by cloud providers, in tandem with chip suppliers, could drive advances in chip performance, whether that is in the high-performance computing category for AI model training, or innovation for inferencing, where latency is a big consideration.

Cloud cost optimisation has been a major issue for enterprises, as expanding cloud workloads have led customers into ballooning costs. More hardware options give customers more options for reducing overall cloud costs, as they can more discerningly choose appropriate compute.

A growing range of custom silicon chips within cloud services will allow enterprises to better match their application workloads to the specific characteristics of the underlying hardware, ensuring they can use the most appropriate silicon for the use cases they are pursuing.

Sustainability is predicted to become a top five factor for customers procuring cloud vendors by 2028. Vendors are responding: for instance, AWS said carbon emissions can be slashed using Graviton4 chips, which are 60% more efficient. Custom silicon will help improve overall cloud sustainability.

More:
AWS Custom Silicon Chips Range a Sign of What's Coming to APAC Cloud Computing - TechRepublic

Read More..