Page 33«..1020..32333435..4050..»

Cohesity goes epic on AMD Epyc – ComputerWeekly.com

Data security and management company Cohesity is following (if not leading) the infrastructure efficiency efforts being seen across the wider technology industry with recent work focused on energy-efficient computing.

Cohesity Data Cloud now supports AMD Epyc CPU-powered servers.

Epyc (pronounced epic) AMDs brand of multi-core x86-64 microprocessors based on the companys Zen microarchitecture.

The two firms have collaborated to make sure users can deploy Cohesity Data Cloud on AMD Epyc CPU-based all-flash and hybrid servers from Dell, Hewlett Packard Enterprise (HPE) and Lenovo.

Reminding us that organisations face challenges from ransomware and cyberattacks to stringent regulatory requirements, IT constraints, tight budgets and tough economic conditions, Cohesity says that to solve these challenges, companies need to take advantage of technology that is best suited to their specific requirements.

Customers each have unique needs but a common goal securing and gaining insight from their data. They trust Cohesity, in part, because we strive to offer the largest ecosystem with the most choices to suit their preferences, said John Davidson, group vice president, Americas sales, Cohesity.

By supporting AMD Epyc CPU-powered servers, Davidson says his firm is opening up new options for users to customise and modernise their datacenter.

[Customers can] increase performance and deliver energy, space and cost savings so they can execute their data security and management strategy on their preferred hardware configurations, he added.

All-flash servers have become an increasingly popular choice for organisations with high-demand applications and workloads, stringent power budgets for their datacentres, or increasing storage capacity requirements and little physical space within their datacentre.

As supermicro notes here, All-flash data storage refers to a storage system that uses flash memory for storing data instead of spinning hard disk drives. These systems contain only solid-state drives (SSDs), which use flash memory for storage. They are renowned for their speed, reliability, low energy consumption and reduced latency, making them ideal for data-intensive applications and workloads.

Cohesity now offers AMD-powered all-flash servers from HPE to modernise customer data centers and meet the requirements of green initiatives through the greater density, performance and cost savings all-flash servers provide over traditional servers.

Single-socket 1U HPE servers based on AMD Epyc can reduce the number of required nodes and power costs by up to 33% when compared with dual-socket 2U servers based on other CPUs.

Cohesitys AI-powered data security and management capabilities are now generally available on AMD-powered all-flash servers from HPE and hybrid servers from Dell and Lenovo.

Excerpt from:
Cohesity goes epic on AMD Epyc - ComputerWeekly.com

Read More..

Ethereum co-founders reflect on 10-year anniversary EthCC – Cointelegraph

Ethereum has evolved extensively since its inception in 2014, but usability and technical improvements remain hurdles as the protocol enters its second decade of life.

Ethereum co-founders Vitalik Buterin and Joseph Lubin highlighted these focal points as the ecosystem gathered in Brussels for a week-long program of events centered around EthCC.

Recent years have seen Ethereum successfully shift to proof-of-stake consensus and adopt a layer-2 centric approach to scaling the base layer of the network.

This journey has been dotted with successes and challenges as the ecosystem looks to attract mainstream acceptance and adoption.

Cointelegraph spoke exclusively to Lubin during a side event hosted by Consensys Ethereum zkEVM Linea.

Buterin and Lubin have openly hailed the advent of zero-knowledge (ZK) proof-powered layer 2s (L2s) as the future of Ethereums development.

The ability of these platforms to batch transactions offchain and submit cryptographic proofs to Ethereums base layer has brought significant performance and cost reductions to end-users.

Reflecting on the current state of the ecosystem, the Consensys CEO said major strides had been achieved in terms of performance:

Lubin also added that zkEVMs have proven to be the key technological approach that is best suited to bringing high-speed, low-cost functionality to the ecosystem.

Optimistic rollups presented a stopgap for scaling challenges. As Lubin explained, optimistic approaches had edge use cases, and fraud proofs that required up to two weeks for final settlement were just not going to be the real direction of travel.

With ZK tech, we believe we developed the right course. I dont imagine that these, network states will be heavily built on optimistic technology, Lubin said.

Related:Exclusive: Joe Lubin unpacks SEC battle, Ethereum roadmap and Vitalik Buterin

While he admitted that optimistic technology will remain useful for certain things, Lubin said it would be foolish to use technology that is less efficient and less effective when validity proofs will be available.

User experience remains a crucial hurdle to overcome as the wider Ethereum ecosystem looks to abstract some complexity away from end-users.

Buterin drew a full house for his keynote talk at EthCC, and he focused on addressing his views on the challenges remaining for Ethereum as a base layer for its L2-centric approach.

Key concerns highlighted by the Ethereum co-founder included the barriers stopping Ether (ETH) holders from solo staking and pooled staking.

Buterin also said that censorship is one of the biggest risks facing the ecosystem. This could include the censorship of transactions by nodes and other more technical concerns related to malicious parties influencing how transactions are processed.

A proposed approach to combating network attacks or censorship is to increase the quorum threshold to 75% or more. Buterin explained that this could avoid a potentially censored chain reaching finality.

Related:MetaMask unveils new toolkit to streamline Web3, user onboarding

The Ethereum co-founder also highlighted his vision for the majority of Ethereum users to be able to run light clients on mobile devices to verify the base layer and L2s.

Protocol simplification was top of mind as Buterin wrapped up his address, highlighting how the layer 1 can be improved while still leaning heavily on the benefits afforded by L2 networks.

Ethereum has unique strengths as a robust base layer, including some that are not even held by Bitcoin, Buterin said.

Magazine:As Ethereum phishing gets harder, drainers move to TON and Bitcoin

Read more here:

Ethereum co-founders reflect on 10-year anniversary EthCC - Cointelegraph

Read More..

Vitalik Explores Ethereums Strengths and Weaknesses at ETHCC – CoinJournal

At the Ethereum Community Conference, the co-founder of the cryptocurrency, Vitalik Buterin, gave a keynote speech regarding its strengths and weaknesses. He began by highlighting the strength of the Ethereum ecosystem, describing it as large and reasonably decentralised, and highlighting the range of these applications that shows Ethereums versatility.

Buterin mentioned that Ethereums weaknesses still need to be addressed. The co-founder is known for openly discussing the cryptocurrencys weaknesses despite its numerous successes.

Among the more prominent weaknesses is Ethereums usability, which renders the network incredibly complicated for non-experts and discouraging for new users. He pushed for simplifying the existing protocol to ease things for developers and users alike.

Furthermore, Buterin noted that solo staking is still very difficult as the current process requires 32 ETH before one can become a validator in addition to the complicated process of running a node. However, he reassured the crowd that these issues are very addressable.

He also advocated for preparedness in the hypothetical case of a 51% attack and stated that his biggest concern is that it could result in network censorship. He admitted that developing a solution for this is not as simple as it depends on a lot of assumptions around coordination, ideology, and various other things, and its not clear how to do something like that as well in 10 years.

However, he proposed that increasing the quorum threshold from 75% to 80% can help prevent difficulties in the network recovering from attacks when the chain finalises.

We want to make the response to 51% attacks be as automated as possible, Buterin stated. Essentially, if a validator or transaction is censored, it will counter-censor the majority chain, and all honest nodes will coordinate on the same minority software.

Vitalik Buterin presented an honest and unbiased representation of Ethereums current state while proferring solutions to certain issues through various technological innovations.

For example, to address Ethereums scalability issues, Vitalik stated that a solution lies in shards, which will allow for the division of workload among parallel chains. He also stated that continuous research will be conducted to improve and address security concerns.

To round up his speech, he stressed the value of doubling down on strengths while still recognising and fixing any inadequacies.

Read the rest here:

Vitalik Explores Ethereums Strengths and Weaknesses at ETHCC - CoinJournal

Read More..

Strange Motion of Neutrons Proves Nature Is Fundamentally Bizarre – ScienceAlert

At the very smallest scales, our intuitive view of reality no longer applies. It's almost as if physics is fundamentally indecisive, a truth that gets harder to ignore as we zoom in on the particles that pixelate our Univerrse.

In order to better understand it, physicists had to devise an entirely new framework to place it in, one based on probability over certainty. This is quantum theory, and it describes all sorts of phenomena, from entanglement to superposition.

Yet in spite of a century of experiments showing just how useful quantum theory is at explaining what we see, it's hard to shake our 'classical' view of the Universe's building blocks as reliable fixtures in time and space. Even Einstein was forced to ask his fellow physicist, "Do you really believe the Moon is not there when you are not looking at it?"

Numerous physicists have asked over the decades whether there is some way the physics we use to describe macroscopic experiences can also be used to explain all of quantum physics.

Now a new study has also determined that the answer is a big fat nope.

Specifically, neutrons fired in a beam in a neutron interferometer can exist in two places at the same time, something that is impossible under classical physics.

The test is based on a mathematical assertion called the Leggett-Garg inequality, which states that a system is always determinately in one or the other of the states available to it. Basically, Schrdinger's Cat is either alive or dead, and we are able to determine which of those states it is in without our measurements having an effect on the outcome.

Macro systems those we can reliably understand using classical physics alone obey the Leggett-Garg inequality. But systems in the quantum realm violate it. The cat is alive and dead simultaneously, an analogy for quantum superposition.

"The idea behind it is similar to the more famous Bell's inequality, for which the Nobel Prize in Physics was awarded in 2022," says physicist Elisabeth Kreuzgruber of the Vienna University of Technology.

"However, Bell's inequality is about the question of how strongly the behavior of a particle is related to another quantum entangled particle. The Leggett-Garg inequality is only about one single object and asks the question: how is its state at specific points in time related to the state of the same object at other specific points in time?"

The neutron interferometer involves firing a beam of neutrons at a target. As the beam travels through the apparatus, it splits in two, with each of the beam's prongs traveling separate paths until they are later recombined.

Leggett and Garg's theorem states that a measurement on a simple binary system can effectively give two results. Measure it again in the future, those results will be correlated, but only up to a certain point.

For quantum systems, Leggett and Garg's theorem no longer applies, permitting correlations above this threshold. In effect this would give researchers a way to distinguish whether a system needs a quantum theorem to be understood.

"However, it is not so easy to investigate this question experimentally," says physicist Richard Wagner of the Vienna University of Technology. "If we want to test macroscopic realism, then we need an object that is macroscopic in a certain sense, i.e. that has a size comparable to the size of our usual everyday objects."

In order to achieve this, the space between the two parts of the neutron beam in the interferometer is on a scale that's more macro than quantum.

"Quantum theory says that every single neutron travels on both paths at the same time," says physicist Niels Geerits of the Vienna University of Technology . "However, the two partial beams are several centimeters apart. In a sense, we are dealing with a quantum object that is huge by quantum standards."

Using several different measurement methods, the researchers probed the neutron beams at different times. And, sure enough, the measurements were too closely correlated for the classical rules of macro reality to be at play. The neutrons, their measurements suggested, were actually traveling simultaneously on two separate paths, separated by a distance of several centimeters.

It's just the latest in a long string of Leggett-Garg experiments that show we really do need quantum theory in order to describe the Universe we live in.

"Our experiment shows: Nature really is as strange as quantum theory claims," says physicist Stephan Sponar of the Vienna University of Technology. "No matter which classical, macroscopically realistic theory you come up with: It will never be able to explain reality. It doesn't work without quantum physics."

The research has been published in Physical Review Letters.

Read more:

Strange Motion of Neutrons Proves Nature Is Fundamentally Bizarre - ScienceAlert

Read More..

With spin centers, quantum computing takes a step forward – EurekAlert

image:

Photo showsShan-Wen Tsai (left) and Troy Losey.

Credit: Tsai lab, UC Riverside.

RIVERSIDE, Calif. --Quantum computing, which uses the laws of quantum mechanics, can solve pressing problems in a broad range of fields, from medicine to machine learning, that are too complex for classical computers. Quantum simulators are devices made of interacting quantum units that can be programmed to simulate complex models of the physical world. Scientists can then obtain information about these models, and, by extension, about the real worldby varying the interactions in a controlled way and measuring the resulting behavior of the quantum simulators.

In apaper published in Physical Review B, and selected by the journal as an editors' suggestion, a UC Riverside-led research team has proposed a chain of quantum magnetic objects, called spin centers, that, in the presence of an external magnetic field, can quantum simulate a variety of magnetic phases of matter as well as the transitions between these phases.

We are designing new devices that house the spin centers and can be used to simulate and learn about interesting physical phenomena that cannot be fully studied with classical computers, saidShan-Wen Tsai, a professor ofphysics and astronomy, who led the research team. Spin centers in solid state materials are localized quantum objects with great untapped potential for the design of new quantum simulators.

According toTroy Losey, Tsais graduate student and first author of the paper, advances with these devices could make it possible to study more efficient ways of storing and transferring information, while also developing methods needed to create room temperature quantum computers.

We have many ideas for how to make improvements to spin-center-based quantum simulators compared to this initial proposed device, he said. Employing these new ideas and considering more complex arrangements of spin centers could help create quantum simulators that are easy to build and operate, while still being able to simulate novel and meaningful physics.

Below, Tsai and Losey answer a couple of questions about the research:

Q: What is a quantum simulator?

Tsai: It is a device that exploits the unusual behaviors of quantum mechanics to simulate interesting physics that is too difficult for a regular computer to calculate. Unlike quantum computers that operate with qubits and universal gate operations, quantum simulators are individually designed to simulate/solve specific problems. By trading off universal programmability of quantum computers in favor of exploiting the richness of different quantum interactions and geometrical arrangements, quantum simulators may be easier to implement and provide new applications for quantum devices, which is relevant because quantum computers arent yet universally useful.

A spin center is a roughly atom-sized quantum magnetic object that can be placed in a crystal. It can store quantum information, communicate with other spin centers, and be controlled with lasers.

Q: What are some applications of this work?

Losey: We can build the proposed quantum simulator to simulate exotic magnetic phases of matter and the phase transitions between them. These phase transitions are of great interest because at these transitions the behaviors of very different systems become identical, which implies that there are underlying physical phenomena connecting these different systems.

The techniques used to build this device can also be used for spin-center-based quantum computers, which are a leading candidate for the development of room temperature quantum computers, whereas most quantum computers require extremely cold temperatures to function. Furthermore, our device assumes that the spin centers are placed in a straight line, but it is possible to place the spin centers in up to 3-dimensional arrangements. This could allow for the study of spin-based information devices that are more efficient than methods that are currently used by computers.

As quantum simulators are easier to build and operate than quantum computers, we can currently use quantum simulators to solve certain problems that regular computers dont have the abilities to address, while we wait for quantum computers to become more refined. However, this doesnt mean that quantum simulators can be built without challenge, as we are just now getting close to being good enough at manipulating spin centers, growing pure crystals, and working at low temperatures to build the quantum simulator that we propose.

The University of California, Riverside is a doctoral research university, a living laboratory for groundbreaking exploration of issues critical to Inland Southern California, the state and communities around the world. Reflecting California's diverse culture, UCR's enrollment is more than 26,000 students. The campus opened a medical school in 2013 and has reached the heart of the Coachella Valley by way of the UCR Palm Desert Center. The campus has an annual impact of more than $2.7 billion on the U.S. economy. To learn more, visit http://www.ucr.edu.

Physical Review B

Computational simulation/modeling

Not applicable

Quantum simulation of the spin- 1 2 XYZ model using solid-state spin centers

8-Jul-2024

Authors have no conflict of interest.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

See the article here:

With spin centers, quantum computing takes a step forward - EurekAlert

Read More..

New quantum computer smashes ‘quantum supremacy’ record by a factor of 100 and it consumes 30,000 times less power – Livescience.com

A new quantum computer has broken a world record in "quantum supremacy," topping the performance of benchmarking set by Google's Sycamore machine by 100-fold.

Using the new 56-qubit H2-1 computer, scientists at quantum computing company Quantinuum ran various experiments to benchmark the machine's performance levels and the quality of the qubits used. They published their results June 4 in a study uploaded to the preprint database arXiv. The study has not been peer-reviewed yet.

To demonstrate the potential of the quantum computer, the scientists at Quantinuum used a well-known algorithm to measure how noisy, or error-prone, qubits were.

Quantum computers can perform calculations in parallel thanks to the laws of quantum mechanics and entanglement between qubits, meaning the fates of different qubits can instantly change each other. Classical computers, by contrast, can work only in sequence.

Adding more qubits to a system also scales up the power of a machine exponentially; scientists predict that quantum computers will one day perform complex calculations in seconds that a classical supercomputer would have taken thousands of years to solve.

The point where quantum computers overtake classical ones is known as "quantum supremacy," but achieving this milestone in a practical way would need a quantum computer with millions of qubits. The largest machine today has only about 1,000 qubits.

Related: Quantum computing breakthrough could happen with just hundreds, not millions, of qubits using new error-correction system

Get the worlds most fascinating discoveries delivered straight to your inbox.

The reason we would need so many qubits for "quantum supremacy" is that they are inherently prone to error, so many would be needed to correct those errors. That's why many researchers are now focusing on building more reliable qubits, rather than simply adding more qubits to machines.

The team tested the fidelity of H2-1's output using what's known as the linear cross entropy benchmark (XEB). XEB spits out results between 0 (none of the output is error-free) and 1 (completely error-free), Quantinuum representatives said in a statement.

Scientists at Google first tested the company's Sycamore quantum computer using XEB in 2019, demonstrating that it could complete a calculation in 200 seconds that would have taken the most powerful supercomputer at the time 10,000 years to finish. They registered an XEB result of approximately 0.002 with the 53 superconducting qubits built into Sycamore.

But in the new study, Quantinuum scientists in partnership with JPMorgan, Caltech and Argonne National Laboratory achieved an XEB score of approximately 0.35. This means the H2 quantum computer can produce results without producing an error 35% of the time.

"We are entirely focused on the path to universal fault tolerant quantum computers," Ilyas Khan, chief product officer at Quantinuum and founder of Cambridge Quantum Computing, said in the statement. "This objective has not changed, but what has changed in the past few months is clear evidence of the advances that have been made possible due to the work and the investment that has been made over many, many years."

Quantinuum previously collaborated with Microsoft to demonstrate "logical qubits" that had an error rate 800 times lower than physical qubits.

In the study, published in April, scientists demonstrated they could run experiments with the logical qubits with an error rate of just 1 in 100,000 which is much stronger than the 1-in-100 error rate of physical qubits, Microsoft representatives said.

"These results show that whilst the full benefits of fault tolerant quantum computers have not changed in nature, they may be reachable earlier than was originally expected," added Khan.

See the original post:
New quantum computer smashes 'quantum supremacy' record by a factor of 100 and it consumes 30,000 times less power - Livescience.com

Read More..

The 3 Smartest Quantum Computing Stocks to Buy With $5K Right Now – InvestorPlace

Quantum computing stocks present a thrilling frontier for investors. The industrys promise to exponentially accelerate problem-solving capabilities across various sectors, such as cryptography, drug discovery, and optimization problems, is increasingly appealing. This transformative potential has poised quantum computing stocks as pivotal investments for those looking to capitalize on next-generation technology.

As per McKinsey, public investments in quantum technology have grown significantly, increasing by over 50% since 2022, totaling approximately $42 billion globally. This increase is driven by countries like Germany, the UK, and South Korea, indicating a strong governmental interest in quantum technology.

For investors ready to engage with this high-potential market, the current moment offers a strategic entry point. In this context, here are three quantum computing stocks to buy for significant rewards in the future.

Source: Shutterstock

Rigetti Computing (NASDAQ:RGTI) stands out in the burgeoning field of quantum computing. Founded in 2013, Rigetti made significant strides in developing full-stack quantum computing services, offering them through its cloud-based platform, Forest.

The company reported generating over $3 million in revenue in Q1, marking a 39% year-over-year (YOY) growth. This growth primarily stems from its technology development contracts and quantum processing unit (QPU) sales. Despite ongoing operating losses, Rigettis financial health shows signs of improvement, with a narrowed net loss and a robust liquidity position of over $102 million in cash and short-term investments by the end of Q1.

Rigettis dedication to innovation is evident in its recent launch of the Novera QPU partnership program. This initiative advances quantum computing by fostering collaboration and technological integration across various aspects of the quantum stack.

The company has already secured partnerships with notable firms such as Riverlane in the UK for error correction and Quantum Machines in Israel for control systems. These collaborations are expected to enhance Rigettis market position and drive future QPU sales.

Source: Amin Van / Shutterstock.com

IonQ (NYSE:IONQ) is a pioneering force in the quantum computing industry. Despite being a relatively young player, IonQ has made remarkable strides in advancing quantum computing technology, especially with its unique approach using trapped ion technology.

IonQ has carved out a niche in the quantum computing sector with its trapped ion technology, which is considered to have several advantages over other quantum systems, such as superconducting qubits. This technology offers longer coherence times and potentially more scalable solutions. These features are pivotal for solving complex computational problems currently beyond the reach of classical supercomputers.

In 2023, IonQ announced significant advancements, including the development of next-generation quantum systems such as the IonQ Forte and the upcoming Tempo system. Moreover, the company has been proactive in forming strategic alliances across various sectors, including finance, automotive, and aerospace. Partnerships with notable companies like Airbus and Hyundai aim to explore quantum computings utility in optimizing logistics and material sciences. These collaborations not only validate IonQs technology but also enhance its credibility and market presence.

Source: josefkubes / Shutterstock.com

Honeywell (NASDAQ:HON) stands out as a pivotal player in the diversified industrial sector. The company has positioned itself as a significant player in the rapidly evolving field of quantum computing through its dedicated unit, Honeywell Quantum Solutions. This division focuses on developing advanced quantum computing technologies that utilize trapped ion technology.

In the Q1 2024 earnings call, the company exceeded its adjusted earnings-per-share (EPS) guidance and achieved healthy organic sales growth. Honeywells Q1 revenue was $9.11 billion, a 2.72% YOY increase, with an EPS of $2.25, surpassing estimates by $0.07.

Honeywells Quantum Solutions division has set records in quantum volume, indicating strong performance and scalability of its quantum systems. Strategic collaborations, such as with Microsoft on quantum computing experiments, demonstrate Honeywells commitment to innovation and the commercialization of quantum technologies.

The company plans to monetize its quantum business around 2025, leveraging advancements and market readiness.

On the date of publication, Mohammed Saqib did not hold (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

On the date of publication, the responsible editor did not have (either directly or indirectly) any positions in the securities mentioned in this article.

Mohammed Saqib is a research analyst with experience in equity research and financial modeling. He has extensively covered stocks listed in the tech sector using fundamental analysis as the cornerstone of his approach. Currently pursuing a masters degree in finance, Saqib is dedicated to obtaining the CFA charter to augment his expertise in the field further.

View original post here:
The 3 Smartest Quantum Computing Stocks to Buy With $5K Right Now - InvestorPlace

Read More..

Why every quantum computer will need a powerful classical computer – Ars Technica

Enlarge / A single logical qubit is built from a large collection of hardware qubits.

One of the more striking things about quantum computing is that the field, despite not having proven itself especially useful, has already spawned a collection of startups that are focused on building something other than qubits. It might be easy to dismiss this as opportunismtrying to cash in on the hype surrounding quantum computing. But it can be useful to look at the things these startups are targeting, because they can be an indication of hard problems in quantum computing that haven't yet been solved by any one of the big companies involved in that spacecompanies like Amazon, Google, IBM, or Intel.

In the case of a UK-based company called Riverlane, the unsolved piece that is being addressed is the huge amount of classical computations that are going to be necessary to make the quantum hardware work. Specifically, it's targeting the huge amount of data processing that will be needed for a key part of quantum error correction: recognizing when an error has occurred.

All qubits are fragile, tending to lose their state during operations, or simply over time. No matter what the technologycold atoms, superconducting transmons, whateverthese error rates put a hard limit on the amount of computation that can be done before an error is inevitable. That rules out doing almost every useful computation operating directly on existing hardware qubits.

The generally accepted solution to this is to work with what are called logical qubits. These involve linking multiple hardware qubits together and spreading the quantum information among them. Additional hardware qubits are linked in so that they can be measured to monitor errors affecting the data, allowing them to be corrected. It can take dozens of hardware qubits to make a single logical qubit, meaning even the largest existing systems can only support about 50 robust logical qubits.

Riverlane's founder and CEO, Steve Brierley, told Ars that error correction doesn't only stress the qubit hardware; it stresses the classical portion of the system as well. Each of the measurements of the qubits used for monitoring the system needs to be processed to detect and interpret any errors. We'll need roughly 100 logical qubits to do some of the simplest interesting calculations, meaning monitoring thousands of hardware qubits. Doing more sophisticated calculations may mean thousands of logical qubits.

That error-correction data (termed syndrome data in the field) needs to be read between each operation, which makes for a lot of data. "At scale, we're talking a hundred terabytes per second," said Brierley. "At a million physical qubits, we'll be processing about a hundred terabytes per second, which is Netflix global streaming."

It also has to be processed in real time, otherwise computations will get held up waiting for error correction to happen. To avoid that, errors must be detected in real time. For transmon-based qubits, syndrome data is generated roughly every microsecond, so real time means completing the processing of the datapossibly Terabytes of itwith a frequency of around a Megahertz. And Riverlane was founded to provide hardware that's capable of handling it.

The system the company has developed is described in a paper that it has posted on the arXiv. It's designed to handle syndrome data after other hardware has already converted the analog signals into digital form. This allows Riverlane's hardware to sit outside any low-temperature hardware that's needed for some forms of physical qubits.

That data is run through an algorithm the paper terms a "Collision Clustering decoder," which handles the error detection. To demonstrate its effectiveness, they implement it based on a typical Field Programmable Gate Array from Xilinx, where it occupies only about 5 percent of the chip but can handle a logical qubit built from nearly 900 hardware qubits (simulated, in this case).

The company also demonstrated a custom chip that handled an even larger logical qubit, while only occupying a tiny fraction of a square millimeter and consuming just 8 milliwatts of power.

Both of these versions are highly specialized; they simply feed the error information for other parts of the system to act on. So, it is a highly focused solution. But it's also quite flexible in that it works with various error-correction codes. Critically, it also integrates with systems designed to control a qubit based on very different physics, including cold atoms, trapped ions, and transmons.

"I think early on it was a bit of a puzzle," Brierley said. "You've got all these different types of physics; how are we going to do this?" It turned out not to be a major challenge. "One of our engineers was in Oxford working with the superconducting qubits, and in the afternoon he was working with the ion trap qubits. He came back to Cambridge and he was all excited. He was like, 'They're using the same control electronics.'" It turns out that, regardless of the physics involved in controlling the qubits, everybody had borrowed the same hardware from a different field (Brierley said it was a Xilinx radiofrequency system-on-a-chip built for 5G base stationed prototyping.) That makes it relatively easy to integrate Riverlane's custom hardware with a variety of systems.

See the rest here:
Why every quantum computer will need a powerful classical computer - Ars Technica

Read More..

Kipu Quantum Acquires Quantum Computing Platform Built by Anaqor AG to Accelerate Development of Industrially Relevant Quantum Solutions – PR Newswire

KARLSRUHE, Germany, July 11, 2024 /PRNewswire/ -- Kipu Quantum, the worldwide leading quantum software company, announced today the strategic acquisition of PlanQK, the German quantum computing platform successfully built and commercialized by Anaqor AG. Along with its platform, key members of its team of experts joined Kipu Quantum to lead the way in making useful quantum computing accessible to organizations of all sizes, including industrial, academic, and governmental ones. This shrewd movefollows Kipu Quantum's successful 11.4million second closing of the seed funding round led by HV Capital and DTCF in late 2023.

The well-established and industry-recognized PlanQK platform with a constantly growing quantum ecosystem of more than 100 organizations enhances accessibility to quantum computing across various sectors, serving a broad range of users from leading companies such as BASF, DB Systel GmbH, T-LABS and TRUMPF.

The acquisition will drastically acceleratethe commercialization of Kipu's application- and hardware-specific algorithms, as services through the PlanQK platform, enabling frictionlessaccess fororganizations to integrate quantum solutions into their existing processes.

Combining their strengths, Kipu Quantum and the PlanQK team are set to greatly enhance quantum computing accessibility across various industries, including pharmaceuticals, chemicals, logistics, and finance, accelerating these sectors towards achieving quantum advantage.

Strategic Integration for Accelerated Growth

Kipu Quantum's CEO, Daniel Volz, stated, "Kipu Quantum's massively compressed algorithms enable the use of today's quantum processors across multiple industries to solve industrially sizedproblems, without waiting a decade for massive quantum computers. Our work with customers such as BASF, DLR, and MasOrangehas demonstrated that this is feasible. Making our world-leading algorithm services accessible through the PlanQK platform will make our capabilitiesavailable to a much wider audience."

MichaelFalkenthal, lead architect of the PlanQK platform stated, "By merging our deep technical expertise and strengths with Kipu Quantum, we aim to deliver unparalleled value and accelerate access to quantum computing capabilities sooner than anticipated. PlanQK will remain an open, community-driven platform, enhanced by our partnerships and research initiatives to support all quantum innovators. Together, we will facilitate seamless integration of quantum solutions into existing organizational processes, significantly broadening the reach and application of advanced quantum technologies."

Enrique Solano, co-founder andChief Visionary Officer of Kipu Quantum emphasized,"We are delighted to welcome the elite team of PlanQK, bolstering our capabilities. We share the mission to achieve a first demonstration of quantum advantage. We appreciate the unique qualities and talents our new colleagues bring to achieve this mission by 2026. Together, we are here to make history and make useful quantum computing as soon as possible."

Era of Useful Quantum Computing

Alexandra Beckstein, CEO and co-founder of QAI Ventures, whose Swiss-based accelerator program KIPU graduated in 2023, commented, "We highly appreciate the strategic move in acquiring the PlanQK platform.This confirms Kipu's commitment to the era of useful quantum computing and positions the company as a frontrunner. Their innovative methods will enhance the ability to deliver high-impact quantum solutions and drive significant advancements in complex real-life computational processes. We will continue to support their journey towards business-relevant quantum computing."

About Kipu

Kipu Quantum is a German company that operates at the intersection of quantum computer hardware and application software layers, developing disruptive application- and hardware-specific quantum algorithms for a wide range of industries. These algorithms are based on a one-of-a-kind compression technology that requires orders of magnitude less quantum processor resources to solve a given problem than comparable approaches. Kipu Quantum's technology has the potential to solve industry-relevant problems in the order of 1,000 physical qubits and is compatible with any leading quantum hardware. The company is currently testing its technology with customers in the pharmaceutical, chemical, logistics and financial industries.

About PlanQK

PlanQK is the first open community-powered platform and ecosystem for quantum applications, connecting developers, industrial users, researchers, and quantum hardware providers with a platform for the integration, deployment, development, and monetization of quantum services. With over 30 successfully tested use cases and more than 100 partners, PlanQK, is a pioneer in the field of quantum platforms. Initiated in 2019 and supported through a research grant by the German Federal Ministry of Economic Affairs and Climate Affairs it has been continuously developed in collaboration with leading universities and companies.

Press contact: Joanna Folberth [emailprotected] +49 1523 4621 156

http://www.kipu-quantum.com

http://www.planqk.de

Logo - https://mma.prnewswire.com/media/2458484/Kipu_Quantum_GmbH_Logo.jpg Logo - https://mma.prnewswire.com/media/2458485/PlanQK_Logo.jpg

SOURCE Kipu Quantum GmbH

See the original post:
Kipu Quantum Acquires Quantum Computing Platform Built by Anaqor AG to Accelerate Development of Industrially Relevant Quantum Solutions - PR Newswire

Read More..

Quantinuum and STFC Hartree Centre Partner to Advance Quantum Research in the UK – HPCwire

LONDON and BROOMFIELD, Colo., July 11, 2024 Quantinuum has signed a Joint Statement of Endeavor with the STFC Hartree Centre, one of Europes largest supercomputing centers dedicated to industry engagement. The partnership will provide UK industrial and scientific users access to Quantinuums H-Series, the worlds highest-performing trapped-ion quantum computers, via the cloud and on-premise.

Research and scientific discovery are central to our culture at Quantinuum, and we are proud to support the pioneers at the Hartree Centre, said Raj Hazra, CEO of Quantinuum. As we accelerate quantum computing, the Hartree Centre and the UK quantum ecosystem will be on the forefront of building solutions powered by quantum computers at scale.

Both organizations aim to support UK businesses and research organizations in exploring quantum advantage in quantum chemistry, computational biology, quantum artificial intelligence and quantum-augmented cybersecurity. The UK has a strong global reputation in each domain, and quantum computing is expected to accelerate development in the coming years.

Quantinuums H-Series hardware will benefit scientists across various areas of research, including exascale computing algorithms, fusion energy development, climate resilience and more, said Kate Royse, Director of the STFC Hartree Centre. This partnership also furthers our five-year plan to unlock the high growth potential of advanced digital technologies for UK industry.

The Hartree Centre is part of the Science and Technology Facilities Council (STFC) within UK Research and Innovation building on a wealth of established scientific heritage and a network of international expertise. The centers experts collaborate with industry and the research community to explore the latest technologies, upskill teams and apply practical digital solutions across supercomputing, data science and AI.

Quantinuums H-Series quantum computers are the highest-performing in the world, having consistently held the world record for quantum volume, a widely used benchmark for quantum computing performance, for over three years and currently standing at 220.

In April 2024, Quantinuum and Microsoft reported a breakthrough demonstration of four reliable logical qubits using quantum error correction an important technology necessary for practical quantum computing. During the same month, Quantinuum extended its industry leadership with its H-Series computer becoming the first to achieve three 9s 99.9% two-qubit gate fidelity across all qubit pairs in a production device, a critical milestone that enables fault-tolerant quantum computing.

This achievement was immediately available to Quantinuum customers, who depend on using the very best quantum hardware and software, enabling them to push the boundaries on new solutions in areas such as materials development, drug discovery, machine learning, cybersecurity, and financial services.

Quantinuum formerly known as Cambridge Quantum prior to its 2021 combination with Honeywell Quantum Solutions was one of the UK governments delivery partners, following the 2014 launch of the National Quantum Technologies Programme. Cambridge Quantum ran the Quantum Readiness Programme for several years to inspire UK business and industry to invest in quantum computing to explore the potential use cases of this revolutionary technology.

Earlier this year, Quantinuum was selected as a winner in the 15m SBRI Quantum Catalyst Fund, to support the UK Government in delivering the benefits of quantum technologies, with an initial focus on simulating actinide chemistry using quantum computers.

About Quantinuum

Quantinuum, the worlds largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuums technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.

About The STFC Hartree Centre

The Hartree Centre helps UK businesses and organizations of any size to explore and adopt supercomputing, data analytics and artificial intelligence (AI) technologies for enhanced productivity, smarter innovation and economic growth. Backed by significant UK government funding and strategic partnerships with industry leaders such as the University of Liverpool, the Hartree Centre is home to some of the most advanced digital technologies and experts in the UK. In 2021, the Hartree National Centre for Digital Innovation (HNCDI) program was established to provide a safe and supportive environment for UK businesses and public sector organizations to acquire the skills needed to adopt AI, develop proofs-of-concept and de-risk investment into emerging digital technologies such as quantum computing. The Hartree Centre is part of the Science and Technology Facilities Council (STFC).

Source: Quantinuum

Here is the original post:
Quantinuum and STFC Hartree Centre Partner to Advance Quantum Research in the UK - HPCwire

Read More..