Page 199«..1020..198199200201..210220..»

Finland develops quantum algorithms for the future – ComputerWeekly.com

Many experts believe that once quantum computers are big enough and reliable enough to solve useful problems, the most common deployment architecture will be to have them serve as accelerators for supercomputers. Several algorithms have already been developed to run on this architecture, including the most famous one Shors algorithm, which will one day crack many of todays public key cryptosystems.

To experiment with this arrangement in Finland, the Finnish Technical Research Center (VTT) cooperated with CSC, who run LUMI, Europes fastest supercomputer. VTT and CSC have been connecting progressively larger quantum computers to LUMI to allow users to experiment with algorithms that will make the best use of both types of computers. The classical computer stores most of the data and executes most of the steps in the algorithm, handing off other steps to the quantum computer and then reading the results.

VTTs first quantum computer was completed in 2021 and used 5 qubits. They then upgraded to20 qubits last yearand are aiming for 50 qubits by the end of 2024. Universities and research organisations have already started using the hybrid service to solve trivial problems.

This is sort of a rehearsal for the future, saysVilleKotovirta, leader of the VTTteam that develops quantum algorithms and software. It allows people who develop supercomputing algorithms to start thinking about what they will be able to do when a bigger quantum computer works alongside a classical computer. They can use the existing service to practice writing algorithms.

We were the first to set up the hybrid configuration in Europe, but others are following, says Kotovirta. In June 2023, the European HPC joint undertaking acceptedsix other projectsto build similar architectures. These will be in Czechia, France, Germany, Italy, Poland and Spain.

Kotovirta is responsible for research in quantum algorithms and development of software to enable access to VTTs quantum computing. Part of his job is to hire new talent, which he says is not easy. Some of the graduating students already in Finland are interested in quantum computing, but they dont have real world experience.

The people who have work experience in Finland already have a job somewhere else in the ecosystem, and people from outside Finland are hesitant to move to a cold climate. Having said that, some of the outsiders are impressed enough with theFinnish ecosystemto overcome concerns they may have about the weather.

Were all learning because its a new field and its changing all the time, says Kotovirta. There are new inventions, new platforms, new devices, new algorithms and new ways of programming. To keep up, we try to hire mathematicians, physicists and computer scientists.

Kotovirtas team is developing several types of algorithms for hybrid architectures. One is a set of optimisation problems, called quadradic unconstrained binary optimisation (QUBO), which can be solved using quantum annealing or quantum approximate optimisation algorithms (QAOAs).

We have built quantum algorithms for analysing graph data and identifying the community structure of networks, he says. The data comes from complex networks, like technological or biological networks systems.

The team is also developing algorithms for quantum chemistry, with focus on reducing the complexity of amolecular Hamiltonianto improve on simulations of molecules. Similarly, they are working on synthetic biology, where they generate new proteins, with certain desirable features.

Another area of focus is quantum machine learning especially quantum generative machine learning, models that learn from existing data to produce novel samples.

Most people have heard of generative AI in the context of fake AI, where it is used to create images, text and sound, says Kotovirta. These same techniques can be applied to science, learning from something that already exists to create something new. We are finding ways of improving these techniques with quantum computing to generate new proteins.

The most difficult part is proving that quantum machines have benefits over their classical counterparts, says Kotovirta. Current quantum computers are real computers, and they can do real calculations and solve real problems.

In that sense, they are already doing useful things, but the problem is that sizes are very small, because the current systems are inefficient in comparison to their classical counterparts. In order for quantum computers to solve something more efficiently than classical computers, fidelities need to improve.

Fidelity refers to the success rate of a single operation on a quantum computer. The higher the fidelities, the better success rate of the overall computation. We are currently in an era of noisy intermediate-scale quantum (NISQ) devices, which have already shown that quantum devices can simulate things classical computers struggle to simulate. However, so far, the results are so trivial that they dont solve real problems.

As fidelities improve, were approaching the era of utility-scale algorithms, utility-scale quantum computing, says Kotovirta. This will happen when the fidelities are good enough to run certain algorithms that are tailored for the topology of the device youre using. That will give us results that classical computers cannot replicate, but only for very specific use cases.

These algorithms could be used to simulate quantum systems related to material sciences or chemistry, for example. Although you cant claim general quantum advantage, for those specific use cases you can demonstrate advantage. Finlands strategy is to make a difference on the world stage through the use cases in which quantum advantage can be achieved in the not-so-distant future.

It isnt easy for smaller countries to compete with larger economies. However, it is possible for them to find a niche that allows them to actively contribute on a global scale in one or more specific areas. In that regard, we are doing the right things, he says. Hopefully, we will continue to do so in the future.

Read more:
Finland develops quantum algorithms for the future - ComputerWeekly.com

Read More..

Advancing Spin Qubit Technology: Cryogenic Probing – AZoQuantum

In an article recently published in the journal Nature, researchers developed a 300-mm cryogenic probing process to obtain high-volume data on spin qubit devices across full wafers and optimized an industry-compatible process to fabricate spin qubit devices on a low-disorder host material to realize automated probing of single electrons in spin qubit arrays across 300-mm wafers.

Substantial numbers of physical qubits are required to build a fault-tolerant quantum computer. Recently, silicon quantum dot spin qubits/qubits based on silicon electrons have displayed two-qubit and single-qubit fidelities above 99%, satisfying the error correction thresholds.

Integrated spin qubit arrays have currently attained sizes equal to six quantum dots, and bigger quantum dot platforms in two-dimensional (2D) and one-dimensional (1D) configurations have also been demonstrated. However, the number of physical qubits must significantly increase to realize real-world applications using spin qubit technology. Thus, spin qubit devices must be fabricated with uniformity, volume, and density comparable with classical computing chips, which currently consist of billions of transistors.

Fabricating spin qubit devices using a similar infrastructure as classical computing chips can facilitate the development of fault-tolerant quantum computers using the spin qubit technology and unlock the spin qubits' potential for scaling.

This is because spin qubit technology possesses inherent advantages for scaling due to the approximate qubit size of 100 nm and built-in compatibility with modern complementary metal-oxide-semiconductor (CMOS) manufacturing infrastructure, specifically in the case of silicon-based devices.

Currently, yield and process variation are the major challenges for spin qubits. Additionally, the cryogenic electrical testing bottleneck hinders the scaling of solid-state quantum technologies like superconducting and topological qubits and spin qubits.Thus, the cryogenic device testing scale must maintain pace with the increasing fabrication complexity to ensure efficient device screening and improve statistical metrics like voltage variation and qubit yield. Yield and process variation in quantum devices can be improved by combining process changes with statistical measurements of indicators like component yield and voltage variation.

In this study, researchers proposed a testing process using a cryogenic 300-mm wafer prober to obtain high-volume data on the performance of hundreds of industry-manufactured spin qubit devices at 1.6 K across full wafers. Additionally, they combined low process variation with a low-disorder host material to optimize an industry-compatible process for spin qubit device fabrication on silicon/silicon-germanium (Si/SiGe) heterostructures.

These proposed advancements were synergistic as the development of the full-wafer cryogenic test capability can enable the complex 300-mm fabrication process optimization and the optimized fabrication process can improve the reliability of the devices, enabling high-fidelity automated measurements across wafers.

Collectively, these advancements culminated in automated single-electron probing in spin qubit arrays across 300-mm wafers. In this work, the spin qubit devices were synthesized in Intel's D1 factory, where the CMOS logic processes are developed. A Si/SiGe heterostructure grown on 300-millimeter silicon wafers was used as the host material.

Researchers selected this structure to exploit the prolonged electron spin coherence in silicon and its applicability for multiple qubit encodings. All patterning was performed using optical lithography. Extreme ultraviolet lithography was employed for quantum dot gate patterning in a single pass.

Additionally, all device sub-components were fabricated using fundamental industry techniques like chemical-mechanical polishing, etching, and deposition. The AEM Afore and Bluefors-manufactured cryo-prober/cryogenic wafer prober used in this work can load and cool 300-millimeter wafers to 1.0 K base temperature at the chuck and 1.6 0.2 K electron temperature. Thousands of test structures and spin qubit arrays on the wafer were measured after cooldown.

Low process variation and high yield were successfully achieved across the 300-mm wafer using the proposed approach. The proposed cryogenic testing method provided fast feedback to enable the CMOS-compatible fabrication process's optimization, resulting in low process variation and high yield.

Using this proposed system, measurements of the spin qubits' operating point were successfully automated and the transitions of single electrons were thoroughly investigated across full wafers. Results obtained by analyzing the random variation in single-electron operating voltages demonstrated that the optimized fabrication process results in low levels of disorder at the 300-mm scale.

The high device yield combined with the cryogenic wafer prober enabled a simple path from device fabrication to the investigation of spin qubits, which eliminated failures due to electrostatics or yield at the dilution refrigerator stage. Overall, an extensible and large unit cell of up to 12 qubits was realized using a high-volume cryogenic testing method, an all CMOS-industry-compatible fabrication process with low process variation, and a low-disorder host material/Si/SiGe.

To summarize, the findings of this study established a new standard for the reliability and scale of spin qubit devices and paved the way for more complex and much larger spin qubit arrays of the future.

Neyens, S., et al. et al. (2024). Probing single electrons across 300-mm spin qubit wafers. Nature, 629(8010), 80-85. https://doi.org/10.1038/s41586-024-07275-6, https://www.nature.com/articles/s41586-024-07275-6

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Go here to read the rest:
Advancing Spin Qubit Technology: Cryogenic Probing - AZoQuantum

Read More..

Quantum computing may be the future of technology: what does that really mean? – cuindependent

Physics Professor David Wineland in his lab at the National Institute of Standards and Technology (NIST) in Boulder. Professor Wineland was awarded the 2012 Nobel Prize for his ground-breaking experimental methods that enable measuring and manipulation of individual quantum systems. (Courtesy of The University of Colorado)

Envision a computer that runs faster and more securely than ever before. A new age of technology is coming and its called quantum computing.

Global investors are pouring billions of dollars into quantum research and technologies. Many research universities are pursuing quantum computing development. The University of Colorado Boulder is among the list of schools researching quantum with the CUbit Quantum Initiative.

Its an exciting field, said Scott Sternberg, the executive director of the CUbit Quantum Initiative at CU Boulder. Theres not [only] just a lot of money that the United States is putting forward to this.

Quantum in a nutshell

Quantum mechanics refers to the laws of physics that were discovered in the 1920s, said Karl Mayer, who received his Ph.D. in quantum engineering from CU Boulder.

These are the laws of physics that apply to really small things, like electrons, atoms and things like that. The laws of physics at that small scale behave differently than what physicists call Newtonian physics, Mayer said.

Newtons Laws of Motion refer to how an object responds when forces act upon it. While Newtons laws are objectively correct, they do not apply to smaller bodies, such as electrons or objects moving close to the speed of light. Thats where quantum mechanics come into play.

Quantum physics starts to present itself in a very different mathematical way, Sternberg said. It becomes a window into some of the fundamental forces of the universe.

CUbit Quantum Initiative

Under the quantum umbrella, CUBit is working on three main focus areas. These include quantum sensing and measurement, quantum network and communications and quantum computing and simulation.

CUbit aims to advance science and build a foundation for quantum technology. With Colorado being a hub for quantum research, the program has an advantage in partnering with local companies and startups.

Quantum sensing and measurement and qubits

Quantum sensing and measurement allows for technological devices to be more efficient, accurate and productive. GPS systems use quantum physics to provide mapping tools.

According to Sternberg, sensors based on quantum techniques help form atomic clocks, very precise timekeepers, in addition to mapping tools. In 2017, CU Boulder scientists created a new design for an atomic clock that outperformed previous attempts.

Part of what allows for devices to be more efficient and productive is the use of bits, an essential part of both classical and quantum computing.

Bits, the smallest unit of information in a computer, have two states: 0 and 1, which correspond to off and on. Using combinations of bits, engineers transform simple electrical signals into complex logic gates. Traditional bits only have the potential to represent single values whereas qubits can take on any proportion between 0 and 1 when coding. Since qubits can break things into smaller values, they are more powerful than traditional bits.

Instead of a traditional computer that encodes into one or zero, you have this bit, this qubit, which in this particular state operates very differently than a traditional transistor one/zero would, Sternberg said. If you think about a one or a zero, it [a qubit] could be a one and a zero at the same time and everything in between.

Quantum computing and simulation

Quantum physics deals with electrons and other minuscule moving objects. According to Mayer, electrons do not have exact positions, but they have wave functions. This is important for prediction-making.

A wave function is an equation used to explain the behavior of quantum particles, mainly electrons. It is useful for finding the probability that a particle will end up in a specific place. While probabilities can be found, it is not an exact prediction of what will happen.

Sternberg used a deck of cards as an example. When drawing from a deck of cards, there is a one in 52 chance of getting a certain number or face and a certain suit. In this scenario, a wave function would describe the probability of which face and suit will appear.

Quantum computing and simulation use qubits to speed up computing processes. When computers perform at such a high level, they can simulate materials or chemistry processes. This is where probability and wave functions come into play.

Using electrons and the principles of quantum physics, researchers have found that there is a better chance of predicting probability through experiments run on quantum computers. For example, quantum computers predict the likelihood of certain events, such as the sun rising tomorrow.

Quantum networks and communications

According to Mayer, quantum computing also encompasses cryptography.

Cryptography is the process that protects digital information. Encryption and decryption are two essential parts of this procedure. Encryption occurs when data is turned from plaintext into ciphertext. Decryption is the process of transforming ciphertext into plaintext. When one opens a readable message, it is considered in its plaintext or readable form.

As a way to block third-party viewers from reading messages, they are encrypted and turned into ciphertext, which is not easily readable. When someone sends a text message, it is encrypted and when the receiver opens the message, it is decrypted.

Traditional cryptography uses math when encrypting messages whereas quantum cryptography uses physics. It sends subatomic particles known as photons through a filter and to a receiver. When an eavesdropper tries to read the photons, the state of the photon changes. This makes quantum cryptography more secure than traditional cryptography.

The context of quantum networks and communication is being able to create a secure connection that is so unbreakable, Sternberg said.

Problems with quantum

Experts are hopeful about the future of quantum, but research is developing. Quantum computers still exhibit an array of operational errors and researchers dont yet know the most effective way to build them.

The hope with quantum computers is that you would kind of do this experimental research of building the material and studying it, Mayer said. Quantum computers enable you to do that simulation efficiently where classical computers do not.

Mayer works at Quantinuum, a company that develops technology and conducts research in quantum computing. One of their products, known as Quantum Origin, is designed to use quantum cryptography in a way that better secures data.

Part of Mayers research at Quantinuum involves identifying errors in the system.

One problem present with current quantum computers is that they are very noisy or very prone to errors, Mayer said, and theres a whole zoo of possible errors that can occur to quantum bits. And what I do is I design experiments to get run on our system that measure things like what types of errors are occurring and with what rates.

Additionally, Sternberg said that quantum computers are hard to build due to the advanced cooling technology the process requires.

Its an engineering problem to be able to manipulate these and scale them to the point where quantum computers could now grow, Sternberg said.

While quantum computers have the potential to improve security, they can also pose a threat.

Sternberg said that some members of the quantum community fear that advanced quantum computers in other nations could access private information in the United States.

And so theres sort of this half security side of it and half benefit for overall mankind, Sternberg said.

CU Boulder and the future of quantum

CU Boulder has a particular advantage in quantum research as a university based in Colorado.

We have industries that are quantum-like. Meaning that we have aerospace, we have biosciences, we have IT as part of our major industry sectors here in the state, Sternberg said. We have those skill sets that are adjacent to the quantum space in other industries that we have here in the state.

Although quantum computing is not yet the industry standard, technology is heading in that direction. Companies around the globe are working to progress humanity toward an era of quantum technology.

However, as quantum technology develops, potential users shouldnt expect to see any quantum laptops or computers in their local tech stores anytime soon.

Were at this kind of frontier, this beginning of engineering quantum computers, Sternberg said. I dont know if well see them on a laptop, but well certainly be able to use a laptop to access a quantum computer for a computation.

Contact Guest Writer Hope Munoz at Hope.Munoz-Stanley@colorado.edu.

Go here to see the original:
Quantum computing may be the future of technology: what does that really mean? - cuindependent

Read More..

BBVA runs successful trial of distributed quantum simulation in the cloud – Finextra

BBVA has completed a successful trial of the execution of quantum algorithms across multiple conventional servers in the AWS cloud.

Javier Recuenco Andrs, head of the Technical Architecture Innovation area at BBVA CIB in charge of the pilot project, says of the project: With these trials, we have shown that at BBVA we can have a proprietary architecture for executing quantum algorithms, which would help further our exploration of their use in complex financial tasks.

To run its test, BBVA worked with the Quantum Computing team of the digital transformation company VASS and AWS, using Qiskit software to distribute the execution of quantum algorithms across multiple classical compute servers located in the AWS cloud, and created a platform to automate and streamline the distribution process.

During the tests, BBVA was able to run quantum algorithms scaling up to a total computing power of 38 qubits, a scale that is difficult to reach with the use of a single classical computer. The higher the number of qubits, the more complex the problems the system can tackle.

Alongside its own inhouse trials, BBVA is also a founding member of the Quantum Safe Financial Forum (QSFF), a safe space for collaboration between European and US financial firms promoted by Europols European Cybercrime Centre (EC3). The alliance aims to foster the creation of new technological systems within the financial industry that are safe, secure and resilient to malicious attacks that rely on quantum computing.

At BBVA we explore the potential of quantum computing for two main reasons: to try to find better solutions to business problems and to strengthen the security of our communications and data to counteract the malicious use of quantum computing by third parties, explains Escolstico Snchez, leader of the Quantum discipline at BBVA. The distributed quantum simulation pilot we have successfully completed is a further step in this exploration, which could enable different business units of the bank to leverage this technology.

Read more:
BBVA runs successful trial of distributed quantum simulation in the cloud - Finextra

Read More..

Unveiling the Potential of Hole Spin Qubits in Quantum Computing – AZoQuantum

Scientists from the University of Basel and the NCCR SPIN have achieved the first controllable interaction between two-hole spin qubits in a typical silicon transistor. The discovery makes it possible to use established manufacturing techniques to combine millions of these qubits on a single chip. The research was published in the journal Nature Physics.

The race to build a practical quantum computer is well underway, with researchers around the world working on a huge variety of qubit technologies. Up until now, there has been no consensus on what type of qubit is most suitable for maximizing the potential of quantum information science.

A quantum computer's qubits, which manage data processing, transport, and storage, are its fundamental components. They need to process information quickly and store it accurately to function properly. Stable and quick interactions among several qubits whose states are reliably controllable externally constitute the foundation for fast information processing.

Millions of qubits need to fit on a single chip for a quantum computer to be useful. With only a few hundred qubits, the most sophisticated quantum computers available today are limited to performing tasks that can already be completed (and frequently done more quickly) by conventional computers.

Researchers at the University of Basel and the NCCR SPIN are addressing the challenge of arranging and linking thousands of qubits by utilizing a type of qubit that exploits the spin (intrinsic angular momentum) of either an electron or a hole.

A hole is essentially a missing electron in a semiconductor. Both holes and electrons possess spin, which can adopt one of two states: up or down, analogous to 0 and 1 in classical bits. An advantage of a hole spin over an electron spin is that it can be controlled entirely through electrical means, eliminating the need for additional components such as micromagnets on the chip.

As early as 2022, physicists from Basel demonstrated that hole spins could be trapped and utilized as qubits in existing electronic devices. These devices, known as "FinFETs" (fin field-effect transistors), are integral components of modern smartphones and are manufactured through widespread industrial processes.

Recently, a team led by Dr. Andreas Kuhlmann achieved a breakthrough by successfully facilitating a controllable interaction between two qubits within this setup for the first time.

Quantum computers require "quantum gates" to perform calculations; these gates are operations that manipulate qubits and link them together. As detailed in the journal Nature Physics, researchers have successfully coupled two qubits and achieved a controlled flip of one qubit's spin based on the state of the other's spin, a process referred to as a controlled spin-flip.

Hole spins allow us to create two-qubit gates that are both fast and high-fidelity. This principle now also makes it possible to couple a larger number of qubit pairs.

Dr. Andreas Kuhlmann, Department of Physics, University of Basel

The exchange interaction between two indistinguishable particles that interact electrostatically provides the basis for the coupling of two spin qubits.

Surprisingly, the exchange energy of holes is not only electrically controllable but strongly anisotropic. This is due to thespin-orbit coupling, meaningthat the spin state of a hole is influenced by its motion through space.

Experimental and theoretical physicists from the NCCR SPIN and the University of Basel joined forces to describe this observation in a model.

The anisotropy makes two-qubit gates possible without the usual trade-off between speed and fidelity, Qubits based on hole spins not only leverage the tried-and-tested fabrication of silicon chips, they are also highly scalable and have proven to be fast and robust in experiments.

Dr. Andreas Kuhlmann, Department of Physics, University of Basel

The study emphasizes how promising this strategy is to create a large-scale quantum computer.

Geyer, S., et al. (2024) Anisotropic exchange interaction of two hole-spin qubits. Nature Physics. doi.org/10.1038/s41567-024-02481-5.

Source: https://www.unibas.ch/en.htm

Here is the original post:
Unveiling the Potential of Hole Spin Qubits in Quantum Computing - AZoQuantum

Read More..

BBVA runs successful trial of distributed quantum simulation in the cloud – BBVA

To run its test, BBVA worked with the Quantum Computing team of the digital transformation company VASS and AWS, using Qiskit software to distribute the execution of quantum algorithms across multiple classical compute servers located in the AWS cloud, and created a platform to automate and streamline the distribution process.

With this distributed quantum simulation, one of the first of its kind in the financial sector, BBVA was able to run quantum algorithms scaling up to a total computing power of 38 qubits, a scale that is difficult to reach with the use of a single classical computer. The higher the number of qubits, the more complex the problems the system can tackle.

BBVA is a founding member of the Quantum Safe Financial Forum (QSFF), a safe space for collaboration between European and US financial firms promoted by Europols European Cybercrime Centre (EC3). The alliance aims to foster the creation of new technological systems within the financial industry that are safe, secure and resilient to malicious attacks that rely on quantum computing.

The trial also served to demonstrate that classical computers can be used to test quantum algorithms at scale and in an ideal computing environment. Quantum computing is an emerging technology and todays hardware is highly susceptible to noise. Running large-scale simulations allows BBVA to explore potential applications in a noise-free environment, with the potential to bring these applications to larger, more fault-tolerant quantum hardware as it matures.

The results were exactly what we expect to obtain in a fault-tolerant quantum computer, said Javier Recuenco Andrs, head of the Technical Architecture Innovation area at BBVA CIB in charge of the pilot project. With these trials, we have shown that at BBVA we can have a proprietary architecture for executing quantum algorithms, which would help further our exploration of their use in complex financial tasks.

At BBVA we explore the potential of quantum computing for two main reasons: to try to find better solutions to business problems and to strengthen the security of our communications and data to counteract the malicious use of quantum computing by third parties, explained Escolstico Snchez, leader of the Quantum discipline at BBVA. The distributed quantum simulation pilot we have successfully completed is a further step in this exploration, which could enable different business units of the bank to leverage this technology.

Original post:
BBVA runs successful trial of distributed quantum simulation in the cloud - BBVA

Read More..

Apple to employ AI cloud servers using its own processors – Macworld

A new Bloomberg report details a project code-named ACDC, for Apple Chips in Data Centers, in which Apple will use its own silicon to provide cloud AI services. Apple is going big on AI with iOS 18 and macOS 15, and while on-device processing will be a big differentiator for the company, more advanced tasks will require the resources of big server infrastructure.

According to the report, the plan to use Apples own chips for cloud infrastructure began three years ago but has been accelerated due to the need to quickly bring to market advanced AI features. The first AI server chips will be M2 Ultra processors, it says, but there are already plans to upgrade to chips based on the M4 series in the future.

Apple is expected to perform relatively simple generative AI tasks (like summarizing your missed text messages) on-device, especially those that use your private data which Apple will surely want to ensure stays on your iPhone. The cloud would be used for more intensive gen-AI tasks like image generation or composing lengthier emails. According to the report, the upgraded version of the Siri voice assistant would use cloud processing as well, though we expect simple answers and tasks that use the information contained on your iPhone to still be processed and executed entirely on-device as they are now.

The company is expected to use its own data centers at first, but just as it does with iCloud, will augment that with third-party data centers from Google or other partners.

Well hear more about Apples AI plans and products in about one month at WWDC.

See the original post here:
Apple to employ AI cloud servers using its own processors - Macworld

Read More..

Chinese server CPU beats Microsoft, Google and AWS rivals to grab performance crown Alibaba’s Yitian 710 is … – TechRadar

Alibaba Cloud's Yitian 710 processor is the most efficient Arm-based server processor for database tasks in hyperscale cloud environments around today, new research has claimed.

A recent study published in the Transactions on Cloud Computing journal by IEEE found the 128-core processor, developed in 2021, not only trumps rival Arm-based chips, but is reported to run circles around Intels Xeon Platinum (Sapphire Rapids) processor when it comes to specific database tasks in the cloud.

This finding comes from a research paper titled "Are Arm Cloud Servers Ready for Database Workloads? An Experimental Study," produced by research assistant professor Dumitrel Loghin from the School of Computing at the National University of Singapore. The study, conducted across eight cloud servers, tested performance of five Arm-powered server CPUs, including the Yitian 710, and pitted them against Intel's x86 Xeon Platinum 8488C processor (launched in 2023).

Key players such as AWS, Microsoft Azure, and Google Cloud Platform are no strangers to 64-bit Arm CPUs, having introduced their own versions of virtual machines running on these servers. AWSs Graviton2 and Graviton3, Alibabas Yitian 710, Huaweis Kunpeng 920, and Ampere Altra CPUs used by Azure and GCP were all included in the analysis.

Alibaba's Yitian 710 was ahead of its rivals in synthetic Dhrystone and Whetstone benchmarks and the study concluded that it, alongside AWSs Graviton3, are genuine rivals to Intel's Xeon CPUs, showcasing equal or even superior results for in-memory workloads. That said, for Online Analytical Processing (OLAP), Machine Learning inference, and blockchain tasks, Arm-based servers struggled to match Xeon. The lag was chalked up mainly to unoptimized software, lower clock frequency, and subpar core level performance.

You can view the full set of benchmarks in The Registers report, which also notes that the Yitian 710 has some inherent advantages: it uses a newer version of the Arm ISA, and speedy DDR5 RAM that some rival CPUs cant utilize.

The report concludes that while ARM servers spend 2X more time in Linux kernel system calls compared to Xeon servers they show great potential. Given their lower cloud computing price, ARM servers could be the ideal choice when the performance is not critical.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

See original here:
Chinese server CPU beats Microsoft, Google and AWS rivals to grab performance crown Alibaba's Yitian 710 is ... - TechRadar

Read More..

Apple plans to use M2 Ultra chips in the cloud for AI – The Verge

Apple plans to start its foray into generative AI by offloading complex queries to M2 Ultra chips running in data centers before moving to its more advanced M4 chips.

Bloombergreportsthat Apple plans to put its M2 Ultra on cloud servers to run more complex AI queries, while simple tasks are processed on devices. The Wall Street Journal previously reported that Apple wanted to make custom chips to bring to data centers to ensure security and privacy in aproject the publicationsaysis called Project ACDC, or Apple Chips in Data Center.But the company now believes its existing processors already have sufficient security and privacy components.

The chips will be deployed to Apples data centers and eventually to servers run by third parties. Apple runs its own servers across the United States and has been working on anew center in Waukee, Iowa,which it first announced in 2017.

While Apple has not moved as fast on generative AI as competitors like Google, Meta, and Microsoft, the company has been putting out research on the technology. In December, Apples machine learning research teamreleased MLX, a machine learning framework that can make AI models run efficiently on Apple silicon. The company has also been releasing other research around AI models that hint atwhat AI could look like on its devicesand how existing products, like Siri,may get an upgrade.

Apple put a big emphasis on AI performance in itsannouncement of the new M4 chip, saying its new neural engine is an outrageously powerful chip for AI.

Read more here:
Apple plans to use M2 Ultra chips in the cloud for AI - The Verge

Read More..

For complex iPhone AI tasks, Apple will use cloud-based servers running M-series chips – PhoneArena

Apple is planning on having more complex AI tasks for iPhones, iPads, and Macs get sent through the cloud to data centers using servers powered by Apple's powerful in-house chips. Less complicated AI tasks will be handled directly on-device which will make them faster and more secure. According to a report in Bloomberg written by the news agency's chief Apple correspondent Mark Gurman, the first chips to be used to power the servers in the data centers will be the M2 Ultra. That chip is currently used to run the Mac Pro and Mac Studio. The scuttlebutt calls for Apple to eventually develop an M4 Ultra chip to power the servers in the data centers. Apparently Apple had come up with a plan to use its own chips and cloud-based servers to run complex AI tasks three years ago but decided to accelerate the timeline once OpenAI kicked off the latest AI craze with the ChatGPT chatbot. In December 2022, when ChatGPT first started to become known to the public, Gmail developer Paul Buchheit said that AI will do to internet search what Google did to the Yellow Pages. Namely, make the older technology obsolete.

Apple will reportedly use the M2 Ultra chip to power the first servers used in the data centers

If you're like me, you can't wait to see how Siri is affected by Apple's AI initiative. The virtual digital assistant, originally launched with the iPhone 4s in 2011, soon found itself not as useful as Google Assistant with too many responses consisting of excepts from three websites. Hopefully the use of AI will help Siri deliver more precise responses to queries.

Read more:
For complex iPhone AI tasks, Apple will use cloud-based servers running M-series chips - PhoneArena

Read More..