Category Archives: Quantum Computer

Top Academics: Here’s How We Facilitate the Next Big Leap in Quantum Computing – PCMag Middle East

Table of Contents From Quantum Physics to Quantum Computing Grand Challenges and Error Correction The Road to Quantum Advantage Education and Workforce Development The Quantum Bottom Line

In advance of the ribbon-cutting for its new IBM System One quantum computer, the first one on a college campus, Rensselaer Polytechnic Institute (RPI) last week hosted a quantum computing day which featured several prominent speakers who together provided a snapshot of where the field is now. I've been writing about quantum computing for a long time, and have noted some big improvements, but there are also a host of challenges that still need to be overcome.

Here are some highlights.

The first plenary speaker was Jay M. Gambetta, Vice President of Quantum Computing at IBM, who gave an overview of the history and progress of quantum computing, as well as the challenges and opportunities ahead. He explained that quantum computing is based on exploiting the quantum mechanical properties of qubits, such as superposition and entanglement, to perform computations that are impossible or intractable for classical computers. He talked about watching the development of superconducting qubits, as they moved from single qubit systems in 2007, to 3-qubit systems in 2011, and now with IBM's Eagle chip, which has 127 qubits and is the heart of the Quantum System One.

He then asked how we could make quantum computing useful. His answer: We need to keep building larger and larger systems and we need to improve error correction.

"There are very strong reasons to believe there are problems that are going to be easy for a quantum computer but hard for a classical computer, and this is why we're all excited," Gambetta said. He discussed the development of quantum circuits and that while the number of qubits was important, equally important was the "depth," detailing how many operations you can do and the accuracy of the results. Key to solving this are larger and larger systems, and also error mitigation, a topic that would be discussed in much greater detail later in the day.

To get to "quantum utility"which he said would be reached when a quantum computer is better than a brute force simulation of a quantum computer on a classical machineyou would need larger systems with at least 1000 gates, along with improved accuracy and depth, and new efficient algorithms.

He talked about quantum algorithmic discovery, which means finding new and efficient ways to map problems to quantum circuits. For instance, a new variation on Shor's algorithm, which allows for factorization in much faster time than would be possible on a classical computer. "The future of running error-mitigated circuits and mixing classical and quantum circuits sets us up to explore this space, " he said.

In a panel discussion that followed, James Misewich from Brookhaven National Laboratory discussed his interest in using quantum computing to understand quantum chromodynamics (QCD), the theory of strong interactions between quarks and gluons. QCD is a hard problem that scales well with the number and depth of qubits, and he is looking at entanglement between jets coming out of particle collisions as a possible avenue to explore quantum advantage.

Jian Shi and Ravishankar Sundararaman from RPI's Materials Science and Engineering faculty talked about computational materials science, and applying quantum computing to discover new materials and properties. Shi noted there was a huge community now doing quantum chemistry, but there is a gap between that and quantum computing. He stressed that a partnership between the two groups will be important, so each learns the language of the other and can approach the problems from a different perspective.

One of the most interesting talks was given by Steve M. Girvin, Eugene Higgins Professor of Physics, Yale University, who discussed the challenges of creating an error-correction quantum computer.

Girvin described how the first quantum revolution was the development of things like the transistor, the laser, and the atomic clock, while the second quantum revolution is based on a new understanding of how quantum mechanics works. He usually tells his students that they do the things that Einstein said were impossible just to make sure that we have a quantum computer and not a classical computer.

He thought there was a bit too much hype around quantum computing today. quantum is going to be revolutionary and do absolutely amazing things, but it's not its time yet. We still have massive problems to solve.

He noted that quantum sensors are extremely sensitive, which is great for making sensors, but bad for building computers, because they are very sensitive to external perturbations and noise. Therefore, error correction is important.

Among the issues Girvin discussed were making measurements to detect errors, but he said we also need calculations to decide if it truly is an error, where it is located, and what kind of error it is. Then there is the issue of deciding what signals to send to correct those errors. Beyond that, there is the issue of putting these together in a system to reduce overall errors, perhaps borrowing from the flow control problems used in things like telephony.

In addition to quantum error detection, Girvin said there are "grand challenges all up and down the stack," from materials to measurement to machine models and algorithms. We need to know how to make each layer of the stack more efficient, using less energy and fewer qubits, and get to higher performance so people can use these to solve science problems or economically interesting problems.

Then there are the algorithms. Girvin noted that there were algorithms way before there were computers, but it took time to decide on the best ones for classical computing. For quantum computing, this is just the beginning, and over time, we need people to figure out how to build up their algorithms and how to do heuristics. They need to discover why quantum computers are so hard to program and clever tools to solve these problems.

Another challenge he described was routing quantum information. He noted that having two quantum computers that can communicate classically is exponentially less good than having two quantum computers that can communicate with quantum information, entangling with each other.

He talked about fault tolerance, which is the ability to correct errors even when your error correction circuit makes errors. He believes that fact that it's possible to do that in a quantum system, at least in principle, is even more amazing than the fact that if you had a perfect quantum computer, you could do interesting quantum calculations.

Girvin described the difficulty in correcting errors, saying you have an unknown quantum state, and you're not allowed to know what it is, because it's from the middle of a quantum computation. (If you know what it is, you've destroyed the superposition, and if you measure it to see if there's an error, it will randomly change, due to state collapse.) Your job is that if it develops an error, please fix it.

"That's pretty hard, but miraculously it can be done in principle, and it's even been done in practice," he said. We're just entering the era of being able to do it. The basic idea is to build in redundancy, such as building a logical qubit that consists of multiple physical qubits, perhaps nine. Then you have two possible giant entangled states corresponding to a logical Zero and a logical One. Note the one and zero aren't living in any single physical qubit, both are only the superposition of multiple ones.

In that case, Girvin says, if the environment reaches in and measures one of those qubits, the environment doesn't actually learn what it knows. There's an error, but it doesn't know what state, so there's still a chance that you haven't totally collapsed anything and lost the information.

He then discussed measuring the probability of errors and then seeing whether it exceeds some threshold value, with some complex math. Then correcting the errors, hopefully quicklysomething that should improve with new error correction methods and better, more precise physical qubits.

All this is still theoretical. That's why fault tolerance is a journey with improvements being made continuously. (This was in opposition to Gambetta, who said systems are either fault tolerant or they aren't). Overall, Girvin said, "We still have a long way to go, but we're moving in the right direction."

Later in the morning, Austin Minnich, Professor of Mechanical Engineering and Applied Physics, Caltech described "mid-circuit measurement" and the need for hybrid circuits as a way of finding, and thus mitigating errors.

In a discussion that followed, Kerstin Kleese van Dam, Director of the Computational Science Initiative at Brookhaven National Laboratory, explained that her team was looking for answers to problems, whether solved on traditional or quantum machines. She said there were problems they can't solve accurately on a traditional computer, but there remains the question of whether the accuracy will matter. There are areas, such as machine learning, where quantum computers can do things accurately. She predicts that quantum advantage will come when we have systems that are large enough. But she also wondered about energy consumption, noting that a lot of power is going into today's AI models, and if quantum can be more efficient.

Shekhar Garde, Dean of the School of Engineering, RPI, who moderated this part of the discussion, compared the status of quantum computing today to where traditional computing was in the late 70s or early 80s. He asked what the next 10 years would bring.

Kleese van Dam said that within 10 years, we would see hybrid systems that combine quantum and classical computing, but also hoped we would see libraries that are transferred from high-performance computing to quantum systems, so a programmer could use them without having to understand the way the gates work. Aparna Gupta, Professor and Associate Dean of RPI's Lally School of Management would bet on the hybrid approach offering more easy access and cost-effectiveness, as well as "taking away the intrigue and the spooky aspects of quantum, so it is becoming real for all of us"

Antonio Corcoles, Principal Research Scientist, IBM Quantum, said he hoped users who don't know quantum will be able to use the system because the complexity will become more transparent, but that can take a long time. In between, they can develop quantum error correction in a way that is not as disruptive as current methods. Minnich talked about "blind quantum computing" where many smaller machines might be linked together.

One of the most interesting talks came from Lin Lin, Professor of Mathematics at the University of California, Berkeley, who discussed the theoretical aspects and challenges of achieving quantum advantage for scientific computation. He defined quantum advantage as the ability to solve problems that are quantumly easy but classically hard, and proposed a hierarchy of four levels of problems.

Lin said that for the first two levels, a lot of people think quantum advantage will be achieved, as the methods are generally understood. But on the next two levels, there needs to be a lot of work on the algorithms to see if it will work. That's why this is an exciting time for mathematicians as well as physicists, chemists, and computer scientists.

This talk was followed by a panel during which Lin said that he is interested in solving quantum many-body problems, as well as applying quantum computing to other areas of mathematics, such as numerical analysis and linear algebra.

Like Garde above, Lin compared where quantum is today to the past, going even further to say it's where classical computing was 60 or 70 years ago, where error correction was still very important. Quantum computing will need to be a very interdisciplinary field, in that it will require people to be very good at building the machines, but it will always produce errors, so it will require both mathematical and engineering ways to correct these.

Ryan Sweke from IBM Research noted that one of the things that has allowed classical computing to develop to the point it is at is the various levels of abstraction, so if you want to work on developing algorithms, you don't have to understand how the compiler works. If you want to understand how the compiler works, you don't have to understand how the hardware works.

The interesting thing in the quantum regime, as seen in error mitigation for example, is that people who come out of the top level of abstraction have to interact with people who are developing the devices. This is an exciting aspect of the time we're in.

Di Fang, Assistant Professor of Mathematics, Duke University, said now was a "golden time for people who work on proving algorithms." She talked about the varying levels of complexity, and the need to see where new algorithms can solve theoretical problems, then look at the hardware and solve practical problems.

Brian McDermott, Principal R&D Engineer at the Naval Nuclear Laboratory, said he was looking at this in reverse, seeing what the problems are and then working backward toward the quantum hardware and software. His job involved matching applications of new and emerging computing architectures to the types of engineering problems that are important to the lab's mission for new nuclear propulsion.

The panelists discussed where quantum algorithms could have the most impact. McDermott talked about things like finite elements and computational fluid dynamics, going up to material science. As a nuclear engineer, he was first attracted to the field because of the quantum properties of the nucleus itself moving predicting behaviors in astrophysics, the synthesis of nuclei in a supernova, and then with engineering, into nuclear reactors and things like fusion. Lin discussed the possibilities for studying molecular dynamics.

Olivia Lanes, Global Lead and Manager for IBM Quantum Learning and Education gave the final talk of the day, where she discussed the need for workforce development in the quantum field.

Already the US is projected to face a shortfall of nearly two million STEM workers by next year. She quoted Carl Sagan, who said "We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology," and agreed with him that this is a recipe for disaster.

She noted that not only do very few people understand quantum computing, very few actually understand how classical computers work. She cited a McKinsey study which found that there are three open jobs in quantum for every person qualified to fill those positions. It's probably just going to get worse from here to 2026.

She focused on upskilling and said it was unrealistic to expect that we'll make everyone into experts in quantum computing. But, there were a lot of other jobs that are part of the quantum ecosystem that will be required, and urged students to focus on the areas they are particularly interested in.

In general, she recommended getting a college degree (not surprising, since she was talking at a college), considering graduate school, or finding some other way to get relevant experience in the field, and building up rare skills. "Find the one thing that you can do better than anybody else and market that thing. You can make that thing applicable to any career that you really want for the most part," she said. "Stop letting the physicists hog quantum; they've had a monopoly here for too long and that needs to change."

Similar concepts were voiced in a panel that followed. Anastasia Marchenkova, Quantum Researcher, Bleximo Corporation, said that there was lots of pop science, and lots of research, but not much in the middle. She said we need to teach people enough so they can use quantum computing, even if they aren't computer scientists.

Richard Plotka, Director of Information Technology and Web Science, RPI, said it was important to create middleware tools that can be applied to quantum so that the existing workforce can take advantage of these computers. He also said it was important to prepare students for a career in the future, with foundational knowledge, so they have the ability to adapt because quantum in five or ten years won't look like it does today.

All told, it was a fascinating day of speakers. I was intrigued by software developers explaining the challenge in writing languages, compilers, and libraries for quantum. One explained that you can't use traditional structures such as "ifthen" because you won't know "if." Parts of it were beyond my understanding, and I remain skeptical about how quickly quantum will become practical and how broad the applications may be.

Still, it's an important and interesting technology that is sure to get even more attention in the coming years, as researchers meet some of the challenges. It's good to see students getting a chance to try out the technology and discover what they can do with it.

Read more from the original source:
Top Academics: Here's How We Facilitate the Next Big Leap in Quantum Computing - PCMag Middle East

This University in New York Is the First With a Full-Fledged Quantum Computer – PCMag

On Friday April 5, I attended the ribbon-cutting for the first quantum computer installed on a university campus, an IBM Quantum System One machine at Rensselaer Polytechnic Institute. While quantum computing has the potential to solve some problems that traditional computers cant and has been advancing at a steady rate, there are still many questions and challenges around the technology. Installing the machine on a college campus will allow researchers to examine many of these issues and allow students to get hands-on experience with the technology.

RPI President Martin A. Schmidt (Credit: Michael J. Miller)

RPI President Martin A. Schmidt says that with this quantum computer, we will explore applications, develop algorithms, and in so doing help humanity solve some very large problems. He states that while it's easy to predict that quantum systems will rapidly become essential because of their computational power, we don't yet fully know how best to use them. He says we can anticipate that there will be important applications in biomedicine, in modeling climate and predicting weather, and in materials design; but there will be applications in many other fields.

With IBMs research in Yorktown Heights, manufacturing in Poughkeepsie, and partnerships with the University of Albany as well as RPI, he hopes for "an agglomeration effect," in which organizations in a region working together can create something where the whole is greater than the sum of the parts. Schmidt notes that there are already partnerships in the area for semiconductor research, and this has led to new factories being built in upstate New York: "Adding 'quantum valley' aspects to 'tech valley' is not only going to draw new businesses here and encourage startups, but also offer the region's existing businesses early insights into what it means to be quantum advantaged."

Schmidt hopes the system and its use by RPI and the University of Albany will help answer the question of how the United States educates a quantum-ready workforce for the near future. He notes RPI's history of 'hands-on' education and that students at all levels will be encouraged to use the machine.

Separately, Schmidt also tells me that he believes the quantum computer will be useful in attracting both faculty and students.

Curtis Priem, a cofounder of Nvidia and vice-chairman of RPIand the donor who arranged for the machine to come to RPInotes that he enrolled at RPI initially because of this 'hands-on' approach and remarked at how today even undergraduates can use RPI's supercomputer.

IBM CEO Arvind Krishna (Credit: Michael J. Miller)

IBM CEO Arvind Krishna says that quantum systems will solve problems that we cannot solve on today's computersproblems in materials, problems in carbon sequestration, problems around drug discovery, and problems in lightweight materials, lubricants, and EV battery materials. "When you think about it intuitively," he says "they come from a world of physical chemistry, which means that they are subject to the principles of quantum mechanics, which is why these systems, which kind of simulate nature, are the ones that are going to let us make progress on these problems." They have the potential to solve problems around stochastics and financial risk.

Krishna believes that the university could uniquely help with workforce development, saying "Students are going to imagine using these systems in ways that even the inventors of these systems can't conceive." Listing a set of potential use cases, he says, " I'll make a bet that within five years students and faculty here are going to bring up use cases that are far beyond what we are imagining."

The unveiling was preceded by a day of discussions about the opportunities and the many challenges facing quantum computing before it is ready for commercial applications. I'll talk about those in my next post.

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

See the original post here:
This University in New York Is the First With a Full-Fledged Quantum Computer - PCMag

Quantum Encryption Advances at Oxford University Physics – yTech

Oxford researchers have made a significant leap in quantum security, which may lead to the safe deployment of quantum computing in domestic settings. The team, directed by postdoctoral research assistant Peter Drmota at Oxford University Physics, has successfully demonstrated a blind quantum computing technique on a trapped-ion quantum processora technology touted for its scalable quantum computing prospects.

This new approach marries quantum computing with quantum cryptography in a manner that hasnt been achieved before. It does so by ensuring that both the processed data and the algorithms used remain hidden from both the server and potential eavesdroppers. The concept relies on the principles of quantum mechanics, which state that attempting to observe or duplicate a quantum state will inevitably alter it.

In practical terms, the teams experiments used a standard fiber network to link a quantum computing server with a simplistic device used for detecting light particles at a separate client computer. This allowed the client to perform computations remotely on the server without the server having access to any of the data or the algorithms being used.

Drmota finds great potential in the blind aspect of this technology, particularly in verifying the correctness of computations done by a remote quantum computer. This is crucial for problems that are beyond the scope of classical computing. The relative simplicity and scalability of the Oxford approach, incorporating existing technology like fiber networks and photon detectors, herald a future where cloud-hosted quantum servers could engage with clients worldwide to process sensitive data securely.

The research is a stride towards enabling secure, confidential quantum computations by clients with minimal resources, thereby potentially bringing quantum computings formidable power to everyday users. This development was made possible thanks to collaborative efforts funded by UKs Quantum Computing and Simulation Hub and contributions from various international institutions. Insights from this study appear in the distinguished Physical Review Letters journal.

Advancements in Quantum Computing and Quantum Security

The groundbreaking research conducted by Oxford University is a notable achievement in the rapidly expanding field of quantum computing. Quantum computing is an emerging industry that boasts the potential to revolutionize various fields by performing complex computations much faster than current classical computers can. Given that quantum computing involves processing and storing information in quantum states, it brings forward not only unprecedented computational power but also unique challenges concerning data security and privacy.

Quantum security is particularly crucial as quantum computers have the potential to break current encryption methods, which would jeopardize data integrity and privacy. The blind quantum computing technique developed by Dr. Peter Drmota and his team adds an additional layer of security, allowing computations to take place without revealing the data or the algorithms to the server, thus ensuring the confidentiality of sensitive information.

Market Forecasts and Industry Growth

The global quantum computing market has been projected to grow significantly in the coming years, fueled by investments from both public and private sectors. Market analysts foresee that with continued advancements and reductions in cost, quantum computing services could become widely accessible through cloud-based models, similar to how classical computing services are offered today.

Industry Challenges and Potential Issues

Despite the optimism surrounding quantum computing, the industry is not without its challenges. One of the major hurdles lies in the current technological limitations which include error rates and quantum decoherence that can affect the stability of quantum states. Moreover, securing quantum communications to safeguard against potential quantum attacks is an ongoing area of investigation, highlighted by advancements such as the one from Oxford researchers.

Addressing the broader concerns, there is also the need to develop new standards and protocols for quantum security to ensure compatibility and protection across the various platforms and networks that may emerge. Furthermore, the issue of accessibility and education must be addressed, as the complexity of quantum computing could create a barrier for entry for many users and businesses.

As the quantum computing industry evolves, companies, governments, and educational institutions must work collaboratively to establish an ecosystem that not only fosters innovation but also ensures a secure and equitable framework for its use. Partnerships and funding, such as those from the Quantum Computing and Simulation Hub in the UK, are pivotal in supporting research that bridges the gap between theoretical quantum computing and practical, secure applications.

For readers seeking to stay updated on the latest in this transformative field or to learn more about the market and its influencers, reputable sources include the official websites for quantum technology development and research centers. One may find these sources at the main domains without any specific subpage links:

Oxford University Physics Department: physics.ox.ac.uk Quantum Computing and Simulation Hub: qcshub.org Physical Review Letters Journal: aps.org

These platforms often provide insights and updates on current research, industry trends, and market forecasts, helping individuals and businesses to navigate the complexities of quantum technologies and their implications for the future.

Roman Perkowski is a distinguished name in the field of space exploration technology, specifically known for his work on propulsion systems for interplanetary travel. His innovative research and designs have been crucial in advancing the efficiency and reliability of spacecraft engines. Perkowskis contributions are particularly significant in the development of sustainable and powerful propulsion methods, which are vital for long-duration space missions. His work not only pushes the boundaries of current space travel capabilities but also inspires future generations of scientists and engineers in the quest to explore the far reaches of our solar system and beyond.

Read this article:
Quantum Encryption Advances at Oxford University Physics - yTech

A New Dawn for Quantum Computing: Major Advancements on the Horizon – yTech

Recent research by a global consortium of scientists has reached a pivotal milestone in quantum physics that may usher in a new era of computing and technological innovation. Their study could dramatically change the landscape of everyday technology by incorporating quantum attributes into nonmagnetic materials using light at ambient conditions. This paves the way for practical quantum computing in day-to-day life.

The typically frigid realm of quantum mechanics has made a significant leap toward practical application. Scientists have discovered how to induce magnetic properties in nonmagnetic materials with light, remarkably, without requiring subzero temperatures. Considering their potential for enabling superconductivity and extraordinary magnetism in everyday materials, these findings signify an impending revolution, particularly in quantum computing applications.

The impact of this discovery is far-reaching, potentially altering every facet of technological development, from data security enhancements to magnetic-based medical technologies like MRI scanners. The notion of a quantum computer in every household, once seen as science fiction, is now a viable future prospect.

However, adapting this breakthrough to consumer-level technology is not without its challenges. Producing quantum states outside of strict laboratory settings remains a significant hurdle, and advances in production and infrastructure will be necessary to sustain this quantum leap.

This breakthrough underscores a pivotal period in technological progress and highlights the need for thoughtful deliberation on the implications of widespread quantum computing, including ethical, safety, and privacy issues. Industry experts and research institutions, such as IBM and government initiatives like Quantum.gov, continue to lead the path towards harnessing these quantum advancements.

Summary: With quantum computing set to revolutionize industries and infrastructures, scientists have made a breakthrough by inducing magnetism in nonmagnetic materials using light at room temperature. This advancement could simplify quantum computer designs and reduce costs, leading to a more practical and commercially viable technology. The excitement around this development is tempered by challenges in maintaining quantum coherence outside of lab conditions, talent shortages, and potential cybersecurity risks. Nonetheless, this transformative period in computing is poised to offer innovative solutions and a wealth of technological advancements.

Introduction to Quantum Computing Industry

Quantum computing is poised to be the next great leap in computational power, capable of addressing problems that are currently intractable for classical computers. Unlike conventional computers, which use bits that represent either a 0 or a 1, quantum computers use quantum bits or qubits that can represent both 0 and 1 simultaneously through a property known as superposition. This, combined with entanglement and quantum interference, allows quantum computers to process vast amounts of data at unprecedented speeds.

Market Forecasts

The quantum computing market is projected to grow significantly in the coming years. According to recent market research, the global quantum computing market size is expected to reach multi-billion-dollar levels by the end of the decade, growing at a compound annual growth rate (CAGR) of over 20%. This growth is fueled by increasing investments from governments and private sectors in quantum technologies and research and development activities.

Industry Applications and Challenges

Industries ranging from finance and pharmaceuticals to automotive and aerospace are anticipated to benefit from quantum computing capabilities, particularly in optimization problems, machine learning applications, and simulations of molecular and chemical processes. In the financial sector, quantum computing could transform risk analysis and fraud detection, while in medicine, it could accelerate drug discovery and the personalization of treatments.

However, there are significant issues facing the industry as it moves toward commercialization. The production of qubits and the maintenance of their coherence require exacting conditions, such as extremely low temperatures and vacuum environments. One of the key challenges is to develop technology that can operate at ambient conditions while preserving quantum states, which the current breakthrough aims to address.

In addition, there are concerns about cybersecurity, as the ability of quantum computers to break traditional encryption methods could render current safety protocols obsolete. This has led to considerable interest in developing quantum-safe encryption techniques. Furthermore, integrating quantum computing into current infrastructures will require considerable development of new algorithms and software capable of exploiting quantum computational advantages.

Conclusion and Related Links

The achievement of inducing magnetism in nonmagnetic materials using light at room temperature is a considerable step toward making quantum computing more accessible and cost-effective. If these early scientific triumphs can be transitioned into practical applications, we may see quantum computing move from the realm of research labs to commercial reality.

This progress in quantum computing foreshadows an era of accelerated innovation with wide-ranging positive implications for various sectors. For further understanding of the domain and industry insights, you are encouraged to visit the main domains of leading institutions and initiatives in this field:

IBM Research for its pioneering work in quantum computing Quantum.gov for details on the United States National Quantum Initiative

Continued research and investment are essential to overcoming the remaining technical barriers, and with the combined efforts of the scientific community and industry partners, the full potential of quantum computing may soon be realized.

Iwona Majkowska is a prominent figure in the tech industry, renowned for her expertise in new technologies, artificial intelligence, and solid-state batteries. Her work, often at the forefront of innovation, provides critical insights into the development and application of cutting-edge AI solutions and the evolution of energy storage technologies. Majkowskas contributions are pivotal in shaping the future of sustainable energy and intelligent systems, making her a respected voice in both academic and industrial circles. Her articles and research papers are a valuable resource for professionals and enthusiasts alike, seeking to understand the impact and potential of these transformative technologies.

See the rest here:
A New Dawn for Quantum Computing: Major Advancements on the Horizon - yTech

New method of measuring qubits promises ease of scalability in a microscopic package – EurekAlert

image:

An artistic illustration shows how microscopic bolometers (depicted on the right) can be used to sense very weak radiation emitted from qubits (depicted on the left).

Credit: Aleksandr Kkinen/Aalto University

Chasing ever-higher qubit counts in near-term quantum computers constantly demands new feats of engineering.

Among the troublesome hurdles of this scaling-up race is refining how qubits are measured. Devices called parametric amplifiers are traditionally used to do these measurements. But as the name suggests, the device amplifies weak signals picked up from the qubits to conduct the readout, which causes unwanted noise and can lead to decoherence of the qubits if not protected by additional large components. More importantly, the bulky size of the amplification chain becomes technically challenging to work around as qubit counts increase in size-limited refrigerators.

Cue the Aalto University research group Quantum Computing and Devices (QCD). They have a hefty track record of showing how thermal bolometers can be used as ultrasensitive detectors, and they just demonstrated in an April 10 Nature Electronics paper that bolometer measurements can be accurate enough for single-shot qubit readout.

A new method of measuring

To the chagrin of many physicists, the Heisenberg uncertainty principle determines that one cannot simultaneously know a signals position and momentum, or voltage and current, with accuracy. So it goes with qubit measurements conducted with parametric voltage-current amplifiers. But bolometric energy sensing is a fundamentally different kind of measurementserving as a means of evading Heisenbergs infamous rule. Since a bolometer measures power, or photon number, it is not bound to add quantum noise stemming from the Heisenberg uncertainty principle in the way that parametric amplifiers are.

Unlike amplifiers, bolometers very subtly sense microwave photons emitted from the qubit via a minimally invasive detection interface. This form factor is roughly 100 times smaller than its amplifier counterpart, making it extremely attractive as a measurement device.

When thinking of a quantum-supreme future, it is easy to imagine high qubit counts in the thousands or even millions could be commonplace. A careful evaluation of the footprint of each component is absolutely necessary for this massive scale-up. We have shown in the Nature Electronics paper that our nanobolometers could seriously be considered as an alternative to conventional amplifiers. In our very first experiments, we found these bolometers accurate enough for single-shot readout, free of added quantum noise, and they consume 10 000 times less power than the typical amplifiersall in a tiny bolometer, the temperature-sensitive part of which can fit inside of a single bacterium, says Aalto University Professor Mikko Mttnen, who heads the QCD research group.

Single-shot fidelity is an important metric physicists use to determine how accurately a device can detect a qubits state in just one measurement as opposed to an average of multiple measurements. In the case of the QCD groups experiments, they were able to obtain a single-shot fidelity of 61.8% with a readout duration of roughly 14 microseconds. When correcting for the qubits energy relaxation time, the fidelity jumps up to 92.7%.

With minor modifications, we could expect to see bolometers approaching the desired 99.9% single-shot fidelity in 200 nanoseconds. For example, we can swap the bolometer material from metal to graphene, which has a lower heat capacity and can detect very small changes in its energy quickly. And by removing other unnecessary components between the bolometer and the chip itself, we can not only make even greater improvements on the readout fidelity, but we can achieve a smaller and simpler measurement device that makes scaling-up to higher qubit counts more feasible, says Andrs Gunyh, the first author on the paper and a doctoral researcher in the QCD group.

Prior to demonstrating the high single-shot readout fidelity of bolometers in their most recent paper, the QCD research group first showed that bolometers can be used for ultrasensitive, real-time microwave measurements in 2019. They then published in 2020 a paper in Nature showing how bolometers made of graphene can shorten readout times to well below a microsecond.

The work was carried out in the Research Council of Finland Centre of Excellence for Quantum Technology (QTF) using OtaNano research infrastructure in collaboration with VTT Technical Research Centre of Finland and IQM Quantum Computers. It was primarily funded by the European Research Council Advanced Grant ConceptQ and the Future Makers Program of the Jane and Aatos Erkko Foundation and the Technology Industries of Finland Centennial Foundation.

Full paper:

Andrs M. Gunyh, Suman Kundu, Jian Ma, Wei Liu, Sakari Niemel, Giacomo Catto, Vasilii Vadimov, Visa Vesterinen, Priyank Singh, Qiming Chen, Mikko Mttnen, Single-Shot Readout of a Superconducting Qubit Using a Thermal Detector, Nature Electronics, https://doi.org/10.1038/s41928-024-01147-7 (2024).

Contact information:

Mikko Mttnen

Professor, QCD group leader

Aalto University

mikko.mottonen@aalto.fi

m. +358505940950

Andrs Gunyh

Doctoral researcher

Aalto University

andras.gunyho@aalto.fi

Nature Electronics

Experimental study

Single-Shot Readout of a Superconducting Qubit Using a Thermal Detector

10-Apr-2024

M.M. declares that that he is a Co-Founder and Shareholder of the quantum-computer company IQM Finland Oy. M.M. declares that he is an inventor in granted patents FI122887B, US9255839B2, JP5973445B2, and EP2619813B1 titled Detector of single microwave photons propagating in a guide, applied by Aalto korkeakoulusti, and invented by M.M. and Jukka Pekola. This patent family describes an ultrasensitive microwave detector concept. M.M. declares that he is an inventor in pending patent applications WO2022248759A1 and TW202303466A titled Quantum-state readout arrangement and method, applied by IQM Finland Oy, and invented by M.M. and Juha Hassel. This patent family describes a concept of measuring the states of qubits using bolometers. Other authors declare no competing interests.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Original post:
New method of measuring qubits promises ease of scalability in a microscopic package - EurekAlert

Breakthrough in Quantum Computing: ETH Zurich Innovates with Static Fields in Ion Trapping – yTech

ETH Zurichs recent foray into the realm of ion trapping has yielded promising advancements for quantum computing. A team of researchers at the esteemed institution has developed a method for trapping ions that could potentially enable the creation of quantum computers with greater numbers of qubits than currently possible. Utilizing static electric and magnetic fields, the group has taken quantum operations a step further, signaling a leap forward in computing capabilities.

**Summary:** Researchers at ETH Zurich have made a significant stride in quantum computing by devising an ion trapping technique that employs static electric and magnetic fields. This novel approach, utilizing Penning traps on a microfabricated chip, allows for arbitrary ion transport and offers a scalable solution that promises to increase the number of qubits in quantum computers considerably.

Quantum computer scientists are working tirelessly to overcome the limitations imposed by traditional oscillating field ion traps, such as the Paul trap, which restricts ions to linear motion and complicates the integration of multiple traps on a single chip. By means of steady fields, the ETH teams Penning traps have unlocked new potentials for maneuvering ions in two dimensions without the constraints of oscillating fields, offering a boon for future quantum computing applications.

The ETH researchers, led by Jonathan Home, have reimagined the ion trap architecture, traditionally used in precision experiments, to suit the demands of quantum computing. Despite encountering initial skepticism, the team constructed an advanced Penning trap that incorporated a superconducting magnet producing a field strength of 3 Tesla. They effectively implemented precise control over the ions energy states, proving their methods viability for quantum computation.

The trapped ions ability to stay put for several days within this new system has marked a remarkable achievement. This stable trapping environment, free from oscillating fields and external disturbances, allowed the researchers to maintain quantum mechanical superpositions essential for operations in quantum computers.

Looking ahead, the ETH group aims to harness these innovations for multi-qubit operations by trapping two ions in adjacent Penning traps on the same chip. This ambitious endeavor would illustrate the practicality of large-scale quantum computers using static field ion traps, potentially leading to more powerful computing technologies than any seen before.

The research at ETH Zurich represents an exciting development in the field of quantum computing, an industry that is expected to revolutionize the world of computing as we know it. With the progress made in ion trapping techniques, the scalability of quantum computers could rise precipitously, culminating in machines far exceeding the capabilities of todays supercomputers.

Industry Background: Quantum computing harnesses the phenomena of quantum mechanics to perform computation. Unlike classical bits, quantum computers use qubits, which can exist in states of 0, 1, or any quantum superposition of these states. This allows quantum computers to solve certain problemslike factoring large numbers or running simulations of quantum materialsmuch faster than classical computers.

Market Forecasts: The quantum computing market is projected to grow significantly in the coming years. According to industry analysis, the global market size, which was valued at several hundred million dollars, is expected to reach into the billions by the end of the decade, with a compound annual growth rate (CAGR) often cited in strong double digits. This growth is driven by increasing investments from both public and private sectors and advancements in quantum computing technologies.

Industry-Related Issues: There are several challenges that the quantum computing industry faces. One of the main hurdles is quantum decoherence, where the qubits lose their quantum state due to environmental interference, posing a significant issue for maintaining quantum superpositions. Another challenge involves error rates in quantum calculations that require complex error correction methods. Furthermore, the creation and maintenance of qubits are technically demanding and expensive, requiring precise control over the physical systems that host them, like ions or other particles.

The breakthrough by ETH Zurichs researchers addresses some of these challenges by using static fields, which can potentially improve the stability and coherence times of the qubits. This could lead to advancements in quantum error correction and enable the implementation of more complex quantum algorithms.

As the demand for quantum computing continues to rise, collaboration and investment in research and development are crucial. Successful implementation of quantum computers can impact various industries, including cryptography, materials science, pharmaceuticals, and finance. For those interested in the cutting-edge developments in this field, the following sources offer valuable insights:

IBM Quantum IBM is one of the companies at the forefront of quantum computing. They provide access to quantum computers through the cloud and are actively involved in advancing quantum computation technology.

D-Wave Systems Inc. D-Wave is known for developing quantum annealing-based computers, specializing in solving optimization and sampling problems.

Google Quantum AI Googles Quantum AI lab is working on developing quantum processors and novel quantum algorithms to help researchers and developers solve near-term problems across various sectors.

The innovations from the team at ETH Zurich are poised to contribute significantly to this burgeoning industry, potentially overcoming some of the critical challenges and pushing us closer to the realization of fully functional quantum computers.

Marcin Frckiewicz is a renowned author and blogger, specializing in satellite communication and artificial intelligence. His insightful articles delve into the intricacies of these fields, offering readers a deep understanding of complex technological concepts. His work is known for its clarity and thoroughness.

Follow this link:
Breakthrough in Quantum Computing: ETH Zurich Innovates with Static Fields in Ion Trapping - yTech

Could quantum computing be South Carolina’s next economic draw? This statewide initiative says yes – Columbia … – columbiabusinessreport.com

The future of cutting-edge computer technology in South Carolina is getting a huge boost from an initiative announced March 25.

The South Carolina Quantum Association has launched an effort to develop quantum computing technology and talent in the state through $15 million approved by the South Carolina legislature in the fiscal year 2023-24 budget, the states largest ever investment in a tech initiative, according to information from SCQA.

SC Quantum hopes to increase collaboration among academia, entrepreneurs, industry and government to further the advancement of this technology in the Midlands and South Carolina in general, officials said.

Columbia Mayor Daniel Rickenmann, state Sen. Dick Harpootlian, and Joe Queenan, executive director of SC Quantum, announced the landmark project at an event held at the Boyd Innovation Center on Saluda Avenue in Columbias Five Points district.

Quantum computing is a concept that many people havent even heard of and one that is still in development. In a nutshell, its a computing system that uses the principles of quantum physics to simulate and solve problems that are difficult for traditional digital systems to manage, according to MITs Sloan School of Management. Quantum computing was first proposed in the 1980s and the first well known quantum algorithm emerged from MIT in the 1990s.

Unlike traditional computers which use binary electric signals, quantum computers use subatomic particles called qubits which can represent combinations of both ones and zeroes. Experts say the technology could be used to help scientists, businesses, economists and others to work through complex problems and find solutions in a more efficient way.

The funds will go toward education including workforce development, certificate and micro-credential programs, entrepreneurship support and engagement projects such as gatherings of experts and quantum demonstration projects.

Quantum is a new way of solving problems, and this initiative will allow us to build out a quantum-ready workforce able to solve important real-world problems with this cutting edge technology, Queenan said.

Queenan noted the importance of funding quantum development because massive efforts are already underway overseas. China has recently dedicated $15 billion to development of quantum technology, and the European Union is devoting $8 billion.

The U.S. government has named quantum an industry of the future on par with artificial intelligence and 5G, and committed more than $1.2 billion for quantum research and development budgets in 2022, according to information from SCQA.

Work in the quantum computing field is already underway at the University of South Carolina, where students recently came in third at a quantum competition held at MIT and others have recently developed a prototype quantum-based hedge fund, which is showing strong returns, Queenan said.

Mayor Rickenmann said the new initiative will help develop Columbias role as a technology research hub for the state.

This is the right project at the right time, he said. This is an investment in the intellectual capital of our city and state. I think were going to see a renaissance of intellectual development here in this community.

Sen. Harpootlian, who has lived in the Five Points area for more than 50 years, said the quantum initiative being launched from there is just the latest marker of the dramatic change that has transformed the neighborhood since he first moved to the area to attend law school in 1971.

I look back fondly on the days when this was a sleepy little village, of going to get breakfast at Gibsons and then a hot dog at Franks Hot Dogs, Harpootlian said, referencing two iconic eateries that were symbols of the areas previous incarnation, But those days are long gone and they arent coming back whats coming is much better. South Carolina Quantum is putting South Carolina ahead of the curve. Columbia could be a major hub of innovation for this technology that is rapidly growing in use across the globe.

See the rest here:
Could quantum computing be South Carolina's next economic draw? This statewide initiative says yes - Columbia ... - columbiabusinessreport.com

Shaping the Future: South Carolina’s Quantum Computing Education Initiative – yTech

A summary of the new initiative by the South Carolina Quantum Association reveals the states forward-thinking investment in quantum computing expertise. South Carolina is funneling resources into a groundbreaking educational partnership aimed at equipping University of South Carolina students with real-world quantum computing skills. Backed by taxpayer dollars, this project is providing a platform for students to train on a cutting-edge quantum supercomputer, fostering their growth into in-demand tech professionals and invigorating local industries with innovative solutions.

In a significant development for South Carolinas aspiring quantum scientists, the states Quantum Association is collaborating with the University of South Carolina to offer an extraordinary educational experience. The venture is supported by a $20,000 research project fund and is already yielding promising outcomes.

Finance and computer science majors at the university are piloting a quantum computing project, enhancing investment strategies for a regional bank. The quantum computer, funded through a substantial $15 million state budget allocation and accessed remotely from the University of Maryland, serves as a cornerstone for the states burgeoning intellectual and industrial advancements.

The initiatives participants are already making waves on the national stage, having secured a top position at a notable MIT hackathon. With aspirations extending well into the investment sphere, these students are founding a hedge fund to apply their unique quantum computing insights.

South Carolinas project goes beyond mere technological enhancement. It aims to nurture top-tier talent through an expansive quantum computing curriculum and an online training platform, positioning the state to become a nexus of high-tech finance and industry professionals.

The global quantum computing industry is poised for exponential growth as this powerful technology promises to transform diverse sectors. The South Carolina initiative reflects a strategic movement to prepare for a future that demands advanced computational knowledge, amidst challenges like hardware stability and the need for specialists.

By marrying academic learning with practical application, South Carolinas initiative is setting the stage for building a proficient quantum workforce. This workforce would be adept at addressing industry challenges and leveraging the opportunities offered by this emergent technological field.

Industry and Market Forecasts

The quantum computing industry represents one of the most exciting frontiers in technology and science. As of my latest knowledge update in 2023, the industry is expected to grow substantially in the coming years. According to market research, the global quantum computing market is projected to reach billions of dollars by the end of this decade, with a compounded annual growth rate (CAGR) that underscores its dynamic potential. This growth is fueled by increased investments from both public and private sectors, along with breakthroughs in quantum algorithms and hardware.

Companies across various industries, such as finance, pharmaceuticals, automotive, and aerospace, are exploring quantum computing to gain a competitive advantage. This technology holds the promise of solving complex problems that are currently intractable by classical computers, such as optimizing enormous data sets, modeling molecular interactions for drug discovery, and improving encryption methods.

Current Issues in Quantum Computing

One of the most significant issues facing the quantum computing industry is the stabilization of qubits, the basic units of quantum information. Unlike classical bits, qubits can exist in multiple states simultaneously, through a phenomenon known as superposition. However, they are also highly susceptible to interference from their environment, which can lead to errors in computation. Overcoming this challenge, often referred to as quantum decoherence, is a key focus for researchers.

Another issue is the need for a highly specialized workforce, as quantum computing requires not just expertise in computer science but also in quantum mechanics. The intricacies of quantum algorithms and the underpinning physical principles necessitate a new breed of professionals who can bridge the gap between theory and practical application.

Moreover, the industry is also working on making quantum computing more accessible. Currently, only a handful of organizations have the resources to develop and maintain a quantum computer. However, the rise of quantum computing as a service (QCaaS) models has begun to democratize access to quantum resources, allowing more players to explore the potential of this technology.

South Carolinas Role in the Quantum Computing Ecosystem

South Carolinas initiative to invest in quantum computing education highlights the importance of building a smart workforce that can contribute to and benefit from this promising industry. With their practical projects, such as improving banking investment strategies through quantum computing, students are not only contributing to innovation but are also showcasing how these complex technologies can have real-world applications.

Such initiatives prepare a new generation to play an active role in the industry, ensuring that the U.S. remains at the forefront of technological advancements. For more information on related topics, you may consider visiting the website of professional organizations and industry leaders in quantum computing. Here are a couple of valid related links:

IBM Quantum Google Quantum AI

By promoting education and funding in quantum computing, South Carolina is positioning itself not only as a contributor to the global quantum revolution but also as a beneficiary of the quantum economy to come. It is an example of how regional initiatives can have significant outcomes in a rapidly evolving high-tech landscape.

Jerzy Lewandowski, a visionary in the realm of virtual reality and augmented reality technologies, has made significant contributions to the field with his pioneering research and innovative designs. His work primarily focuses on enhancing user experience and interaction within virtual environments, pushing the boundaries of immersive technology. Lewandowskis groundbreaking projects have gained recognition for their ability to merge the digital and physical worlds, offering new possibilities in gaming, education, and professional training. His expertise and forward-thinking approach mark him as a key influencer in shaping the future of virtual and augmented reality applications.

View post:
Shaping the Future: South Carolina's Quantum Computing Education Initiative - yTech

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative – HPCwire

Editors Note: Next month there will be a workshop to discuss what a quantum initiative led by NSFs Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Call for Participation announcement. A key contact for interested quantum community members is Frank Mueller, N.C. State University.

Call for Participation: Planning Workshop on Quantum Computing (PlanQC 2024) April 28, 2024, https://www.asplos-conference.org/asplos2024/ in conjunction with: ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2024) San Diego, CA (USA)

Funding for quantum computing has come from a variety of programs at the National Science Foundation (NSF), which have been multi-disciplinary and cutting across multiple NSF divisions. However, no NSF quantum initiatives have been led by the Computer, Information

Science and Engineering (CISE) directorate within NSF. Undoubtedly, there is a surge in demand driven by open positions in academia and industry focused on the computing side of quantum. There is arguably a need for a focused program on quantum computing led by CISE in cooperation with other directorates to enable the next generation of quantum algorithms, quantum architectures, quantum communication, quantum systems, quantum software and compilers. The objective of this workshop is to identify areas of quantum computing in particular to enable new discoveries in quantum science and engineering, and to meaningly contribute to creating a Quantum computing ready workforce.

To articulate this need and to develop a plan for new CISE-led quantum program, we plan to bring several leading senior and some junior researchers in quantum computing together for a planning workshop complemented by selected ones beyond the CISE community to foster interdisciplinary interactions.

This workshop will lead to a comprehensive report that will provide a detailed assessment of the need for a new CISE program and will provide a description of the research areas that such a program should focus on. With such a new program, NSF would be able to focus its efforts on the computing aspects of quantum science and greatly enhance its ability to sustain the leadership role of the United States in this area of strategic interest. This workshop and report will be the first stepping stone in bootstrapping such a program.

Call for participation

We invite researchers both already active in quantum computing and those aspiring to become active to participate in a one day workshop, including some participants from the quantum application domains. The focus of the workshop will be to engage in discussions and summarize findings in writing to create a report on open quantum problems and motivate workforce development in the area. The workshop format will be alternating plenary and break-out sessions to provide a unifying vision while identifying research challenges in a diverse set of subareas.

Quantum Topics

Algorithms

Architectures

Communication

Compilers/Languages

Simulation

Software

Theory

Applications

Classical control and peripheral hardware

Workshop Chairs and Organizers

Frank Mueller North Carolina State University

Fred Chong University of Chicago

Vipin Chaudhary Case Western Reserve University

Samee Khan Mississippi State University

Gokul Ravi University of Michigan

Read the original post:
Call for Participation in Workshop on Potential NSF CISE Quantum Initiative - HPCwire

Quantum computing progress: Higher temps, better error correction – Ars Technica

There's a strong consensus that tackling most useful problems with a quantum computer will require that the computer be capable of error correction. There is absolutely no consensus, however, about what technology will allow us to achieve that. A large number of companies, including major players like Microsoft, Intel, Amazon, and IBM, have all committed to different technologies to get there, while a collection of startups are exploring an even wider range of potential solutions.

We probably won't have a clearer picture of what's likely to work for a few years. But there's going to be lots of interesting research and development work between now and then, some of which may ultimately represent key milestones in the development of quantum computing. To give you a sense of that work, we're going to look at three papers that were published within the last couple of weeks, each of which tackles a different aspect of quantum computing technology.

Error correction will require connecting multiple hardware qubits to act as a single unit termed a logical qubit. This spreads a single bit of quantum information across multiple hardware qubits, making it more robust. Additional qubits are used to monitor the behavior of the ones holding the data and perform corrections as needed. Some error-correction schemes require over a hundred hardware qubits for each logical qubit, meaning we'd need tens of thousands of hardware qubits before we could do anything practical.

A number of companies have looked at that problem and decided we already know how to create hardware on that scalejust look at any silicon chip. So, if we could etch useful qubits through the same processes we use to make current processors, then scaling wouldn't be an issue. Typically, this has meant fabricating quantum dots on the surface of silicon chips and using these to store single electrons that can hold a qubit in their spin. The rest of the chip holds more traditional circuitry that performs the initiation, control, and readout of the qubit.

This creates a notable problem. Like many other qubit technologies, quantum dots need to be kept below 1 Kelvin in order to keep the environment from interfering with the qubit. And, as anyone who has ever owned an x86-based laptop knows, all the other circuitry on the silicon generates heat. So, there's the very real prospect that trying to control the qubits will raise the temperature to the point that the qubits can't hold onto their state.

That might not be the problem that we thought, according to some work published in Wednesday's Nature. A large international team that includes people from the startup Diraq have shown that a silicon quantum dot processor can work well at the relatively toasty temperature of 1 Kelvin, up from the usual milliKelvin that these processors normally operate at.

The work was done on a two-qubit prototype made with materials that were specifically chosen to improve noise tolerance; the experimental procedure was also optimized to limit errors. The team then performed normal operations starting at 0.1 K and gradually ramped up the temperatures to 1.5 K, checking performance as they did so. They found that a major source of errors, state preparation and measurement (SPAM), didn't change dramatically in this temperature range: "SPAM around 1 K is comparable to that at millikelvin temperatures and remains workable at least until 1.4 K."

The error rates they did see depended on the state they were preparing. One particular state (both spin-up) had a fidelity of over 99 percent, while the rest were less constrained, at somewhere above 95 percent. States had a lifetime of over a millisecond, which qualifies as long-lived in the quantum world.

All of which is pretty good and suggests that the chips can tolerate reasonable operating temperatures, meaning on-chip control circuitry can be used without causing problems. The error rates of the hardware qubits are still well above those that would be needed for error correction to work. However, the researchers suggest that they've identified error processes that can potentially be compensated for. They expect that the ability to do industrial-scale manufacturing will ultimately lead to working hardware.

Read this article:
Quantum computing progress: Higher temps, better error correction - Ars Technica