Category Archives: Quantum Computing
Enhancing Quantum Error Correction Effectiveness – AZoQuantum
Apr 30 2024Reviewed by Lexie Corner
In a study published in the journal Nature Physics, a team of scientists led by researchers from the University of ChicagosPritzker School of Molecular Engineering (PME) created the blueprint for a quantum computer that can fix errors more efficiently.
Although quantum computers are an extremely potent computational tool, their delicate qubits challenge engineers: how can they design useful, functional quantum systems using bits that are easily disrupted and erased of data by minute changes in their environment?
Engineers have long grappled with how to make quantum computers less error-prone, frequently creating methods to identify and rectify problems rather than preventing them in the first place. However, many of these error-correction systems entail replicating information over hundreds or thousands of physical qubits simultaneously, making it difficult to scale up efficiently.
The system makes use of reconfigurable atom array hardware, which enables qubits to communicate with more neighbors and, consequently, allows the qLDPC data to be encoded in fewer qubits, as well as a new framework based on quantum low-density party-check (qLDPC) codes, which can detect errors by examining the relationship between bits.
With this proposed blueprint, we have reduced the overhead required for quantum error correction, which opens new avenues for scaling up quantum computers.
Liang Jiang, Study Senior Author and Professor, Pritzker School of Molecular Engineering, University of Chicago
While standard computers rely on digital bitsin an on or off positionto encode data, qubits can exist in states of superposition, giving them the ability to tackle new computational problems. However, qubits unique properties also make them incredibly sensitive to their environment; they change states based on the surrounding temperature and electromagnetism.
Quantum systems are intrinsically noisy. Theres really no way to build a quantum machine that wont have error. You need to have a way of doing active error correction if you want to scale up your quantum system and make it useful for practical tasks.
Qian Xu, Graduate Student, Pritzker School of Molecular Engineering, University of Chicago
For the previous few decades, scientists have primarily relied on one type of error correction, known as surface codes, for quantum systems. In these systems, users encode the same logical information into several physical bits grouped in a wide two-dimensional grid. Errors can be detected by comparing qubits to their immediate neighbors. A mismatch indicates that one qubit misfired.
Xu added, The problem with this is that you need a huge resource overhead. In some of these systems, you need one thousand physical qubits for every logical qubit, so in the long run, we dont think we can scale this up to very large computers.
Jiang, Xu, and colleagues from Harvard University, Caltech, the University of Arizona, and QuEra Computing designed a novel method to fix errors using qLDPC codes. This type of error correction had long been contemplated but not included in a realistic plan.
With qLDPC codes, data in qubits is compared to both direct neighbors and more distant qubits. It enables a smaller grid of qubits to do the same number of comparisons for error correction. However, long-distance communication between qubits has always been a challenge when implementing qLDPC.
The researchers devised a solution in the form of new hardware: reconfigurable atoms that can be relocated using lasers to enable qubits to communicate with new partners.
With todays reconfigurable atom array systems, we can control and manipulate more than a thousand physical qubits with high fidelity and connect qubits separated by a large distance. By matching the structure of quantum codes and these hardware capabilities, we can implement these more advanced qLDPC codes with only a few control lines, putting the realization of them within reach with today's experimental systems.
Harry Zhou, Ph.D. Student, Harvard University
When researchers paired qLDPC codes with reconfigurable neutral-atom arrays, they achieved a lower error rate than surface codes using only a few hundred physical qubits. When scaled up, quantum algorithms requiring thousands of logical qubits might be completed with fewer than 100,000 physical qubits, vastly outperforming the gold-standard surface codes.
Theres still redundancy in terms of encoding the data in multiple physical qubits, but the idea is that we have reduced that redundancy by a lot, Xu added.
Though scientists are developing atom-array platforms quickly, the framework is still theoretical and represents a step toward the real-world use of error-corrected quantum computation. The PME team is now striving to improve its design even more and ensure that reconfigurable atom arrays and logical qubits relying on qLDPC codes can be employed in computation.
Xu concluded, We think in the long run, this will allow us to build very large quantum computers with lower error rates.
Xu, Q., et. al. (2024) Constant-overhead fault-tolerant quantum computation with reconfigurable atom arrays. Nature Physics. doi:10.1038/s41567-024-02479-z
Source: https://www.uchicago.edu/en
Read the original post:
Enhancing Quantum Error Correction Effectiveness - AZoQuantum
Global Quantum Processors Industry Research 2024: A $5+ Billion Market by 2033 – Collaborations and Partnerships … – GlobeNewswire
Dublin, May 01, 2024 (GLOBE NEWSWIRE) -- The "Global Quantum Processors Market - A Global and Regional Analysis: Focus on Application, Type, Business Model, and Regional and Country-Level Analysis - Analysis and Forecast, 2023-2033" report has been added to ResearchAndMarkets.com's offering.
The global quantum processors market is projected to reach a value of $5.02 billion by 2033 from $1.07 billion in 2023, growing at a CAGR of 16.7%
The global quantum processors market is experiencing rapid growth, driven by advancements in quantum computing technology and increasing investments from both the public and private sectors. Quantum processors, the core components of quantum computers, offer the potential to solve complex problems at speeds far beyond traditional computing systems.
This has led to heightened interest from industries such as healthcare, finance, and cybersecurity, where quantum computing promises groundbreaking solutions. Key players in the quantum processors market are continuously striving to enhance processor performance, scalability, and reliability to meet the evolving demands of various applications.
Additionally, collaborations between technology companies, research institutions, and government agencies are fostering innovation and accelerating the commercialization of quantum processors. Despite these advancements, challenges such as maintaining qubit coherence and error correction remain significant barriers to widespread adoption. However, ongoing research efforts and investments in quantum computing infrastructure are expected to drive the market forward, unlocking new possibilities across industries and reshaping the computing landscape in the years to come.
Market Lifecycle Stage
The global quantum processors market is undergoing rapid evolution, characterized by distinct phases of introduction, growth, maturity, and potential decline. In the introductory phase, pioneering companies and research institutions are driving innovation, developing prototypes, and exploring potential applications. As technological advancements and investments surge, the market enters a phase of rapid growth, marked by increasing demand from various sectors such as finance, healthcare, and cybersecurity.
This growth phase sees the emergence of new players, intensified competition, and acceleration of commercialization efforts. In the maturity phase, quantum processors become more mainstream, with established use cases and a growing customer base. Market saturation may occur as competition reaches its peak, leading to price stabilization and consolidation among key players. However, innovation remains crucial to sustaining market momentum and staying ahead of competitors.
The future trajectory of the quantum processors market depends on factors such as technological breakthroughs, regulatory environment, and market acceptance. While the potential for transformative impact is immense, challenges such as scalability, error correction, and cost-effectiveness need to be addressed to ensure sustained growth and market relevance.
Industrial Impact
The advent of quantum processors marks a revolutionary stride in computing technology, promising unprecedented capabilities that could redefine various industries. In the realm of finance, quantum processors hold the potential to revolutionize complex calculations, optimizing trading strategies, risk assessment, and portfolio management.
Additionally, quantum computing can enhance data encryption techniques, crucial for safeguarding sensitive financial information in the banking and cybersecurity sectors. In healthcare, quantum processors promise to accelerate drug discovery processes by simulating molecular interactions and predicting compound behaviors with unparalleled accuracy.
Furthermore, industries reliant on optimization problems, such as logistics and supply chain management, stand to benefit from quantum computing's ability to solve complex logistical challenges efficiently. As quantum computing continues to advance, its impact across industries is poised to reshape business operations, drive innovation, and unlock new avenues for growth and development.
Key Market Players and Competition Synopsis
The global quantum processors market has been segmented by different types, among which superconducting qubits accounted for around 43.05%, trapped-ion qubits held around 20.29%, topological qubits accounted for approximately 2.76%, quantum dots held around 6.15%, photonic qubits held approximately around 20.94%, cell assembly held around 2.14% and cold atom processors held for around 4.69% of the total quantum processors market in 2022 in terms of value.
Key Attributes:
Market Dynamics Overview
Photonics: The Next Big Quantum Computing Technology
Trends: Current and Future Impact Assessment
Market Drivers
Market Challenges
Market Opportunities
Company Profiles
Superconducting Qubits
Trapped-Ion Qubits
Topological Qubits
Quantum Dots
Photonic Qubits
Cell Assembly
Cold Atom Quantum Processors
Supply Chain Overview
Research and Development Review
Snapshot of the Quantum Computing Market
For more information about this report visit https://www.researchandmarkets.com/r/tb6y6k
About ResearchAndMarkets.com ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.
See original here:
Global Quantum Processors Industry Research 2024: A $5+ Billion Market by 2033 - Collaborations and Partnerships ... - GlobeNewswire
Intel Research Opens the Door for Mass Production of Silicon-based Quantum Processors, a Requirement for Making … – IndianWeb2.com
Intel has made a significant advancement in quantum computing by demonstrating high fidelity and uniformity in single-electron control on spin qubit wafers. This achievement, as reported in a recent research paper, published in Nature, indicates a major step towards the scalability of silicon-based quantum processors, which are essential for the development of fault-tolerant quantum computers.
Quantum computing researchers at Intel Foundry Technology Research developed a 300-millimeter (mm) cryogenic probing process to collect high-volume data on the performance of spin qubit devices across full wafers, resulting in state-of-the-art uniformity, fidelity, and measurementstatistics of spin qubits.
For an uninitiated, Spin qubits are a type of quantum bit, or qubit, which are the fundamental building blocks of quantum computers. They are based on the quantum property of electron spin. In classical computing, a bit can be in one of two states: 0 or 1. However, in quantum computing, due to the principle of superposition, a qubit like a spin qubit can be in a state that is a complex combination of both 0 and 1 simultaneously.
With this, Intel advances in controlling single-electron spins with high fidelity and uniformity across a wafer. This is significant because it suggests the possibility of scaling up the production of spin qubits using established semiconductor fabrication methods, which is a crucial step towards building practical quantum computers.
Intel is taking steps toward building fault-tolerant quantum computers by improving three factors (1) Qubit density, (2) Reproducibility of uniform qubits, and (3) Measurement statistics from high volume testing.
This research is being conducted by Samuel Neyens and colleagues and demonstrates the application of CMOS industry techniques to the fabrication and measurement of spin qubits. The researchers successfully automated measurements of the operatingpoint of spin qubits and probed the transitions of single electrons across full wafers. Their analysis of the random variation in single-electron operating voltages indicated that this fabrication process leads to low levels of disorder at the 300 mm scale.
This breakthrough is a key step towards scalable quantum computers capable of tackling real-world applications, as it leverages the mature chipmaking industry's methods for fabricating and testing conventional computer chips. The ability to probe single electrons with such precision is essential for the development of fault-tolerant quantum computers that require vast numbers of physical qubits.
The practical applications of probing single electrons in spin qubit wafers are still largely in the developmental stage, but the technology holds significant promise for the future of quantum computing. The ability to probe single electrons with high precision is crucial for creating scalable quantum computers, which could revolutionize various fields by performing complex computations much faster than traditional computers.
Commodore 64 claimed to outperform IBM’s quantum system sarcastic researchers say 1 MHz computer is faster … – Tom’s Hardware
A paper released during the SIGBOVIK 2024 conference details an attempt to simulate the IBM quantum utility experiment on a Commodore 64. The idea might seem preposterous - pitting a 40-year-old home computer against a device powered by 127-Qubit Eagle quantum processing unit (QPU). However, the anonymous researcher(s) conclude that the Qommodore 64 performed faster, and more efficiently, than IBMs pride-and-joy, while being decently accurate on this problem.
At the beginning of the paper, the researchers admit that their Qommodore 64 project is a joke, but, sadly for IBM, its proof of quantum utility was also built upon shaky foundations, and the Qommodore 64 team came up with some convincing-looking benchmarks. There was some controversy about IBMs claims at the time, and we are reminded it took just five days for the quantum experiment to be simulated on an ordinary MacBook M1 Pro laptop. The jokey Quantum Disadvantage paper (PDF link, headlining section starts at page 199) ports this experiment to a machine packing the far more humble MOS Technology 6510 processor.
Image 1 of 3
To get deep into the weeds with the quantum theory and math behind the quantum utility experiment, please follow the above PDF link. However, to summarize, the C64-based experiment uses the sparse Pauli dynamics technique developed by Begui, Hejazi, and Chan to approximate the behavior of ferromagnetic materials. Famously, IBM claimed such calculations were too difficult to perform on a classical computer to an acceptable accuracy, using the leading approximation techniques, recalls the paper. Not quite, and as already mentioned above, an ordinary laptop can obtain similar results.
The anonymous C64 user(s) provide some interesting details of their quantum-defeating feat. Their aggressively truncated and shallow depth-first search model used just 15kB of the spacious 64kB available on the iconic Commodore machine. Meanwhile, the final code consisted of about 2,500 lines of 6502 assembly, stored on a cartridge that fitted in the C64s expansion port. This code was handled by the mighty 1 MHz 8-bit MOS 6510 CPU. The C64 took approx 4 minutes per data point. (Testing the same code on a modern laptop achieved roughly 800s per data point.)
In conclusion, the researcher(s) asserts that the Qommodore 64 is faster than the quantum device datapoint-for-datapoint it is much more energy efficient and it is decently accurate on this problem. On the topic of how applicable this research is to other quantum problems, it is snarkily suggested that it probably wont work on almost any other problem (but then again, neither do quantum computers right now). Overall, it is difficult to know whether the results are entirely genuine, though a lot of detail is provided and the linked research references in the paper seem genuine.
We know many readers are retro computing enthusiasts, as well as DIYers and makers. So it is good to know that the author(s) of this paper say that they will provide source code to allow others to replicate their results. However, source code will only be supplied in one of three formats, they say: a copy handwritten on papyrus, a slide-show of blurry screenshots recorded on a VHS tape, or that I dictate it to you personally over the phone. So please add an extra pinch of salt to this story for that.
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Read more from the original source:
Commodore 64 claimed to outperform IBM's quantum system sarcastic researchers say 1 MHz computer is faster ... - Tom's Hardware
Guaranteeing Security and Privacy: New Quantum Breakthrough Could Benefit Millions of People – SciTechDaily
The process allows a remote user (right) to access a quantum computer in the cloud (left) with complete security. Credit: Helene Hainzer, Oxford University Physics.
A recent breakthrough guaranteeing security and privacy by Oxford University physicists could enable millions of people and businesses to tap into the capabilities of next-generation quantum computing. This advance promises to unlock the transformative potential of cloud-based quantum computing and is detailed in a new study published in the influential U.S. scientific journal Physical Review Letters.
Professor David Lucas, co-head of the Oxford University Physics research team and lead scientist at the UK Quantum Computing and Simulation Hub. Credit: Martin Small
Quantum computing is developing rapidly, paving the way for new applications that could transform services in many areas like healthcare and financial services. It works in a fundamentally different way to conventional computing and is potentially far more powerful. However, it currently requires controlled conditions to remain stable and there are concerns around data authenticity and the effectiveness of current security and encryption systems.
Several leading providers of cloud-based services, like Google, Amazon, and IBM, already separately offer some elements of quantum computing. Safeguarding the privacy and security of customer data is a vital precursor to scaling up and expanding its use and for the development of new applications as the technology advances. The new study by researchers at Oxford University Physics addresses these challenges.
We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity, said Professor David Lucas, who co-heads the Oxford University Physics research team and is lead scientist at the UK Quantum Computing and Simulation Hub, led from Oxford University Physics.
Experiments on quantum computing in the Beecroft facility, Oxford University Physics. Credit: David Nadlinger, Oxford University Physics.
In the new study, the researchers use an approach dubbed blind quantum computing, which connects two totally separate quantum computing entities potentially an individual at home or in an office accessing a cloud server in a completely secure way. Importantly, their new methods could be scaled up to large quantum computations.
Peter Drmota, author of the new study who led the experiments on blind quantum computing at Oxford University Physics. Credit: Martin Small.
Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realising this concept is a big step forward in both quantum computing and keeping our information safe online said study lead Dr Peter Drmota, of Oxford University Physics.
The researchers created a system comprising a fiber network link between a quantum computing server and a simple device detecting photons, or particles of light, at an independent computer remotely accessing its cloud services. This allows so-called blind quantum computing over a network. Every computation incurs a correction which must be applied to all that follow and needs real-time information to comply with the algorithm. The researchers used a unique combination of quantum memory and photons to achieve this.
Never in history have the issues surrounding privacy of data and code been more urgently debated than in the present era of cloud computing and artificial intelligence, said Professor David Lucas. As quantum computers become more capable, people will seek to use them with complete security and privacy over networks, and our new results mark a step change in capability in this respect.
The results could ultimately lead to commercial development of devices to plug into laptops, to safeguard data when people are using quantum cloud computing services.
Researchers exploring quantum computing and technologies at Oxford University Physics have access to the state-of-the-art Beecroft laboratory facility, specially constructed to create stable and secure conditions including eliminating vibration.
Reference: Verifiable Blind Quantum Computing with Trapped Ions and Single Photons by P. Drmota, D. P. Nadlinger, D. Main, B. C. Nichol, E. M. Ainley, D. Leichtle, A. Mantri, E. Kashefi, R. Srinivas, G. Araneda, C. J. Ballance and D. M. Lucas, 10 April 2024, Physical Review Letters. DOI: 10.1103/PhysRevLett.132.150604
Funding for the research came from the UK Quantum Computing and Simulation (QCS) Hub, with scientists from the UK National Quantum Computing Centre, the Paris-Sorbonne University, the University of Edinburgh, and the University of Maryland, collaborating on the work.
See more here:
Guaranteeing Security and Privacy: New Quantum Breakthrough Could Benefit Millions of People - SciTechDaily
Crossing the Quantum Threshold: The Path to 10,000 Qubits – HPCwire
Editors Note: Why do qubit count and quality matter? Whats the difference between physical qubits and logical qubits? Quantum computer vendors toss these terms and numbers around as indicators of the strengths of their systems. For seasoned quantum computing watchers, the rationale behind the claims are well-known and appreciated. However, there are many who are new to the quantum information science and for whom a qubit count/quality 101 backgrounder can be helpful. Heres a brief explanation from Yuval Boger of QuEra Computing. BTW, QuEra has a nice glossary of quantum terms on its website.
In recent months, several quantum companies have made roadmap announcements with plans to reach 10,000 physical qubits in the next five years or sooner. This is a dramatic increase from the current 20 to 300 qubits, especially given that several of these companies have yet to release their first product.
What makes 10,000 qubits such an important milestone, and what will quantum computers be capable of once that number is reached?
The effort to achieve 10,000 physical qubits in quantum computing is more than a mere pursuit of quantity; it embodies strategic milestones toward unlocking the full potential of quantum computation. Broadly speaking, 10,000 physical qubits allow for the practical realization of over 100 logical qubits, essential for performing longer, more complex computations with a lower chance of errors. Below, I explain the important distinction between physical and logical qubits, the significance of reaching and crossing the 100 logical qubit threshold, and the varied path different quantum computing implementations take to get there.
While increasing the number of qubits is good, increasing the qubit quality is even more important. One key attribute of good qubits is the error rates associated with single- and two-qubit operations and the lifetime of a qubit. The error rate indicates how often qubit operations are successful. These might be operations on single qubits, such as flipping a qubit, or operations on two qubits, such as entangling them. The state-of-the-art in two-qubit operations is approaching 99.9% success. While 99.9% might sound great, this success rate implies that about 1 in 1,000 operations fail. Thus, if an algorithm requires several thousands of two-qubit operations, it will likely produce incorrect results. Truly useful algorithms require millions of such operations.
While pursuing 10,000 physical qubits is critical, its imperative to acknowledge that effective quantum error correction is necessary since it is unlikely that physical qubit error rates will sufficiently improve to enable these longer, more complex algorithms. This is where logical qubits come in. Logical qubits are a collection of physical qubits that address this problem. By cleverly spreading the information from a single qubit across several qubits, detecting and correcting many errors becomes possible. The exact way to do so and the number of physical qubits that are required to create a good enough logical qubit is an active area of research, but depending on the desired error rate and the selected qubit technology, dozens, hundreds, or thousands of physical qubits will be required to create one good fault-tolerant logical qubit.
The transition from noisy, physical qubits to fault-tolerant, logical qubits is not merely technical; its transformative, marking the difference between quantum computing as an experimental curiosity and a practical technological powerhouse. The leap toward 10,000 physical qubits is intrinsically aimed at enabling the construction of a significant number of logical qubits, with 100 being a critical milestone for demonstrating practical quantum advantage in various computational tasks.
One reason reaching 100 logical qubits is significant is the simulation limit. When simulating quantum algorithms, classical computers face exponential growth in computational requirements. Todays most powerful supercomputers can simulate quantum algorithms with about 50 perfect qubits. This is called the simulation limit. Thus, the ability to run algorithms with 100 logical error-corrected qubits would enter an exciting era where quantum computers far exceed the computational capabilities of classical machines while also certifying that the calculation results are accurate. Achieving 100 logical qubits would signify the transition from theoretical or small-scale experimental quantum computing to practical, impactful applications, heralding a new era of computational capabilities.
Imagine a plane with a range of 20 miles. Useful? Not really. Now imagine a plane with a 1,000-mile range. That would be useful for short-haul flights but not for longer trips. A plane with a 10,000-mile range? This is useful for most applications. Similarly, a 100-logical-qubit quantum computer can provide real business value for some applications, such as optimization or machine learning. Larger problems, such as molecular simulations, still require many more logical qubits. Those may require 1,000 logical qubits, while 4,000 logical qubits are expected to be required to crack RSA-2048.
Multiple paths to 10,000 qubits
The journey to 10,000 qubits is navigated through diverse quantum computing technologies, each with unique challenges and advantages:
Each of these technologies is on a unique path to overcoming their respective challenges, with the collective goal of achieving the scale necessary for practical quantum computing.
In conclusion, the quantum computing industrys roadmap toward 10,000 physical qubits and thereby achieving over 100 logical qubits encapsulates both the challenges and the transformative potential of quantum computing. While the winning approach is yet to be determined, it appears that we are getting closer and closer to truly useful quantum computers.
Read more from the original source:
Crossing the Quantum Threshold: The Path to 10,000 Qubits - HPCwire
America is the undisputed world leader in quantum computing even though China spends 8x more on the technology … – Fortune
Processors that crunch through supercomputing tasks in the blink of an eye. Batteries that recharge in a flash. Accelerated drug discovery, encryption and decryption, and machine learning. These are just a few of the possibilities that may be enabled by quantum computing, which harnesses the laws of physics to perform calculations much faster than even the most powerful traditional computers. They all hinge on research here in the United States, the worlds undisputed leader in quantum computing.
How did America become the epicenter of this technological revolution? It didnt happen by accident. Quantum computing and world-class U.S. research universities have grown hand in hand, fostered by a policy environment that encourages scientists and entrepreneurs to commercialize academic research.
Consider our quantum computing company, IonQ. As engineering and physics professors from Duke and the University of Maryland (UMD), we founded the company in 2015 using our research, which was largely funded by the Defense Department and the Intelligence Advanced Research Projects Activity (IARPA)a government organization investing in cutting-edge technology for the intelligence community. Weve also received significant funding from the National Science Foundation, the National Institute of Standards and Technology (NIST), and the Department of Energy.
In 2020, we opened a 23,000-square-foot, $5.5 million center in College Park to house our state-of-the-art quantum machinery. The next year, IonQ was valued at $2 billion upon our IPOand became the first publicly traded pure-play quantum hardware and software company.
Along with government financing, we owe much of our success to both UMD and Dukes investment in our quantum research. UMD boasts more than 200 quantum researchers including a Nobel laureate at a joint institute shared between the university and NIST, and has awarded more than 100 doctorates in physics with a quantum focus. Duke recently established the only vertical quantum computing center in the world, which conducts research and development combining every stage of the quantum computing processfrom assembling individual atoms and engineering their electronic controllers to designing quantum algorithms and applications.
But we also owe it to a little-known law, without which none of this would have been possible the Bayh-Dole Act of 1980. Before its passage, the federal government owned the patents on inventions resulting from academic research that had received any amount of federal funding. However, the government lacked the capacity to further develop university breakthroughs, so the vast majority simply gathered dust on shelves.
Bayh-Dole allowed universities to own the patents on the inventions of their scientists, which has had a galvanizing impact. Suddenly, academic institutions were incentivized to license those patents to the private sector where they could be transformed into valuable goods and services, while stimulating entrepreneurship among the researchers who came up with those inventions in the first place.
Unfortunately, the federal government may soon undermine the Bayh-Dole systemwhich could massively stifle new advances in quantum computing. The Biden administration just announced that it seeks to use the laws march-in provision to impose price controls on inventions that were originally developed with federal funds if the priceat which the product is currently offered to the public [is] not reasonable. This notion arises from ignorance of the core value in entrepreneurship and commercialization: While the ideas are conceived and tested at universities using federal funding, it is the huge amount of effort invested by the licensee that turns those ideas and patents into useful products and services.
Abusing march-in wouldnt make new technologies more accessible for consumers or anyone else, it would do just the opposite. Devaluing the investment needed to turn these ideas into successful and practical products could disincentivize private-sector companies from taking risks by licensing university research in the first place.
When it comes to quantum computing, that chilling effect on research and development would enormously jeopardize U.S. national security. Our projects received ample funding from defense and intelligence agencies for good reason. Quantum computing may soon become the gold standard technology for codebreaking and defending large computer networks against cyberattacks.
Adopting the proposed march-in framework would also have major implications for our future economic stability. While still a nascent technology today, quantum computings ability to rapidly process huge volumes of data is set to revolutionize business in the coming decades. It may be the only way to capture the complexity needed for future AI and machine learning in, say, self-driving vehicles. It may enable companies to hone their supply chains and other logistical operations, such as manufacturing, with unprecedented precision. It may also transform finance by allowing portfolio managers to create new, superior investment algorithms and strategies.
Given the technologys immense potential, its no mystery why China committed what is believed to be more than $15 billion in 2022 to develop its quantum computing capacitymore than double the budget for quantum computing of EU countries and eight times what the U.S. government plans to spend.
Thankfully, the U.S. still has a clear edge in quantum computingfor now. Our universities attract far more top experts and leaders in the field than any other nations, including Chinas, by a wide margin. Our entrepreneurial startup culture, often bred from the innovation of our universities, is the envy of the world. And unlike Europe, our government incentivizes risk-taking and entrepreneurship through public-private partnerships.
However, if the Biden administration dismantles the law that makes this collaboration possible, theres no guarantee that our global dominance in quantum computing will persist in the long term. That would have devastating second-order effects on our national security and economic future. Computer scientists, ordinary Americans, and the intelligence and defense communities can only hope our officials rethink their proposal.
Jungsang Kim is a professor of ECE and physics at Duke University. Christopher Monroe is a professor of ECE and physics at Duke University and the University of Maryland, College Park. In 2015 they co-founded IonQ, Inc., the first publicly traded pure-play quantum hardware and software company.
The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs ofFortune.
Go here to read the rest:
America is the undisputed world leader in quantum computing even though China spends 8x more on the technology ... - Fortune
AI and Quantum Computing: High Risks or Big Boons to Fintech? – InformationWeek
Fintech startups and even incumbent banks continue to explore ways to leverage widely popular artificial intelligence for a host of tasks.
This includes producing and personalizing policy documents, the extraction of information from documents, and communication with customers. AI could be tasked to work with big data, which banks have plenty of, with generative AI also being put to work. There are concerns, however, about AI potentially introducing hallucinations into processes as well as the potential for bad actors to use AI to assail the security of banks and smaller fintechs.
The risks could be further compounded if quantum-powered AI, a potential future tech tag team, gets into the wrong hands -- a nightmare scenario where current encryption protection might be at risk of becoming vulnerable.
In the latest episode of DOS Wont Hunt, Doug Hathaway, vice president of engineering with Versapay; Prashant Kelker, chief strategy officer and partner with ISG; and Sitaram Iyer, senior director of cloud native solutions with Venafi discuss ways innovations that could transform fintech might also require conversations about guardrails and safeguards as technologies converge. Though quantum computing is still down the road, AI is making moves here and now, including in fintech.
Related:AI, Bitcoin, and Distilled Spirits at New York Fintech Week
Listen to the full podcast here.
Read the rest here:
AI and Quantum Computing: High Risks or Big Boons to Fintech? - InformationWeek
SNOLAB collaborating on quantum computing research – Sudbury.com
Worlds deepest cleanroom will be used to study how radiation affects qubits in quantum computers
SNOLAB, Sudburys underground science research facility, is partnering two other organizations to study how radiation impacts quantum computing.
Researchers at SNOLAB, located two kilometres below the surface at Creighton Mine, are teaming up with researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo and Chalmers University of Technology in Sweden on the work.
SNOLAB maintains the lowest muon flux in the world and advanced cryogenics testing capabilities, making it an ideal place to conduct valuable research on quantum technologies, Dr. Jeter Hall, director of research at SNOLAB, said in a news release. In addition, SNOLABs next generation dark matter experiments promise to be early adopters of quantum technology, so we have multiple, vested interests in the outcome of this project.
Titled Advanced Characterization and Mitigation of Qubit Decoherence in a Deep Underground Environment, the research is sponsored by the Army Research Office, a directorate of the U.S Combat Capabilities Development Commands Army Research Laboratory.
The grant was to Dr. Chris Wilson, a faculty member at IQC and professor in Waterloos Department of Electrical and Computer Engineering, alongside SNOLAB director Hall, and Dr. Per Delsing, a professor at Chalmers University and director of the Wallenberg Center for Quantum Technology.
By partnering with the experts in dark matter and cosmic radiation at SNOLAB, we can bring together their expertise and strengths with the superconducting qubit skills we have at IQC and Chalmers, said Wilson. Were also able to connect to the quantum communities and funding within the United States while showcasing the unique facilities and capabilities in Canadas scientific ecosystem.
But what does it all mean?
Like classic computers use something called a bit to describe a unit of digital information, quantum computing uses something similar, a qubit or quantum bit. Digital bits and quantum qubits dont behave the same way though, and qubits are susceptible to error when hit by high energy particles, such as cosmic rays or radioactivity.
This results in an error hotspot, which spreads out to neighbouring qubits, and has been seen happening at a rate of about once every 10 seconds, setting an upper limit on quantum calculation time, SNOLAB said in a news release.
While digital computers can use design rules and error correction to account for these high energy particles, the same is not true in quantum computing. Sometimes all the qubits will error in response to radiation, creating a challenge known as decoherence, where the qubit loses its quantum state.
With this project, we hope to start understanding whats going on with the qubit decoherence in relation to cosmic rays, and then start understanding how the radiation affects the qubits in more controlled ways, Wilson said.
Using the Canadian Shield to create a low background environment, SNOLABs unique environment allows the research collaboration to isolate the qubits from the cosmic radiation at the surface.
High-quality superconducting qubits will be manufactured in the fabrication facilities at Chalmers University, and then tested at the surface in both Sweden and Waterloo, as well as underground at SNOLAB to study the differences in each environment, SNOLAB said.
We are super excited about this project, since it addresses the very important issue of how cosmic radiation affects quantum bits and quantum processors, said Delsing, the researcher from Chalmers. Getting access to the underground facility at SNOLAB is crucial to understand how the effects of cosmic radiation can be mitigated.
Continue reading here:
SNOLAB collaborating on quantum computing research - Sudbury.com
Quantum Leap: Google’s Sycamore and the New Frontier in Computing – WebProNews
In the ever-accelerating race of technological advancement, quantum computing is the new frontier, promising to revolutionize our approach to complex problem-solving that current supercomputers cannot efficiently address. At the forefront of this quantum revolution is Googles quantum computer, Sycamore, which achieved a milestone known as quantum supremacy in 2019 by performing a complex computation in 200 seconds that would take the worlds most influential classical computer approximately 10,000 years to complete.
The Quantum Difference
Traditional computers use bits as the basic unit of data, which are binary and can represent either a 0 or a 1. Quantum computers, like Sycamore, however, use qubits that can represent both 0 and 1 simultaneously thanks to the principle of superposition. This ability allows quantum computers to handle more information than classical computers and quickly solve complex problems.
Sycamore has 54 qubits, although one was inactive during its historic feat, leaving 53 to do the work. These qubits are made from superconducting circuits that can be controlled and read electronically. The arrangement of these qubits in a two-dimensional grid enhances their connectivity, which is crucial for executing complex quantum algorithms.
The video bloggers at LifesBiggestQuestions recently explored what the future has in store for Google Quantum Computer Sycamore.
Challenges of Quantum Computing
Despite their potential, quantum systems like Sycamore are not without their challenges. They are susceptible and prone to errors. The quantum gates, which are operations on qubits, have a critically low error rate, which is pivotal for maintaining the integrity of computations. These systems require an ultra-cold environment to operate effectively, achieved through sophisticated cooling systems, notably dilution refrigerators that use helium isotopes to reach temperatures close to absolute zero.
This cooling is about achieving low temperatures and isolating the qubits from external disturbances like cosmic rays or stray photons. This can cause quantum decoherence a loss of the orderly quantum state that qubits need to perform computations.
Energy Efficiency and Future Applications
One of the surprising elements of quantum computing, particularly highlighted by Sycamores operation, is its energy efficiency. Unlike classical supercomputers that can consume up to 10 megawatts of power, quantum computers use significantly less power for computational tasks. Most of the energy is utilized to maintain the operational environment of the quantum processor rather than the computations.
The potential applications for quantum computing are vast and include fields like material science and complex system simulations, which are currently not feasible with classical computers due to the computational load.
Looking Ahead
As we advance further into quantum computing, the technology promises to expand our computational capacity and enhance energy efficiency and sustainability. However, as with all emerging technologies, quantum computing presents new challenges and risks, particularly in cybersecurity and privacy. Quantum computers could, theoretically, crack encryption systems that currently protect our most sensitive data, prompting a need for quantum-resistant cryptographic methods.
Ethical and Safety Considerations
The advent of quantum computing also underscores the need for robust ethical guidelines and safety measures to mitigate risks associated with advanced computing capabilities. This includes potential misuse in creating sophisticated weaponry or personal and national security threats. Transparent international collaboration and regulation will be critical in shaping the safe development of quantum technologies.
In conclusion, while quantum computing, like Googles Sycamore, represents a monumental leap forward, it compels us to navigate the associated risks carefully. The journey into quantum computing is about harnessing new technology and ensuring it contributes positively to society, bolstering security rather than undermining it. As this technology continues to develop, it will require innovation and a balanced approach to harness its full potential while safeguarding against its inherent risks.
Read this article:
Quantum Leap: Google's Sycamore and the New Frontier in Computing - WebProNews