Category Archives: Quantum Computing

Commodore 64 claimed to outperform IBM’s quantum system sarcastic researchers say 1 MHz computer is faster … – Tom’s Hardware

A paper released during the SIGBOVIK 2024 conference details an attempt to simulate the IBM quantum utility experiment on a Commodore 64. The idea might seem preposterous - pitting a 40-year-old home computer against a device powered by 127-Qubit Eagle quantum processing unit (QPU). However, the anonymous researcher(s) conclude that the Qommodore 64 performed faster, and more efficiently, than IBMs pride-and-joy, while being decently accurate on this problem.

At the beginning of the paper, the researchers admit that their Qommodore 64 project is a joke, but, sadly for IBM, its proof of quantum utility was also built upon shaky foundations, and the Qommodore 64 team came up with some convincing-looking benchmarks. There was some controversy about IBMs claims at the time, and we are reminded it took just five days for the quantum experiment to be simulated on an ordinary MacBook M1 Pro laptop. The jokey Quantum Disadvantage paper (PDF link, headlining section starts at page 199) ports this experiment to a machine packing the far more humble MOS Technology 6510 processor.

Image 1 of 3

To get deep into the weeds with the quantum theory and math behind the quantum utility experiment, please follow the above PDF link. However, to summarize, the C64-based experiment uses the sparse Pauli dynamics technique developed by Begui, Hejazi, and Chan to approximate the behavior of ferromagnetic materials. Famously, IBM claimed such calculations were too difficult to perform on a classical computer to an acceptable accuracy, using the leading approximation techniques, recalls the paper. Not quite, and as already mentioned above, an ordinary laptop can obtain similar results.

The anonymous C64 user(s) provide some interesting details of their quantum-defeating feat. Their aggressively truncated and shallow depth-first search model used just 15kB of the spacious 64kB available on the iconic Commodore machine. Meanwhile, the final code consisted of about 2,500 lines of 6502 assembly, stored on a cartridge that fitted in the C64s expansion port. This code was handled by the mighty 1 MHz 8-bit MOS 6510 CPU. The C64 took approx 4 minutes per data point. (Testing the same code on a modern laptop achieved roughly 800s per data point.)

In conclusion, the researcher(s) asserts that the Qommodore 64 is faster than the quantum device datapoint-for-datapoint it is much more energy efficient and it is decently accurate on this problem. On the topic of how applicable this research is to other quantum problems, it is snarkily suggested that it probably wont work on almost any other problem (but then again, neither do quantum computers right now). Overall, it is difficult to know whether the results are entirely genuine, though a lot of detail is provided and the linked research references in the paper seem genuine.

We know many readers are retro computing enthusiasts, as well as DIYers and makers. So it is good to know that the author(s) of this paper say that they will provide source code to allow others to replicate their results. However, source code will only be supplied in one of three formats, they say: a copy handwritten on papyrus, a slide-show of blurry screenshots recorded on a VHS tape, or that I dictate it to you personally over the phone. So please add an extra pinch of salt to this story for that.

Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.

Read more from the original source:
Commodore 64 claimed to outperform IBM's quantum system sarcastic researchers say 1 MHz computer is faster ... - Tom's Hardware

Guaranteeing Security and Privacy: New Quantum Breakthrough Could Benefit Millions of People – SciTechDaily

The process allows a remote user (right) to access a quantum computer in the cloud (left) with complete security. Credit: Helene Hainzer, Oxford University Physics.

A recent breakthrough guaranteeing security and privacy by Oxford University physicists could enable millions of people and businesses to tap into the capabilities of next-generation quantum computing. This advance promises to unlock the transformative potential of cloud-based quantum computing and is detailed in a new study published in the influential U.S. scientific journal Physical Review Letters.

Professor David Lucas, co-head of the Oxford University Physics research team and lead scientist at the UK Quantum Computing and Simulation Hub. Credit: Martin Small

Quantum computing is developing rapidly, paving the way for new applications that could transform services in many areas like healthcare and financial services. It works in a fundamentally different way to conventional computing and is potentially far more powerful. However, it currently requires controlled conditions to remain stable and there are concerns around data authenticity and the effectiveness of current security and encryption systems.

Several leading providers of cloud-based services, like Google, Amazon, and IBM, already separately offer some elements of quantum computing. Safeguarding the privacy and security of customer data is a vital precursor to scaling up and expanding its use and for the development of new applications as the technology advances. The new study by researchers at Oxford University Physics addresses these challenges.

We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity, said Professor David Lucas, who co-heads the Oxford University Physics research team and is lead scientist at the UK Quantum Computing and Simulation Hub, led from Oxford University Physics.

Experiments on quantum computing in the Beecroft facility, Oxford University Physics. Credit: David Nadlinger, Oxford University Physics.

In the new study, the researchers use an approach dubbed blind quantum computing, which connects two totally separate quantum computing entities potentially an individual at home or in an office accessing a cloud server in a completely secure way. Importantly, their new methods could be scaled up to large quantum computations.

Peter Drmota, author of the new study who led the experiments on blind quantum computing at Oxford University Physics. Credit: Martin Small.

Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realising this concept is a big step forward in both quantum computing and keeping our information safe online said study lead Dr Peter Drmota, of Oxford University Physics.

The researchers created a system comprising a fiber network link between a quantum computing server and a simple device detecting photons, or particles of light, at an independent computer remotely accessing its cloud services. This allows so-called blind quantum computing over a network. Every computation incurs a correction which must be applied to all that follow and needs real-time information to comply with the algorithm. The researchers used a unique combination of quantum memory and photons to achieve this.

Never in history have the issues surrounding privacy of data and code been more urgently debated than in the present era of cloud computing and artificial intelligence, said Professor David Lucas. As quantum computers become more capable, people will seek to use them with complete security and privacy over networks, and our new results mark a step change in capability in this respect.

The results could ultimately lead to commercial development of devices to plug into laptops, to safeguard data when people are using quantum cloud computing services.

Researchers exploring quantum computing and technologies at Oxford University Physics have access to the state-of-the-art Beecroft laboratory facility, specially constructed to create stable and secure conditions including eliminating vibration.

Reference: Verifiable Blind Quantum Computing with Trapped Ions and Single Photons by P. Drmota, D. P. Nadlinger, D. Main, B. C. Nichol, E. M. Ainley, D. Leichtle, A. Mantri, E. Kashefi, R. Srinivas, G. Araneda, C. J. Ballance and D. M. Lucas, 10 April 2024, Physical Review Letters. DOI: 10.1103/PhysRevLett.132.150604

Funding for the research came from the UK Quantum Computing and Simulation (QCS) Hub, with scientists from the UK National Quantum Computing Centre, the Paris-Sorbonne University, the University of Edinburgh, and the University of Maryland, collaborating on the work.

See more here:
Guaranteeing Security and Privacy: New Quantum Breakthrough Could Benefit Millions of People - SciTechDaily

Crossing the Quantum Threshold: The Path to 10,000 Qubits – HPCwire

Editors Note: Why do qubit count and quality matter? Whats the difference between physical qubits and logical qubits? Quantum computer vendors toss these terms and numbers around as indicators of the strengths of their systems. For seasoned quantum computing watchers, the rationale behind the claims are well-known and appreciated. However, there are many who are new to the quantum information science and for whom a qubit count/quality 101 backgrounder can be helpful. Heres a brief explanation from Yuval Boger of QuEra Computing. BTW, QuEra has a nice glossary of quantum terms on its website.

In recent months, several quantum companies have made roadmap announcements with plans to reach 10,000 physical qubits in the next five years or sooner. This is a dramatic increase from the current 20 to 300 qubits, especially given that several of these companies have yet to release their first product.

What makes 10,000 qubits such an important milestone, and what will quantum computers be capable of once that number is reached?

The effort to achieve 10,000 physical qubits in quantum computing is more than a mere pursuit of quantity; it embodies strategic milestones toward unlocking the full potential of quantum computation. Broadly speaking, 10,000 physical qubits allow for the practical realization of over 100 logical qubits, essential for performing longer, more complex computations with a lower chance of errors. Below, I explain the important distinction between physical and logical qubits, the significance of reaching and crossing the 100 logical qubit threshold, and the varied path different quantum computing implementations take to get there.

While increasing the number of qubits is good, increasing the qubit quality is even more important. One key attribute of good qubits is the error rates associated with single- and two-qubit operations and the lifetime of a qubit. The error rate indicates how often qubit operations are successful. These might be operations on single qubits, such as flipping a qubit, or operations on two qubits, such as entangling them. The state-of-the-art in two-qubit operations is approaching 99.9% success. While 99.9% might sound great, this success rate implies that about 1 in 1,000 operations fail. Thus, if an algorithm requires several thousands of two-qubit operations, it will likely produce incorrect results. Truly useful algorithms require millions of such operations.

While pursuing 10,000 physical qubits is critical, its imperative to acknowledge that effective quantum error correction is necessary since it is unlikely that physical qubit error rates will sufficiently improve to enable these longer, more complex algorithms. This is where logical qubits come in. Logical qubits are a collection of physical qubits that address this problem. By cleverly spreading the information from a single qubit across several qubits, detecting and correcting many errors becomes possible. The exact way to do so and the number of physical qubits that are required to create a good enough logical qubit is an active area of research, but depending on the desired error rate and the selected qubit technology, dozens, hundreds, or thousands of physical qubits will be required to create one good fault-tolerant logical qubit.

The transition from noisy, physical qubits to fault-tolerant, logical qubits is not merely technical; its transformative, marking the difference between quantum computing as an experimental curiosity and a practical technological powerhouse. The leap toward 10,000 physical qubits is intrinsically aimed at enabling the construction of a significant number of logical qubits, with 100 being a critical milestone for demonstrating practical quantum advantage in various computational tasks.

One reason reaching 100 logical qubits is significant is the simulation limit. When simulating quantum algorithms, classical computers face exponential growth in computational requirements. Todays most powerful supercomputers can simulate quantum algorithms with about 50 perfect qubits. This is called the simulation limit. Thus, the ability to run algorithms with 100 logical error-corrected qubits would enter an exciting era where quantum computers far exceed the computational capabilities of classical machines while also certifying that the calculation results are accurate. Achieving 100 logical qubits would signify the transition from theoretical or small-scale experimental quantum computing to practical, impactful applications, heralding a new era of computational capabilities.

Imagine a plane with a range of 20 miles. Useful? Not really. Now imagine a plane with a 1,000-mile range. That would be useful for short-haul flights but not for longer trips. A plane with a 10,000-mile range? This is useful for most applications. Similarly, a 100-logical-qubit quantum computer can provide real business value for some applications, such as optimization or machine learning. Larger problems, such as molecular simulations, still require many more logical qubits. Those may require 1,000 logical qubits, while 4,000 logical qubits are expected to be required to crack RSA-2048.

Multiple paths to 10,000 qubits

The journey to 10,000 qubits is navigated through diverse quantum computing technologies, each with unique challenges and advantages:

Each of these technologies is on a unique path to overcoming their respective challenges, with the collective goal of achieving the scale necessary for practical quantum computing.

In conclusion, the quantum computing industrys roadmap toward 10,000 physical qubits and thereby achieving over 100 logical qubits encapsulates both the challenges and the transformative potential of quantum computing. While the winning approach is yet to be determined, it appears that we are getting closer and closer to truly useful quantum computers.

Read more from the original source:
Crossing the Quantum Threshold: The Path to 10,000 Qubits - HPCwire

America is the undisputed world leader in quantum computing even though China spends 8x more on the technology … – Fortune

Processors that crunch through supercomputing tasks in the blink of an eye. Batteries that recharge in a flash. Accelerated drug discovery, encryption and decryption, and machine learning. These are just a few of the possibilities that may be enabled by quantum computing, which harnesses the laws of physics to perform calculations much faster than even the most powerful traditional computers. They all hinge on research here in the United States, the worlds undisputed leader in quantum computing.

How did America become the epicenter of this technological revolution? It didnt happen by accident. Quantum computing and world-class U.S. research universities have grown hand in hand, fostered by a policy environment that encourages scientists and entrepreneurs to commercialize academic research.

Consider our quantum computing company, IonQ. As engineering and physics professors from Duke and the University of Maryland (UMD), we founded the company in 2015 using our research, which was largely funded by the Defense Department and the Intelligence Advanced Research Projects Activity (IARPA)a government organization investing in cutting-edge technology for the intelligence community. Weve also received significant funding from the National Science Foundation, the National Institute of Standards and Technology (NIST), and the Department of Energy.

In 2020, we opened a 23,000-square-foot, $5.5 million center in College Park to house our state-of-the-art quantum machinery. The next year, IonQ was valued at $2 billion upon our IPOand became the first publicly traded pure-play quantum hardware and software company.

Along with government financing, we owe much of our success to both UMD and Dukes investment in our quantum research. UMD boasts more than 200 quantum researchers including a Nobel laureate at a joint institute shared between the university and NIST, and has awarded more than 100 doctorates in physics with a quantum focus. Duke recently established the only vertical quantum computing center in the world, which conducts research and development combining every stage of the quantum computing processfrom assembling individual atoms and engineering their electronic controllers to designing quantum algorithms and applications.

But we also owe it to a little-known law, without which none of this would have been possible the Bayh-Dole Act of 1980. Before its passage, the federal government owned the patents on inventions resulting from academic research that had received any amount of federal funding. However, the government lacked the capacity to further develop university breakthroughs, so the vast majority simply gathered dust on shelves.

Bayh-Dole allowed universities to own the patents on the inventions of their scientists, which has had a galvanizing impact. Suddenly, academic institutions were incentivized to license those patents to the private sector where they could be transformed into valuable goods and services, while stimulating entrepreneurship among the researchers who came up with those inventions in the first place.

Unfortunately, the federal government may soon undermine the Bayh-Dole systemwhich could massively stifle new advances in quantum computing. The Biden administration just announced that it seeks to use the laws march-in provision to impose price controls on inventions that were originally developed with federal funds if the priceat which the product is currently offered to the public [is] not reasonable. This notion arises from ignorance of the core value in entrepreneurship and commercialization: While the ideas are conceived and tested at universities using federal funding, it is the huge amount of effort invested by the licensee that turns those ideas and patents into useful products and services.

Abusing march-in wouldnt make new technologies more accessible for consumers or anyone else, it would do just the opposite. Devaluing the investment needed to turn these ideas into successful and practical products could disincentivize private-sector companies from taking risks by licensing university research in the first place.

When it comes to quantum computing, that chilling effect on research and development would enormously jeopardize U.S. national security. Our projects received ample funding from defense and intelligence agencies for good reason. Quantum computing may soon become the gold standard technology for codebreaking and defending large computer networks against cyberattacks.

Adopting the proposed march-in framework would also have major implications for our future economic stability. While still a nascent technology today, quantum computings ability to rapidly process huge volumes of data is set to revolutionize business in the coming decades. It may be the only way to capture the complexity needed for future AI and machine learning in, say, self-driving vehicles. It may enable companies to hone their supply chains and other logistical operations, such as manufacturing, with unprecedented precision. It may also transform finance by allowing portfolio managers to create new, superior investment algorithms and strategies.

Given the technologys immense potential, its no mystery why China committed what is believed to be more than $15 billion in 2022 to develop its quantum computing capacitymore than double the budget for quantum computing of EU countries and eight times what the U.S. government plans to spend.

Thankfully, the U.S. still has a clear edge in quantum computingfor now. Our universities attract far more top experts and leaders in the field than any other nations, including Chinas, by a wide margin. Our entrepreneurial startup culture, often bred from the innovation of our universities, is the envy of the world. And unlike Europe, our government incentivizes risk-taking and entrepreneurship through public-private partnerships.

However, if the Biden administration dismantles the law that makes this collaboration possible, theres no guarantee that our global dominance in quantum computing will persist in the long term. That would have devastating second-order effects on our national security and economic future. Computer scientists, ordinary Americans, and the intelligence and defense communities can only hope our officials rethink their proposal.

Jungsang Kim is a professor of ECE and physics at Duke University. Christopher Monroe is a professor of ECE and physics at Duke University and the University of Maryland, College Park. In 2015 they co-founded IonQ, Inc., the first publicly traded pure-play quantum hardware and software company.

The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs ofFortune.

Go here to read the rest:
America is the undisputed world leader in quantum computing even though China spends 8x more on the technology ... - Fortune

AI and Quantum Computing: High Risks or Big Boons to Fintech? – InformationWeek

Fintech startups and even incumbent banks continue to explore ways to leverage widely popular artificial intelligence for a host of tasks.

This includes producing and personalizing policy documents, the extraction of information from documents, and communication with customers. AI could be tasked to work with big data, which banks have plenty of, with generative AI also being put to work. There are concerns, however, about AI potentially introducing hallucinations into processes as well as the potential for bad actors to use AI to assail the security of banks and smaller fintechs.

The risks could be further compounded if quantum-powered AI, a potential future tech tag team, gets into the wrong hands -- a nightmare scenario where current encryption protection might be at risk of becoming vulnerable.

In the latest episode of DOS Wont Hunt, Doug Hathaway, vice president of engineering with Versapay; Prashant Kelker, chief strategy officer and partner with ISG; and Sitaram Iyer, senior director of cloud native solutions with Venafi discuss ways innovations that could transform fintech might also require conversations about guardrails and safeguards as technologies converge. Though quantum computing is still down the road, AI is making moves here and now, including in fintech.

Related:AI, Bitcoin, and Distilled Spirits at New York Fintech Week

Listen to the full podcast here.

Read the rest here:
AI and Quantum Computing: High Risks or Big Boons to Fintech? - InformationWeek

SNOLAB collaborating on quantum computing research – Sudbury.com

Worlds deepest cleanroom will be used to study how radiation affects qubits in quantum computers

SNOLAB, Sudburys underground science research facility, is partnering two other organizations to study how radiation impacts quantum computing.

Researchers at SNOLAB, located two kilometres below the surface at Creighton Mine, are teaming up with researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo and Chalmers University of Technology in Sweden on the work.

SNOLAB maintains the lowest muon flux in the world and advanced cryogenics testing capabilities, making it an ideal place to conduct valuable research on quantum technologies, Dr. Jeter Hall, director of research at SNOLAB, said in a news release. In addition, SNOLABs next generation dark matter experiments promise to be early adopters of quantum technology, so we have multiple, vested interests in the outcome of this project.

Titled Advanced Characterization and Mitigation of Qubit Decoherence in a Deep Underground Environment, the research is sponsored by the Army Research Office, a directorate of the U.S Combat Capabilities Development Commands Army Research Laboratory.

The grant was to Dr. Chris Wilson, a faculty member at IQC and professor in Waterloos Department of Electrical and Computer Engineering, alongside SNOLAB director Hall, and Dr. Per Delsing, a professor at Chalmers University and director of the Wallenberg Center for Quantum Technology.

By partnering with the experts in dark matter and cosmic radiation at SNOLAB, we can bring together their expertise and strengths with the superconducting qubit skills we have at IQC and Chalmers, said Wilson. Were also able to connect to the quantum communities and funding within the United States while showcasing the unique facilities and capabilities in Canadas scientific ecosystem.

But what does it all mean?

Like classic computers use something called a bit to describe a unit of digital information, quantum computing uses something similar, a qubit or quantum bit. Digital bits and quantum qubits dont behave the same way though, and qubits are susceptible to error when hit by high energy particles, such as cosmic rays or radioactivity.

This results in an error hotspot, which spreads out to neighbouring qubits, and has been seen happening at a rate of about once every 10 seconds, setting an upper limit on quantum calculation time, SNOLAB said in a news release.

While digital computers can use design rules and error correction to account for these high energy particles, the same is not true in quantum computing. Sometimes all the qubits will error in response to radiation, creating a challenge known as decoherence, where the qubit loses its quantum state.

With this project, we hope to start understanding whats going on with the qubit decoherence in relation to cosmic rays, and then start understanding how the radiation affects the qubits in more controlled ways, Wilson said.

Using the Canadian Shield to create a low background environment, SNOLABs unique environment allows the research collaboration to isolate the qubits from the cosmic radiation at the surface.

High-quality superconducting qubits will be manufactured in the fabrication facilities at Chalmers University, and then tested at the surface in both Sweden and Waterloo, as well as underground at SNOLAB to study the differences in each environment, SNOLAB said.

We are super excited about this project, since it addresses the very important issue of how cosmic radiation affects quantum bits and quantum processors, said Delsing, the researcher from Chalmers. Getting access to the underground facility at SNOLAB is crucial to understand how the effects of cosmic radiation can be mitigated.

Continue reading here:
SNOLAB collaborating on quantum computing research - Sudbury.com

Quantum Leap: Google’s Sycamore and the New Frontier in Computing – WebProNews

In the ever-accelerating race of technological advancement, quantum computing is the new frontier, promising to revolutionize our approach to complex problem-solving that current supercomputers cannot efficiently address. At the forefront of this quantum revolution is Googles quantum computer, Sycamore, which achieved a milestone known as quantum supremacy in 2019 by performing a complex computation in 200 seconds that would take the worlds most influential classical computer approximately 10,000 years to complete.

The Quantum Difference

Traditional computers use bits as the basic unit of data, which are binary and can represent either a 0 or a 1. Quantum computers, like Sycamore, however, use qubits that can represent both 0 and 1 simultaneously thanks to the principle of superposition. This ability allows quantum computers to handle more information than classical computers and quickly solve complex problems.

Sycamore has 54 qubits, although one was inactive during its historic feat, leaving 53 to do the work. These qubits are made from superconducting circuits that can be controlled and read electronically. The arrangement of these qubits in a two-dimensional grid enhances their connectivity, which is crucial for executing complex quantum algorithms.

The video bloggers at LifesBiggestQuestions recently explored what the future has in store for Google Quantum Computer Sycamore.

Challenges of Quantum Computing

Despite their potential, quantum systems like Sycamore are not without their challenges. They are susceptible and prone to errors. The quantum gates, which are operations on qubits, have a critically low error rate, which is pivotal for maintaining the integrity of computations. These systems require an ultra-cold environment to operate effectively, achieved through sophisticated cooling systems, notably dilution refrigerators that use helium isotopes to reach temperatures close to absolute zero.

This cooling is about achieving low temperatures and isolating the qubits from external disturbances like cosmic rays or stray photons. This can cause quantum decoherence a loss of the orderly quantum state that qubits need to perform computations.

Energy Efficiency and Future Applications

One of the surprising elements of quantum computing, particularly highlighted by Sycamores operation, is its energy efficiency. Unlike classical supercomputers that can consume up to 10 megawatts of power, quantum computers use significantly less power for computational tasks. Most of the energy is utilized to maintain the operational environment of the quantum processor rather than the computations.

The potential applications for quantum computing are vast and include fields like material science and complex system simulations, which are currently not feasible with classical computers due to the computational load.

Looking Ahead

As we advance further into quantum computing, the technology promises to expand our computational capacity and enhance energy efficiency and sustainability. However, as with all emerging technologies, quantum computing presents new challenges and risks, particularly in cybersecurity and privacy. Quantum computers could, theoretically, crack encryption systems that currently protect our most sensitive data, prompting a need for quantum-resistant cryptographic methods.

Ethical and Safety Considerations

The advent of quantum computing also underscores the need for robust ethical guidelines and safety measures to mitigate risks associated with advanced computing capabilities. This includes potential misuse in creating sophisticated weaponry or personal and national security threats. Transparent international collaboration and regulation will be critical in shaping the safe development of quantum technologies.

In conclusion, while quantum computing, like Googles Sycamore, represents a monumental leap forward, it compels us to navigate the associated risks carefully. The journey into quantum computing is about harnessing new technology and ensuring it contributes positively to society, bolstering security rather than undermining it. As this technology continues to develop, it will require innovation and a balanced approach to harness its full potential while safeguarding against its inherent risks.

Read this article:
Quantum Leap: Google's Sycamore and the New Frontier in Computing - WebProNews

Advancing Quantum Technologies with Magnetic Butterfly – AZoNano

Apr 15 2024Reviewed by Lexie Corner

Researchers at the National University of Singapore (NUS) have created a novel design idea for next-generation carbon-based quantum materials in the form of a small magnetic nanographene with a distinct butterfly shape that houses strongly correlated spins.

This innovative design has the potential to expedite the growth of quantum materials, which are critical for the development of advanced quantum computing technologies that will revolutionize information processing and high-density storage capacities.

Associate Professor Lu Jiong of the NUS Department of Chemistry and Institute for Functional Intelligent Materials headed the project, which also included Professor Wu Jishan from the NUS Department of Chemistry and international collaborators.

Magnetic nanographene, a nanostructure comprised of graphene molecules, has exceptional magnetic capabilities due to the behavior of particular electrons in the carbon atoms -orbitals. These unique electrons can be controlled by precisely arranging these carbon atoms at the nanoscale.

This makes nanographene particularly promising for producing incredibly small magnets and the essential building pieces required for quantum computers, known as quantum bits or qubits.

The researchers butterfly-shaped magnetic graphene features four rounded triangles that resemble butterfly wings. Each wing has an unpaired -electronresponsible for the discovered magnetic characteristics. The structure was created using an atomic-precise design of the -electron network in nanostructured graphene.

Magnetic nanographene, a tiny molecule composed of fused benzene rings, holds significant promise as a next-generation quantum material for hosting fascinating quantum spins due to its chemical versatility and long spin coherence time. However, creating multiple highly entangled spins in such systems is a daunting yet essential task for building scalable and complex quantum networks.

Lu Jiong, Associate Professor, Department of Chemistry, National University of Singapore

The remarkable breakthrough is the result of close collaboration involving synthetic chemists, materials scientists, and physicists, including key contributors Professor Pavel Jelinek and Dr. Libor Vei of the Czech Academy of Sciences in Prague.

On February 19th, 2024, Nature Chemistry published this groundbreaking study.

The magnetic characteristics of nanographene are often determined by the configuration of its unique electrons, known as -electrons, or the strength of their interactions. However, combining these features to generate numerous associated spins is difficult. Nanographene also has a unique magnetic order, with spins aligning in either the same direction (ferromagnetic) or opposing directions (antiferromagnetic).

The researchers devised a strategy to circumvent these obstacles. Their butterfly-shaped nanographene, which has ferromagnetic and antiferromagnetic characteristics, is created by merging four smaller triangles into a rhombus in the center. Nanographene is roughly 3 nanometers in size.

To make the butterfly nanographene, the researchers first created a unique molecule precursor using traditional in-solution chemistry. This precursor was then employed for the on-surface synthesis, a novel type of solid-phase chemical reaction carried out in a vacuum environment. This method enabled the researchers to accurately manipulate the form and structure of nanographene at the atomic level.

The butterfly nanographene has four unpaired -electrons, with spins delocalized in the wing regions and entangled together. The researchers used an ultra-cold scanning probe microscope with a nickelocene tip as an atomic-scale spin sensor to test the magnetism of butterfly nanographenes.

This novel technology also allows scientists to probe entangled spins directly to better understand how nanographenes magnetism operates at the atomic level. The innovation not only addresses current obstacles but also offers up new avenues for accurately manipulating magnetic characteristics at the smallest scale, resulting in promising advances in quantum materials research.

Lu added, The insights gained from this study pave the way for creating new-generation organic quantum materials with designer quantum spin architectures. Looking ahead, our goal is to measure the spin dynamics and coherence time at the single-molecule level and manipulate these entangled spins coherently. This represents a significant stride towards achieving more powerful information processing and storage capabilities.

Song, S., etal. (2024) Highly entangled polyradical nanographene with coexisting strong correlation and topological frustration. Nature Chemistry. doi:10.1038/s41557-024-01453-9

Source: https://nus.edu.sg/

Read more here:
Advancing Quantum Technologies with Magnetic Butterfly - AZoNano

Microsoft, Quantinuum Usher in the Next Age of Quantum Computing – Thomas Insights

Microsoft and Quantinuum have achieved a groundbreaking advancement in quantum error correction, pushing the boundaries of quantum computing beyond the noisy intermediate scale quantum (NISQ) era. Their collaboration combines Quantinuums ion-trap hardware with Microsofts innovative qubit-virtualization system, resulting in an impressive feat: over 14,000 error-free experiments.

In the realm of quantum computing, where even small environmental disturbances can disrupt results, error correction is crucial. The teams achievement not only ensures error-free operation but also allows for the detection and correction of errors without compromising logical qubits.

A key aspect of this milestone is the ability to perform active syndrome extraction, a process that identifies and fixes errors while preserving logical qubits a vital step towards reliable quantum computing. Dennis Tom and Krysta Svore of Microsoft highlight the importance of this achievement, considering it a fundamental milestone in quantum error correction.

As the quantum community works to replicate and build upon these results, the collaboration between Microsoft and Quantinuum establishes a new standard for the resilience and potential of quantum computing.

Image Credit: Shutterstock.com / Gorodenkoff

Continued here:
Microsoft, Quantinuum Usher in the Next Age of Quantum Computing - Thomas Insights

Combatting disruptive noise in quantum comm – EurekAlert

image:

PhD researcher Luis Villegas Aguilar conducting the experiment

Credit: Griffith University

In a significant milestone for quantum communication technology, an experiment has demonstrated how networks can be leveraged to combat disruptive noise in quantum communications.

The international effort led by researchers from Griffith Universitys Centre for Quantum Dynamics highlights the potential of quantum networks in revolutionising communication technologies on a quantum level.

Researchers Dr Nora Tischler and Dr Sergei Slussarenko, Program Managers at the ARC Centre of Excellence for Quantum Computation and Communication Technology (CQC2T) node at Griffith University, believe their findings were a first step towards large-scale quantum networks, which may fundamentally change how we communicate on a global scale.

The study delves into the intricate world of quantum entanglementa phenomenon where particles maintain a connection regardless of the distance between them. Quantum entanglement, which has long been recognised as a cornerstone of quantum technology, has intrigued scientists due to its potential applications in hyper-sensitive sensors and ultra-private communication channels.

CQC2T PhD Researcher Luis Villegas-Aguilar, alongside the team at Griffith University, embarked on a journey to explore the relationship between quantum entanglement and nonlocalitymysterious correlations that Einstein famously referred to as "spooky action at a distance."

The degradation of these quantum effects due to noise has posed a major challenge in realising their practical applications. The experiment conducted by the research team addressed this challenge head-on.

"In essence, our experiment demonstrates how networks can be utilised to overcome noise in quantum communications," explains Villegas-Aguilar. By simulating real-world conditions within a controlled environment, we aimed to enhance noise tolerance and 'activate' quantum nonlocality within a network structure.

To realise this goal, they joined forces with researchers from the University of New South Wales, Sorbonne University, France, and the National Institute of Standards and Technology in the US. The team set up a three-station quantum network in their laboratories, mimicking configurations one might find in a future quantum internet.

In our experiment, we sent the entangled particles to different stations inside the lab. We used entangled single photons, which are quantum particles of light, Dr Tischler said.

The three-station quantum network, simulating noisy conditions that one might encounter in a larger, field-deployed network. First, we started with only two entangled photons and proved they could not produce quantum nonlocality past a specific noise limit.

Then, through meticulous design and implementation, the researchers observed a remarkable phenomenon: the previously lost quantum nonlocality could be recovered by adding an extra connectivity link.

We observed that adding the third station to the network configuration allowed us to overcome the effects of noise and activate quantum nonlocality, says Dr Emanuele Polino, a Postdoctoral Researcher involved with the experiment.

The team are confident that their results not only advanced our understanding of quantum phenomena, but also paved the way for the development of resilient and robust quantum technologies.

As the world continues to progress towards an era of quantum computing and communication, this research represents a significant milestone in harnessing the full potential of quantum mechanics.

The study Nonlocality activation in a photonic quantum network has been published inNature Communications.

Nature Communications

Experimental study

Nonlocality activation in a photonic quantum network

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read the original:
Combatting disruptive noise in quantum comm - EurekAlert