Page 417«..1020..416417418419..430440..»

Superconducting qubit promises breakthrough in quantum computing – Advanced Science News

A radical superconducting qubit design promises to extend their runtime by addressing decoherence challenges in quantum computing.

A new qubit design based on superconductors could revolutionize quantum computing. By leveraging the distinct properties of single-atom-thick layers of materials, this new approach to superconducting circuits promises to significantly extend the runtime of a quantum computer, addressing a major challenge in the field.

This limitation on continuous operation time arises because the quantum state of a qubit the basic computing unit of a quantum computer can be easily destabilized due to interactions with its environment and other qubits. This destruction of the quantum state is called decoherence and leads to errors in computations.

Among the various types of qubits that scientists have created, including photons, trapped ions, and quantum dots, superconducting qubits are desirable because they can switch between different states in the shortest amount of time.

Their operation is based on the fact that, due to subtle quantum effects, the power of the electric current flowing through the superconductor can take discrete values, each corresponding to a state of 0 and/or 1 (or even larger values for some designs).

For superconducting qubits to work correctly, they require the presence of a gap in the superconducting circuit called a Josephson junction through which an electrical current flows through a quantum phenomenon called tunneling the passage of particles through a barrier that, according to the laws of classical physics, they should not be able to cross.

The problem is, the advantage of superconducting qubits in enhanced switching time comes at a cost: They are more susceptible to decoherence, which occurs in milliseconds, or even faster. To mitigate this issue, scientists typically resort to meticulous adjustments of circuit configurations and qubit placements with few net gains.

Addressing this challenge with a more radical approach, an international team of researchers proposed a novel Josephson junction design using two, single-atom-thick flakes of a superconducting copper-based material called a cuprate. They called their design flowermon.

In their study published in the Physical Review Letters, the team applied the fundamental laws of quantum mechanics to analyze the current flow through a Josephson junction and discovered that if the angle between the crystal lattices of two superconducting cuprate sheets is 45 degrees, the qubit exhibits more resilience to external disturbances compared to conventional designs based on materials like niobium and tantalum.

The flowermon modernizes the old idea of using unconventional superconductors for protected quantum circuits and combines it with new fabrication techniques and a new understanding of superconducting circuit coherence, Uri Vool, a physicist at the Max Planck Institute for Chemical Physics of Solids in Germany, explained in a press release.

The teams calculations suggest that the noise reduction promised by their design could increase the qubits coherence time by orders of magnitude, thereby enhancing the continuous operation of quantum computers. However, they view their research as just the beginning, envisioning future endeavors to further optimize superconducting qubits based on their findings.

The idea behind the flowermon can be extended in several directions: Searching for different superconductors or junctions yielding similar effects, exploring the possibility to realize novel quantum devices based on the flowermon, said Valentina Brosco, a researcher at the Institute for Complex Systems Consiglio Nazionale delle Ricerche and Physics Department University of Rome. These devices would combine the benefits of quantum materials and coherent quantum circuits or using the flowermon or related design to investigate the physics of complex superconducting heterostructures.

This is only the first simple concrete example of utilizing the inherent properties of a material to make a new quantum device, and we hope to build on it and find additional examples, eventually establishing a field of research that combines complex material physics with quantum devices, Vool added.

Since the teams study was purely theoretical, even the simplest heterostructure-based qubit design they proposed requires experimental validation a step that is currently underway.

Experimentally, there is still quite a lot of work towards implementing this proposal, concluded Vool. We are currently fabricating and measuring hybrid superconducting circuits which integrate these van der Waals superconductors, and hope to utilize these circuits to better understand the material, and eventually design and measure protected hybrid superconducting circuits to make them into real useful devices.

Reference: Uri Vool, et al., Superconducting Qubit Based on Twisted Cuprate Van der Waals Heterostructures, Physical Review Letters (2024). DOI: 10.1103/PhysRevLett.132.017003

Feature image credit: SuttleMedia on Pixabay

Here is the original post:

Superconducting qubit promises breakthrough in quantum computing - Advanced Science News

Read More..

U.S. weighs National Quantum Initiative Reauthorization Act – TechTarget

While artificial intelligence and semiconductors capture global attention, some U.S. policymakers want to ensure Congress doesn't fail to invest and stay competitive in other emerging technologies, including quantum computing.

Quantum computing regularly lands on the U.S. critical and emerging technologies list, which pinpoints technologies that could affect U.S. national security. Quantum computing -- an area of computer science that uses quantum physics to solve problems too complex for traditional computers -- not only affects U.S. national security, but intersects with other prominent technologies and industries, including AI, healthcare and communications.

The U.S. first funded quantum computing research and development in 2018 through the $1.2 billion National Quantum Initiative Act. It's something policymakers now want to continue through the National Quantum Initiative Reauthorization Act. Reps. Frank Lucas (R-Okla.) and Zoe Lofgren (D-Calif.) introduced the legislation in November 2023, and it has yet to pass the House despite having bipartisan support.

Continuing to invest in quantum computing R&D means staying competitive with other countries making similar investments to not only stay ahead of the latest advancements, but protect national security, said Isabel Al-Dhahir, principal analyst at GlobalData.

"Quantum computing's geopolitical weight and the risk a powerful quantum computer poses to current cybersecurity measures mean that not only the U.S., but also China, the EU, the U.K., India, Canada, Japan and Australia are investing heavily in the technology and are focused on building strong internal quantum ecosystems in the name of national security," she said.

Global competition in quantum computing will increase as the technology moves from theoretical to practical applications, Al-Dhahir said. Quantum computing has the potential to revolutionize areas such as drug development and cryptography.

Al-Dhahir said while China is investing $15 billion over the next five years in its quantum computing capabilities, the EU's Quantum Technologies Flagship program will provide $1.2 billion in funding over the next 10 years. To stay competitive, the U.S. needs to continue funding quantum computing R&D and studying practical applications for the technology.

"If reauthorization fails, it will damage the U.S.'s position in the global quantum race," she said.

Lofgren, who spoke during The Intersect: A Tech and Policy Summit earlier this month, said it's important to pass the National Quantum Initiative Reauthorization Act to "maintain our competitive edge." The legislation aims to move beyond scientific research and into practical applications of quantum computing, along with ensuring scientists have the necessary resources to accomplish those goals, she said.

Indeed, Sen. Marsha Blackburn (R-Tenn.) said during the summit that the National Quantum Initiative Act needs to be reauthorized for the U.S. to move forward. Blackburn, along with Sen. Ben Ray Lujn (D-N.M.), has also introduced the Quantum Sandbox for Near-Term Applications Act to advance commercialization of quantum computing.

The 2018 National Quantum Initiative Act served a "monumental" purpose in mandating agencies such as the National Science Foundation, NIST and the Department of Energy to study quantum computing and create a national strategy, said Joseph Keller, a visiting fellow at the Brookings Institution.

Though the private sector has made significant investments in quantum computing, Keller said the U.S. would not be a leader in quantum computing research without federal support, especially with goals to eventually commercialize the technology at scale. He said that's why it's pivotal for the U.S. to pass the National Quantum Initiative Reauthorization Act, even amid other congressional priorities such as AI.

"I don't think you see any progress forward without the passage of that legislation," Keller said.

Despite investment from numerous big tech companies, including Microsoft, Intel, IBM and Google, significant technical hurdles remain for the broad commercialization of quantum computing, Al-Dhahir said.

She said the quantum computing market faces issues such as overcoming high error rates -- for example, suppressing error rates requires "substantially higher" qubit counts than what is being achieved today. A qubit, short for quantum bit, is considered a basic unit of information in quantum computing.

IBM released the first quantum computer with more than 1,000 qubits in 2023. However, Al-Dhahir said more is needed to avoid high error rates in quantum computing.

"The consensus is that hundreds of thousands to millions of qubits are required for practical large-scale quantum computers," she said.

Indeed, industry is still trying to identify the economic proposition of quantum computing, and the government has a role to play in that, Brookings' Keller said.

"It doesn't really have these real-world applications, things you can hold and touch," he said. "But there are breakthroughs happening in science and industry."

Lofgren said she recognizes that quantum computing has yet to reach the stage of practical, commercial applications, but she hopes that legislation such as the National Quantum Initiative Reauthorization Act will help the U.S. advance quantum computing to that stage.

"Quantum computing is not quite there yet, although we are making tremendous strides," she said.

Makenzie Holland is a news writer covering big tech and federal regulation. Prior to joining TechTarget Editorial, she was a general reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.

Read more:

U.S. weighs National Quantum Initiative Reauthorization Act - TechTarget

Read More..

Electrons become fractions of themselves in graphene, study finds – EurekAlert

image:

The fractional quantum Hall effect has generally been seen under very high magnetic fields, but MIT physicists have now observed it in simple graphene. In a five-layer graphene/ hexagonal boron nitride (hBN) moire superlattice, electrons (blue ball) interact with each other strongly and behave as if they are broken into fractional charges.

Credit: Sampson Wilcox, RLE

The electron is the basic unit of electricity, as it carries a single negative charge. This is what were taught in high school physics, and it is overwhelmingly the case in most materials in nature.

But in very special states of matter, electrons can splinter into fractions of their whole. This phenomenon, known as fractional charge, is exceedingly rare, and if it can be corralled and controlled, the exotic electronic state could help to build resilient, fault-tolerant quantum computers.

To date, this effect, known to physicists as the fractional quantum Hall effect, has been observed a handful of times, and mostly under very high, carefully maintained magnetic fields. Only recently have scientists seen the effect in a material that did not require such powerful magnetic manipulation.

Now, MIT physicists have observed the elusive fractional charge effect, this time in a simpler material: five layers of graphene an atom-thin layer of carbon that stems from graphite and common pencil lead. They report their results inNature.

They found that when five sheets of graphene are stacked like steps on a staircase, the resulting structure inherently provides just the right conditions for electrons to pass through as fractions of their total charge, with no need for any external magnetic field.

The results are the first evidence of the fractional quantum anomalous Hall effect (the term anomalous refers to the absence of a magnetic field) in crystalline graphene, a material that physicists did not expect to exhibit this effect.

This five-layer graphene is a material system where many good surprises happen, says study author Long Ju, assistant professor of physics at MIT. Fractional charge is just so exotic, and now we can realize this effect with a much simpler system and without a magnetic field. That in itself is important for fundamental physics. And it could enable the possibility for a type of quantum computing that is more robust against perturbation.

Jus MIT co-authors are lead author Zhengguang Lu, Tonghang Han, Yuxuan Yao, Aidan Reddy, Jixiang Yang, Junseok Seo, and Liang Fu, along with Kenji Watanabe and Takashi Taniguchi at the National Institute for Materials Science in Japan.

A bizarre state

The fractional quantum Hall effect is an example of the weird phenomena that can arise when particles shift from behaving as individual units to acting together as a whole. This collective correlated behavior emerges in special states, for instance when electrons are slowed from their normally frenetic pace to a crawl that enables the particles to sense each other and interact. These interactions can produce rare electronic states, such as the seemingly unorthodox splitting of an electrons charge.

In 1982, scientists discovered the fractional quantum Hall effect in heterostructures of gallium arsenide,where a gas of electrons confined ina two-dimensional plane is placed under high magnetic fields. The discovery later won the group a Nobel Prize in Physics.

[The discovery] was a very big deal, because these unit charges interacting in a way to give something like fractional charge was very, very bizarre, Ju says. At the time, there were no theory predictions, and the experiments surprised everyone.

Those researchers achieved their groundbreaking results using magnetic fields to slow down the materials electrons enough for them to interact. The fields they worked with were about 10 times stronger than what typically powers an MRI machine.

In August 2023, scientists at the University of Washington reported the first evidence of fractional charge without a magnetic field. They observed this anomalous version of the effect, in a twisted semiconductor called molybdenum ditelluride. The group prepared the material in a specific configuration, which theorists predicted would give the material an inherent magnetic field, enough to encourage electrons to fractionalize without any external magnetic control.

The no magnets result opened a promising route to topological quantum computing a more secure form of quantum computing, in which the added ingredient of topology (a property that remains unchanged in the face of weak deformation or disturbance) gives a qubit added protection when carrying out a computation. This computation scheme is based on a combination of fractional quantum Hall effect and a superconductor. It used to be almost impossible to realize: One needs a strong magnetic field to get fractional charge, while the same magnetic field will usually kill the superconductor. In this case the fractional charges would serve as a qubit (the basic unit of a quantum computer).

Making steps

That same month, Ju and his team happened to also observe signs of anomalous fractional charge in graphene a material for which there had been no predictions for exhibiting such an effect.

Jus group has been exploring electronic behavior in graphene, which by itself has exhibited exceptional properties. Most recently, Jus group has looked into pentalayer graphene a structure of five graphene sheets, each stacked slightly off from the other, like steps on a staircase. Such pentalayer graphene structure is embedded in graphite and can be obtained by exfoliation using Scotch tape. When placed in a refrigerator at ultracold temperatures, the structures electrons slow to a crawl and interact in ways they normally wouldnt when whizzing around at higher temperatures.

In their new work, the researchers did some calculations and found that electrons might interact with each other even more strongly if the pentalayer structure were aligned with hexagonal boron nitride (hBN) a material that has a similar atomic structure to that of graphene, but with slightly different dimensions. In combination, the two materials should produce a moir superlattice an intricate, scaffold-like atomic structure that could slow electrons down in ways that mimic a magnetic field.

We did these calculations, then thought, lets go for it, says Ju, who happened to install a new dilution refrigerator in his MIT lab last summer, which the team planned to use to cool materials down to ultralow temperatures, to study exotic electronic behavior.

The researchers fabricated two samples of the hybrid graphene structure by first exfoliating graphene layers from a block of graphite, then usingoptical tools to identify five-layered flakes in the steplike configuration. They then stamped the graphene flake onto an hBN flake and placed a second hBN flake over the graphene structure. Finally, they attached electrodes to the structure and placed it in the refrigerator, set to near absolute zero.

As they applied a current to the material and measured the voltage output, they started to see signatures of fractional charge, where the voltage equals the current multiplied by a fractional number and some fundamental physics constants.

The day we saw it, we didnt recognize it at first, says first author Lu. Then we started to shout as we realized, this was really big. It was a completely surprising moment.

This was probably the first serious samples we put in the new fridge, adds co-first author Han. Once we calmed down, we looked in detail to make sure that what we were seeing was real.

With further analysis, the team confirmed that the graphene structure indeed exhibited the fractional quantum anomalous Hall effect. It is the first time the effect has been seen in graphene.

Graphene can also be a superconductor, Ju says. So, you could have two totally different effects in the same material, right next to each other. If you use graphene to talk to graphene, it avoids a lot of unwanted effects when bridging graphene with other materials.

For now, the group is continuing to explore multilayer graphene for other rare electronic states.

We are diving in to explore many fundamental physics ideas and applications, he says. We know there will be more to come.

This research is supported in part by the Sloan Foundation, and the National Science Foundation.

###

Written by Jennifer Chu, MIT News

Fractional Quantum Anomalous Hall Effect in Multilayer Graphene

Read more from the original source:

Electrons become fractions of themselves in graphene, study finds - EurekAlert

Read More..

Breaking the Temperature Barrier: How Quantum Ground State Acoustics Could Revolutionize Quantum Physics – SciTechDaily

Researchers from the Stiller Research Group have significantly cooled sound waves in an optical fiber to a near quantum ground state, reducing thermal noise and potentially bridging classical and quantum mechanics. This breakthrough, achieved through laser cooling and stimulated Brillouin scattering, marks a promising step towards utilizing long acoustic phonons in quantum technologies. Artists impression of cooled acoustic waves in an optical fiber taper. Credit: Long Huy Da

The quantum ground state of an acoustic wave of a certain frequency can be reached by completely cooling the system. In this way, the number of quantum particles, the so-called acoustic phonons, which cause disturbance to quantum measurements, can be reduced to almost zero and the gap between classical and quantum mechanics bridged.

Over the past decade, major technological advances have been made, making it possible to put a wide variety of systems into this state. Mechanical vibrations oscillating between two mirrors in a resonator can be cooled to very low temperatures as far as the quantum ground state. This has not yet been possible for optical fibers in which high-frequency sound waves can propagate. Now researchers from the Stiller Research Group have taken a step closer to this goal.

In their study, recently published in Physical Review Letters, they report that they were able to lower the temperature of a sound wave in an optical fiber initially at room temperature by 219 K using laser cooling, ten times further than had previously been reported. Ultimately, the initial phonon number was reduced by 75%, at a temperature of 74 K, -194 Celsius.

Birgit Stillers research team in the lab: Birgit Stiller, Laura Blzquez Martnez, Andreas Geilen, Changlong Zhu, Philipp Wiedemann (f.l.t.r.) Credit: MPL, Florian Ritter

Such a drastic reduction in temperature was made possible by the use of laser light. Cooling of the propagating sound waves was achieved via the nonlinear optical effect of stimulated Brillouin scattering, in which light waves are efficiently coupled to sound waves. Through this effect, the laser light cools the acoustic vibrations and creates an environment with less thermal noise which is, to an extent, disturbing noise for a quantum communication system, for example.

An interesting advantage of glass fibers, in addition to this strong interaction, is the fact that they can conduct light and sound excellently over long distances, says Laura Blzquez Martnez, one of the lead authors of the article and a doctoral student in the Stiller research group. Most physical platforms previously brought to the quantum ground state were microscopic.

However, in this experiment, the length of the optical fiber was 50 cm, and a sound wave extending over the full 50 cm of the core of the fiber was cooled to extremely low temperatures. These results are a very exciting step towards the quantum ground state in waveguides and the manipulation of such long acoustic phonons opens up possibilities for broadband applications in quantum technology, according to Dr. Birgit Stiller, head of the quantum optoacoustics group.

Experimental setup in the laboratory. Credit: SAOT Max Gmelch

Sound, in the day-to-day classical world, can be understood as a density wave in a medium. However, from the perspective of quantum mechanics, sound can also be described as a particle: the phonon. This particle, the sound quantum, represents the smallest amount of energy that occurs as an acoustic wave at a certain frequency. In order to see and study single quanta of sound, the number of phonons must be minimized. The transition from the classical to quantum behavior of sound is often more easily observed in the quantum ground state, where the number of phonons is close to zero on average, such that the vibrations are almost frozen and quantum effects can be measured.

Stiller: This opens the door to a new landscape of experiments that allow us to gain deeper insights into the fundamental nature of matter. The advantage of using a waveguide system is that light and sound are not bound between two mirrors, but propagating along the waveguide. The acoustic waves exist as a continuum not only for certain frequencies and can have a broad bandwidth, making them promising for applications such as high-speed communication systems.

We are very enthusiastic about the new insights that pushing these fibers into the quantum ground state will bring, emphasizes the research group leader. Not only from the fundamental research point of view, allowing us to peek into the quantum nature of extended objects, but also because of the applications this could have in quantum communications schemes and future quantum technologies.

Reference: Optoacoustic Cooling of Traveling Hypersound Waves by Laura Blzquez Martnez, Philipp Wiedemann, Changlong Zhu, Andreas Geilen and Birgit Stiller, 11 January 2024, Physical Review Letters. DOI: 10.1103/PhysRevLett.132.023603

Go here to read the rest:

Breaking the Temperature Barrier: How Quantum Ground State Acoustics Could Revolutionize Quantum Physics - SciTechDaily

Read More..

The surprising origins of wave-particle duality – Big Think

One of the most powerful, yet counterintuitive, ideas in all of physics is wave-particle duality. It states that whenever a quantum propagates through space freely, without being observed-and-measured, it exhibits wave-like behavior, doing things like diffracting and interfering not only with other quanta, but with itself. However, whenever that very same quantum is observed-and-measured, or compelled to interact with another quantum in a fashion that reveals its quantum state, it loses its wave-like characteristics and instead behaves like a particle. First discovered in the early 20th century in experiments involving light, its now known to apply to all quanta, including electrons and even composite particles such as atomic nuclei.

But the story of how we discovered wave-particle duality doesnt begin and end in the early 20th century, but rather goes back hundreds of years: to the time of Isaac Newton. It all began with an argument over the nature of light, one that went unresolved (despite both sides declaring victory at various times) until we came to understand the bizarre quantum nature of reality. While wave-particle duality owes its origin to the quantum nature of the Universe, the human story of how we revealed it was full of important steps and missteps, driven at all times by the only source of information that matters: experiments and direct observations. Heres how we finally arrived at our modern picture of reality.

What appears to be a simple plane wave, such as light or water passing through a partly obscured barrier, was conceived of (brilliantly) by Christiaan Huygens as a series of waves that propagate spherically outward, all superimposed atop one another. This idea of wave mechanics would apply not only to scalar waves such as water waves, but to light and particles as well.

Huygens: light is a wave

Picture a wave propagating through water, such as in the ocean: it appears to move linearly, at a particular speed and with a particular height, only to change and crash against the shore as the waters depth lessens. Back in 1678, Dutch scientist Christiaan Huygens recognized that these waves could be treated rather than as linear, coherent entities as a sum of an infinite number of spherical waves, where each spherical wave became superimposed atop one another along the propagating wavefront. (Illustrated above.)

Huygens noted the existence of phenomena like interference, refraction, and reflection, and saw that they applied equally well to water waves as they did to light, and so he theorized that light is a wave as well. This provided the first successful explanation of both linear and spherical wave propagation, both for water waves as well as for light waves. However, Huygens work had limitations to it, including:

The idea that light is a wave was born with Huygens and became quite popular across the European continent upon his treatises publication in 1690, but didnt catch on elsewhere due to the presence of a much more famous competitor.

Light passing from a negligible medium through a dense medium, exhibiting refraction. Light comes in from the left, strikes the prism and partially reflects (top), while the remainder is transmitted through the prism and exits at right. The light that passes through the prism appears to bend, as it travels at a slower speed than the light traveling through air did earlier. When it re-emerges from the prism, it refracts once again, returning to its original speed. Note that different wavelengths correspond to different colors, and that they are separated by their passage through the prism, not before nor after.

Newton: light is a corpuscle

In 1704, Newton published his treatise on Opticks, based on experiments that he first presented in 1672. Instead of a wave, Newton was able to describe light as a series of rays, or corpuscles, that behaved in a particle-like fashion. The deductions made in Newtons Opticks arise as direct inferences from the experiments performed, and focused on the phenomena of refraction and diffraction. By passing light through a prism, Newton was the first to show that light was not inherently white and altered to have color by its interactions with matter, but rather that white light itself was composed of all of the different colors of the spectrum, which he did by passing white light through a prism.

He performed experiments on refraction with prisms and lenses, on diffraction with closely-spaced sheets of glass, and on color mixtures with both lights of individual colors that were brought together and with pigment powders. Newton was the first to coin the ROY G. BIV palette of colors, noting that white light could be broken up into red, orange, yellow, green, blue, indigo, and violet. Newton was the first to understand that what appears to us as color arises from the selective absorption, reflection, and transmission of the various components of light: what we now know as wavelength, an idea antithetical to Newtons conception.

This diagram, dating back to Thomas Youngs work in the early 1800s, is one of the oldest pictures that demonstrate both constructive and destructive interference as arising from wave sources originating at two points: A and B. This is a physically identical setup to a double slit experiment, even though it applies just as well to water waves propagated through a tank.

Youngs double slit experiment

Throughout the 1700s, Newtons ideas became popular worldwide, heavily influencing Voltaire, Benjamin Franklin, and Lavoisier, among others. But at the end of the century, from 1799 to 1801, scientist Thomas Young began experimenting with light, making two enormous advances in our understanding of light in the process.

The first, arguably most famous advance, is illustrated above: Young performed whats known as the double slit experiment with light for the first time. By passing light of a monochrome color through two closely spaced slits, Young was able to observe a phenomenon thats only explicable through wave behavior: the constructive and destructive interference of that light in the pattern it produces, in a fashion that depends on the color of the light being used. Young was further able to prove, through quantitative investigation, that what we perceive as the color of light is, in fact, determined by the wavelength of that light: that wavelength and color, barring the mixture of different colors, were directly related to one another.

While Newtons conception of light still had its advantages, it was clear that the wave theory of light had its advantages too, and succeeded where Newtons corpuscular theory did not. The mystery would only deepen as the 19th century unfolded.

Light of different wavelengths, when passed through a double slit, exhibit the same wave-like properties that other waves do. Changing the wavelength of light, as well as changing the spacing between the slits, will change the specifics of the pattern that emerges.

Simeon Poisson and the worlds most absurd calculation

In 1818, the French Academy of Sciences held an essay competition on uncovering the nature of light, and physicist Augustin-Jean Fresnel decided to enter. In that competition, he wrote an essay detailing the wave theory of light, quantitatively, accounting for Huygens wave principle and Youngs principle of interference in the process. He was able to account for the effects of diffraction within this framework as well, adding in the principle of superposition to his essay, explaining the scintillating colors of stars as well.

Initially, however, one of the adherents to Newtons corpuscular idea who was serving as a judge on the committee, Simeon Poisson, attempted to have Fresnel laughed out of the competition. (Despite the fact that the only other entrant, who remains anonymous more than 200 years later, was ignorant of Youngs work.) Poisson was able to show that, according to Fresnels theory, if you took:

then Fresnels theory would predict that rather than a solid shadow, there would be a bright, luminous point in the shadows center. Even worse, that point would be just as bright as the part of the beam lying outside of the spheres shadow. Clearly, Poisson reasoned, this idea is absurd, and therefore light simply cannot have a wave nature to it.

The results of an experiment, showcased using laser light around a spherical object, with the actual optical data. Note the extraordinary validation of Fresnels theorys prediction: that a bright, central spot would appear in the shadow cast by the sphere, verifying the absurd prediction of the wave theory of light. Logic, alone, would not have gotten us here.

Franois Arago shows the absurdity of experiment

However, there were five men on the committee, and one of them was Franois Arago: abolitionist, politician, and a man who would, in 1848, become President of France. Arago was moved by Poissons argument against Fresnels idea, but not in the reductio ad absurdum sense that Poisson intended. Instead, Arago became motivated to actually perform the experiment himself: to create a monochromatic light source, widen it in a spherical fashion, and pass it around a smooth, small sphere, to see what the results of the experiment were.

To the great surprise of perhaps all, Aragos experiment revealed that the spot does in fact exist! Moreover, it:

Huygens ideas had finally been placed on a solid theoretical footing, and had been developed into a full-fledged theory that could now account for phenomena such as polarization. Over the course of the 1800s, the wave nature of light became widely accepted in scientific circles.

Light is nothing more than an electromagnetic wave, with in-phase oscillating electric and magnetic fields perpendicular to the direction of lights propagation. The shorter the wavelength, the more energetic the photon, but the more susceptible it is to changes in the speed of light through a medium.

Maxwell demonstrates how light is a wave

The 1800s were also a spectacular time for advances and discoveries in the fields of electricity and magnetism. The work of Ampere, Faraday, Gauss, Coulomb, Franklin, and many others laid the groundwork for what would arguably be the 19th centurys greatest scientific achievement: the development of Maxwells equations and the science of electromagnetism. Revelations included:

One of the consequences of Maxwells equations, as was shown in the 1870s, is that there would be some sort of electromagnetic radiation that arose under the right conditions: radiation that was made of oscillating, in-phase electric and magnetic fields that propagated at one universal speed, which happens to be the speed of light in a vacuum. At last, we had what appeared to be a full explanation: light wasnt just a wave, but an electromagnetic wave, that always traveled at one universal speed, the speed of light.

The photoelectric effect details how electrons can be ionized by photons based on the wavelength of individual photons, not on light intensity or any other property. Above a certain wavelength threshold for the incoming photons, regardless of intensity, electrons will be kicked off. Below that threshold, no electrons will be kicked off, even if you turn the intensity of the light way up. Both electrons and the energy in each photon are discrete.

Einstein demonstrates that lights energy is quantized

Of course, physics didnt end with the discovery of classical electromagnetism, and the dawn of the 1900s would bring with it the earliest stages of the quantum revolution. One of the key aspects of this new conception of our reality came from none other than Albert Einstein himself, whose 1905 treatise on the photoelectric effect would forever change our understanding of light. Taking a conducting metal plate, Einstein was able to show that shining light on it caused electrons to spontaneously be emitted from the metal, as though these electrons were being kicked off by the light that struck them. Clearly, with enough energy, the electrons were becoming unbound from the metal they were a part of

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

What Einstein did next was nothing short of brilliant.

It was as though light was made up of individual energy packets, known today as photons, which carry energy in proportion to their frequency. (Or, inversely proportional to their wavelength.) Even though light propagated as a wave, it interacted with matter like a corpuscle (or particle), bringing about the beginnings of the modern idea of wave-particle duality.

The wave pattern for electrons passing through a double slit, one-at-a-time. If you measure which slit the electron goes through, you destroy the quantum interference pattern shown here. However, the wave-like behavior remains so long as the electrons have a de Broglie wavelength thats smaller than the size of the slit theyre passing through. This wave-like and particle-like behavior has been demonstrated for electrons, photons, and even larger, composite entities.

The modern double-slit and the dual nature of reality

It turns out that photons, electrons, and all other particles exhibit this odd quantum behavior of wave-particle duality, where if you observe-and-measure them during their journey, or otherwise force them to interact and exchange energy-and-momentum with other quanta, they behave as particles, but if you dont, they behave as waves. This is perhaps exemplified by the modern version of Youngs double-slit experiment, where it doesnt rely on monochromatic light, but can even be performed with single particles, like photons or electrons, passed through a double slit one-at-a-time.

If you perform this experiment without measuring your particles until they reach the screen, youll find that they do, in fact, reproduce the classic interference pattern once youve accumulated enough individual quanta. Bright spots, which correspond to the locations where large numbers of particles land, are spaced apart by dark bands, where few-to-no particles land, consistent with the notion of an interference pattern.

However, if you measure whether the quantum passes through slit #1 or slit #2 during its journey, you no longer get an interference pattern on the screen, but simply two lumps: one lump corresponding to particles that passed through the first slit and the other corresponding to particles that passed through the other.

If you measure which slit an electron (or a photon) goes through when performing a one-at-a-time double slit experiment, you dont get an interference pattern on the screen behind it. Instead, the electrons behave not as waves, but as classical particles. A similar effect can be seen for single-slit (left) experiments as well.

Many have commented that Its like nature knows whether youre watching it or not! And in some sense, this counterintuitive statement is actually true. When you dont measure a quantum, but rather simply allow it to propagate, it behaves like a wave: a classical wave that interferes with not just other waves but also itself, exhibiting wave-like behavior such as diffraction and superposition. However, when you do measure a quantum, or otherwise compel it to interact with another quantum of high-enough energy, your original quantum behaves like a particle, with a deterministic, particle-like trajectory that it follows, just as tracks in particle physics detectors reveal.

So, is light a wave or a particle?

The answer is yes: its both. Its wave-like when its freely propagating, and its particle-like when its interacting, a set of phenomena thats been probed in an enormous variety of ways over the past ~100 years or so. Despite the proposal of hidden variables to attempt to reconcile wave-particle duality into a single deterministic framework, all experiments point to nature still being non-deterministic, as you cannot predict the outcome of an unmeasured, wave-like trial with any more accuracy than the Schrdingers equations probabilistic approach. Wave-particle duality began in the 1600s, and despite our attempts to pin down the true nature of reality, the answer that the Universe itself reveals is that our quantum reality is both, simultaneously, and really does depend on whether or not we measure or interact with it.

Read this article:

The surprising origins of wave-particle duality - Big Think

Read More..

‘Constellation’ Review: Alice in Wonderspace – The New York Times

In Constellation on Apple TV+, the Swedish actress Noomi Rapace stars as Jo Ericsson, an astronaut whose time on the International Space Station takes a tragic and mysterious turn. The superbly capable Jo battles overwhelming odds to get back to Earth and to decipher why she feels so out of place once shes there. But the real hero of the story its emotional center and vigilant conscience is Jos young daughter, a solemn girl with a significant name: Alice. To understand whats up with her mom, shell have to go through the looking glass.

The uneven but seductively spooky Constellation, which premieres with three of its eight episodes on Wednesday, is a space adventure, mystery and family drama spun from the unstable fabric of quantum physics. People, places and events look different from episode to episode and scene to scene; when a NASA scientist tells Jo that curiosity killed the cat, he is definitely referring to the poor animal inside Schrdingers box.

In storytelling terms, though, the real quantum entanglement is that of straight science-fiction action with dark fairy tale. The shows creator and writer, Peter Harness, working with the directors Michelle MacLaren, Oliver Hirschbiegel and Joseph Cedar, carries off both with aplomb, and maintains a dry tone and an appealing atmosphere of foreboding. The mechanics of the narrative, as Constellation shifts through its different gears, can be creaky, but the show continually draws you in.

The main action begins with a bang, as an unidentified bit of debris cripples the space station during an experiment that seeks a new state of matter. Across two episodes the echoes of Alfonso Cuarns Gravity are heavy as Jo, left alone in the station, deals with a cascade of problems while trying to escape in a Soyuz capsule. Where Gravity ended, though, Constellation is just getting started. The resourcefulness and sanity Jo displays in space define her for the audience, so that we stay on her side when things start to go wrong on Earth.

Jos memories of names, cars, relationships do not completely jibe with what she finds when she gets home to Sweden, and the show slides from adventure into increasingly paranoid thriller, smoothly though perhaps with more time-jumping confusion and open questions than some viewers will have patience for. It plays fair, however by Episode 6 things begin to come clear. At which point Jo and Alice head into the dark northern woods.

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit andlog intoyour Times account, orsubscribefor all of The Times.

Thank you for your patience while we verify access.

Already a subscriber?Log in.

Want all of The Times?Subscribe.

Read more:

'Constellation' Review: Alice in Wonderspace - The New York Times

Read More..

In the quantum world, the future causes the past | Emily Adlam – IAI

Two particles affecting one another faster than light seemed unimaginable but there is no denying the facts. This mystery has inspired much skepticism from lay folk and Nobel prize winners alike. But how do we solve this? Emily Adlam argues, that retrocausality, the idea the future can affect the past, is the key to solving this quantum enigma.

One of the most famous and puzzling features of quantum mechanics is the fact that it exhibits what Einstein called spooky action at a distance, - correlations between measurements performed on particles very far apart in space. Now, of course distant correlations in and of themselves are nothing surprising. For example, suppose I take a pair of socks out of my drawer, separate them, and then send one off to my friend Alice in London and the other to my friend Bob in Auckland. If my friends observe the socks and then compare their results and discover that they both have socks of the same colour, theres no mystery about that - the socks are the same colour because of a common cause in their shared past, i.e. the fact that they both came from the same pair. But the correlations appearing in quantum mechanics are special because they cannot be explained in this way. For example, distant quantum correlations are typically demonstrated in a Bell experiment, in which I prepare a pair of quantum particles and send one to Alice and another to Bob, and then Alice and Bob both choose a measurement and perform that measurement on their particle. A famous theorem due to the physicist John Bell tells us that in this scenario, correlations explained by a common cause in the past must obey a certain inequality - but it turns out that we can choose preparations and measurements in quantum mechanics which lead to a violation of Bells inequality. This seems to indicate that Alices decision to make one measurement rather than another has an instantaneous influence on Bobs particle and thereby affects the results that Bob obtains in his own measurement, regardless of how far apart they are.

___

Its troubling to think that the deep structure of quantum mechanics violates such a foundational principle of relativity.

___

These instantaneous nonlocal influences are troubling to many physicists, not least because they seem strongly in tension with our understanding of spe- cial relativity. The problem is that special relativity tells us that there is no fact of the matter about what is instantaneous - to determine whether two distant events are simultaneous we must choose a reference frame relative to which we can assess simultaneity, but special relativity ultimately says that no reference frame is preferred and therefore there are no observer-independent facts about simultaneity. Now, quantum nonlocality doesnt produce any observable contra- dictions with this relativistic principle, since the instantaneous influences work in a subtle way which means they are only detectable after the results of the measurements on both particles have been compared, and therefore we cant ex- perimentally detect any preferred reference frame. But still, if the influences are instantaneous there must be some preferred reference frame, even if observers like us cant detect it, and its troubling to think that the deep structure of quantum mechanics violates such a foundational principle of relativity. This seems particularly problematic if we hope to one day unify quantum mechanics and relativity in a theory of quantum gravity, for it seems unlikely we will be able to do this if the two theories are incompatible in such a foundational way.

Consequently, a number of physicists have sought to understand if there is any way we can explain these distant correlations without spooky action at a distance. In order to do this, we will necessarily have to deny one of the assump- tions going into Bells theorem, and most work on the topic has focused on an assumption known as statistical independence, which simply says that the state of the two particles together at the time that I prepared them is independent of the later choices that Alice and Bob make about which measurements they are going to perform. This assumption is clearly necessary to prove the existence of non-locality, because if we know in advance what measurements Alice and Bob are going to perform, we can just pre-program the particles with outcomes which will give the appearance of non-locality even though everything is com- pletely local. So if there is any plausible way to deny statistical independence in a Bell experiment, that would allow us to resist the existence of spooky action at a distance.

SUGGESTED VIEWING Beyond Quantum With Gerard 't Hooft

However, we cant simply stipulate that statistical independence is violated - that would amount to simply switching one spooky nonlocal influence for an- other. Rather we will have to come up with some local mechanism by which the state of the particle could come to be correlated with the later measure- ment choices. And there are two obvious ways to do that. The first, known as superdeterminism, involves suggesting that there is some common cause in the past of the measurement choices and the preparation events which results in correlations between them. The second, known as retrocausality, involves sug- gesting that the measurement choices have a backwards-in-time causal influence on the state of the particle at the time of its preparation.

Superdeterminism might at first sound like the more natural approach, but it actually has some very strange consequences. For Alice and Bob can make their measurement choices in any way you can possibly imagine - flipping a coin, as a function of the date of their birthdays, by observing the light from distant galaxies, and so on. So superdeterminism regards us to suggest that there is some common causal factor which always influences the measurement choices, regardless of whatever weird and wonderful method Alice and Bob might use to make their choice. Moreover, we will probably have to just stipulate that the desired correlations are simply written into the initial state at the beginning of time, and that seems problematic, since our current understanding of statistical mechanics and thermodynamics suggests that we can only make sense of the arrow of time if we assume that the initial state of the universe does not contain too many fine correlations. Therefore adopting superdeterminism seems to create serious tensions with well-established scientific methodology.

On the other hand, retrocausality does not require any special adjustments to the initial state of the universe: it allows us to say, as we would naturally be inclined to, that Alice and Bobs choices of measurements are independent of the rest of the variables involved in the experiments, but nonetheless the state of the particles does end up correlated with the measurement choices. A number of interesting retrocausal models of the Bell experiments have thus been proposed, including proposals by Wharton and Schulman - these models demonstrate that it is indeed possible in principle to account for the Bell experiments in a local way by allowing retrocausal influences.

However, its important to note that there are two importantly different conceptions of retrocausality that one might adopt here. The first is the two arrows approach, which is perhaps what most people think of first when they hear the term retrocausality - it suggests there are literally two distinct arrows of causation pointing in opposite directions, so we have evolution both forwards in time and backwards in time. The alternative is the all-at-once approach, in which there is no process of evolution at all, and instead the laws of nature work in an atemporal way to pick out the whole course of history at once, in much the same way as the rules of sudoku constrain the whole grid at once rather than starting at one side and moving to the other side. In all-at-once models the past and future influence each other in a mutual and reciprocal manner, and therefore in such models it is natural to expect that there will be correlations which appear from our internal point of view to involves something like backwards-in-time causal influences. So we could certainly get violations of statistical independence in an all-at-once model.

Which of these conceptions of retrocausality should we choose? Well, there is an obvious problem with the two arrows conception: it seems liable to produce logical paradoxes. For example, consider the grandfather paradox in which a time traveller goes back in time and kills her own grandfather before he can father any children - so then the time traveller will not be born, but then of course she cant kill her grandfather, so it seems that there is no logically con- sistent way to resolve this course of events. We can easily imagine a retrocausal analogue of this paradox in which the time traveller uses a backwards-in-time causal influence rather than a time machine to kill her grandfather, ultimately producing the same paradoxical result. Since the universe presumably can- not contain logical paradoxes, it seems that this kind of composition of causal processes cannot be possible. But this is hard to achieve in the two arrows conception of retrocausality - it seems that we have to add some kind of con- sistency conditions, and in order to fully rule out logical contradictions these conditions will have to apply all at once at a global level, and once we allow that it seems we are really moving towards the all-at-once picture rather than two arrows picture.

___

In an all-at-once account of the Bell experiments, we have no need to say that there is an instantaneous influence from one particle on another.

___

The all-at-once picture, meanwhile, cant possibly produce logical contra- dictions, because the whole point of this approach is that the entire course of history is selected all together, in a logically consistent way. But theres a catch - its not entirely clear that an all-at-once model is really local in the ordi- nary sense. For in an all-at-once model, the correlations between distant events are brought about by constraints imposed directly on the whole of history, so theres really no need for information to be carried from one point to another by a local process. In such a model events at distant points can depend directly on each other via the global constraints, and thus in we should expect to see many strong correlations which are not mediated by any physical system carrying information between their locations.

Therefore one may worry that using retrocausality to avoid nonlocality will ultimately be self-defeating, since the most reasonable kind of retrocausal model turns out to be non-local in any case. But that depends on the reasons one has for wanting to get rid of nonlocality in the first place. Someone who simply thinks that nonlocality is spooky will perhaps not be happy with an all-at- once model, but someone who merely worries about nonlocality because of the issue of consistency with relativity can happily accept an all-at-once model. For all though all-at-once models are generically very non-local, they dont imple- ment that non-locality in a way that requires a preferred reference frame. For example, in an all-at-once account of the Bell experiments, we have no need to say that there is an instantaneous influence from one particle on another: we can simply impose a global constraint which requires that the results of the two measurements are correlated in certain ways, regardless of when and where the measurements are performed. There is no temporal process by which the information is carried from one particle to another, and thus there is no need to identify a preferred reference frame on which that happens.

SUGGESTED READING Quantum mechanics and the puzzle of subjectivity By Steven French

Ultimately, it seems the contradiction with relativity in the Bell correlations arises because we are trying to fit these correlations into a model based on time evolution, which forces us to pick a reference frame on which the correlations take effect. If we stop trying to do that, much of the tension with relativity goes away, so we end up with a model which is non-local but still entirely in line with the underlying principles of special relativity. Thus introducing retrocausality in the all-at-once sense offers a very interesting route to reconciling the Bell ex- periments with relativity, and we are just beginning to explore the implications of this possibility for our ideas about time, causation and gravity.

More:

In the quantum world, the future causes the past | Emily Adlam - IAI

Read More..

Harmonics Rewrite the Fundamental Equation for Superconducting Quantum Bits – SciTechDaily

Cryogenic microwave setup used for quantum device measurements. Credit: Qinu GmbH, qinu.de

Quantum bits can be described more precisely with the help of newly discovered harmonics as a team of 30 researchers reports in Nature Physics.

Physicists have uncovered that Josephson tunnel junctions the fundamental building blocks of superconducting quantum computers are more complex than previously thought. Just like overtones in a musical instrument, harmonics are superimposed on the fundamental mode. As a consequence, corrections may lead to quantum bits that are 2 to 7 times more stable. The researchers support their findings with experimental evidence from multiple laboratories across the globe, including the University of Cologne, Ecole Normale Suprieure in Paris, and IBM Quantum in New York.

It all started in 2019, when Dennis Willsch and Dennis Rieger two PhD students from FZJ and KIT at the time and joint first authors of the paper were having a hard time understanding their experiments using the standard model for Josephson tunnel junctions. This model won Brian Josephson the Nobel Prize in Physics in 1973.

Excited to get to the bottom of this, the team led by Ioan Pop scrutinized further data from the Ecole Normale Suprieure in Paris and a 27-qubit device at IBM Quantum in New York, as well as data from previously published experiments. Independently, researchers from the University of Cologne observed similar deviations of their data from the standard model.

Fortunately, Gianluigi Catelani, who was involved in both projects and realized the overlap, brought the research teams together! recalls Dennis Willsch from FZ Jlich. The timing was perfect, adds Chris Dickel from the University of Cologne, since, at that time, we were exploring quite different consequences of the same underlying problem.

Bottom part: By exciting superconducting circuits (yellow/blue) with microwave signals (red wiggly arrow), the researchers can analyze the fundamental equation that describes the Josephson tunnel junction of the circuit. Right part: The researches have observed significant deviations (red curve) from the sinusoidal standard model (green curve). Left part: schematic zoom-in of a tunnel junction consisting of two superconductors (yellow/blue) with a thin insulating barrier in-between. The large conduction channels (red loops) can be responsible for the observed deviations from the standard model. Credit: Dennis Rieger, Patrick Winkel

Josephson tunnel junctions consist of two superconductors with a thin insulating barrier in between and, for decades, these circuit elements have been described with a simple sinusoidal model.

However, as the researchers demonstrate, this standard model fails to fully describe the Josephson junctions that are used to build quantum bits. Instead, an extended model including higher harmonics is required to describe the tunneling current between the two superconductors. The principle can also be found in the field of music. When the string of an instrument is struck, the fundamental frequency is overlaid by several harmonic overtones.

Its exciting that the measurements in the community have reached the level of accuracy at which we can resolve these small corrections to a model that has been considered sufficient for more than 15 years, Dennis Rieger remarks.

When the four coordinating professors Ioan Pop from KIT and Gianluigi Catelani, Kristel Michielsen, and David DiVincenzo from FZJ realized the impact of the findings, they brought together the large collaboration of experimentalists, theoreticians, and material scientists, to join their efforts in presenting a compelling case for the Josephson harmonics model. In the Nature Physics publication, the researchers explore the origin and consequences of Josephson harmonics.

As an immediate consequence, we believe that Josephson harmonics will help in engineering better and more reliable quantum bits by reducing errors up to an order of magnitude, which brings us one step closer towards the dream of a fully universal superconducting quantum computer, the two first authors conclude.

Reference: Observation of Josephson harmonics in tunnel junctions by Dennis Willsch, Dennis Rieger, Patrick Winkel, Madita Willsch, Christian Dickel, Jonas Krause, Yoichi Ando, Raphal Lescanne, Zaki Leghtas, Nicholas T. Bronn, Pratiti Deb, Olivia Lanes, Zlatko K. Minev, Benedikt Dennig, Simon Geisert, Simon Gnzler, Sren Ihssen, Patrick Paluch, Thomas Reisinger, Roudy Hanna, Jin Hee Bae, Peter Schffelgen, Detlev Grtzmacher, Luiza Buimaga-Iarinca, Cristian Morari, Wolfgang Wernsdorfer, David P. DiVincenzo, Kristel Michielsen, Gianluigi Catelani and Ioan M. Pop, 14 February 2024, Nature Physics. DOI: 10.1038/s41567-024-02400-8

See the original post:

Harmonics Rewrite the Fundamental Equation for Superconducting Quantum Bits - SciTechDaily

Read More..

UCalgary celebrates 25 years of quantum physics | News – University of Calgary

When we think of the word quantum, we often think of the future.

With cutting-edge initiatives like Quantum City, Calgary's new quantum innovation hub, and the Canadian government championing the field through the National Quantum Strategy, quantum technologies promise to become transformational in industries from energy development to finance to agriculture.

Its easy to look forward when we think of the word quantum,and we should be. According to the Department of National Defence and Canadian Armed Forces Quantum Science and Technology Strategy, in less than 20 years, the quantum sectors projected revenues are expected to match what the entire aerospace sector contributes to the national economy today.

Yet, to move forward with clarity and vision, we must remember our foundation. The University of Calgary has a rich history in quantum physics. From researching the nature of antimatter to developing nanotechnologies that exploit quantum physics, its this very history that has brought us to where we are today: on the path to becoming a leader in the quantum revolution.

The university first became involved in the world of quantum through Dr. Richard Cleve, PhD, who was hired in the Department of Computer Science in 1990 and made seminal contributions to quantum information and quantum computing during his time at the university.

Rob Thompson in 2023.

Riley Brandt, University of Calgary

Dr. Robert Thompson, PhD, a professor of physics and associate vice-president (research), was the universitys official quantum hire, 25 years ago.

I came here to do my job interview on June 1, 1998. When I left Houston, it was 104 degrees Fahrenheit and when I arrived in Calgary there was frost on the ground. I loved it, says Thompson.

Thompson grew up in British Columbia. He completed his undergraduate degree from the University of British Columbia and his graduate studies at the University of Toronto, where he first got into quantum research, then limited to atomic, molecular and optical physics (AMO). After this, Thompson took a staff position at Max-Planck-Institute for Quantum Optics in Munich, Germany, and then completed a one-year postdoctoral position at Rice University in Houston, Texas.

Why does Thompson study quantum physics? The answer is simple. He wants to decode the mystery.

Quantum physics has this level of fascination because its relatively young, it was discovered 115 years ago, and so the foundations of quantum are actually very close to the work we do in the lab. Those foundational principles we know are incomplete. Weve solved some of the mysteries since I started in 1998, but there are pieces missing. Its that pursuit of understanding that fascinates me.

Things were different at UCalgary in 1998. Because quantum physics research was in its infancy, Thompson, who at the time was studying ion trap physics, found himself working alone both physically and in his field.

Rob in his office when he first started at the university in 1998.

The Thompson family

It was interesting because the AMO group consisted solely of me, he says. Plus, almost all of the other faculty members had their offices on the third, fifth or sixth floors, and there I was on the ground floor.

The ground floor: a perfect metaphor for how Thompson and soon other quantum researchers built the quantum physics world up from here at UCalgary.

In 1999, Thompson was joined by Dr. Nasser Moazzen-Ahmadi, PhD, in the AMO physics group. Both Thompson and Moazzen-Ahmadi asked the university for the same things: lab space, technical support and more than two faculty members. In 2002, Dr. David Feder, PhD, a theoretical physicist, joined the faculty.

At that point, quantum technology was really starting to grow. That was what brought Barry Sanders to town in 2003.He was another theorist, a quantum information guy, and that really launched us on the quantum science and technology side, says Thompson.

Dr. Barry Sanders, PhD, who joined as an Alberta iCORE Professor, started the Institute for Quantum Information Science (now the Institute for Quantum Science and Technology), an organization that leads research in key theoretical and experimental topics, as well as Quantum Alberta, a network of academic and industry experts from across the province.

Several years after that, with Dr. Wolfgang Tittel, PhD, starting at UCalgary as an industrial research chair to build and translate quantum communication, quantum at UCalgary started to see a fundamental expansion toward knowledge translation. Quantum was no longer just about discovery. Quantum was a whole new ecosystem involving everything from foundational research to the commercialization of quantum technologies.

When I started in this field, quantum physics was a pure science, says Thompson.But as we discovered things, we started to discover tools that can be used more broadly in society, in industry and the corporate sector. We started to move on from Can I discover something and publish a paper about it?to Can I work with a company or organization, or establish a company myself, to start to market some of this intellectual property?

In 2022, Quantum City was born. The initiative is building quantum-focused fabrication infrastructure, new talent development programs and commercialization and adoption pathways to support the development of a vibrant economic and scientific hub right here in Calgary.

Erika Janitz.

Fritz Tolentino

Many of the Quantum City initiatives are what drew the universitys latest quantum hire, Dr. Erika Janitz, PhD, to Calgary. Janitz started almost exactly 25 years after Thompson. An assistant professor in the Schulich School of Engineering, she works at the intersection of electrical engineering and physics to build diamond-based quantum technologies for long-distance communication and sensing. Specifically, Janitzs lab will look for atom-like defects in diamonds for building things like quantum networks and molecular sensors.

She loves her research because it satisfies her desire to understand how things work while also allowing her to build useful technologies.

This field is attractive to me because you can do it all: develop the theory and perform simulations, design and build the experiment, take and analyze the data, says Janitz.

Janitz is also thrilled about Quantum Citys forthcoming qLab: a state-of-the-art cleanroom facility that will be dedicated to serving as a dynamic and collaborative centre for quantum innovation.

Its really exciting that we will have the necessary infrastructure here at the university for our research to lead on an international level, says Janitz.

She chose to make Calgary her home because of the widespread support and community when it comes to quantum research.

It was important to me to work somewhere where its clear that your research is a priority. Quantum has been identified as an important area on the institutional, provincial and national scales, she says. Its a vibrant, collaborative ecosystem. Youre not just an island, working alone.

Janitz hopes the excitement and traction we have now in Alberta leads to a self-sustaining ecosystem, where we have cutting-edge infrastructure, leading talent and the ability to train talent locally.

For Thompson, he wants to see a healthy, vibrant quantum technology centre here in Alberta, for quantum to be a catalyst for diversifying the economy and for the province to be the go to place for a broad range of industries looking for quantum solutions.

As for the past 25 years, Thompson knows its the very heart and soul of UCalgary thats allowed quantum physics to grow and thrive the way it has.

The University of Calgary bills itself as an entrepreneurial university. I know sometimes that can come off as just lip service, but Ive been here 25 years and the opportunities that its offered me, I dont think I would have seen anywhere else in the world. When you see what needs to be done, this institution will let you do it. It doesnt just let you. It helps you.

When Thompson walked into the Science B building in September 1998, there was no quantum physics group. And now? Its half the building and were taking over Earth Science.

Its been a hell of a ride, he continues, and UCalgary has facilitated it all.

Go here to see the original:

UCalgary celebrates 25 years of quantum physics | News - University of Calgary

Read More..

Contact Lost With Spacecraft Carrying Experimental Quantum Drive – Futurism

Dud on arrival. Quantum of Solace

A test involving a highly controversial propellantless propulsion system called the quantum drive has failed because the satellite it was aboard fell silent, The Debrief reports.

In a press release, Rogue Space Systems explained that its Barry 1 cubesat, ferried to orbit by a SpaceX Falcon 9 rocket, was plagued with ongoing power-system issues.

A test with the quantum drive, developed by IVO limited, was supposed to demonstrate whether the engine could alter the orbit of the satellite. But for some reason, after over two months in orbit, the test was never initiated, and contact with the satellite was lost on February 9 an unceremonious end for a demo that was supposedly going to upend the laws of physics.

"Rogue's Barry-1 satellite didn't make it all the way through LEOP (Launch and Early Orbit Phase)," IVO founder and President Richard Mansell told The Debrief. "Sadly, we never even got to turn on the Drives!"

Had the test actually been conducted, skeptical scientists and there are many believed that the quantum drive wouldn't do anything. That's because it's a type of engine known as a "reactionless drive," in which a reaction mass, essentially the propellant, is not used to generate thrust.

According to Newtonian physics, such a device wouldn't work because propulsion without a propellant is impossible. Nevertheless, IVO claimed that its quantum drive could generate an eyebrow-raising 52 millinewtons of thrust per one watt of electricity which is a ludicrous twelve times more efficient than existing ion drives used on satellites, Forbes noted.

It's an appealing idea. The problem with propellant is that the stuff runs out. It's also heavy, taking up a significant portion of a spacecraft's weight, which not only comes with its design limitations but drives up launch costs.

And so there's been a lot of excitement and controversy over lab tests that claim spacecraft could ditch propellant by harnessing spooky properties of quantum mechanics. But so far, none of them have been borne out in large scale tests, let alone ones conducted in space (of which, according to The Debrief, IVO's test would have been the first).

Undeterred by the setback and seemingly the laws of physics Mansell said that IVO will be conducting further orbital tests with its quantum drive.

"The overall configuration of the Drives will not change," he told The Debrief. "While waiting for the Barry-1 tests, we have been continuously working to improve the Drives. Those improvements will be part of the next set that goes to space."

It has to be said: it seems a little convenient that the cubesat went bust before the quantum drives could even be activated after waiting for months in orbit. But hey, maybe they'll remember to turn the thing on next time and, fingers crossed, even manage to prove physics wrong.

More on space: New Chinese Lander to Start Building Base From Moon Dust Bricks

Read the original here:

Contact Lost With Spacecraft Carrying Experimental Quantum Drive - Futurism

Read More..