Page 2,707«..1020..2,7062,7072,7082,709..2,7202,730..»

What God, Quantum Mechanics and Consciousness Have in Common – Scientific American

In my 20s, I had a friend who was brilliant, charming, Ivy-educated and rich, heir to a family fortune. Ill call him Gallagher. He could do anything he wanted. He experimented, dabbling in neuroscience, law, philosophy and other fields. But he was so critical, so picky, that he never settled on a career. Nothing was good enough for him. He never found love for the same reason. He also disparaged his friends choices, so much so that he alienated us. He ended up bitter and alone. At least thats my guess. I havent spoken to Gallagher in decades.

There is such a thing as being too picky, especially when it comes to things like work, love and nourishment (even the pickiest eater has to eat something). Thats the lesson I gleaned from Gallagher. But when it comes to answers to big mysteries, most of us arent picky enough. We settle on answers for bad reasons, for example, because our parents, priests or professors believe it. We think we need to believe something, but actually we dont. We can, and should, decide that no answers are good enough. We should be agnostics.

Some people confuse agnosticism (not knowing) with apathy (not caring). Take Francis Collins, a geneticist who directs the National Institutes of Health. He is a devout Christian, who believes that Jesus performed miracles, died for our sins and rose from the dead. In his 2006 bestseller The Language of God, Collins calls agnosticism a cop-out. When I interviewed him, I told him I am an agnostic and objected to cop-out.

Collins apologized. That was a put-down that should not apply to earnest agnostics who have considered the evidence and still dont find an answer, he said. I was reacting to the agnosticism I see in the scientific community, which has not been arrived at by a careful examination of the evidence. I have examined the evidence for Christianity, and I find it unconvincing. Im not convinced by any scientific creation stories, either, such as those that depict our cosmos as a bubble in an oceanic multiverse.

People I admire fault me for being too skeptical. One is the late religious philosopher Huston Smith, who called me convictionally impaired. Another is megapundit Robert Wright, an old friend, with whom Ive often argued about evolutionary psychology and Buddhism. Wright once asked me in exasperation, Dont you believe anything? Actually, I believe lots of things, for example, that war is bad and should be abolished.

But when it comes to theories about ultimate reality, Im with Voltaire. Doubt is not a pleasant condition, Voltaire said, but certainty is an absurd one. Doubt protects us from dogmatism, which can easily morph into fanaticism and what William James calls a premature closing of our accounts with reality. Below I defend agnosticism as a stance toward the existence of God, interpretations of quantum mechanics and theories of consciousness. When considering alleged answers to these three riddles, we should be as picky as my old friend Gallagher.

THE PROBLEM OF EVIL

Why do we exist? The answer, according to the major monotheistic religions, including the Catholic faith in which I was raised, is that an all-powerful, supernatural entity created us. This deity loves us, as a human father loves his children, and wants us to behave in a certain way. If were good, Hell reward us. If were bad, Hell punish us. (I use the pronoun He because most scriptures describe God as male.)

My main objection to this explanation of reality is the problem of evil. A casual glance at human history, and at the world today, reveals enormous suffering and injustice. If God loves us and is omnipotent, why is life so horrific for so many people? A standard response to this question is that God gave us free will; we can choose to be bad as well as good.

The late, great physicist Steven Weinberg, an atheist, who died in July, slaps down the free will argument in his book Dreams of a Final Theory. Noting that Nazis killed many of his relatives in the Holocaust, Weinberg asks: Did millions of Jews have to die so the Nazis could exercise their free will? That doesnt seem fair. And what about kids who get cancer? Are we supposed to think that cancer cells have free will?

On the other hand, life isnt always hellish. We experience love, friendship, adventure and heartbreaking beauty. Could all this really come from random collisions of particles? Even Weinberg concedes that life sometimes seems more beautiful than strictly necessary. If the problem of evil prevents me from believing in a loving God, then the problem of beauty keeps me from being an atheist like Weinberg. Hence, agnosticism.

THE PROBLEM OF INFORMATION

Quantum mechanics is sciences most precise, powerful theory of reality. It has predicted countless experiments, spawned countless applications. The trouble is, physicists and philosophers disagree over what it means, that is, what it says about how the world works. Many physicistsmost, probablyadhere to the Copenhagen interpretation, advanced by Danish physicist Niels Bohr. But that is a kind of anti-interpretation, which says physicists should not try to make sense of quantum mechanics; they should shut up and calculate, as physicist David Mermin once put it.

Philosopher Tim Maudlin deplores this situation. In his 2019 book Philosophy of Physics: Quantum Theory, he points out that several interpretations of quantum mechanics describe in detail how the world works. These include the GRW model proposed by Ghirardi, Rimini and Weber; the pilot-wave theory of David Bohm; and the many-worlds hypothesis of Hugh Everett. But heres the irony: Maudlin is so scrupulous in pointing out the flaws of these interpretations that he reinforces my skepticism. They all seem hopelessly kludgy and preposterous.

Maudlin does not examine interpretations that recast quantum mechanics as a theory about information. For positive perspectives on information-based interpretations, check out Beyond Weird by journalist Philip Ball and The Ascent of Information by astrobiologist Caleb Scharf. But to my mind, information-based takes on quantum mechanics are even less plausible than the interpretations that Maudlin scrutinizes. The concept of information makes no sense without conscious beings to send, receive and act upon the information.

Introducing consciousness into physics undermines its claim to objectivity. Moreover, as far as we know, consciousness arises only in certain organisms that have existed for a brief period here on Earth. So how can quantum mechanics, if its a theory of information rather than matter and energy, apply to the entire cosmos since the big bang? Information-based theories of physics seem like a throwback to geocentrism, which assumed the universe revolves around us. Given the problems with all interpretations of quantum mechanics, agnosticism, again, strikes me as a sensible stance.

MIND-BODY PROBLEMS

The debate over consciousness is even more fractious than the debate over quantum mechanics. How does matter make a mind? A few decades ago, a consensus seemed to be emerging. Philosopher Daniel Dennett, in his cockily titled Consciousness Explained, asserted that consciousness clearly emerges from neural processes, such as electrochemical pulses in the brain. Francis Crick and Christof Koch proposed that consciousness is generated by networks of neurons oscillating in synchrony.

Gradually, this consensus collapsed, as empirical evidence for neural theories of consciousness failed to materialize. As I point out in my recent book Mind-Body Problems, there are now a dizzying variety of theories of consciousness. Christof Koch has thrown his weight behind integrated information theory, which holds that consciousness might be a property of all matter, not just brains. This theory suffers from the same problems as information-based theories of quantum mechanics. Theorists such as Roger Penrose, who won last years Nobel Prize in Physics, have conjectured that quantum effects underpin consciousness, but this theory is even more lacking in evidence than integrated information theory.

Researchers cannot even agree on what form a theory of consciousness should take. Should it be a philosophical treatise? A purely mathematical model? A gigantic algorithm, perhaps based on Bayesian computation? Should it borrow concepts from Buddhism, such as anatta, the doctrine of no self? All of the above? None of the above? Consensus seems farther away than ever. And thats a good thing. We should be open-minded about our minds.

So, whats the difference, if any, between me and Gallagher, my former friend? I like to think its a matter of style. Gallagher scorned the choices of others. He resembled one of those mean-spirited atheists who revile the faithful for their beliefs. I try not to be dogmatic in my disbelief, and to be sympathetic toward those who, like Francis Collins, have found answers that work for them. Also, I get a kick out of inventive theories of everything, such as John Wheelers it from bit and Freeman Dysons principle of maximum diversity, even if I cant embrace them.

Im definitely a skeptic. I doubt well ever know whether God exists, what quantum mechanics means, how matter makes mind. These three puzzles, I suspect, are different aspects of a single, impenetrable mystery at the heart of things. But one of the pleasures of agnosticismperhaps the greatest pleasureis that I can keep looking for answers and hoping that a revelation awaits just over the horizon.

This is an opinion and analysis article; the views expressed by theauthor or authorsare not necessarily those of Scientific American.

Further Reading:

I air my agnostic outlook in my two most recent books, Mind-Body Problems, available for free online, and Pay Attention: Sex, Death, and Science.

See also my podcast Mind-Body Problems, where I talk to experts, including several mentioned above, about God, quantum mechanics and consciousness.

Read this article:

What God, Quantum Mechanics and Consciousness Have in Common - Scientific American

Read More..

This Is Why Quantum Mechanics Isn’t Enough To Explain The Universe – Forbes

Going to smaller and smaller distance scales reveals more fundamental views of nature, which means ... [+] if we can understand and describe the smallest scales, we can build our way to an understanding of the largest ones.

Of all the revolutionary ideas that science has entertained, perhaps the most bizarre and counterintuitive one is the notion of quantum mechanics. Previously, scientists had assumed that the Universe was deterministic, in the sense that the laws of physics would enable you to predict with perfect accuracy how any system would evolve into the future. We assumed that our reductionist approach to the Universe where we searched for the smallest constituents of reality and worked to understand their properties would lead us to the ultimate knowledge of things. If we could know what things were made of and could determine the rules that governed them, nothing, at least in principle, would be beyond our ability to predict.

This assumption was quickly shown not to be true when it comes to the quantum Universe. When you reduce whats real to its smallest components, you find that you can divide all forms of matter and energy into indivisible parts: quanta. However, these quanta no longer behaves in a deterministic fashion, but only in a probabilistic one. Even with that addition, however, another problem still remains: the effects that these quanta cause on one another. Our classical notions of fields and forces fail to capture the real effects of the quantum mechanical Universe, demonstrating the need for them to be somehow quantized, too. Quantum mechanics isnt sufficient to explain the Universe; for that, quantum field theory is needed. This is why.

Schematic animation of a continuous beam of light being dispersed by a prism. Note how the wave ... [+] nature of light is both consistent with and a deeper explanation of the fact that white light can be broken up into differing colors. However, radiation doesn't occur continuously at all wavelengths and frequencies, but is quantized into individual energy packets: photons.

Its possible to imagine a Universe where nothing at all was quantum, and where there was no need for anything beyond the physics of the mid-to-late 19th century. You could divide matter into smaller and smaller chunks as much as you like, with no limit. At no point would you ever encounter a fundamental, indivisible building block; you could reduce matter down into arbitrarily small pieces, and if you had a sharp or strong enough divider at your disposal, you could always break it down even further.

In the early 20th century, however, this idea was shown to be incompatible with reality. Radiation from heated objects doesnt get emitted at all frequencies, but rather is quantized into individual packets each containing a specific amount of energy. Electrons can only be ionized by light whose wavelength is shorter (or frequency is higher) than a certain threshold. And particles emitted in radioactive decays, when fired at a thin piece of gold foil, would occasionally ricochet back in the opposite direction, as though there were hard chunks of matter in there that those particles couldnt pass through.

If atoms had been made of continuous structures, then all the particles fired at a thin sheet of ... [+] gold would be expected to pass right through it. The fact that hard recoils were seen quite frequently, even causing some particles to bounce back from their original direction, helped illustrate that there was a hard, dense nucleus inherent to each atom.

The overwhelming conclusion was that matter and energy couldnt be continuous, but rather were divisible into discrete entities: quanta. The original idea of quantum physics was born with this realization that the Universe couldnt be entirely classical, but rather could be reduced into indivisible bits which appeared to play by their own, sometimes bizarre, rules. The more we experimented, the more of this unusual behavior we uncovered, including:

These discoveries didnt just pose philosophical problems, but physical ones as well. For example, theres an inherent uncertainty relationship between the position and the momentum of any quantum of matter or energy. The better you measure one, the more inherently uncertain the other one becomes. In other words, positions and momenta cant be considered to be solely a physical property of matter, but they must be treated as quantum mechanical operators, yielding only a probability distribution of outcomes.

Trajectories of a particle in a box (also called an infinite square well) in classical mechanics (A) ... [+] and quantum mechanics (B-F). In (A), the particle moves at constant velocity, bouncing back and forth. In (B-F), wavefunction solutions to the Time-Dependent Schrodinger Equation are shown for the same geometry and potential. The horizontal axis is position, the vertical axis is the real part (blue) or imaginary part (red) of the wavefunction. (B,C,D) are stationary states (energy eigenstates), which come from solutions to the Time-Independent Schrodinger Equation. (E,F) are non-stationary states, solutions to the Time-Dependent Schrodinger equation. Note that these solutions are not invariant under relativistic transformations; they are only valid in one particular frame of reference.

Why would this be a problem?

Because these two quantities, measurable at any instant in time that we so choose, have a time-dependence. The positions that you measure or the momenta that you infer a particle possesses will change and evolve with time.

That would be fine on its own, but then theres another concept that comes to us from special relativity: the notion of time is different for different observers, so the laws of physics that we apply to systems must remain relativistically invariant. After all, the laws of physics shouldnt change just because youre moving at a different speed, in a different direction, or are at a different location from where you were before.

As originally formulated, quantum physics was not a relativistically invariant theory; its predictions were different for different observers. It took years of developments before the first relativistically invariant version of quantum mechanics was discovered, which didnt happen until the late 1920s.

Different frames of reference, including different positions and motions, would see different laws ... [+] of physics (and would disagree on reality) if a theory is not relativistically invariant. The fact that we have a symmetry under 'boosts,' or velocity transformations, tells us we have a conserved quantity: linear momentum. This is much more difficult to comprehend when momentum isn't simply a quantity associated with a particle, but is rather a quantum mechanical operator.

If we thought the predictions of the original quantum physics were weird, with their indeterminism and fundamental uncertainties, a whole slew of novel predictions emerged from this relativistically invariant version. They included:

Later on, those negative energy states were identified with an equal-and-opposite set of quanta that were shown to exist: antimatter counterparts to the known particles. It was a great leap forward to have a relativistic equation that described the earliest known fundamental particles, such as the electron, positron, muon, and more.

However, it couldnt explain everything. Radioactive decay was still a mystery. The photon had the wrong particle properties, and this theory could explain electron-electron interactions but not photon-photon interactions. Clearly, a major component of the story was still missing.

Electrons exhibit wave properties as well as particle properties, and can be used to construct ... [+] images or probe particle sizes just as well as light can. Here, you can see the results of an experiment where electrons are fired one-at-a-time through a double-slit. Once enough electrons are fired, the interference pattern can clearly be seen.

Heres one way to think about it: imagine an electron traveling through a double slit. If you dont measure which slit the electron goes through and for these purposes, assume that we dont it behaves as a wave: part of it goes through both slits, and those two components interfere to produce a wave pattern. The electron is somehow interfering with itself along its journey, and we see the results of that interference when we detect the electrons at the end of the experiment. Even if we send those electrons one-at-a-time through the double slit, that interference property remains; its inherent to the quantum mechanical nature of this physical system.

Now ask yourself a question about that electron: what happens to its electric field as it goes through the slits?

Previously, quantum mechanics had replaced our notions of quantities like the position and momentum of particles which had previously been simply quantities with values with what we call quantum mechanical operators. These mathematical functions operate on quantum wavefunctions, and produce a probabilistic set of outcomes for what you might observe. When you make an observation, which really just means when you cause that quantum to interact with another quantum whose effects you then detect, you only recover a single value.

If you have a point charge and a metal conductor nearby, it's an exercise in classical physics alone ... [+] to calculate the electric field and its strength at every point in space. In quantum mechanics, we discuss how particles respond to that electric field, but the field itself is not quantized as well. This seems to be the biggest flaw in the formulation of quantum mechanics.

But what do you do when you have a quantum thats generating a field, and that quantum itself is behaving as a decentralized, non-localized wave? This is a very different scenario than what weve considered in either classical physics or in quantum physics so far. You cant simply treat the electric field generated by this wave-like, spread-out electron as coming from a single point, and obeying the classical laws of Maxwells equations. If you were to put another charged particle down, such as a second electron, it would have to respond to whatever weird sort of quantum-behavior this quantum wave was causing.

Normally, in our older, classical treatment, fields push on particles that are located at certain positions and change each particles momentum. But if the particles position and momentum are inherently uncertain, and if the particle(s) that generate the fields are themselves uncertain in position and momentum, then the fields themselves cannot be treated in this fashion: as though theyre some sort of static background that the quantum effects of the other particles are superimposed atop.

If we do, were short-changing ourselves, inherently missing out on the quantum-ness of the underlying fields.

Visualization of a quantum field theory calculation showing virtual particles in the quantum vacuum. ... [+] Whether space (or time) itself is discrete or continuous is not yet decided, as is the question of whether gravity is quantized at all, or particles, as we know them today, are fundamental or not. But if we hope for a fundamental theory of everything, it must include quantized fields.

This was the enormous advance of quantum field theory, which didnt just promote certain physical properties to being quantum operators, but promoted the fields themselves to being quantum operators. (This is also where the idea of second quantization comes from: because not just the matter and energy are quantized, but the fields as well.) All of a sudden, treating the fields as quantum mechanical operators enabled an enormous number of phenomena that had already been observed to finally be explained, including:

With quantum field theory, all of these phenomena now made sense, and many other related ones could now be predicted, including the very exciting modern disagreement between the experimental results for the muons magnetic moment and two different theoretical methods of calculating it: a non-perturbative one, which agrees with experiment, and a perturbative one, which doesnt.

The Muon g-2 electromagnet at Fermilab, ready to receive a beam of muon particles. This experiment ... [+] began in 2017 and continues to take data, having reduced the uncertainties in the experimental values significantly. Theoretically, we can compute the expected value perturbatively, through summing Feynman diagrams, getting a value that disagrees with the experimental results. The non-perturbative calculations, via Lattice QCD, seem to agree, however, deepening the puzzle.

One of the key things that comes along with quantum field theory that simply wouldnt exist in normal quantum mechanics is the potential to have field-field interactions, not just particle-particle or particle-field interactions. Most of us can accept that particles will interact with other particles, because were used to two things colliding with one another: a ball smashing against a wall is a particle-particle interaction. Most of us can also accept that particles and fields interact, like when you move a magnet close to a metallic object, the field attracts the metal.

Although it might defy your intuition, the quantum Universe doesnt really pay any mind to what our experience of the macroscopic Universe is. Its much less intuitive to think about field-field interactions, but physically, theyre just as important. Without it, you couldnt have:

When a nucleus experiences a double neutron decay, two electrons and two neutrinos get emitted ... [+] conventionally. If neutrinos obey this see-saw mechanism and are Majorana particles, neutrinoless double beta decay should be possible. Experiments are actively looking for this.

The Universe, at a fundamental level, isnt just made of quantized packets of matter and energy, but the fields that permeate the Universe are inherently quantum as well. Its why practically every physicist fully expects that, at some level, gravitation must be quantized as well. General Relativity, our current theory of gravity, functions in the same way that an old-style classical field does: it curves the backdrop of space, and then quantum interactions occur in that curved space. Without a quantized gravitational field, however, we can be certain were overlooking quantum gravitational effects that ought to exist, even if we arent certain of what all of them are.

In the end, weve learned that quantum mechanics is fundamentally flawed on its own. Thats not because of anything weird or spooky that it brought along with it, but because it wasnt quite weird enough to account for the physical phenomena that actually occur in reality. Particles do indeed have inherently quantum properties, but so do fields: all of them relativistically invariant. Even without a current quantum theory of gravity, its all but certain that every aspect of the Universe, particles and fields alike, are themselves quantum in nature. What that means for reality, exactly, is something were still trying to puzzle out.

Here is the original post:

This Is Why Quantum Mechanics Isn't Enough To Explain The Universe - Forbes

Read More..

The Quantum Theory of Light Transformed Physics as We Know It – Interesting Engineering

1905 is referred to as the "miracle year" by physicists. In that one year, Albert Einstein published four papers that laid the foundations of modern physics.

One of the major breakthroughs proposed by Einstein in 1905 was the quantum theory of light, which posited that light is made up of small particles, known as photons, and these quantum particles have the ability to show wave-like properties.

From laser technology to television screens, there are many inventions that would have never been possible without the knowledge imparted through Einstein's theory. It not only transformed the domain of quantum mechanics but also influenced various other branches of science.

Scientists began to explore the various properties of light from as early as the 17th century, in order to understand its behavior, motion, and origin of light and develop ways to use this knowledge.

Proposed by Sir Isaac Newton, this theory argued againstChristiaan Huygens' theory, which stated that light was made of waves, by suggesting that thegeometric nature of reflection and refraction of light could only be explainedif light were made up of particles. He referred to these particles as corpuscles. Newton proposed thatevery time light rays strike a surface, corpuscles are reflected back, andthat the density of a medium affects the velocity of light.

Contrary to Newton, Dutch Mathematician Christiaan Huygens argued that light is made up of waves that propagate in a perpendicular fashion with respect to the direction of light. He further explained that every point that a luminous disturbance meets turns into a source of the wave itself. A new wave is then determined by the sum of the secondary waves, that result from the disturbance. Huygens' principle was introduced in 1678 to explain the reflection and refraction caused by light rays.

Many years later, in 1801, British scientist Thomas Young conducted his 'double-slit experiment', which validated Huygens findings on the wave-like behavior of light.

In Youngs experiment, a beam of light from a single source was split into two beams, and the two beams were then recombined and superimposed onto a screen, resulting ina pattern of light and dark fringes onthe screen.Young concluded that the fringes resulted from the fact that when the beams recombined, their peaks and troughs were not in phase. When two peaks coincide they reinforce each other, and a line of light results; when a peak and a trough coincide they cancel each other, and a dark line results.

The formation of the resultant wave or interference pattern by the superimposition of two waves was referred to as interference.

The double-slit experiment produced evidence contrary to Newtons corpuscular theory, and it was the first practical proof of the wave theory of light. Thomas Young mentioned the experiment in Lecture 39 of his famous book A Course of Lectures on Natural Philosophy and the Mechanical Arts.

In the years that followed, French engineer August Fresnels findings on diffraction, the phenomena due to which light spreads when passed through a narrow aperture, also confirmed the relevance of the double-slit experiment.

James Clerk Maxwell formulated the theory that electric and magnetic fields propagate with the speed of light, and concluded that light is an electromagnetic (EM) wave. He also predicted the presence of the numerous EM waves that form the electromagnetic spectrum.

According to Maxwells wave theory of light:

= c/

where, = frequencyc = speed of light = wavelength

Later, in 1886, Heinrich Hertz built a spark-gas transmittercomposed of induction coil and a Leyden jar (a capacitor) to create electromagnetic waves, and a spark gap between two brass spheres to detect them. Using this apparatus, he detected radio waves (which also traveled at the speed of light). Hertzs experiment proved the existence of EM waves proposed by Maxwell.

In 1900, Max Planck postulated that energy of light is emitted in the form of small packets of energy called quanta; and that the energy of each quanta is directlyproportional to its frequency.Planck won the Nobel prize in 1918 for his work, which also set the stage for the development of quantum mechanics.

The notion that like matter, light also exists in the form of both particle and wave was further explained by Einstein and Louis De Broglie.

The emission of photoelectrons from a metal surface when light strikes the metal is called the photoelectric effect. The electrons released during this process are called photoelectrons and their emission is influenced by the frequency of the incident beam of light.

The photoelectric effect was first proposed in 1887 by Heinrich Hertz, who observed the occurrence of electric charge in a cathode ray tube when UV light hit the cathode. In 1897, physicist J.J. Thomson performed a cathode-ray tube experiment, which led to the discovery ofelectrons.Thomson also proposed the plum pudding model of the atom, in which negatively-charged electrons were embedded like raisins within a positively-charged "plum pudding".

The photoelectric effect was explained in detail by Albert Einstein in 1905, when he proposed that light is made of tiny particles called photons (previously called quanta), with the energy of a photon given as

E E = h (Plancks equation) orE = hc/

here,E = energy of a photonh = Plancks constant (6.626 10-34 m2 kg/s) = frequency of incident light = wavelength of lightc = speed of light in vacuum

The minimum amount of energy required by an electron to leave the metal surface is referred to as threshold energy, and the minimum value of frequency of light that is sufficient to cause the photoemission of an electron is called threshold frequency.

= hth

= hc/th

here, = threshold energyth = threshold frequencyth = threshold wavelength

The photoelectric effect follows the law of conservation of energy which states that energy can neither be created nor be destroyed. The energy of a photon is equal to the sum total of energy required to emit an electron and the kinetic energy of the emitted electron.

h= W + E

here,

h = Plank constant = frequency of the incident photon.W = work function (the minimum photon energy required to liberate an electron from a substance)E = maximum kinetic energy of ejected electrons (1/2 mv).

The photoelectric effect not only validated the particle nature of light but also strengthened the possibility of photons acting as a wave (since Einsteins equation involved both frequency and wavelength). In 1921, Albert Einstein was awarded Nobel Prize in Physics for his exceptional work on the photoelectric effect and the quantum theory of light.

De Broglie put forward the idea that light exhibits wave-like properties such as frequency and wavelength, and dual nature is not a special case but the fundamental nature of light energy.

He combined Einsteins special theory of relativity with Plancks equation for energy to reveal the wave nature of light in the year 1924.

E = mc2

E = h

mc2 = h

mc = h/c = p

here,p = momentum

Now, we know that frequency and wavelength share an inverse relationship, and

=c/f

p = h/

= h/p = h/mv

here, = De Broglie wavelengthv = velocity of particle

In his theory, De Broglie explained that = h/mv demonstrate the wave nature of particles. He came to the conclusion that if a wave can show particle behavior then a particle (photon) is also able to exhibit the properties of a wave.

More than 100 years have passed since the quantum theory of light got introduced to us, but even today this theory is so relevant that many modern-day discoveries and inventions are found to be based upon its underlying knowledge.

From cosmology to holograms, our understanding of light has changed the world in numerous ways.

Read the rest here:

The Quantum Theory of Light Transformed Physics as We Know It - Interesting Engineering

Read More..

Quantitative hyperspectral coherent diffractive imaging spectroscopy of a solid-state phase transition in vanadium dioxide – Science Advances

Abstract

Solid-state systems can host a variety of thermodynamic phases that can be controlled with magnetic fields, strain, or laser excitation. Many phases that are believed to exhibit exotic properties only exist on the nanoscale, coexisting with other phases that make them challenging to study, as measurements require both nanometer spatial resolution and spectroscopic information, which are not easily accessible with traditional x-ray spectromicroscopy techniques. Here, we use coherent diffractive imaging spectroscopy (CDIS) to acquire quantitative hyperspectral images of the prototypical quantum material vanadium oxide across the vanadium L2,3 and oxygen K x-ray absorption edges with nanometer-scale resolution. We extract the full complex refractive indices of the monoclinic insulating and rutile conducting phases of VO2 from a single sample and find no evidence for correlation-driven phase transitions. CDIS will enable quantitative full-field x-ray spectromicroscopy for studying phase separation in time-resolved experiments and other extreme sample environments where other methods cannot operate.

The rich diversity of thermodynamic phases in solid-state systems results in a wide range of properties and behaviors, and the study of phase diagramsthe behavior of the system under different thermodynamic conditionsconstitutes a major branch of investigation in condensed matter physics (1). In complex materials, many different phases can emerge through the interplay of the spin, charge, and lattice degrees of freedom, and these different phases can coexist at the nanoscale. When under extreme stimuli such as high magnetic fields and temperatures or exposed to ultrafast laser excitation, previously unidentified material phases can emerge that have unexpected and potentially useful properties; for instance, room temperature superconductivity now famously emerges at extremely high pressures (2). In materials showing phase coexistence, these hidden phases may emerge only at the nanoscale, with the presence of other phases either intrinsically competing with or even being necessary for the stability of the novel phase. These nanoscale phases have been posited to exist in a wide range of materials including prototypical systems such as the cuprates (35) and manganites (6, 7) but are perhaps most famously proposed to exist in the vanadates. Vanadates form one of the most widely studied groups of quantum materials because of their prototypical metal-to-insulator (IMT) phase transition (8) and their catalytic properties (9, 10).

Many experiments in the vanadates have suggested that novel correlated phases can exist in equilibrium on the nanoscale close to the IMT, which are different from those found in the bulk. For example, both V2O3 (11) and VO2 (12, 13) have been reported to exhibit correlation-driven nanoscale metallic phases the critical temperature (Tc). VO2 in particular is also claimed to exhibit a nonequilibrium nanoscale phase after optical excitation (1416). Many of these claims have not directly observed the phase but inferred it from combining multiple techniques. Such an approach has been shown to be unreliable, and the presence of these hidden phases remains unproven (17). Thus, understanding emergent phases in these complex and spatially inhomogeneous materials requires nanoscale imaging methods that are compatible with the extreme conditions under which they are generated and can return information on the local spin, charge, and lattice state. While electron diffraction has been shown to measure dynamical phase coexistence at the nanoscale (18), multiple probes are needed to determine the properties of materials. Specifically, metal-insulator transitions may not show a structural change. Another promising method is x-ray spectromicroscopy, which leverages the power of x-ray spectroscopy to provide the sensitivity to electronic, chemical, and bond-angle makeup in nanoscale systems necessary to understand nanoscale phase separation (19, 20). X-ray spectroscopy is already widely used to study phase transitions because of this sensitivity (12, 2123), and x-ray spectromicroscopy is already used to study emergent phase coexistence at the nanoscale in some systems (24). However, x-ray imaging methods are limited by available x-ray optics, restricting their spatial resolution, bandwidth, and sample geometries (19). In particular, conventional x-ray optics need to be very close (1 m) to the sample to achieve nanometer resolution, hampering the use of high magnetic fields or laser excitation. To surpass these limits, coherent scattering methods such as coherent diffractive imaging (CDI) and ptychography have been developed, which do not rely on the x-ray optics but rather on the coherence of the beam to achieve high spatial resolution (25, 26). Coherent imaging methods can achieve diffraction-limited resolution while having the advantage of returning the full complex amplitude and phase of the sample. Coherent imaging methods work by numerically inverting the captured scattering pattern of the sample, and a variety of geometries are possible that are compatible both with time-resolved measurements (27, 28) and with samples in complex environments (25, 29). The important role that coherent scattering imaging methods could have in understanding phase transitions at the nanoscale has been previously recognized (26), but imaging is only part of the story. Methods that can also identify the properties of nanophases are also needed, and to date, no measurements discriminating different solid-state phases have been demonstrated.

Measuring nanoscale solid-state phases presents new challenges for coherent imaging methods. First, we note that while solid-to-solid phase transitions can modulate the density of the material, the density-length product is unchanged, and this quantity is the relevant value for imaging of two-dimensional (2D) objects. Hard x-ray coherent imaging can directly measure changes in structure and density (26), but many phase transitions show either no or marginal changes in structure; measuring the minor structural changes associated with some phase transitions is challenging even for large macroscopic crystals where the sensitivity is intrinsically many orders of magnitude larger than for nanoscale imaging methods (30). Thus, large density-related changes cannot be expected to provide a strong contrast mechanism as in other systems commonly studied with coherent imaging (31, 32). As already indicated, an alternative route is to use the sensitivity of x-ray spectroscopy to changes in the local electronic and nuclear structure to determine the phase makeup. Coherent imaging has been combined with x-ray spectroscopy in the past, most notably with x-ray ptychography where spectral information has been used to measure chemical makeup and charge states (3336), although dichroic CDI measurements mapping the presence of a particular element or magnetic domains are common as well (3739). Dichroic measurements rely on a priori knowledge of the material properties and usually leverage a strong contrast mechanism; changes of 15% are common for magnetic dichroism, while the presence or lack thereof of an element can easily change the scattering probability by orders of magnitude when tuned to an absorption edge. Measurement of charge states (also commonly referred to as oxidization or valency states) on the other hand is conceptually very similar to the measurement of phases, as in both cases the different states manifest as changes in the x-ray absorption spectrum (XAS) due to local alterations of the electronic structure. However, discriminating and identifying phases can be more challenging than charge states; charge states are identifiable through the chemical shift of the absorption edge and exhibit other large changes in the XAS, while phase transitions can result in much more subtle changes distributed across the full absorption spectrum. Particularly for phases that emerge only at the nanoscale, the ability to perform truly quantitative measurements of the XAS becomes critically important, as each portion of the spectrum reports on different aspects of the material properties.

Here, we present the first quantitative x-ray spectromicroscopy of solid-state phases using coherent imaging. We extend CDI to a full spectromicroscopy method, CDI spectroscopy (CDIS), capable of returning the full spectroscopic and nanometer-scale spatial information from a variety of samples. In a geometry compatible with ultrafast laser excitation and high magnetic fields, we acquire hyperspectral images of a vanadium oxide thin film with 25-nm spatial and 0.25-eV spectral resolution across the vanadium L2,3 and oxygen K-edges. We note a previously unappreciated ambiguity of coherent imaging methods in the determination of the sample plane that leads to distortions of the extracted XAS spectrum and outline a procedure for finding the correct object plane, allowing us to recover the complex refractive index of the sample. We show that our sample is a heterogeneous mixture of approximately 80% VO2 and 20% V2O5 using this data. We then heat the sample to observe the phase transition from the monoclinic insulating (M) to rutile conducting (R) phase in the VO2 and extract the full amplitude and phase for all three different states simultaneously. We see no evidence of any intermediate phases, either insulating or metallic, and show that a previously observed intermediate phase we measured with x-ray holography resulted from the ambiguity in object plane. Our results pave the way for the quantitative study of novel nanoscale solid-state phases in a wide range of cases inaccessible to current x-ray spectromicroscopy methods.

A 75-nm-thick film of nominally vanadium oxide, masked to provide the finite spatial extent needed for numerical inversion, was illuminated with synchrotron radiation and the resulting diffraction pattern recorded on a charge-coupled device (CCD) detector (Fig. 1). The nearest beamline optic to the system is several meters away; hence, the method is compatible with laser excitation and strong magnetic fields, along with other environments. The required target stability is low, allowing us to heat the sample without the need for any active stabilization. Reference holes in the mask provide an absolute phase reference for the input wave, allowing us to quantitatively extract both components of the complex refractive index n = 1 i and not just the relative phase shift between different parts of the sample. While these references also permit a Fourier transform holography (FTH) analysis of the sample (40), substantial low-frequency noise in the reconstruction prevents quantitative spectroscopy (section S1 and fig. S1), and a full CDI analysis is found to be necessary.

A tunable synchrotron x-ray radiation source illuminates the sample, and the scattered radiation is collected on a CCD camera. Holographic reference holes in the sample mask provide an absolute phase reference. Long and short exposures are combined to yield a high dynamic range diffraction pattern. Images are recorded at a range of photon energies across the relevant absorption edges; three representative amplitude images are shown.

Although there has been an increasing push for coherent imaging methods that can operate with broadband illumination, spectral scanning of the x-ray wavelength is still required for identification of solid phases from their absorption spectra. In particular, recent broadband CDI measurements require negligible spectral structure in the sample (41), while proposed hybrid scanning-CDI methods return only coarse spectral structure (42); thus, neither can return the XAS spectrum required for phase identification. In our approach, no chromatic elements are used, allowing us to scan the x-ray photon energy from 510 to 535 eV without any adjustments in the optical layout between images, and 101 images were taken in 0.25-eV steps of photon energy. Further details on the sample and experimental methods are given in Materials and Methods.

We invert the diffraction patterns at each photon energy independently to generate real-space images using an iterative phase reconstruction algorithm robust to partial coherence (see Materials and Methods). Concatenating these images leads to a hyperspectral image where each pixel contains the full amplitude and phase of the local XAS spectrum or the local complex refractive index. Extraction of quantitative spectral information from coherent imaging methods is, however, complicated by the fact that coherent imaging methods return a complex real-space image, which can be numerically propagated to other planes. This propagation is necessary as the initial image will be at a particular plane in space, which may or may not correspond to the actual object plane. For instance, in holography, this is the plane of the mask, while in CDI, it is set by the applied spatial constraint: the plane of the aperture in a sample with a mask or the plane of the smallest spatial size for an isolated object (43). Away from the correct plane, the propagation leads to coupling between the amplitude and phase channels, as described by the mathematics of optical or contrast transfer functions (44, 45). This amplitude-phase coupling is perhaps most famously illustrated in the use of propagation to generate shadow images of phase-only objects (46). To extract quantitative information such as the and of the sample in a 2D geometry, it is thus critical to analyze features in the object plane.

The sample plane is most often selected by choosing a feature and propagating until it is in focus, which is to say when it forms the sharpest image. In noncoherent methods, this procedure is generally well defined, but in coherent imaging methods, there is an inherent ambiguity in choosing whether to bring the amplitude or phase component into focus. For instance, an amplitude-only object that was propagated until the phase component came into focus would map to the wrong plane and create an apparent but artificial phase component to the sample, as shown in Fig. 2; the inverse relationship also holds. If only one contrast mechanism (amplitude or phase) is consequential and is known a priori, then the propagation is performed to bring the correct component into focus. Alternatively, and more generally, the artificial propagationinduced amplitude or phase components flip sign as they are propagated through the object plane, while the real components maintain their physically meaningful sign. This can be seen from a Huygens principle view of a source plane, where each point acts a source of a spherical wavelet; propagation through the object plane converts each outgoing spherical wave into an incoming one, inverting the sign. These principles allow us to identify the correct sample plane and extract quantitative information. We have combined both approaches here. Since a normal Lorentzian absorption peak introduces no phase shift at the peak of the absorption, we propagate until the amplitude component came into focus at photon energies corresponding to peaks of the absorption spectrum, where the contrast is dominantly in the amplitude channel, and verifying that the phase flipped sign around this point. The propagation required is the same for all wavelengths as the iterative reconstruction algorithms reconstruct all wavelengths at the constraint (mask) plane. How widespread this issue is in previous measurements is unclear as physically plausible images are returned away from this plane, but it appears to not have been noted previously.

Amplitude (A) and phase () of an amplitude-only object imaged before, at, and after the object plane. Propagation causes features in the amplitude channel to appear in the phase channel. The artificial propagationinduced phase changes sign through the real focus at the object plane.

Having identified the correct object plane, we use the hyperspectral image to generate false-color images where the different red-green-blue color channels encode a variety of spectral bands. Figure 3A shows a comparison between a scanning electron microscopy (SEM) micrograph and a false-color amplitude image of the sample at 320 K that we composed from eight different frequency bands chosen to highlight all states of interest while remaining visible to those with color blindness (see Materials and Methods). A comparison of the finest features observed in the CDIS reveals a resolution of 25 nm in the maximum q direction (fig. S6). Thickness changes, reflecting the topography of the sample, are apparent in all individual images and as blue-white features in the composite image, agreeing very well with the SEM micrograph. Most notably, however, we can see two major regions of the sample (pink and green) that are not apparent from the SEM image. To identify these regions, we extract the local x-ray absorption spectra from the hyperspectral image, which are plotted in Fig. 3B, along with the integrated spectrum of the whole sample. Clear differences appear particularly on the vanadium L2,3-edges (518 and 522 eV). The assignment of features in the XAS spectra of vanadium oxides is well established (22, 4750), which allow us to identify the pink regions as monoclinic insulating phase (M1) VO2 and the green region as V2O5. In particular, the absorption peak at 516 eV and the resonance at 522 eV are much more pronounced in VO2, while the ratio of the two oxygen peaks at 529 and 532 eV also acts as a marker for the charge state (22, 50). Furthermore, while forward scattering CDI is generally insensitive to the crystal orientation when off-resonance, the XAS spectra of anisotropic materials are themselves strongly anisotropic, and clear differences depending on the crystal orientation relative to the x-ray polarization can be found. In M1-phase VO2 (51), shifts of around 1 eV would be expected in the peak at 516 eV and the resonance at 522 eV for perpendicularly orientated samples, but such large features are absent, and a homogenous spectrum is observed across all VO2 regions. In V2O5, the orientation manifests as the appearance or lack thereof of additional peaks between 516 and 520 eV changes in the relative peak heights at the oxygen edge (52); no such changes are observed in the sample. Thus, we conclude that the crystallites in this field of view have the same orientation of their anisotropy axis.

(A) SEM image of the sample and comparison to a false-color composite image of the sample at 320 K. Scale bar, 200 nm. Two different regions (pink and green) are clearly visible in addition to the topography (blue-white). The composite is formed from a combination of images taken at individual photon images. A total of 101 monochromatic images are taken, allowing us to extract a full transmission spectrum for each pixel in a hyperspectral imaging scheme. Comparison of lineouts around the finest spatial features reveals a spatial resolution of 25 nm. (B) Transmission spectra of the pink and green regions of the sample, along with the integrated spectrum as would be observed in absorption spectroscopy (black). The full spectral information allows us to identify the green region as V2O5 and the pink as monoclinic phase VO2.

Despite V2O5 making up 20% of the sample and the large differences in their spectra, the integrated spectrum is broadly consistent with that of the nominal film composition, VO2, underlining the necessity of spectroscopic imaging methods in accurately determining local material composition. We note that in the pre-edge spectrum, the overall transmission is lower than for either the VO2 or V2O3; this is due to defects in the sample that absorb more strongly off-resonance than the vanadate portion but on resonance become less important. Our results, which show a substantial V2O5 fraction, may be of interest for the manufacturing of VO2-based devices as the thin films were deposited using the commonly used technique of pulsed laser deposition (PLD) and subsequent annealing (53), although previous work has shown that some V2O5 in PLD films does not adversely affect the switching behavior of the entire film (54). The boundary between the V2O5 and VO2 regions does not always correspond to a distinct topographic feature in the SEM or off-resonant CDI images, demonstrating the ability of CDIS to track oxygen diffusion that is not visible to nonspectroscopic imaging.

We next use the coherent nature of CDIS together with our reference structure to examine the IMT phase transition of VO2 and extract the complex refractive index of each phase. As we heat the sample through the VO2 phase transition (336 K), regions of the rutile metallic phase (R phase) begin to nucleate near crystallite edges and other defects, eventually growing to encompass the entire VO2 volume (fig. S2) (55). Although the (M1) start and end (R) points of this phase transition are well known, numerous studies have speculated the existence of nanoscale intermediate phases, but no direct visualizations of these states have been possible. Therefore, we perform CDIS across the phase transition region to investigate the role of hidden phases. Figure 4A shows a false-color amplitude image of the sample when heated to 336 K, at the center of the transition. Purple regions have grown to encompass roughly 50% of the VO2. We can see an identical structure in the phase, as seen in Fig. 4B, where an alternative set of photon energies is used to generate the false-color image (Materials and Methods). This alternative set is necessary because the phase shift imparted by an absorption resonance changes sign passing through the peak of the absorption, leading to a minimal phase shift imparted at the maximum absorption and zero contrast in phase coinciding with maximum contrast in the amplitude.

(A) False-color amplitude of the VO2/V2O5 sample at 336 K. The new purple regions correspond to the R phase of VO2. Scale bar, 200 nm. (B) Corresponding phase image. (C) Extracted real (solid lines) and imaginary (faded lines) parts of the refractive index for V2O5 and the monoclinic and rutile phases of VO2, with colors corresponding to the false-color amplitude image. (D) Difference between the real and imaginary parts of the refractive index for VO2 R and M1 phase. (E) Difference between the real and imaginary parts of the refractive index for V2O5 and VO2 M1 phase. The standard error of the spectra calculated across their respective regions of interest results in error bars too small to be seen here.

These newly grown phase domains are readily identifiable as the R phase by comparing our data to absorption spectra found in the literature; in particular, the R phase results in a large shift in the 529-eV resonance at the oxygen edge. As a function of temperature, we only observe an increase in the metallic phase volume fraction and find no evidence for any intermediate structures, indicating direct transitions from the M1 to R phase. Our results, which show identical domains in both the amplitude and phase channels, also rule out the presence of strong strain fields that could induce a secondary transition (56) and show conclusively that nanoscale monoclinic metallic domains do not form during the transition.

This demonstrates the power of CDIS over conventional FTH analysis, which uses a beam block to suppress the direct beam for phase identification. Our previous FTH measurements suggested that the M2 phase formed in specific locations, which we attributed to strain. However, x-ray absorption spectra could not be obtained to confirm the M2 phase assignment. This now appears to not be the case. While the beam stop used in FTH can increase the dynamic range by suppressing the central part of the hologram, it results in mixing the real and imaginary components of the scattered field, making quantitative spectroscopy impossible and prevents the correct assignment of the object plane unless the properties of the object are already known. Amplitude-to-phase coupling via propagation of coherent x-rays then leads to the appearance of artificial phase domains (see section S3); this artifact underlies the importance of precise determination of the object plane for quantitative measurement in coherent x-ray spectromicroscopy.

Using the full complex field, we can locally extract the complex refractive index of all three constituents (V2O5 and R-phase and M-phase VO2) from a single hyperspectral dataset as shown in Fig. 4C. The imaginary part of the refractive index is in excellent agreement with previous absorption measurements for all three constituents, while the previously unmeasured real part provides a new constraint for ab initio models of vanadium oxides (57). The retrieved is noisier than because the phase shift is measured relative to the reference holes, which have low amplitude because of their small size and are imaged as near single-pixel size. However, as the values at different regions are measured simultaneously and relative to the same reference, this noise cancels when considering the differences in and . This also ensures that the differences are robust with respect to drift during measurement, an advantage of full-field imaging like CDI over scanning methods like ptychography. This advantage was previously noted in conventional x-ray spectromicroscopy (58) but in such systems applied only for absorption or phase independently. The differences in and between different constituents are shown in Fig. 4 (D and E) and show oscillations characteristic of the resonance structure, with n and k oscillating out of phase. This is consistent with the Kramers-Kronig relation of the resonances and is a very effective way of identifying the exact energies of overlapping resonances. While no intermediate phases were observed in our dataset, in contrast to reports elsewhere (12, 55), in the event that phases without a macroscopic counterpart were observed, such a difference analysis would allow the parent phase to act as a local reference for both amplitude and phase of the nanoscale state. This differential analysis would allow very subtle changes in the XAS to be observed. This is particularly important for detecting domains that are below the resolution limit. For example, experiments have suggested that transient domains, of an order of 10 to 20 nm in size, may exist in VO2 (15). While such domains could not be observed directly in the current approach, phase coexistence will still affect the optical properties, with reduced magnitude, on a scale that can be observed. Thus, subresolution domains with dimensions close to the resolution limit could still be inferred.

We have performed the first quantitative measurements of solid-state phases using coherent x-ray imaging by extending CDI to a full x-ray spectromicroscopy method, CDIS, demonstrating the first full-field coherent imaging spectromicroscopy method. By generating hyperspectral images of a vanadium oxide thin film spanning 510 to 535 eV and identifying the correct object plane, we recover the full complex amplitude and phase across the XAS and uniquely identify different oxidization states and phases at temperatures across the VO2 insulator-to-metal phase transition. We return the first quantitative measurement of the complex refractive indices for V2O5 and the M1 and R phases of VO2. We also rule out the presence of intermediate phases in the IMT for our sample and show that an intermediate phase previously observed with FTH is a propagation artifact, underlining the extra care that is needed for extracting even qualitative information from coherent imaging methods.

In comparison with conventional methods, we require no highnumerical aperture diffractive optics (33, 58). Hence, CDIS is fully compatible with time-resolved measurements and with a wide range of sample environments, including high magnetic field and cryogenic samples, paving the way for multidimensional spectroscopic imaging of solid-state phase transitions with x-ray probes. In comparison to ptychography, CDIS requires no stabilization of the sample position, which may be prohibitive for high magnetic field geometries, and no scanning of the probe beam or sample, which is advantageous for multidimensional measurements. In addition, as CDIS is a full-field measurement, it has reduced sensitivity to drift in the x-ray source, which may be important for experiments at x-rayfree electron lasers or with high-harmonic sources.

While here a mask chosen to allow comparison of FTH and CDI was used to examine the optimal route for coherent x-ray spectromicroscopy, in the future, larger reference holes placed closer to the sample aperture can speed acquisition, reduce noise in the real part of refractive index, and relax requirements on the beam coherence without reduction in the reconstruction quality or resolution (59). Although the current measurements were made with 0.25-eV spectral resolution and 25-nm spatial resolution, state-of-the-art beamlines allow for sub10-meV spectral resolution if required, while the spatial resolution is set by the scattering geometry and is ultimately diffraction limited. The spectral resolution used here, which allows for clear discrimination between material phases, corresponds to 15-fs transform-limited pulses, which could allow for time-resolved studies of phase transition beyond magnetic dynamics at x-rayfree electron lasers (XFELs) (60) or with high-harmonic generation sources (61). However, the slow recovery time of optically excited samples may favor the use of XFELs, which generally operate at reduced repetition rates and provide sufficient time for sample recovery (55, 62). We note that moving between absorption edges separated by hundreds of electron volts, relevant for many materials, would require no changes to the setup.

Here, we have considered soft x-rays, but hard x-ray XAS can also reveal local chemical and phase changes (63). CDIS in the hard x-ray could be used for phase-sensitive measurements in an even greater range of sample environments, particularly at the extreme pressures in diamond anvil cells where a range of exotic phases are produced. As the wealth of material phases that emerge at the nanoscale under extreme stimuli continue to be unveiled, the ability of coherent x-ray spectromicroscopy to reveal the quantitative nature of these phases will make it an increasingly valuable tool for condensed matter physics.

A 75-nm-thick layer of VO2 was deposited on a Si3N4 membrane using PLD and subsequent annealing (55). The opposite surface was coated with a [Cr(5 nm)/Au(55 nm)]20 multilayer (~1.1 m integrated Au thickness) to block the x-rays. Using a focused ion beam (FIB), a 2-m-diameter aperture was cut in the Cr/Au multilayer to define the field of view. Three 50- to 90-nm-diameter reference apertures were FIB-milled through all three layers on a 4.5-m radius around the central aperture.

The experiments were performed at the ALICE x-ray scattering instrument at the UE52-SGM beamline of the BESSY II synchrotron-radiation source. The sample was heated to between 300 and 360 K with 0.1 K stability. Linearly polarized x-rays were focused using a longfocal length Kirkpatrick-Baezlike mirror pair, giving a large focal spot (60 120 m full width at half maximum) approximately 25 cm upstream from the sample itself. Diffraction patterns were acquired with a 2048 2048, 13.5-m pixel x-ray CCD (Princeton Instruments MTE 2048B) placed 40 cm downstream of the sample. The maximum q direction supports a resolution of 25 nm. Ten long (3.5 to 4 s) and short (0.03 s) exposures were combined to maximize the signal to noise at the low- and high-amplitude regions before reconstruction, with a central beamblock added for the long exposures to avoid camera damage and CCD bleed. The long-exposure images were scaled to ensure that the integral over the first airy disk ring was the same between both images and combined using a circular mask with Gaussian edges.

The reference holes used to provide an absolute phase reference additionally pin the zeros of the interference fringes, improving reconstruction robustness (38, 59, 64) but at the cost of increasing the sample size to near typical transverse coherence length of third-generation synchrotrons (65). This necessitates the use of partial coherence reconstruction algorithms, which return both the complex object and the x-ray coherence function. Hence, reconstructions were performed using the partially coherent reconstruction algorithm of Clark et al. (66); 50 iterations of the error reduction algorithm were followed by 100 iterations of the hybrid input-output algorithm (67), ending with another 49 iterations of error reduction. Every 15 iterations, the coherence function was updated by 25 iterations of the Lucy-Richardson deconvolution algorithm (68, 69). The use of partially coherent methods was essential to obtain good reconstructions; fully coherent methods introduced substantial low-frequency noise to the reconstruction. We found no evidence of underdetermination for the partial coherent reconstruction of complex objects as described in previous works (64), likely because the reference holes, while only partially coherent, lead to sufficiently clear interference fringes to pin the reconstructed phase. Most critical to obtaining high-quality reconstructions was the accurate determination of the mask function, particularly for the hybrid input-output method; a single pixel error in the mask radius introduced substantial reconstruction noise. With correct determination of the mask, the algorithms were found to converge to the same value regardless of the initial guess. Images were reconstructed at 101 different energies and four different temperatures. By setting the pre-edge transmission to coincide with the atomic scattering factor transmission for a film of 75 nm thickness (90%), we convert the relative transmission values to absolute values. As small drifts in the synchrotron mode over time can adversely affect the spectrum, we used the nominally temperature-independent spectrum of the V2O5 to normalize out these effects. Example images at select photon energies, details on the object constrain, and resolution determination can be found in section S4.

False-color images are generated to show contrast for particular phases of VO2, using the different spectral response of the different phases in the real and imaginary part of the refractive index. To that end, the sum and difference images at various photon energies are considered to take advantage of the full spectral information. For amplitude images, the red channel was taken as the sum of images at 514.25 and 517 eV with images at 518.25 and 515.25 eV subtracted, the green channel images at 521 and 529.5 eV added with 530.5 eV subtracted, and the blue channel at 517 eV. A DC level of 0.5 was added to the red channel, and the channels were scaled by 1.6, 1, and 1.3, respectively, for plotting. Phase images were composed with the red channel taken as the sum of images at 514.75, 517.75, and 529.75 eV with images at 518.75, 522.75, and 531 eV subtracted; the green channel images at 521.5, 524, 525.5, and 528.75 eV added; and the blue channel the sum of 515 and 517.75 eV with 516.5 eV subtracted. A DC level of 0.5 was subtracted from the red channel, and the channels were scaled by 1, 1, and 1.2, respectively, for plotting. This scheme was chosen instead of a simple mapping of differences between the relative phases to different color channels for three reasons: one, to map different phases to different colors in such a way that the three phases are apparent to viewers with all major types of color blindness; two, to maximize the spectral information used to differentiate different phases, reducing the impact of reconstruction noise or sample contamination, particularly at the oxygen K-edge; and three, to enable the simultaneous viewing of the sample topography and phase/oxidization state separation, to better understand the relationship or lack thereof between topography and said separation.

Acknowledgments: Funding: This project has received funding from the European Research Council (ERC) under the European Unions Horizon 2020 Research and Innovation Programme (grant agreement no. 758461) and under the Marie Skodowska-Curie grant agreement no. 754510 (PROBIST) as well as the Ministry of Science, Innovation and Universities (MCIU), State Research Agency (AEI), and European Regional Development Fund (FEDER) PGC2018-097027-B-I00 and was supported by Spanish MINECO (Severo Ochoa grants SEV-2015-0522 and SEV2015-0496) as well as Fundaci Privada Cellex and CERCA Programme/Generalitat de Catalunya. L.V. acknowledges financial support by the HZB. Research at Vanderbilt was supported by the NSF EECS-1509740. Author contributions: S.E. and S.W. conceived the project. K.A.H. and R.F.H. grew the samples, which were processed by C.M.G. subsequently. L.V., D.P.-S., C.M.G., and B.P. measured the diffraction patterns. A.S.J., J.V.C., and D.P.-S. processed the images, and A.S.J. analyzed the data. A.S.J. and S.W. wrote the manuscript with input from all authors. Competing interests: The authors declare that they have no competing interests. Data and materials availability: All data needed to evaluate the conclusions in the paper are present in the paper and/or the Supplementary Materials. Additional data related to this paper may be requested from the authors.

Here is the original post:

Quantitative hyperspectral coherent diffractive imaging spectroscopy of a solid-state phase transition in vanadium dioxide - Science Advances

Read More..

Femi Fadugba: Theres no reason why Peckham couldnt be the theoretical physics capital of the world – The Guardian

Had it not been for his secondary school caretaker, physicist-turned-novelist Femi Fadugba might never have gone on to study material sciences and quantum computing at Oxford University. I dont usually tell people this story because it sounds like something out of a movie, he says, laughing, on a video call from Peckham, south London. He gave me a Quantum Physics for Dummies book when I was 11. It was only a couple of years ago that I found his phone number and called him up. He told me that he had a PhD and was really into physics, but just wasnt able to pursue it.

Similarly, had it not been for his career in quantum physics, Fadugba might never have written his debut sci-fi novel, The Upper World a book about time travel set in Peckham and deeply informed by the study of atoms, matter, energy and relativity. I decided I wanted to write this book because Id have conversations with people who would ask me to explain quantum physics. Theyd always be super fascinated and wanted me to recommend a book, but I couldnt find one that I could put my hand on my heart and say: Youll dig this.

So he set out to do exactly what Toni Morrison had asked of anyone frustrated by the lack of diverse stories in the landscape of literature: If theres a book that you want to read, but it hasnt been written yet, then you must write it.

The Upper World is a uniquely thrilling, heart-wrenching young adult novel that follows two teenage protagonists. Esso and Rhia exist in different time periods, 2020 and 2035 respectively, but are connected by a life-changing event a bullet heading for an alleyway and set to cause irreversible harm. When Esso is hit by a car, he is transported to a mysterious place where he discovers that he can see into the past and the future. He then seeks to change the course of this tragic event, which somehow involves Rhia a girl living in foster care who is desperate to learn the truth about her parents.

Peckham is full of people who look like me. People from somewhere else, but also from here, says Fadugba. His eyes light up whenever he talks about the neighbourhood. Ive seen two decades of change in Peckham, so I felt comfortable trying to project another couple of decades. I also just really like this place.

Now aged 34, Fadugba, who was born in Togo, moved to England from the US in 1997, when he was nine years old. He spent much of his childhood moving between a boarding school in Somerset and various African countries with his parents, when his father was working as an interpreter for the UN. But the summers and half terms at his aunts house in a Peckham estate had the biggest impact on him. As a Nigerian, there arent many places in the world, including Nigeria and including most of England, where I feel so at home.

The idea of feeling at home is something Fadugba struggled with when it came to his career in science, however. In addition to Oxford, he studied at the University of Pennsylvania and taught science I published in PRL [a peer-reviewed scientific journal], which is where Einstein published. I was at the peak of my career. But at the same time, something about writing equations for my whole life seemed too abstract and removed from real life problems.

He eventually left academia and went into the energy sector, working full time at a solar finance company. He didnt start reading fiction until his late 20s (naming Chimamanda Ngozi Adichie, Stephen King and Orson Scott Card as particular favourites) but something clicked, and he decided to teach himself how to write. I had a couple of chats that convinced me that writing was something you could learn and didnt have to be born with. That was the switch for me. He still had the urge to communicate scientific ideas and theories, but wanted to do that through fiction rather than academia.

Perhaps thats why The Upper World, despite its humour, is also enjoyably educational. Theories relating to time and space are woven into the narrative. The appendix is full of equations relevant to the plot, such as the speed of light and the Pythagorean theorem. But deep down, the novel is about grief, loss and hope. I was dealing with a similar situation to what Esso goes through, in terms of losing someone before their time, because of some madness. The part in the book where someone gets shot: there are kids and adults who are dealing with this in real life. I felt a responsibility to explore what that meant.

An important part of the novel is its investigation into the concept of free will. As the two teenagers fight to change the future, the psychological and sociological influences on a persons destiny are a central part of the narrative. For the black community in the UK, so much of the tension is fundamentally about free will. Are our people in the position that theyre in because they made bad decisions or was it actually [out of their control]? Its tough. I do think we are a product of our environment, but at the same time, if Im standing in front of a kid who is in a shit situation, thats not a helpful thing to say. We have an obligation to explore both sides, instead of making the false choice that only one of them is true.

What makes The Upper World so groundbreaking is how it straddles multiple realities and truths. Its geeky but cool, otherworldly but also very south London, a genre-defying book for which Penguin Random House Childrens won the rights after a crazy 15-way auction. It also grabbed the heart of Daniel Kaluuya: the Oscar-winning actor will not only star in the Netflix film adaptation, he will also co-produce it.

My book leaked to Hollywood, says Fadugba in disbelief, speaking of the whirlwind that ensued in June 2020, straight after he sent his manuscript to publishers. I still dont know how that happened, but apparently, it happens. A bunch of studios got hold of it the big ones. Jordan Peeles Monkeypaw Productions and Brad Pitts Plan B Entertainment. You can circle a month around the period where I got my book deal and I got the Netflix deal. It was exciting and also very hard to process. Even now, I still talk about it as if it happened to someone else.

A big reason why the mammoth successes have yet to sink in is because they came at a personally difficult time. In early 2020, Fadugba had been living with his wife in Kenya when the countrys president tweeted that they would be shutting the borders due to the pandemic. I packed up my life in two days and went to my aunts house [in Peckham]. My wife is American and had to go back to her family. I spent a whole year in my aunts spare bedroom, separated from my wife, while the world went down. The couple finally reunited in June 2021.

It has all been incredibly stressful, Fadugba tells me. The gradient of change was insane. But hes grateful, of course. I can see how visibly excited he is speaking about his new life. He humbly smiles while talking about the fact that he will be executive producer on the Netflix film. Its a strong team, he says. Eric Newman [the producer of Narcos, Children of Men and Bright] has the experience of making sick films and shows. Daniel knows how to navigate both worlds. Hes from ends, but hes also an Oscar-winning actor.

I ask him if hes nervous about whether the adaptation will be as good as the book. My agent put me in touch with Nick Hornby, who has had the experience of having his books adapted into films, and he gave me a metaphor. If you design ankara suits and then someone buys it and turns it into a bikini, that bikini could go on to sell more than your suit. Even though the Netflix team has been really faithful to the vision, you have to let it breathe in whatever direction it needs to.

So what does Fadugba see when he looks into his own future? Im currently writing a film with a couple of mates, and a well-known rapper called CS. Hes also working on the sequel to The Upper World, which will focus on quantum mechanics and the multiverse. But, he says, my biggest purpose has always been about education. I dont mean that necessarily in terms of getting all kids into Stem [science, technology, engineering and mathematics]. I think its more about getting kids to explore all the different parts of their mind. Theres no reason why Peckham couldnt be the theoretical physics capital of the world I mean, there are reasons, but there are no good reasons.

His plan is to find a way to use music, virtual reality and gaming to facilitate maths and physics education. Looking at what he has achieved so far, with his physics career and his first ever attempt at writing fiction, very little seems impossible. I was born in a civil war. There have been too many times that things could have gone left, he says, referring to everything from his familys immigration struggles to his time spent in Rwanda, to living in a council flat and seeing all kinds of shit go on. When I think about the stuff that has happened to me, I think to myself: I was given this [gift]. Enjoy yourself, take care of your mental health. But use it.

An edited extract from The Upper World

After the collision, I expect to turn and see a pumpkin-coloured bench stued with people waiting for the 78, 381, 63 or 363. And, on the other side of the road, I expect a barbershop, followed by a Western Union, then a pub, then a corner shop selling fufu and Oyster-card top-ups the same rota of shops that repeats itself across Narm, interrupted only by the odd pound shop or chain cafe.

I expect to see a Range Rover with a dent in its front end and Im ready to go ballistic on the driver, threaten to sue him, punch him, both. I expect no, I hope to see a little boy, sitting safely on the pavement, in roughly the same shape and condition Id met him.

Instead I can barely see my own hands. Darkness has swallowed them. And inside the darkness are echoes: half-familiar screams and hushed voices, each one loud enough for me to hear, but not clear enough to make out the words. My mind draws its own imaginary lines in the dark, filling it with demonic creatures with jagged teeth and talons. Scenario A, I think, this is a dream, and Im alive. Scenario B: Im dead, and this is either heaven or hell.

A bead of sweat tumbles down my forehead. Above the echoes, I can hear my heart pounding and my breaths getting shorter. In all the Sunday school lessons I remember, not one mentioned heaven looking like a barren wasteland filled with screams. Not to mention the scorching heat. Please let this be scenario A.

Original post:

Femi Fadugba: Theres no reason why Peckham couldnt be the theoretical physics capital of the world - The Guardian

Read More..

Matter From Light. Physicists Create Matter and Antimatter by Colliding Just Photons. – Universe Today

In 1905 Albert Einstein wrote four groundbreaking papers on quantum theory and relativity. It became known as Einsteins annus mirabilis or wonderous year. One was on brownian motion, one earned him the Nobel prize in 1921, and one outlined the foundations of special relativity. But its Einsteins last 1905 paper that is the most unexpected.

The paper is just two pages long, and it outlines how special relativity might explain a strange aspect of radioactive decay. As Marie Curie most famously demonstrated, some materials such as radium salts can emit particles with much more energy than is possible from simple chemistry. Einsteins little paper speculated about the excess energy might be balanced by a loss of mass of the nuclear particles. This idea eventually led to Einsteins most famous equation, E = mc2.

This equation is often taken to mean that matter and energy are two sides of the same coin. It actually means that the apparent mass and energy of an object depend upon the relative motion of an observer, and because of this, the two are intertwined, similar to the connection between space and time. But one consequence of this relation is that under the right circumstances objects should be able to produce energy via a loss of mass.

We now know this is exactly what happens in radioactive decay. The effect is also how stars create energy in their cores via nuclear fusion. Of course, if matter can become energy, then it should also be possible for energy to become matter. That tricks a bit more difficult, and it took particle accelerators to pull it off. These days we do this all the time. Accelerate particles to nearly the speed of light and slam them together. The large apparent mass of the particles releases tremendous energy, and some of that energy changes back into particles. All of modern particle physics can trace its history to Einsteins two-page paper.

But the laws of physics dont just say you can create energy from matter and vice versa, it places specific constraints on the nature of the created matter and energy. One of the simplest examples of this is electron-positron annihilation. This happens when an electron collides with its antimatter twin. The two particles have the same mass, but opposite charge, so when they collide they produce two high-energy photons. The mass of the electron and positron are transformed entirely into energy. This experiment was first proposed in the 1930s, but it wasnt done until 1970.

If you can convert matter entirely into energy, you should be able to do the reverse. Its known as the BreitWheeler process and involves colliding two photons to create an electron-positron pair. While we have used light to create matter several times, converting two photons directly into matter is very difficult. But a recent experiment shows it can be done.

The team used data from the Relativistic Heavy Ion Collider (RHIC) and looked at more than 6,000 events that created electron-positron pairs. They didnt simply beam two lasers at each other but instead used high-energy particle collisions to create intense bursts of photons. In some cases, these photons collided to create an electron-positron pair. From the data, they could show when a pair was created directly from light.

Since these pair productions occurred in the intense magnetic field the team also demonstrated another interesting effect known as vacuum birefringence. Normal birefringence occurs when light is split into two beams of different polarization. This effect occurs naturally in materials such as Iceland spar. With vacuum birefringence, light passing through an intense magnetic field is split into two polarizations, with each polarization taking a slightly different path. Its an amazing effect if you think about it because it means you can change the path of light in a vacuum, using only a magnetic field. Vacuum birefringence has been observed in the light coming from a neutron star, but this is the first time its been observed in the lab.

Reference: Einstein, Albert. Ist die Trgheit eines Krpers von seinem Energieinhalt abhngig? Annalen der Physik 323.13 (1905): 639-641.

Reference: Sodickson, L., et al. Single-quantum annihilation of positrons. Physical Review 124.6 (1961): 1851.

Reference: Breit, Gregory, and John A. Wheeler. Collision of two light quanta. Physical Review 46.12 (1934): 1087.

Reference: Adam, Jaroslav, et al. Measurement of e+ e? Momentum and Angular Distributions from Linearly Polarized Photon Collisions. Physical Review Letters 127.5 (2021): 052302.

Like Loading...

More here:

Matter From Light. Physicists Create Matter and Antimatter by Colliding Just Photons. - Universe Today

Read More..

Albert Einstein: The Life and Legacy of the Great Genius Albert Einstein was one of the – Interesting Engineering

It's hard to understate the genius of Albert Einstein. As one of the world's foremost physicists, his discoveries revolutionized the way we see not just our world but the entirety of the universe. It's little wonder how the name Einstein has come to be synonymous with scientific genius.

He is most well known for his theory of relativity, but his brilliance did not end there. He helped lay the foundations for quantum mechanics with his Nobel Prize-winning work on the photoelectric effect and was instrumental in bringing the world into the atomic age, though was generally opposed to the use of nuclear weapons.

By pushing our understanding of physics far beyond what anyone thought possible or could even imagine at the time, Einstein stands nearly alone in the pantheon of physicists with his unparalleled brilliance.

Albert Einstein was born on March 14, 1879, to Hermann Einstein and Pauline Koch-Einstein, Ashkenazi Jews living in Ulm, the Kingdom of Wrttemberg in the southern part of the German Empire.

Shortly after his birth, his family moved to Munich, where his father and uncle founded an electrical equipment manufacturing company. Einstein began receiving a primary education at a Catholic school in 1885 before transferring to the Luitpold-Gymnasium (since renamed the Albert Einstein Gymnasium, for obvious reasons) in 1888.

Einstein was, surprisingly or maybe not so surprisingly, a mediocre student. So mediocre in fact that when Einstein wanted to attend theEidgenoessische Polytechnische Schule (mercifully renamed ETH in later years) in Zurich, Switzerland, in 1895, he failed the entrance examination and had to attend the Kantonschule in Aarau, Switzerland, to remediate the subject areas whose test scores were insufficient.

Receiving a diploma from the school in 1896, he was able to enroll in ETH soon thereafter with the goal of becoming a math and physics teacher. Again, he was a passable student, but not much more than that, though he did manage to graduate with a diploma in July 1901.

By this point, he had already abandoned his German citizenship and had been formally granted Swiss citizenship in February 1901. He spent several months looking for a job, giving private instruction in math and physics to make ends meet, and taking short-term employment as a teacher from May 1901 to January 1902.

Albert Einstein's turn as the world's most famous patent clerk started with the help of a fellow student, Marcel Grossman, who helped get Einstein a probationary appointment at the Swiss Patent Office in Bern, where he had settled after school.

Einstein took up the position in December 1901 and by June 1902, he was promoted to Technical Expert, Third Class, giving him some measure of stability, and allowing him to pursue his research in theoretical physics.

At this time, he was also a founding member of the Akademie Olympia, a scientific society in Bern that greatly helped focus Einstein's work and thinking in the field of physics.

In April 1905, Einstein submitted a doctoral thesis to the University of Zurich titled, "A New Determination of Molecular Dimensions" which he had dedicated to Grossman. It was accepted by the University in July of that year, but by then Einstein was already well on his way to revolutionizing our understanding of the universe.

To say that the year 1905 was a landmark year for science is grossly underselling it. Einstein, still working as a "technical expert" in the Swiss patent office, published four revolutionary scientific papers in a span of just 7 months that would establish him as one of the greatest scientific minds of the time. Einstein later described the period by saying thatit was when a storm broke loose in my mind.

The first of the papers was "On A Heuristic Point of View Concerning the Production and Transformation of Light," which was the first paper to theorize that electromagnetic radiation, including light, consisted of "quanta".

The paper argued that, in effect, radiation was carried through space by means of measurable particles which we know today as photons. Interestingly, this theory was rejected at first before it was eventually confirmed by Max Planck, who was initially critical of the theory himself. For this discovery, Einstein would win the 1921 Nobel Prize for Physics.

The next paper was publishedon July 18, 1905, titled,On the movement of small particles suspended in a stationary liquid, as required by the molecular-kinetic theory of heat.Although it did not revolutionize the principles of physics, Einstein demonstrated through the physical phenomenon of Brownian motion empirical evidencethat matter is composed of atoms.Although many scientists already believed this, it was by no means universally accepted.Einstein not only mathematically confirmed the existence of atoms and molecules but also opened a new field in the study of physics,statistical physics.

Einstein wasn't done, however. His next paper, "On theElectrodynamics of Moving Bodies", and published in September 1905, was the most groundbreaking. It introduced the idea of Special Relativity, which addresses the problem of objects in different coordinate systems moving relative to each other at constant speeds.

It produced a new conception of space that would lay the groundwork for Einstein's theory of general relativity that would come later, and also established that as an object accelerates towards the speed of light, its mass also increases, which requires more energy to accelerate, which then adds even more mass to the object. As a result, as an object effectively approaches the speed of light, its mass becomes infinite, making the speed of light the effective speed limit for all matter.

His next paper that year, "Does the Inertia of a Body Depend upon its Energy Content?" was published in November 1905, and gave the mathematical proof of special relativity, confirming the equivalence of mass and energy, and introducing arguably the most famous equation in human history, E = mc2.

Finally, in 1907, Einstein published "Planck's Theory of Radiation and the Theory of Specific Heat", which was a foundational work of quantum mechanics.

While Einstein's Theory of Special Relativity was revolutionary in its own right, between 1909 and 1916, Einstein worked on a more general form of this theory that would be published in March 1916 as, "The Foundation of the General Theory of Relativity".

This paper was absolutely transformative. While Einstein's work on Special Relativity required an advanced understanding of math and physics, his theory of general relativity was much more accessible, owing to its elegance and (relative) simplicity.

Einstein envisioned gravity not as a force the way Newton described it but describing space and time as a fabric stretching out in all directions. If that space is empty, an object moving through it would travel in a straight line. But if that space has a massive object in the center, like the Sun, then the fabric of space warps toward that center of mass, turning the flat fabric of space into a kind of funnel.

An object passing through that space is affected by the shape of that funnel so that it no longer travels in straight lines through that space but instead gets pulled toward the mass in the center, effectively rolling down the slope of space towards the mass in the center.

Critically, if the speed of something passing through that space is great enough, like light, then it is not pulled into the center mass entirely, but its course is instead refracted as a consequence of the gravitational effect of that massive object.

It was this aspect of Einstein's theory that would help cement his reputation. Convinced that this deflection of light from distant stars could be seen in the gravitational field produced by the sun during a solar eclipse, Einstein sought but failed to verify his theory personally. In 1919, however, English astronomer Arthur Eddington and French astronomer Andrew Crommelin observed the deflection of light at two separate locations during the May 29 eclipse that year.

Confirmation of Einstein's prediction was announced on November 6, 1919, during a meeting in London of the Royal Society and Royal Astronomical Society. Joseph John Thompson, the Royal Society's president, declared that "This is the most important result related to the theory of gravitation since the days of Newton...This result is among the greatest achievements of human thinking."

Confirmation of Einstein's theory of gravitation was printed on the front page of newspapers around the world, establishing him forevermore in the public consciousness as the greatest scientific mind since Isaac Newton, and possibly even greater.

While Einstein was working out his theory of general relativity, he had already established himself in 1905 as a brilliant scientist. He still had trouble landing an academic position for himself, though, being rejected by the University of Bern in 1907 for a professorial position. He was successful on his second go-around a year later, however, and landed a position in 1908, giving his first lecture as a professor at the end of that year.

Devoting himself to his scientific endeavors, he gave up his post with the patent office in 1909 and bounced around between Bern, Zurich, and Prague until 1914, when Planck and German chemist Walther Nernst convinced Einstein to take up a post in Berlin, then the world's epicenter for natural science research.

They offered him a non-teaching professorship at Berlin University, made him a member of the Prussian Academy of Sciences, and made him the head of the yet-to-be-founded Kaiser Wilhelm Insitute of Physics.

Einstein's global popularity led to invitations to speak from around the world, offers Einstein took up, traveling to the United States, France, Britain, Palestine, and elsewhere.

Einstein traveled to Asia as well, and contrary to his public image as a great humanitarian who decried racism as "a disease of white people," his travel diaries from that timeexpressed somesweeping and negative generalizationsof the people he met in Asia, especially the Chinese.

People are a study in contradictions, and Einstein could both believe that racism was social cancer while holding some particularly abhorrent views himself. And while many of his recently published personal papers were written in the early 1920s, when such opinions would not have been seen as particularly out of the mainstream, this certainly does not absolve him - although he also clearly changed over time.

This is especially true as he himself was the subject of some especially ugly anti-Semitic attacks from those inside the scientific community and among the broader public. There were those in Germany, including Nobel laureates like Johannes Stark and Philipp Lenard, who advocated for a "German physics" separate from "Jewish physics".

In December 1932, Einstein and his wife Elsa left for the United States for a series of lectures just as the Nazi Party was on the rise, having secured the most seats in the German parliament elections held earlier that year. In January 1933, Adolf Hitler seized power and in response, Einstein cut all ties with any scientific and academic institution in Germany that he had, including resigning from the Prussian Academy of Sciences. He would never again return to Germany.

Now more or less a refugee, Einstein was quickly given a position at the Institute for Advanced Studies in Princeton, NJ. He bought a house there, the famous 112 Mercer Street.

In 1940, Einstein was formally granted US citizenship and renounced his German citizenship for the second time though he retained his Swiss citizenship. He would live the rest of his life in the United States.

Einstein was a committed pacifist, but his horror at the thought of Nazi Germany working on atomic weapons compelled him to sign a letter to then-President Franklin D. Roosevelt that raised the alarm, recommending that the United States begin researching atomic weaponry as well.

Though this would be Einstein's only direct involvement in the Manhattan Project, giving hisimprimatur to the effort certainly helped make the case for the project, and his famous equation equating mass and energy was fundamental to the project's development.

Einstein spent the rest of his life pursuing aunified field theorybut was unable to make any breakthroughs in this area. His contemporaries had become enamored with some of what he regarded as the stranger aspects of quantum mechanics, which Einstein criticized.

Rejecting the use of probability and randomness in describing quantum effects, Einstein famously declared that, "[God] does not play dice with the universe."

This disagreement and his failure to make major progress in his work on unified field theory led to his isolation from the scientific community in his later years, though Einstein did not appear to be bitter about this fact.

On April 15, 1955, Albert Einstein suffered debilitating pain and was rushed to a hospital in Princeton. He was diagnosed with an aneurysm in his abdominal aorta, and doctors were unable to save him.

Einstein died on April 18, 1955. In accordance with his will, he was cremated that day and his ashes spread at an unknown location. Though his later career proved to be mostly fruitless, he exerted a substantial gravity of his own on those around him, even helping the likes of Niels Bohr refine the principles of quantum mechanics by virtue of his critiques of it.

Einstein's work redefined the universe as we know it and gave us the clearest, most elegant model to date to help even the layman understand it. The foundation he laid for theoretical physics has led to the discovery of gravitational lensing and the greatest cosmological monsters of all, black holes.

Albert Einstein, like Isaac Newton and other great minds before him, surely stood on the shoulders of the giants who came before them, but few giants have ever stood as tall as Einstein and it may be centuries before we see so revolutionary a scientific figure.

Here is the original post:

Albert Einstein: The Life and Legacy of the Great Genius Albert Einstein was one of the - Interesting Engineering

Read More..

Postdoctoral Research Associate in Quantum Light and Matter job with DURHAM UNIVERSITY | 262887 – Times Higher Education (THE)

Department of Physics

Grade 7: - 33,797 to 35,845 per annumFixed Term - Full TimeContract Duration: 36 monthContracted Hours per Week: 35Closing Date: 27-Aug-2021, 6:59:00 AM

Durham University

Durham University is one of the world's top universities with strengths across the Arts and Humanities, Sciences and Social Sciences. We are home to some of the most talented scholars and researchers from around the world who are tackling global issues and making a difference to people's lives.

The University sits in a beautiful historic city where it shares ownership of a UNESCO World Heritage Site with Durham Cathedral, the greatest Romanesque building in Western Europe. A collegiate University, Durham recruits outstanding students from across the world and offers an unmatched wider student experience.

Less than 3 hours north of London, and an hour and a half south of Edinburgh, County Durham is a region steeped in history and natural beauty. The Durham Dales, including the North Pennines Area of Outstanding Natural Beauty, are home to breathtaking scenery and attractions. Durham offers an excellent choice of city, suburban and rural residential locations. The University provides a range of benefits including pension and childcare benefits and the Universitys Relocation Manager can assist with potential schooling requirements.

Durham University seeks to promote and maintain an inclusive and supportive environment for work and study that assists all members of our University community to reach their full potential. Diversity brings strength and we welcome applications from across the international, national and regional communities that we work with and serve.

The Department

The Department of Physics at Durham University is one of the leading UK Physics departments with an outstanding reputation for excellence in teaching, research, and employability.

The Department is committed to advancing equality and we aim to ensure that our culture is inclusive, and that our systems support flexible and family-friendly working, as recognized by our Juno Champion and Athena SWAN Silver awards.

We recognise and value the benefits of diversity throughout our staff and students.

The Role

A Postdoctoral Research Associate position is available to pursue experimental research in the field of atomic and laser physics within the Durham Atomic and Molecular Physics group. The position is associated with an existing experiment funded by the UK Engineering and Physical Science Research Council (EPSRC) focused on Quantum Optics using Rydberg atoms.

The post holder will be expected to display the initiative and creativity, together with the appropriate skills and knowledge, required to lead and develop the existing experiment to meet the project goals. The post holder will be expected to be familiar with the ultra-stable lasers, and have experience in atomic physics, quantum optics or laser cooling and trapping. The post holder is expected to be able to work effectively both independently and as part of a small research team. It is expected that the post holder will enhance the international contacts of the group through the presentation of work at international conferences. The post holder will also be expected to aid in the supervision of graduate students within the group as well as contributing to the undergraduate teaching within the Department.

The goal of the Rydberg project is to realise strong photon interactions with a high fidelity (preservation of properties of the incoming photons). The successful candidate will be required to take a lead role in all aspects of their project, contributing directly to the experiment and working closely with Prof Charles Adams, Prof Kevin Weatherill, project partners, and graduate students as well as other members of the research group and will be expected to undertake an active role in the laboratory activity

The Department of Physics is committed to building and maintaining a diverse and inclusive environment. It is pledged to the Athena SWAN charter, where we hold a silver award, and has the status of IoP Juno Champion. We embrace equality and particularly welcome applications from women, black and minority ethnic candidates, and members of other groups that are under-represented in physics. Durham University provides a range of benefits including pension, flexible and/or part time working hours, shared parental leave policy and childcare provision.

Responsibilities

The post is for a fixed term of 36 months as it is associated with an existing experiment with fixed-term funding from the UK Engineering and Physical Science Researcher Council (EPSRC) focused on Quantum Optics using Ryberg atoms.

The post-holder is employed to work on research/a research project which will be led by another colleague. Whilst this means that the post-holder will not be carrying out independent research in his/her own right, the expectation is that they will contribute to the advancement of the project, through the development of their own research ideas/adaptation and development of research protocols.

Successful applicants will ideally be in post by 1st October 2021.

How to Apply

For informal enquiries please contact Kevin Weatherill (K.j.weatherill@durham.ac.uk).All enquiries will be treated in the strictest confidence.

The Joint Quantum Centre (JQC) is one of the UKs leading centres for atomic, molecular and optical physics research. Members of the JQC span the Physics and Chemistry Departments at Durham University and the Applied Mathematics Department at Newcastle University. Projects within the JQC investigate experimental and theoretical topics ranging from laser cooling and Bose-Einstein Condensation to nonlinear optics and Rydberg physics. The atomic and molecular physics group in the Department of Physics comprises 10 faculty, 11 post-doctoral researchers and 22 Ph.D. students. Further details of the research activities of the group can be found athttp://www.jqc.org.uk/

We prefer to receive applications online via the Durham University Vacancies Site.https://www.dur.ac.uk/jobs/. As part of the application process, you should provide details of 3 (preferably academic/research) referees and the details of your current line manager so that we may seek an employment reference.

Applications are particularly welcome from women and black and minority ethnic candidates, who are under-represented in academic posts in theUniversity.

What to Submit

All applicants are asked to submit:

Next Steps

Shortlisted candidates will be invited for interview and assessments.

The Requirements

Essential Criteria:

Desirable Criteria:

DBS Requirement:Not Applicable.

See the article here:

Postdoctoral Research Associate in Quantum Light and Matter job with DURHAM UNIVERSITY | 262887 - Times Higher Education (THE)

Read More..

Cloud Computing Trends & Future Technology 2021 – Datamation

For many years, tech experts have focused on the trend of enterprises making the initial move to the cloud. But Flexeras 2021 State of the Cloud Report tells a slightly different story, with 92% of enterprises already operating on a multicloud strategy and 82% operating on a hybrid cloud strategy.

So if most users already work on some kind of cloud, how is their cloud experience transforming right now?

See below our top trends in cloud computing, which range from diversifying cloud infrastructure to considering how cloud use impacts the greater global environment:

Also read: Big Data Trends in 2021 and The Future of Big Data

Most major enterprises migrated data and operations to the cloud over the past several years, but during the height of the COVID-19 pandemic, companies learned the importance of a distributed, flexible infrastructure.

Enterprise leaders are quickly recognizing that not all clouds work for all of their needs, and some of their legacy systems and applications work better on their existing on-premises infrastructure. These realizations, and a growing need for both flexibility and reliable security, have ushered in a period of growth for hybrid and multicloud setups.

Kaushik Joshi, global managing director for strategic alliances at Equinix, a digital infrastructure and integration company, explained why hybrid and multicloud setups are taking off:

The past year has seen a significant shift from private and public cloud-only deployments to hybrid and multicloud strategies, Joshi said.

He went on to share Equinixs recent Global Tech Trends Survey which polled more than 2,600 global IT decision-makers globally and highlighted that hybrid cloud is now the most common choice, with 46% of respondents now using a hybrid cloud (a 12% increase since their previous survey).

Hybrid cloud architectures represent the best path to engage with a rapidly changing infrastructure landscape, because it enables enterprises to more easily manage legacy, data-intensive processes, while simultaneously embracing new born-in-the-cloud applications.

As more organizations recognize the different strengths of private clouds, public clouds, industry-specific clouds, and legacy on-premises setups, more cloud and data center vendors are working hard to create hybrid cloud and multicloud connections among the disparate systems.

More on hybrid cloud: Hybrid Cloud and Hyperconverged Infrastructure (HCI)

Security is a hot-button topic in the cloud computing world, with some users believing that cloud computing is more secure, while others believe it is less secure than their on-premises security infrastructure.

The truth that many of them are discovering is that managed vendor clouds and on-premises solutions alike need additional security support to combat a growing number of major data breaches and ransomware attacks.

Instead of relying on embedded, native security features, tech experts are advocating for the increased use of managed security service providers (MSSPs) and a better organizational policy for user access management. Organizations are recognizing that security incidents can come from both internal accidents and external actors, so its important that all users are trained and compliant with an organizations security policies.

Amit Bareket, CEO and co-founder of Perimeter81, a cybersecurity firm, believes that zero-trust security is the answer to growing cloud security concerns:

Organizations are coming to realize they should not automatically trust anything inside or outside their perimeter, Bareket said. That perimeter is more or less erased. So, now, we are seeing companies verifying everything trying to connect to their IT systems.

With zero-rust security, policy enforcement and protection are easily implemented by isolating applications and segmenting network access based on user permissions, authentication, and verification. By implementing the ZTNA model for secure network access, IT teams can have full control over who is granted access, enters, and leaves the network at all times. This model has gained much more recognition since being mandated in President Bidens executive order.

A large number of enterprise cloud users are moving toward Kubernetes, containerization, and other custom cloud efforts for cloud-native application development.

Simply having a cloud or several clouds is not enough. Dividing those clouds up into containers makes it possible for companies to develop microservices and applications that require different storage, security, and configuration features.

Anthony Cusimano, solutions evangelist at Veritas Technologies, an enterprise backup and recovery software company, explained that many users are leaning into containerization for the cost and efficiency benefits as well:

The entire world is starting to shift its attention to Kubernetes and the orchestration of containers, Cusimano said. Its the next iterative shift we went from physical to virtual to cloud, and now were going to microservices and containers.

They make the hybrid cloud more cost effective and improve performance across the board. Thats why youre starting to see some of the biggest cloud providers offering turnkey Kubernetes solutions.

Most of the top cloud vendors in the market offer separate advanced analytics and artificial intelligence (AI) capabilities, but a growing number of them are offering these features as cloud-native technology.

Tapan Patel, senior manager of AI and cloud at SAS, a major analytics software company, explained what expanding containerization and cloud-native features will mean for enterprises:

Cloud-native technologies will usher in a new era of distributed enterprise analytics software designed to run wherever containers and a Kubernetes platform is available, Patel said. Cloud-native technologies will also lead and help companies to build, migrate, and modernize customer-facing and analytics and artificial intelligence (AI) apps more easily and at scale.

Patel also explained that customers will be able to test out new analytics and AI operations when they adopt clouds with cloud-native features.

Customers can get a taste of new and emerging analytics and artificial intelligence (AI) capabilities delivered in the cloud to attract new use cases and new users, Patel said. Analytics in the cloud can be used as an ideation sandbox, which allows a lot of new users to build prototypes. Each phase of machine learning projects (build, train, deploy) requires a different infrastructure and deployment considerations. Because the cloud is elastic, it provides the right level of scalability and availability.

Sustainability and climate care efforts are growing across industries, including technology and the cloud market. The technology industry is known for consuming a large amount of energy, and although cloud is typically more energy efficient than on-premises infrastructure, the growth of AI and the Internet of Things (IoT) is causing cloud technology to work harder than ever.

Ali Fenn, president of ITRenew, an asset recycler and data efficiency expert, explained that the circular cloud, a circular economic model of cloud asset recycling, is the next answer to sustainability concerns:

Sustainability is a massive trend in IT from enterprises seeking to ensure cloud providers leverage renewable energy in data centers to enterprises increasingly seeking to minimize their own supply carbon footprint via sustainably sourced, circular IT solutions, Fenn said.

Not only is this circular model good for sustainability goals, but it also makes a lasting impact on the bottom line. IT and business leaders are starting to realize they dont need to allocate additional money, time, and resources or compromise performance to have a more sustainable cloud computing model.

Coupled with an unstable supply chain, the circular cloud saves more than just the cost of supplies with its closed-loop supply chain that eliminates delays and constraints associated with inventory and shipment (a pain point for providers since pre-pandemic times).

Although few cloud vendors have adopted the circular model, most are changing their business models to emphasize more renewable energy use, carbon offsets, and data center efficiency boosts.

Read next: Data Analytics Market Trends 2021

See more here:
Cloud Computing Trends & Future Technology 2021 - Datamation

Read More..

4 Top Cloud Computing Stocks To Watch Today | National | fwbusiness.com – FW Business

4 Hot Cloud Computing Stocks For Your Watchlist Now

Like it or not, the cloud computing industry continues to grow by the day. Accordingly, with the current shift towards digital workspaces, cloud computing stocks are also gaining traction in the stock market now. If anything, the demand, and possible use cases for cloud computing are rapidly increasing in our world today. Before we dive into the details, what exactly is this upcoming tech field might you ask? Well, in essence, cloud computing is a suite of computing services that can be delivered over the Internet or the cloud. This can range from data storage, networking, analytics, and database services among other computing functions.

Notably, cloud computing serves a wide variety of purposes across numerous industries. We only need to take a look at the largest name in the industry today Amazon (NASDAQ: AMZN) to see this. Just this week, the companys Amazon Web Services (AWS) division announced two major plays on the operational front. Firstly, General Electrics (NYSE: GE) Healthcare section is now moving several of its software platforms onto the AWS cloud. Secondly, news broke the day after that AWS also won a $10 billion cloud contract from the U.S. Defense Department.

Meanwhile, other names in the industry also continue to thrive amidst the current uptick in enterprise software spending. Last week, Datadog (NASDAQ: DDOG) smashed consensus estimates in its second-quarter fiscal. The company reported a total revenue of $233.5 million, marking a whopping 67% year-over-year increase. With all this activity in the cloud computing business now, I can understand if investors are keen to jump on. Should you be one of them, here are four to consider in the stock market today.

Best Cloud Computing Stocks To Buy [Or Sell] This Month

Microsoft Corporation

First off, we have Microsoft Corporation, a multinational tech company whose products and services are used by billions all over the world. Namely, its computer software and consumer services are immensely popular. For instance, its Microsoft Windows operating system is used by over 1.5 billion users worldwide. MSFT stock currently trades at $291.46 as of 11:22 a.m. ET and is up by over 35% in the past year. In late July, the company announced very strong fourth-quarter results.

To begin with, its revenue for the quarter was $46.2 billion, an increase of 21% year-over-year. The company also reported an operating income of $19.1 billion, up by 42% from a year earlier. Net income was $16.5 billion for the quarter, increasing by 49% year-over-year.

In fact, the company notes that its revenue this quarter is driven by cloud services. Impressively, revenue from its Intelligent Cloud segment was $17.4 billion, increasing by 30% year-over-year. Given the impressive financials, will you consider buying MSFT stock?

Alibaba Group Holding Ltd

Alibaba is a company that specializes in cloud computing and e-commerce. Its Alibaba Cloud, also known as Aliyun, provides cloud computing services to online businesses and also the companys own e-commerce ecosystem. Also, the company has a digital media and entertainment business. BABA stock currently trades at $187.92 as of 11:22 a.m. ET. On August 3, 2021, the company reported a stellar quarter. Firstly, it delivered a revenue of $31.86 billion for the quarter, an increase of 34% year-over-year. Secondly, annual active consumers of the Alibaba Ecosystem across the world reached approximately 1.18 billion users.

This includes 912 million consumers in China and 265 million consumers overseas served by Lazada, AliExpress, Trendyol, and Daraz. The company also reported a net income of $6.63 billion and diluted earnings per share was $0.32. Alibaba says that it has multiple engines driving its long-term growth, expanding upon its consumer and industrial internet segments.

Daniel Zhang, Chairman and Chief Executive Officer of Alibaba Group had this to say, We believe in the growth of the Chinese economy and long-term value creation of Alibaba, and we will continue to strengthen our technology advantage in improving the consumer experience and helping our enterprise customers to accomplish successful digital transformations. All things considered, will you add BABA stock to your watchlist?

Salesforce is a cloud-based company with headquarters in San Francisco, California. The company is the worlds No. 1 customer relationship management (CRM) platform. Also, its cloud-based CRM applications are used for sales, service, marketing, and do not require IT, experts, to set up or manage. The company boasts more than 150,000 companies that use its platform to grow their businesses by strengthening customer relationships. CRM stock currently trades at $250.66 as of 11:23 a.m. ET.

On Tuesday, the company announced Salesforce+, a new streaming service for live experiences and original content series. The new service will include live experiences, original series, podcasts, and other programming. In detail, Salesforce+ will bring the magic of Dreamforce to viewers across the globe with speakers, success stories, and groundbreaking innovations.

Furthermore, with engaging stories and expert advice, Salesforce+ will illuminate the future of technology in the digital-first, work anywhere world. For these reasons, will you consider CRM stock a top cloud computing stock to watch?

Snowflake Inc.

Next up, we will be taking a look at Snowflake. In brief, it identifies itself as a cloud computing-based data warehousing company. Through Snowflakes Data Cloud, organizations across the globe can make the most of their digital assets. According to Snowflake, it helps customers unite siloed data, discover and securely share data, and execute diverse analytic workloads. Among its thousands of clients spanning various industries, Snowflake now serves 187 of the Fortune 500 companies.

As it stands, SNOW stock currently trades at $290.08 a share as of 11:23 a.m. ET. Could the company have more room to grow moving forward? Well, for one thing, Snowflake continues to make strategic partnerships in the tech world. Just last month, the company announced support for The Trade Desks (NASDAQ: TTD) Unified ID 2.0. (UID 2.0).

Namely, this would be a strategic play by Snowflake as more companies shift from conventional third-party ad trackers to TTDs ad tool. With UID 2.0 on the Snowflake Data Marketplace, customers can now optimize their conventional ad businesses to be more privacy-conscious. Overall, Snowflake appears to be catering to the shifting needs of its users. Could all this make SNOW stock a top pick in the stock market for you?

Read the rest here:
4 Top Cloud Computing Stocks To Watch Today | National | fwbusiness.com - FW Business

Read More..