Page 1,179«..1020..1,1781,1791,1801,181..1,1901,200..»

3 Alberta universities receive $25M in funding for quantum physics research – Global News

Three Alberta universities are pooling their resources to investigate the fundamentals of quantum science, with a focus on driving innovation decades from now.

Its vital right now, said Rob Thompson, vice-president of research for Quantum Horizons Alberta.

Because if we dont continue to push that end of our understanding of quantum (fundamentals), then 20 or 30 years from now, well run out of ideas.

Quantum physics, discovered in the early 1900s, is the study of the tiniest possible particles in the universe and allows for a deeper understanding of nature.

Quantum-powered tech is everywhere, from cellphones to home security systems to vehicles.

The current quantum industry, which includes semiconductors and medical imaging, relies on discoveries from three decades ago, said Thompson.

Story continues below advertisement

Scientists from the University of Alberta, the University of Calgary and the University of Lethbridge have received $25 million in private funding to answer several questions about the quantum world, which operates differently than the traditional understanding of physics.

1:28New research may identify viable sperm in infertile men before invasive surgery

Weve taken a step back and are looking at the foundational science on which some of todays technologies are built, said Andre McDonald, a mechanical engineering professor at the University of Alberta.

Dena McMartin from the University of Lethbridge said the research will go back to the basics of physics and mathematics to understand how the Earth works as a complex system and how it interacts with the solar system.

It will also look at how time moves.

Were fascinated by the idea that time can be more circular, she said.

Story continues below advertisement

McMartin said many First Nations communities in Canada perceive time as circular, rather than linear, in a way that aligns closely with quantum science.

She said the Lethbridge node is working on bringing Indigenous quantum scientists to explore the concepts of time and gravity.

Its hard to wrap our head around just how deep the questions are and how important they are.

The University of Lethbridge has already been working on quantum gravity, quantum sensing and quantum computing, said McMartin.

Trending Now

Were looking at ways gravity interacts with Earth and other planets, and how Earth interacts with the solar system, she said.

Her team will also research how technologies are built to work on Earth and in space.

Story continues below advertisement

Quantum Horizons Alberta aims to hire at least seven quantum researchers over the next year, while also funding post-doctoral scholars and graduate students in their research.

Thompson said the specific areas of focus for the University of Calgary are still being worked out, in co-ordination with the two other nodes in Edmonton and Lethbridge.

There are ranges of unanswered questions, he said.

One such question, said Thompson, is how two subatomic particles vast distances apart can be linked and change one another.

That actually fundamentally violates relativity, another branch of physics, which says information cant travel instantaneously, he said.

There are many, many questions at a foundational level still to be answered about quantum and every time we answer one of those questions, it opens up a whole new world for us to explore.

© 2023 The Canadian Press

View original post here:

3 Alberta universities receive $25M in funding for quantum physics research - Global News

Read More..

Multiple worlds has been given artistic impetus by physics – Aeon

When I was in my mid-30s, I was faced with a difficult decision. It had repercussions for years, and at times the choice I made filled me with regret. I had two job offers. One was to work at a very large physics experiment on the West Coast of the United States called the National Ignition Facility (NIF). Last year, they achieved a nuclear fusion breakthrough. The other offer was to take a job at a university research institute. I agonised over the choice for weeks. There were pros and cons in both directions. I reached out to a mentor from graduate school, a physicist I respected, and asked him to help me choose. He told me to take the university job, and so I did.

In the years to come, whenever my work seemed dull and uninspiring, or the vagaries of funding forced me down an unwelcome path, or worse the NIF was in the news, my mind would turn back to that moment and ask: What if? Imagine if I were at that other job in that other state thousands of miles away. Imagine a different life that I would never live.

Then again, perhaps I had dodged a bullet, who knows?

Every life contains pain. Even the perfect life, the life where you have everything you want, hides its own unique struggles. Writing in The Genealogy of Morals (1887), Friedrich Nietzsche said: Man, the bravest animal and most prone to suffer, does not deny suffering as such: he wills it, he even seeks it out, provided he is shown a meaning for it, a purpose of suffering. A life apparently perfect but devoid of meaning, no matter how comfortable, is a kind of hell.

In our search for meaning, we fantasise about the roads not taken, and these alternative lives take on a reality of their own, and, perhaps, they are real. In his novel The Midnight Library (2020), Matt Haig explores this concept. In it, a woman named Nora Seed is given the chance to live the lives she would have lived had she made different choices. Each life is a book in an infinite library. Opening the book takes her to live in that other world for as long as she feels comfortable there. Each possible world becomes a reality.

For centuries, philosophers have dreamed of possible worlds. But only with the advent of quantum physics and the need to interpret its counterintuitive predictions did it appear that these possibilities might be real. Introduced in the 1950s by a graduate student, Hugh Everett, to little fanfare, and promoted in the 1970s by the physicist Bryce DeWitt, the many-worlds interpretation of physics has captured the public imagination and flowered a burst of art and culture. Born out of a need to interpret the behaviour of the smallest building blocks of our Universe, quantum physics has powered a cultural conversation from the depths of academic philosophy and science, to the pinnacle of Hollywoods elite.

The modern concept of possible worlds is attributed to the German polymath, co-inventor of calculus, and rival to Isaac Newton, Gottfried Wilhelm Leibniz, in his work Theodicy: Essays on the Goodness of God, the Freedom of Man, and the Origin of Evil (1710). The phrase best of all possible worlds comes from this work and refers to Leibnizs attempt to solve the problem of evil by proposing that ours is the best possible world. In other words, any other possible world would contain more evil.

Could Socrates have been an alligator? Yes. His being a human is not necessary but contingent

Leibniz drew on the work of the 16th-century Spanish Jesuit priest Luis de Molina, who posited that God contains middle knowledge, the knowledge of what a person would do if placed in a given situation. In any given possible world, a persons actions are fixed but, from one world to another, they may act differently because of changes in their life circumstances. Hence, God gives us a kind of free will, which is essential to holding us responsible for our actions but, by his middle knowledge, places us in the best possible world for the greatest number of people; in this world, our choices are predetermined. Molinas theology proposes that even God requires some people to damn themselves to save others.

The contemporary American analytic philosopher Alvin Plantinga drew on Leibnizs theological ideas to produce his seminal work on possible worlds, The Nature of Necessity (1974). As in Haigs novel, Plantinga conceives of a library of books, each corresponding to a possible world. There, he defines a book on a world as everything that is true, including everything necessary (meaning true across all worlds) and everything that is contingent (meaning true only in some worlds). Each world has one, and only one, book of true things.

Plantinga illustrates the difference between necessary and contingent truths in this way: Could Socrates have been an alligator? Yes. There may be a possible world where Socrates wakes up, as in Franz Kafkas novella The Metamorphosis (1915), to find his body to be that of an alligator. Thus, Socrates being a human being is not necessary but contingent. It is not true in every book in the library. On the other hand, mathematical implications like 1 + 1 = 2 and logical proofs are true in all worlds. They are necessary.

Despite considering many possible worlds, like Leibniz and Molina, Plantinga asserts that there is only one real world. For him, alternative worlds are useful for philosophers to think about but do not actually exist.

The many-worlds interpretation (MWI) of quantum physics, on the other hand, says that all possible worlds exist, and the one we live in is no different from any of the others. According to one form of this belief, somewhere out there is an exact duplicate of you, your house, your family, but one small detail is different, perhaps something as tiny as a stray photon that went left instead of right, or maybe something big like you have a different significant other. Maybe a stray cosmic ray hit your DNA before you were born, and you have red hair instead of brown, or you developed a serious birth defect. Maybe you dont exist at all.

To the layperson, the idea of all these worlds existing out there might seem disturbing because it takes away from our own personal uniqueness. To philosophers like Plantinga, it is disturbing because it takes away from the uniqueness of truth.

A good example is Schrdingers cat. In this classic thought experiment, a cat is placed in a box and the lid closed. Say I also put in the box a semi-reflective mirror that has a 50 per cent chance of letting light through, and a 50 per cent chance of deflecting light. Behind the mirror is Detector D (for Death), which can detect even a single photon of light and, if it does, it sends a signal that opens the lid of a vial of poison, filling the box with poison gas and killing the cat. Next to the mirror is Detector L (for Life), not hooked up to any poison. An automatic emitter inside the box is programmed to fire a single photon at the mirror at a certain time. We dont know which detector it will hit because it is random. Once it does, we wait a minute to ensure that the poison has had its effect.

Both are still possible a single world containing two contradictory facts

If the box is completely sealed and impenetrable by anything external, we wont know what happened inside until we open it.

All this seems very ordinary until I take the quantum nature of light into account. A quantum particle, experimental science has shown, can be in two states at once until it is measured. Thus, when the photon is fired at the mirror, it does not go through or deflect. Rather, it enters a state where, having gone through and having been deflected are both still possible a single world containing two contradictory facts. These facts are, hypothetically, passed on to the cat, although nothing as large and complex as a warm-blooded animal could be put into such a state in practice.

We know this is true for particles because of what physicists call the double-slit experiment. In it, a single beam of light is sent through two slits in a barrier to a screen on the other side. Even though the light originates as a straight beam, after it passes through the two slits, it emerges as two interfering waves hitting the screen together. This looks like alternating bars of light and dark.

We want to know if light is made of particles or a continuous wave. To do so, we fire the smallest amount of light we can, which are little packets called photons, at the double slit. We hypothesise that if these appear at individual points, then photons are particles; but if they appear spread across the screen, then photons are waves. We begin the experiment and see immediately that the photons appear at individual points on the screen: score 1 for particle hypothesis. If we continue firing photons, however, we find that the dots appear in the same alternating light and dark bars as if the photons were interfering with each other. Score 1 for the wave hypothesis.

The reason this happens is because, when the photon goes through the barrier, it enters what physicists term a superposition where it has, in a sense, passed through both slits at the same time, like a wave, but arrived at one point on the panel, like a particle. This is called wave-particle duality.

In standard interpretations of quantum physics, we do not say that the photon has passed through both slits at the same time; rather, we say that its wavefunction a kind of probability field has passed through both slits at the same time. That wavefunction then collapses or vanishes, leaving the one photon on the panel. This resolves the contradiction neatly because we can assert that the photon entered the left slit and the photon entered the right slit are never simultaneously true. Rather, we say the wavefunction passed through the slits and collapsed into the photons position on the screen.

According to the MWI of quantum physics, however, the entire wavefunction is a spectrum of alternative realities coexisting. These worlds are all connected and the photons in them interact weakly before they are measured but the very act of measurement causes them to either split apart or appear to do so. When that split happens, copies of you and the rest of the Universe split apart as well.

The MWI is controversial and is itself subject to interpretation depending on whether you believe there is a quantum mechanism for world splitting, or if it is simply how human beings experience quantum phenomena.

Real or not, possible worlds explain strange quantum paradoxes. For example, in the double-slit experiment, if I place a detector in front of each slit, it will detect only a single photon going through one or the other. Never both. If I take the detectors away, I get the interference pattern as if the photon went through both slits. This creates a paradox. Why can it be one way when I measure, and another when I dont?

This doesnt happen in classical physics. If I shoot an arrow at a bullseye, I can be absolutely certain that the arrow will follow a single trajectory from my bow to the target, whether I watch it fly or not. If I dont watch it but imagine a world where I did, that is called a counterfactual world. In classical physics, counterfactual worlds and real worlds are always the same but in quantum mechanics they are not. The world is really different if I look at a particle flying through space versus if I do not.

Physicists knew this to be true in the 1920s, but it took more than 60 years before anyone proposed a way to split the difference between looking and not looking. In 1988, the physicists Yakir Aharonov, David Albert and Lev Vaidman introduced such a method, called weak measurements. These measurements collect some information about particles and, over the course of many, many measurements, can give us statistical information that helps us understand what is going on inside a quantum superposition.

We are more like two-dimensional beings in a 3D world, perceiving only our little slice

Weak measurements let us detect traces of particles even when they are not present. If there is a trace of a particle, that means it had some measurement effect but was not necessarily there in any real sense. This is what researchers see during the double-slit experiment. A particle has a trace from both slits because of the pattern on the screen but has no presence in either. If a particle is present, that would be ascertained through a strong measurement where it is localised, literally appearing on a detector screen.

The MWI interprets trace and presence in a unique way. A trace is when particles in different worlds have not been measured strongly enough to stop interacting, so the worlds are not split. When the worlds cease interacting (split), then trace becomes presence.

Real-world studies of weak measurements have been designed with atoms, photons and other elements of the quantum world. For example, a lens can deflect photons in a laser slightly and cause them to interfere differently with another beam of photons than if the lens is not present. You can imagine, therefore, if you were to put lenses in front of the slits, they would have a measurable effect but, if the deflection is very slight, it would not be enough to collapse the wavefunction or split the worlds. Using that fact, you can construct experiments that allow you to see traces without presence.

Real experiments measure bizarre effects inside superpositions. For example, experiments with both photons and atoms have been done that show that sometimes a particle duplicates so that it can be in two places at once but each with 100 per cent probability, not the 50 per cent probability of the double slit. The particle will compensate by spawning a negative copy of itself, also with 100 per cent probability, somewhere else, so that the total still adds up to one.

These results are counterintuitive unless you believe the wavefunction is a real thing, in which case the particle is a wavefunction that has 100 per cent probability peaks in two spots and a (-100 per cent) trough in another.

For this reason, some flavours of the MWI, such as Vaidmans, maintain the primacy of the wavefunction over the concept of having multiple copies of the world that split. In other words, the multiverse isnt many worlds but one world, and we are more like two-dimensional beings in a 3D world, perceiving only our little slice. Worlds are like pieces in a jigsaw puzzle, fitting together in a commonsense way when together, but defying intuition when left apart.

This suggests that our lives too might be a jigsaw puzzle. Perhaps they make sense only when we look at them across a multiverse of possible lives and, if we could only talk to those other copies of ourselves, we could understand our experiences. Consider that, when we imagine ourselves in other possible worlds, we dont just want to know how our alternative selves are getting along. We want to know what they would think of us, what it would be like to speak to them, and we want to know what it might be like to live in those other worlds that those other selves inhabit. More than that, we want to resolve the uncertainty we have in our own past decisions by asking them: How did it work out? The only way to do that is to uncover the looking glass and glance through.

One means of connecting with our alternative selves is through literature, film and the arts. The MWI first appeared in Michael Moorcocks novella The Sundered Worlds (1962), a space opera that ranges across a vast multiverse. In this Star Wars-like action novel, the hero Renark von Bek undertakes to save the multiverse from Armageddon. This novel also hosted some of the earliest uses of virtual reality, computer tablets, digital displays and, of course, quantum physics, and it also launched Moorcocks long career.

Since then, numerous novels, movies and TV shows have made use of the concept, including childrens fiction. The first book about a parallel universe that I recall reading was the childrens book The Double Disappearance of Walter Fozbek (1980) by Steve Senn, about a boy who somehow swaps places with his dinosaur counterpart in a world where people are all dinosaurs. As a child, I was blown away by this idea of parallel worlds, and that remained my favourite book for many years.

A rupture opens a doorway, a necessary trope for reaching our parallel selves

The idea has captured the movies, too. Among the many multiverse films are those in the Back to the Future trilogy (1985-90), about what happens when we go back in time, change the past, and find the future is another world entirely. Theres also Spider-Man: Into the Spider-Verse (2018), a computer-animated smash hit about a high-school student, Miles Morales, who becomes a Black Spider-Man in his own universe and teams up with Spider-people (men, women, and even Spider-Ham, a pig) from other universes to defeat his nemesis Kingpin. Also, Doctor Strange in the Multiverse of Madness (2022), a Marvel Universe battle between good and evil in parallel worlds; and the Academy Awards Best Picture winner, Everything Everywhere All at Once (2022), about a heroine who learns that she can draw skills and powers from her alternative selves to battle villains who threaten the world.

In each work, a rupture opens a doorway, a necessary trope for reaching our parallel selves. Yet the MWI actually tells us that worlds are generally unreachable. The work on weak measurements means that worlds can diverge without completely disconnecting. A better device might be a hidden passage that already exists, more like the wardrobe portal in C S Lewiss Chronicles of Narnia series (1950-56) than a dangerous rip in space and time. I have yet to read a story where the plot revolved around keeping worlds from separating rather than worlds accidentally and catastrophically merging, but that might be more realistic.

In some cases, the literary purpose of the multiverse is not so much to connect parallel worlds as to tell different stories with the same characters. Star Trek, for example, depended on the multiverse for its James T Kirk reboot movies (2009-16), allowing the director J J Abrams to skirt around canon and change details to reimagine the young Kirk and his adventures on the USS Enterprise.

Using the multiverse to reboot Spider-Man in the movie Spider-Man: No Way Home (2021), MWI explains how the different actors Tobey Maguire, Andrew Garfield and Tom Holland who have played Spider-Man over the years might all exist simultaneously in different universes, and how they might meet up to fight as a team. The multiverse is not only a fun way to have all three actors appear in the movie but also a means of exploring how their characters differ and what they thought of the choices they made and the challenges they each faced, both similar and unique.

The multiverse has also opened up new ways of looking at the human condition. One of the most fascinating areas where culture, philosophy and possible worlds collide is in the work of Robert Lanza on biocentrism, which is a philosophical approach to physics through the lens of living beings. Lanza, a professional biologist, proposes that the Universe arises directly from an individuals conscious observation of it. He hypothesises that, for this reason, a conscious being cannot cease to be conscious. This leads to the potential fact that it is impossible to be dead. Instead, ones consciousness simply splits off, by quantum processes, into worlds where that consciousness can continue to exist. Every wavefunction collapse or world splitting leaves us in a world where we remain alive.

Another novel, The Doors of Eden (2020) by Adrian Tchaikovsky, explores parallel worlds through the phenomenon of branching evolution. For each parallel Earth in the story, a different species dominates, having continued on, rather than suffering extinction. For instance, the author imagines what a society of trilobites might look like. As in many multiverse stories, reality collapses and the different worlds bleed into one another. The book contains many detailed and imaginative scenarios about speculative evolution, and, from an MWI perspective, it is perfectly reasonable to imagine many different potential evolutionary outcomes, since evolution is highly dependent on randomness, including quantum variations in cosmic rays striking DNA.

Even the art world has taken notice of the multiverse. In response to the COVID-19 pandemic, the Burning Man in the Multiverse experience in 2020 showcased the multiverse with immersive visual styles in a virtual event. In this project, eight teams developed different virtual universes, with a unique Burning Man in each. You could traverse the Burning Man Playa the dry lake bed where it normally takes place at Black Rock City in virtual reality as an avatar, explore art and sculpture created within a virtual world, and imagine the parallel realities of the annual festival itself.

What greater despair than to believe you are living the wrong life?

The most powerful reason why the multiverse has infiltrated culture is because people are storytellers. Research shows that this tendency is universal and appears in early childhood. It is written in our DNA. Implicit in storytelling is the modification of details such that one possible world becomes another. Such narratives are essential to how our species has understood the world for millennia. Meta-stories containing conflicting possible worlds simultaneously become not only plausible but essential to how we interpret our perceptions: personal, nonlinear and qualitative, rather than objective, linear and quantitative.

The human mind even creates its own multiverses through dreams, where alternative realities appear. Who hasnt dreamed of a loved one acting in ways they never would, or living in a house that theyve never seen before? Fundamentally, the human mind has evolved to imagine multiple possible futures branching out from the present. Whether this is actually the case is an open question that physics still must resolve, if it ever can.

While the many-worlds interpretation has at times been overused, the pervasiveness of the multiverse in culture is a shift with benefits. There is more than one way to see the world, and every conscious mind may create its own version of reality. In a world awash with data, hard facts have become difficult to come by, and everyone needs to have their minds open to the possibilities that what they believe or have been told is only one of many possible worlds.

On the other hand, when we start longing to live in one of those alternative realities, it can make us desperately unhappy. This is the curse of imagining all these branching pathways in our lives. As the American novelist James Branch Cabell wrote in The Silver Stallion (1926): The optimist proclaims that we live in the best of all possible worlds; and the pessimist fears this is true. What greater despair than to believe you are living the wrong life? Yet, how can we claim a life is wrong? A life full of suffering is not a meaningless one as Nietzsche points out.

As Nora understands at the end of Haigs The Midnight Library:

This Essay was made possible through the support of a grant to Aeon+Psyche from the John Templeton Foundation. The opinions expressed in this publication are those of the author and do not necessarily reflect the views of the Foundation. Funders to Aeon+Psyche are not involved in editorial decision-making.

Excerpt from:

Multiple worlds has been given artistic impetus by physics - Aeon

Read More..

Glitches in the matrix – The Source – Washington University in St. Louis

The most interesting parts of nature are often the imperfections. Thats especially true in quantum physics, the atomic-level world where tiny flaws can make a big difference in the ways particles behave and interact.

As reported in a paper inNature Communications,Chong Zu, an assistant professor of physics in Arts & Sciences at Washington University in St. Louis, and his team are finding new ways to harness the quantum power of defects in otherwise flawless crystals.

The work is supportedin part by theCenter for Quantum Leaps, a signature initiative of theArts & Sciences strategic planthat aims to apply quantum insights and technologies to physics, biomedical and life sciences, drug discovery and other far-reaching fields.

Zus lab is looking at atomic flaws in boron nitride, a material that forms sheets so thin it can be considered two-dimensional. Boron nitride is generally unchanging and uniform but, every once in a while, a missing boron atom will leave a tiny space. These gaps can happen naturally, but Zu and his team including graduate student Ruotian (Reginald) Gong sped up the process by bombarding microscopic flakes of the material with atoms of helium, little atomic bullets that randomly knock out boron atoms.

The resulting gaps have important quantum potential. The voids naturally fill with electrons that are highly sensitive to their surroundings. For example, tiny shifts in magnetic fields and temperature can change the spin and energy state of the electrons. This sensitivity makes them potentially useful as quantum sensors. In the new study, Zu, Gong and colleagues showed for the first time that the electrons also react to changes in electric fields, expanding the range of potential applications.

Because these particular sensors are trapped in a thin, stable matrix of boron nitride, they could theoretically be applied to a wide variety of substances, from geologic to biologic. Other types of sensors are typically created in a vacuum environment that must be chilled to temperatures near absolute zero.

You could never put something that cold next to a living cell, Zu said. The sensors made from boron nitride, however, are room temperature.

The boron nitride sensors could be also used in basic simulation experiments to study quantum interactions of particles, Zu said. Physicists often use computer programs to predict how particles might interact, he said, but the systems are so complex that even the highest-powered computers can only work so fast.

Instead of trying to build the systems on a computer, you can just create the exact system that you want to studyand then examine the interactions, he said.

Read more from The Ampersand.

See the rest here:

Glitches in the matrix - The Source - Washington University in St. Louis

Read More..

Research Fellow (Energetics of Quantum Measurement), Centre For … – Times Higher Education

About the Centre for Quantum Technologies

The Centre for Quantum Technologies (CQT) is a research centre of excellence in Singapore. It brings together physicists, computer scientists and engineers to do basic research on quantum physics and to build devices based on quantum phenomena. Experts in this new discipline of quantum technologies are applying their discoveries in computing, communications, and sensing.

CQT is hosted by the National University of Singapore and also has staff at Nanyang Technological University. With some 180 researchers and students, it offers a friendly and international work environment.

Learn more about CQT atwww.quantumlah.org

Job Description We are searching for motivated and talented post-docs interested in the fundamental resource cost of quantum measurement and related advantages of quantum nature. The post-doc will join the Quantum Energy Team QET@Singapore led by A. Auffves.

Quantum measurement lies at the crossroad between quantum foundations and quantum technologies. One the one hand, the measurement problem has irrigated all debates about the meaning and completeness of quantum theory - On the other, measurements are key processes in quantum technologies, as they bring results at the (macroscopic) level of the end user. The present project aims to analyze the resource cost of quantum measurement and how it relates to information extraction at the quantum and classical levels. We will optimize the resulting measurement energy efficiency, with special interest in possible advantages when quantum resources are exploited to perform the measurements [1], the fact that quantum measurement can behave as an energetic resource in quantum engines [2,3], and in the intimate relation between energy cost and reversibility. The post-doc will develop theoretical concepts and models, interact with a wide network of top level experimentalists, and supervise PhD students.

Website of the Quantum Energy Team|QET> https://quantum-energy-team.cnrs.fr

Website of the quantum energy initiative https://quantum-energy-initiative.org

Job Requirements

More Information

For enquiries and details about the position, please contactAUFFEVES Alexia atalexia.auffeves@cnrs.fr.

Please include your consent by filling in the NUS Personal Data Consent for Job Applicants.

Employment Type : Full-time

Applications can be submitted via the link below and should contain: the latest CV, and letter of recommendation (if any).

Department : [[Centre For Quantum Technologies]]Job requisition ID : [[18794]]

Covid-19 Message

At NUS, the health and safety of our staff and students are one of our utmost priorities, and COVID-vaccination supports our commitment to ensure the safety of our community and to make NUS as safe and welcoming as possible. Many of our roles require a significant amount of physical interactions with students/staff/public members. Even for job roles that may be performed remotely, there will be instances where on-campus presence is required.

Taking into consideration the health and well-being of our staff and students and to better protect everyone in the campus, applicants are strongly encouraged to have themselves fully COVID-19 vaccinated to secure successful employment with NUS.

Read more here:

Research Fellow (Energetics of Quantum Measurement), Centre For ... - Times Higher Education

Read More..

U.S., India Rapidly Expand Their Military Cooperation > U.S – Department of Defense

This is a transformational moment in the U.S.-India defense partnership, a senior Defense Department official said at the Pentagon today.

"To have the world's largest democracies with some of the most innovative workers and companies working more closely together on strategic technologies and how we can leverage them for security is a natural next step in this relationship," the official said during a briefing.

The United States and India are increasingly doing things in their defense partnership that people wouldn't have said was possible 20 years ago, the official said.

For instance, 20 years ago, there were no U.S. defense sales to India at all, the official said. "Now, we're talking about co-producing and co-developing major systems together."

Also, India is joining the U.S. in annual air and maritime exercises in the region, the official said.

"We now have working groups on everything ranging from cyberspace and critical technologies to maritime security, and India is leading in those forums together with the U.S. and like-minded partners," the official said.

Critical technologies include artificial intelligence, advanced sensor development, unmanned systems, quantum physics and undersea domain awareness, the official said.

India will be a critical strategic partner with the United States in the coming decades. India's growing commitment to playing a more engaged international role, including in the Indo-Pacific Quad, demonstrates a new and growing willingness to join the United States to protect and advance a shared vision of a free, open and rules-based global order, the official said.

The Quadrilateral Security Dialogue, commonly called the Quad, is a strategic security dialogue among Australia, India, Japan and the United States.

Tomorrow, the new INDUS X initiative takes place, bringing together U.S. and Indian stakeholders, research and academic institutions, industry, small startups and investors.

That initiative will focus on accelerating and scaling up commercial technologies that have military applications, providing agreed-upon standards for certification and testing, and making it easier for startups to move their technology into the defense spaces and obtain the capital to do so, the official said.

According to the U.S. Chamber of Commerce, which is hosting the event, INDUS X will be held in Washington just prior to Indian Prime Minister Narendra Modi's White House visit.

"INDUS X has the potential to be a catalyst for India to achieve its target of $5 billion in defense exports by 2025 and for India to diversify its defense supply chain. The conference will feature a defense exhibition where firms will showcase technologies and platforms that can benefit both countries' border security, maritime domain awareness, space situational awareness and more, contributing to a more stable and secure Indo-Pacific region," according to a statement on the Chamber's website.

Excerpt from:

U.S., India Rapidly Expand Their Military Cooperation > U.S - Department of Defense

Read More..

Transforming Cell Phone Radio Frequency With Quantum Apertures – Now. Powered by Northrop Grumman.

The Rydberg sensor is a technology rooted in the fundamental structure of nature on the smallest scales that points to a complete revolution in the detection of radio waves. And it may be coming soon to that cell phone in your hand.

The current cell phone radio frequency network is an engineering marvel that most of us never think twice (or even once) about except when a call gets dropped and we find ourselves talking into dead air. The engineers who designed and operate the cellular network work very hard to maintain it, but lets take a look at this new technology and how it may change cell service forever.

Switching Heavy Traffic

As outlined by UCSB, a cellular handset (aka your phone) contains a compact, low-powered two-way radio, with sufficient range to connect to a nearby base station mounted on a cellular tower. The base station, in turn, uses higher-powered long-range equipment to connect to broader regional and global networks.

The whole point of this system is to enable users to move around freely, including moving from one base stations reception area to anothers. According to Dr. Sid Ghosh of Northrop Grumman, the normal sequence of operations when making a phone call on the move is as follows:

All of this works impressively well, says Ghosh, and a call being dropped during handoff is quite rare. But mobility does put challenging demands on cell phone radio frequency technology. When you are driving or on a train, explains Ghosh, the handovers tend to get tricky since the user may not be optimally located for cell tower coverage.

Even if dropped calls are rare, multiply the basic technology challenge by the number of calls being made, and the problem of dropped calls and generally poor cell phone reception becomes serious especially when its your call that gets dropped or garbled.

The Antenna Fiddling Challenge

A key component of any radio frequency detector is the antenna that picks up the signal. This is, in principle, simply a length of wire. If the wire is the right length and oriented in the right direction, the electric field of passing radio waves will trigger an electric current in the wire, which can be detected and amplified.

This has been the principle of every standard radio receiver since, as noted by the AAAS, Heinrich Hertz first demonstrated and reported on the existence of radio waves in 1889. But it is not ideal for rapid and frequent handoffs from one cell phone radio frequency base station to the next. The thing about antennas is that their ability to pick up a signal depends on the antennas physical size and geometry, which need to be adjusted to pick up a signal well.

Anyone whos moved a radio around to improve reception or fiddled with old-fashioned TV rabbit ears will appreciate that this can be tricky. But as Ghosh explains, quantum physics now provides an entirely different way of detecting radio waves that can revolutionize cell phone reception.

Giant Atoms and a Tiny Detector

Under the right circumstances and hit with the right frequency of laser light, atoms of rare alkaline metals, such as cesium and rubidium, can swell up enormously to a diameter of about 1/25,000 of an inch still submicroscopic but about 10,000 times the size of ordinary atoms.

This atomic bloat is an effect of quantum mechanics. The atomic nucleus is unchanged, but the orbits of its outermost electrons are pumped up by the laser and pushed outward to the atoms far outer envelope, where they are only tenuously bound to the nucleus. This loose binding makes these outer electrons extremely sensitive to electric fields, such as those produced by radio waves.

In effect, the radio signals fluctuating signal jiggles the loosely bound outer electrons, producing changes in the optical spectrum of the Rydberg atoms, and an optical sensor can detect these spectrum changes. It adds up to a nifty physics bank shot: radio waves jiggle the outer electrons of Rydberg atoms, producing spectrum changes in the (vastly shorter wavelength) optical band.

Moving Beyond the Antenna

What makes this one weird physics trick so important to radio technology is that unlike any conventional radio antenna Rydberg atoms do not need to be arranged in a particular length or pointing in a particular direction to pick up a signal. The only thing that needs to be adjusted is the frequency of laser light used to pump up the Rydberg atoms. This property drew the particular interest of the Defense Advanced Research Projects Agency (DARPA), most famous for having developed the internet.

Changing a laser frequency is a much faster and simpler process than physically adjusting the size and orientation of an antenna, particularly when the receiver is on the move and must continually switch frequency as it passes from base station to base station. Size also matters when it comes to mobile radio devices. Conventional antennas need to be proportionate in size to the radio waves they are intended to detect a major constraint on the available radio frequency spectrum when it comes to devices, like cell phones, that must be physically compact.

In contrast, explains Ghosh, Rydberg atoms can act as electrically small antennas and detect the modulation of a carrier wave. They can detect and demodulate AM, FM and phase modulation over a broad range of carrier frequencies, making them a promising platform for advanced communication receivers.

Moreover, he adds, the Rydberg receiver is frequency agnostic over a wide range of operating frequencies, and thus, the size of a Rydberg receiver does not need to scale with operating frequency to maintain optimal performance. The current standard for Rydberg sensor elements, notes Ghosh, is that the sensor must fit inside one cubic centimeter. This is far more compact than most conventional antennas.

The Future of Rydberg Sensors

The Rydberg sensor technology is currently still in its research and development stage in the Quantum Apertures program. But as Ghosh outlines, the gas cell technology for producing Rydberg atoms is maturing rapidly, while similar rapid development is taking place in adjacent peripheral technology development in the commercial optical communication domain thats helping move the concept of a Rydberg receiver toward a viable product that can operate over a wide range of frequencies in various application scenarios.

As reported by EE Times, the wide scope of capabilities offered by Rydberg sensor technology has drawn the interest of NASA as well as DARPA, pointing toward space-based as well as ground-based applications. The possible range of applications is enormous, but few are more ubiquitous than the cell phone network, which makes it likely that people will soon be holding Rydberg sensor technology in their hands. But with improved reliability and performance, theyll probably spend even less time thinking about the technological marvel theyre holding than they do now.

Are you interested in all things related to technology? We are, too. Learn more about life at Northrop Grumman, and check out our career opportunities to see how you can participate in this fascinating time of discovery in science, technology and engineering.

Read more:

Transforming Cell Phone Radio Frequency With Quantum Apertures - Now. Powered by Northrop Grumman.

Read More..

Large Hadron Collider may be closing in on the universe’s missing … – Space.com

Physicists at the Large Hadron Collider (LHC) are closing in on an explanation for why we live in a universe of matter and not antimatter.

Matter and antimatter are two sides of the same coin. Every type of particle has an anti-particle, which is its equal and opposite. For instance, the antimatter equivalent of a negatively charged electron is a positively charged positron.

The Standard Model of physics tells us that if we substitute a particle for its antiparticle, it should still operate within the laws of physics in the same way. As such, the Big Bang should not have had a preference for creating one type over another this symmetry at the heart of nature means that matter and antimatter should have formed in equal amounts in the Big Bang.

Related: 10 cosmic mysteries the Large Hadron Collider could unravel

Lucky for us, this does not seem to have been the case, because when you put matter and antimatter together, the outcome is explosive to say the least. Had matter and antimatter been crafted in equal amounts, then they would have annihilated each other, creating a cosmos filled with a sea of radiation, no atoms and no life. Today, the only antimatter is that which is produced in particle decays and interactions.

However, physicists still don't have an explanation for why we are so fortunate. The fact that there's an excess of matter in the universe means that, somewhere along the line, the symmetry in the way that matter and antimatter interact with the laws of physics was broken.

Physicists call this symmetry-breaking a "charge-conjugation parity (CP) violation." One way to envisage it is to consider the rotational symmetry of a particle. Quantum physics theory holds that particles are not solid objects but rather strange little bodies that act like waves along a "wave function." Ordinarily, when you spin that wave function around 360 degrees, the properties of the particle should not change. But when there is a CP violation, the properties of some particles can change for instance, their quantum spin can alter from 1/2 to 1/2.

CP violation is known to take place in the weak force, which is the fundamental force that is responsible for radioactive decay inside atoms, so we know it can happen (although the weak-force example is a different CP violation than the one that could have possibly created the matterantimatter imbalance). However, in 2013, scientists working on the LHCb (LHCbeauty) experiment also detected CP violation in the decay of "beauty mesons" and "strange beauty mesons," in which the matter and antimatter versions of these particles behave differently when they decay.

The atoms in our bodies are made of protons and neutrons, which themselves are made of three smaller particles called quarks. Physicists call particles made of three quarks "baryons." Particles made of two quarks (one quark and one anti-quark) are called "mesons," and they tend to decay quickly. "Beauty" is another name for the "bottom" quark, while strange refers to a "strange" quark. (The names are just for descriptive purposes to differentiate quarks with slightly different properties and are not to be taken literally.)

Now, analysis of new and more comprehensive results from the LHCb experiment has measured more precisely than ever before the two most important parameters in the CP-violating decay of these mesons.

"These are key parameters that aid our search for unknown effects from beyond our current theory," said LHCb spokesperson Chris Parkes in a statement.

Probing the decay of approximately 349,000 mesons, the LHCb team measured the angle at which the particles that come from the decay of the mesons were emitted, and the time taken for the mesons to decay. Both properties vary, depending on whether the meson is a matter or antimatter particle.

In particular, the time taken for a meson to decay (which is on the scale of tenths of a nanosecond) is dependent on the quantum state of the meson.

Experiments have observed that mesons are able to oscillate between their matter and antimatter states, which have ever-so-slightly different masses. This is because mesons exist in a state of "mixing:" they are a mixture of their matter and antimatter states, which allows them to oscillate back and forth between those states.

As the oscillations take place, the wave functions of the two states can interfere with one another, a bit like the constructive/destructive interference of light in the famous double-slit experiment. The time to decay depends strongly on the masses of the quantum states and the amount of interference between them, which results in a characteristic pattern of CP violation in the meson decays.

"These measurements are interpreted within our fundamental theory of particle physics, the Standard Model, improving the precision with which we can determine the difference between the behavior of matter and antimatter," said Parkes. "Through more precise measurements, large improvements have been made in our knowledge."

The LHCb team was able to measure these properties with unprecedented accuracy. Although the decay of mesons will not fully answer why there is more matter than antimatter in the universe, understanding the symmetry-breaking CP violation at the heart of their decays will help constrain models that do attempt to explain this strange asymmetry, which acted in force at the beginning of time to create a universe dominated by matter.

Follow Keith Cooper on Twitter @21stCenturySETI. Follow us on Twitter @Spacedotcom or on Facebook.

Continued here:

Large Hadron Collider may be closing in on the universe's missing ... - Space.com

Read More..

The expansion of the universe could be a mirage, new theoretical … – Livescience.com

The expansion of the universe could be a mirage, a potentially controversial new study suggests. This rethinking of the cosmos also suggests solutions for the puzzles of dark energy and dark matter, which scientists believe account for around 95% of the universe's total energy and matter but remain shrouded in mystery.

The novel new approach is detailed in a paper published June 2 in the journal Classical and Quantum Gravity, by University of Geneva professor of theoretical physics Lucas Lombriser.

Related: Dark energy could lead to a second (and third, and fourth) Big Bang, new research suggests

Scientists know the universe is expanding because of redshift, the stretching of light's wavelength towards the redder end of the spectrum as the object emitting it moves away from us.Distant galaxies have a higher redshift than those nearer to us, suggesting those galaxies are moving ever further from Earth.

More recently, scientists have found evidence that the universe's expansion isn't fixed, but is actually accelerating faster and faster. This accelerating expansion is captured by a term known as the cosmological constant, or lambda.

The cosmological constant has been a headache for cosmologists because predictions of its value made by particle physics differ from actual observations by 120 orders of magnitude. The cosmological constant has therefore been described as "the worst prediction in the history of physics."

Cosmologists often try to resolve the discrepancy between the different values of lambda by proposing new particles or physical forces but Lombriser tackles it by reconceptualizing what's already there..

"In this work, we put on a new pair of glasses to look at the cosmos and its unsolved puzzles by performing a mathematical transformation of the physical laws that govern it," Lombriser told Live Science via email.

In Lombriser's mathematical interpretation, the universe isn't expanding but is flat and static, as Einstein once believed. The effects we observe that point to expansion are instead explained by the evolution of the masses of particles such as protons and electrons over time.

In this picture, these particles arise from a field that permeates space-time. The cosmological constant is set by the field's mass and because this field fluctuates, the masses of the particles it gives birth to also fluctuate. The cosmological constant still varies with time, but in this model that variation is due to changing particle mass over time, not the expansion of the universe.

In the model, these field fluctuations result in larger redshifts for distant galaxy clusters than traditional cosmological models predict.And so, the cosmological constant remains true to the model's predictions.

"I was surprised that the cosmological constant problem simply seems to disappear in this new perspective on the cosmos," Lombriser said.

Lombriser's new framework also tackles some of cosmology's other pressing problems, including the nature of dark matter. This invisible material outnumbers ordinary matter particles by a ratio of 5 to 1, but remains mysterious because it doesn't interact with light.

Lombriser suggested that fluctuations in the field could also behave like a so-called axion field, with axions being hypothetical particles that are one of the suggested candidates for dark matter.

These fluctuations could also do away with dark energy, the hypothetical force stretching the fabric of space and thus driving galaxies apart faster and faster. In this model, the effect of dark energy, according to Lombriser, would be explained by particle masses taking a different evolutionary path at later times in the universe.

In this picture "there is, in principle, no need for dark energy," Lombriser added.

Post-doctoral researcher at the Universidad ECCI, Bogot, Colombia, Luz ngela Garca, was impressed with Lombriser's new interpretation and how many problems it resolves.

"The paper is pretty interesting, and it provides an unusual outcome for multiple problems in cosmology," Garca, who was not involved in the research, told Live Science. "The theory provides an outlet for the current tensions in cosmology."

However, Garca urged caution in assessing the paper's findings, saying it contains elements in its theoretical model that likely can't be tested observationally, at least in the near future.

Editor's note: This article was corrected at 1:30 p.m. ET on June 20, to reflect that redshift is evidence of cosmic expansion, but not evidence of accelerated cosmic expansion.

Read the original post:

The expansion of the universe could be a mirage, new theoretical ... - Livescience.com

Read More..

Research Fellow (Superconducting Devices), Centre For Quantum … – Times Higher Education

About the Centre for Quantum Technologies

The Centre for Quantum Technologies (CQT) is a research centre of excellence in Singapore. It brings together physicists, computer scientists and engineers to do basic research on quantum physics and to build devices based on quantum phenomena. Experts in this new discipline of quantum technologies are applying their discoveries in computing, communications, and sensing.

CQT is hosted by the National University of Singapore and also has staff at Nanyang Technological University. With some 180 researchers and students, it offers a friendly and international work environment.

Learn more about CQT atwww.quantumlah.org

Job Description

The open position is funded by Singapore's ambitious Quantum Engineering Programme and is affiliated with Yvonne Gao's lab (QCrew) and Steven Touzard's lab (Qove Laboratory). Both PIs hold the Presidential Young Professorship and their laboratories are funded under the National Research Fellowship, which offers long-term funding prospects.

We have an opening for a post-doctoral research fellow specialising in the design and fabrication of superconducting devices. The applicant will lead the development of new broadband Josephson parametric amplifiers and of its application to quantum measurements. The successful candidate will also work closely with the PI to develop future experimental goals and shape the general research direction of the group. There will be ample opportunities to explore personal ideas and participate in grant-writing processes. We aim to provide a challenging and supportive environment to nurture research leadership skills and readiness for a high-level career in academia or in the quantum industry.

Job Requirements

Applicants would need to have the required skills of :

More Information

For enquiries and details about the position, please contact Steven Touzard atsteven.touzard@nus.edu.sg.

Please include your consent by filling in the NUS Personal Data Consent for Job Applicants.

Employment Type : Full-time

Applications can be submitted via the link below and should contain: the latest CV, and letter of recommendation (if any).

Location: [[Kent Ridge]]Organization: [[NUS]]Department : [[Centre For Quantum Technologies]]Job requisition ID : [[18795]]

Covid-19 Message

At NUS, the health and safety of our staff and students are one of our utmost priorities, and COVID-vaccination supports our commitment to ensure the safety of our community and to make NUS as safe and welcoming as possible. Many of our roles require a significant amount of physical interactions with students/staff/public members. Even for job roles that may be performed remotely, there will be instances where on-campus presence is required.

Taking into consideration the health and well-being of our staff and students and to better protect everyone in the campus, applicants are strongly encouraged to have themselves fully COVID-19 vaccinated to secure successful employment with NUS.

Read more from the original source:

Research Fellow (Superconducting Devices), Centre For Quantum ... - Times Higher Education

Read More..

Did physicists get the idea of "fundamental" wrong? – Big Think

If all you start with are the fundamental building blocks of nature the elementary particles of the Standard Model and the forces exchanged between them you can assemble everything in all of existence with nothing more than those raw ingredients. Thats the most common approach to physics: the reductionist approach. Everything is simply the sum of its parts, and these simple building blocks, when combined together in the proper fashion, can come to build up absolutely everything that could ever exist within the Universe, with absolutely no exceptions.

In many ways, its difficult to argue with this type of description of reality. Humans are made out of cells, which are composed of molecules, which themselves are made of atoms, which in turn are made of fundamental subatomic particles: electrons, quarks, and gluons. In fact, everything we can directly observe or measure within our reality is made out of the particles of the Standard Model, and the expectation is that someday, science will reveal the fundamental cause behind dark matter and dark energy as well, which thus far are only indirectly observed.

But this reductionist approach might not be the full story, as it omits two key aspects that govern our reality: boundary conditions and top-down formation of structures. Both play an important role in our Universe, and might be essential to our notion of fundamental as well.

On the right, the gauge bosons, which mediate the three fundamental quantum forces of our Universe, are illustrated. There is only one photon to mediate the electromagnetic force, there are three bosons mediating the weak force, and eight mediating the strong force. This suggests that the Standard Model is a combination of three groups: U(1), SU(2), and SU(3), whose interactions and particles combine to make up everything known in existence.

This might come as a surprise to some people, and might sound like a heretical idea on its surface. Clearly, theres a difference between phenomena that are fundamental like the motions and interactions of the indivisible, elementary quanta that compose our Universe and phenomena that are emergent, arising solely from the interactions of large numbers of fundamental particles under a specific set of conditions.

Take a gas, for example. If you look at this gas from the perspective of fundamental particles, youll find that every fundamental particle is bound up into an atom or molecule that can be described as having a certain position and momentum at every moment in time: well-defined to the limits set by quantum uncertainty. When you take together all the atoms and molecules that make up a gas, occupying a finite volume of space, you can derive all sorts of thermodynamic properties of that gas, including:

Entropy, pressure, and temperature are the derived, emergent quantities associated with the system, and can be derived from the more fundamental properties inherent to the full suite of component particles that compose that physical system.

This simulation shows particles in a gas of a random initial speed/energy distribution colliding with one another, thermalizing, and approaching the Maxwell-Boltzmann distribution. The quantum analogue of this distribution, when it includes photons, leads to a blackbody spectrum for the radiation. Macroscopic properties like pressure, temperature, and entropy can all be derived from the collective behavior of the individual component particles within the system.

But not every one of our familiar, macroscopic laws can be derived from these fundamental particles and their interactions alone. For example, when we look at our modern understanding of electricity, we recognize that its fundamentally composed of charged particles in motion through a conductor such as a wire where the flow of charge over time determines the quantity that we know of as electric current. Wherever you have a difference in electric potential, or a voltage, the magnitude of whatever that voltage is determines how fast that electric charge flows, with voltage being proportional to current.

On macroscopic scales, the relation that comes out of it is the famous Ohms Law: V = IR, where V is voltage, I is current, and R is resistance.

Only, if you try to derive this from fundamental principles, you cant. You can derive that voltage is proportional to current, but you cannot derive that the thing that turns your proportionality into an equality is resistance. You can derive that theres a property to every material known as resistivity, and you can derive the geometrical relationship between how cross-sectional area and the length of your current-carrying wire affects the current that flows through it, but that still wont get you to V = IR.

At temperatures greater than the critical temperature of a superconductor, magnetic flux can freely pass through the conductors atoms. But below the critical superconducting temperature, all of the magnetic flux gets expelled. This is the essence of the Meissner effect, which enables flux-pinning inside regions of a superconductor and the resultant application of magnetic levitation.

In fact, theres a good reason you cant derive V = IR from fundamental principles alone: because its neither a fundamental nor a universal relation. After all, theres a famous experimental set of conditions where this relationship breaks down: inside all superconductors.

In most materials, as they heat up, the resistance of the material to current flowing through it increases, which makes some intuitive sense. At higher temperatures, the particles inside a material zip around more quickly, which makes pushing charged particles (such as electrons) through it more difficult. Common materials such as nickel, copper, platinum, tungsten, and mercury all have their resistances rise as their temperatures increase, as it becomes more and more difficult at higher temperatures to achieve the same flow of current through a material.

On the flipside, however, cooling a material down often makes it easier for current to flow through it. These same materials, as the temperature lowers and cools them down, exhibit less and less resistance to the flow of current. Only, theres a specific transition point where, all of a sudden, once a specific temperature threshold (unique to each material) is crossed, where the resistance suddenly drops to zero.

When cooled to low enough temperatures, certain materials will superconduct: the electrical resistance inside them will drop to zero. When exposed to a strong magnetic field, some superconductors will exhibit levitation effects, and with a properly configured external magnetic field, its possible to pin the superconducting object in place in one or more dimensions, resulting in spectacular applications like quantum levitation.

Its specifically when this occurs that we declare a material has entered a superconducting state. First discovered all the way back in 1911 when mercury was cooled to below 4.2 K, superconductivity still remains only partially explained even today; it cannot be derived or fully explained by fundamental principles alone.

Instead, one needs to apply another set of rules atop the fundamental particles and their interactions: a set of rules known collectively as boundary conditions. Simply giving the information about what forces and particles are at play, even if you include all the information you could possibly know about the individual particles themselves, is insufficient to describe how the full system will behave. You also need to know, in addition to whats going on within a specific volume of space, whats happening at the boundary that encloses that space, with two very common types of boundary conditions being:

If you want to create a propagating electromagnetic wave down a wire where the electric and magnetic fields of that propagating wave are always both perpendicular to the wire and perpendicular to one another, you have to tweak the boundary conditions (e.g., set up a coaxial cable for the wave to travel through) in order to get the desired outcome.

This diagram shows a cutaway of the interior of a coaxial cable. With current flowing in one direction down the central, interior cable and the opposite direction down the outer cable, these boundary conditions enable the propagation of an internal transverse electric-and-magnetic mode in the space between the conductors. This configuration, known as TEM, can only arise due to the specific boundary conditions present in a coaxial cable-like system.

Boundary conditions are of tremendous importance under a wide variety of physical circumstances as well: for plasmas in the Sun, for particle jets around the active black holes at the centers of galaxies, and for the ways that protons and neutrons configure themselves within an atomic nucleus. Theyre required if we want to explain why external magnetic and electric fields split the energy levels in atoms. And theyre absolutely going to come into play if you want to learn how the first strings of nucleic acids came to reproduce themselves, as the constraints and inputs from the surrounding environment must be key drivers of those processes.

One of the most striking places where this arises is on the largest cosmic scales of all, where for decades, a debate took place between two competing lines of thought as to how the Universe grew up and formed stars, galaxies, and the grandest cosmic structures of all.

This image shows the view of JWSTs NIRCam instrument as it looked at galaxy cluster Abell 2744 and revealed a number of galaxies that are members of a proto-cluster. The red squares show several of the galaxies for which spectroscopic measurements were obtained; the orange circles are photometric galaxy candidates that may yet turn out to be part of this cluster. Small, low-mass galaxies form earlier; larger, evolved galaxies and galaxy clusters only appear at later times.

In a top-down Universe, the largest imperfections are on the largest scales; they begin gravitating first, and as they do, these large imperfections fragment into smaller ones. Theyll give rise to stars and galaxies, sure, but theyll mostly be bound into larger, cluster-like structures, driven by the gravitational imperfections on large scales. Galaxies that are a part of groups and clusters would have largely been a part of their parent group or cluster since the very beginning, whereas isolated galaxies would only arise in sparser regions: in between the pancake-and-filament regions where structure was densest.

A bottom-up Universe is the opposite, where gravitational imperfections dominate on smaller scales. Star clusters form first, followed later by galaxies, and only thereafter do the galaxies collect together into clusters. The primary way that galaxies form would be as the first-forming star clusters gravitationally grow and accrete matter, drawing adjacent star clusters into them to form galaxies. The formation of larger-scale structure would only occur as small-scale imperfections experience runaway growth, eventually beginning to affect larger and larger cosmic scales.

If the Universe were purely built based on a top-down scenario of structure formation, wed see large collections of matter fragment into smaller structures like galaxies. If it were purely bottom-up, it would begin by forming small structures whose mutual gravitation brings them together later. Instead, the actual Universe appears to be an amalgam of both, meaning that its not described well by either scenario on its own.

In order to answer this question from an observational perspective, cosmologists began attempting to measure what we call cosmic power, which describes on what scale(s) the gravitational imperfections that seed the Universes structure first appear. If the Universe is entirely top-down, all of the power would be clustered on large cosmic scales, and there would be no power on small cosmic scales. If the Universe is entirely bottom-up, all the cosmic power is clustered on the smallest of cosmic scales, with no power on large scales.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

But if theres at least some power on all manner of cosmic scales, wed instead need to characterize the Universes power spectrum by what we call a spectral index: a parameter that tells us how tilted the Universes power is, and whether it:

If it were this final case, the Universe wouldve been born with power evenly distributed on all scales, and only gravitational dynamics would drive the structure formation of the Universe to get the structures we wind up observing at late times.

The evolution of large-scale structure in the Universe, from an early, uniform state to the clustered Universe we know today. The type and abundance of dark matter would deliver a vastly different Universe if we altered what our Universe possesses. Note that in all cases, small-scale structure arises before structure on the largest scales comes about, and that even the most underdense regions of all still contain non-zero amounts of matter.

When we look back at the earliest galaxies we can see a set of records that are now being newly set all the time with the advent of JWST we overwhelmingly see a Universe dominated by smaller, lower-mass, and less evolved galaxies than we see today. The first groups and proto-clusters of galaxies, as well as the first large, evolved galaxies, dont seem to appear until hundreds of millions years later. And the larger-scale cosmic structures, like massive clusters, galactic filaments, and the great cosmic web, seem to take billions of years to emerge within the Universe.

Does this mean that the Universe really is bottom-up, and that we dont need to examine the birth conditions for the larger scales in order to understand the types of structure that will eventually emerge?

No; thats not true at all. Remember that, regardless of what types of seeds of structure the Universe begins with, gravitation can only send-and-receive signals at the speed of light. This means that the smaller cosmic scales begin to experience gravitational collapse before the larger scales can even begin to affect one another. When we actually measure the power spectrum of the Universe and recover the scalar spectral index, we measure it to be equal to 0.965, with an uncertainty of less than 1%. It tells us that the Universe was born nearly scale-invariant, but with slightly more (by about 3%) large-scale power than small-scale power, meaning that its actually a little bit more top-down than bottom-up.

The large, medium, and small-scale fluctuations from the inflationary period of the early Universe determine the hot and cold (underdense and overdense) spots in the Big Bangs leftover glow. These fluctuations, which get stretched across the Universe in inflation, should be of a slightly different magnitude on small scales versus large ones: a prediction that was observationally borne out at approximately the ~3% level. By the time we observe the CMB, 380,000 years after the end of inflation, theres a spectrum of peaks-and-valleys in the temperature/scale distribution of fluctuations, owing to interactions between normal/dark matter and radiation.

In other words, if you want to explain all of the phenomena that we actually observe in the Universe, simply looking at the fundamental particles and the fundamental interactions between them will get you far, but wont cover it all. A great many phenomena in a great many environments require that we throw in the additional ingredients of conditions both initially and at the boundaries of your physical system on much larger scales than the ones where fundamental particles interact. Even with no novel laws or rules, simply starting from the smallest scales and building up from that wont encapsulate everything thats already known to occur.

This doesnt mean, of course, that the Universe is inherently non-reductionist, or that there are some important and fundamental laws of nature that only appear when you look at non-fundamental scales. Although many have made cases along those lines, those are tantamount to God of the gaps arguments, with no such rules ever having been found, and no emergent phenomena ever coming to be only because some new rule or law of nature has been found on a non-fundamental scale. Nevertheless, we must be cautious against adopting an overly restrictive view of what fundamental means. After all, the elementary particles and their interactions might be all that make up our Universe, but if we want to understand how they assemble and what types of phenomena will emerge from that, much more is absolutely necessary.

Original post:

Did physicists get the idea of "fundamental" wrong? - Big Think

Read More..