Page 812«..1020..811812813814..820830..»

ATLAS Collaboration Sheds Light on the Strongest Force in Nature – AZoQuantum

The force responsible for binding quarks together to form protons, neutrons, and atomic nuclei is known as the strong force, and it's aptly named due to its incredible strength.

The ATLAS experiment at CERN (Image Credit: CERN)

This force, carried by particles called gluons, is thestrongestamong all the fundamental forces of nature, which include electromagnetism, the weak force, and gravity.

Interestingly, it is also the least precisely measured of these four forces. However, in a recently submitted paper to Nature Physics, the ATLAS collaboration has detailed how they harnessed the power of the Z boson, an electrically neutral carrier of the weak force, to determine the strength of the strong force with an unprecedented level of precision, achieving an uncertainty below 1%.

This measurement is important because it is described by a fundamental parameter in the Standard Model of particle physics known as the strong coupling constant. Although knowledge of this constant has improved over the years, its uncertainty is still much larger than the constants for the other fundamental forces.

A more precise measurement is needed for accurate calculations in particle physics and to answer big questions like whether all fundamental forces were once the same or if there are new, unknown forces at play.

Studying the strong force is not only vital for understanding the fundamental aspects of nature but also for addressing significant unanswered questions. For instance, could all the fundamental forces have equal strength at extremely high energies, hinting at a potential common origin? Additionally, could there be new and unknown interactions modifying the behavior of the strong force in specific processes or at certain energy levels?

In their latest examination of the strong coupling constant, the ATLAS collaboration focused on Z bosons generated during proton-proton collisions at CERN's Large Hadron Collider (LHC), operating at a collision energy of 8 TeV.

The production of Z bosons typically occurs when two quarks within the colliding protons annihilate. In this process driven by weak interactions, the strong force becomes involved through the emission of gluons from the annihilating quarks.

This gluon radiation imparts a "kick" to the Z boson, perpendicular to the collision axis, known as transverse momentum. The strength of this kick is directly linked to the strong coupling constant. By precisely measuring the distribution of Z-boson transverse momenta and comparing it with equally precise theoretical calculations, the researchers were able to determine the strong coupling constant.

In the new analysis, the ATLAS team focused on cleanly selected Z-boson decays to two leptons (electrons or muons) and measured the Z-boson transverse momentum via its decay products.

A comparison of these measurements with theoretical predictions enabled the researchers to precisely determine the strong coupling constant at the Z-boson mass scale to be 0.1183 0.0009. With a relative uncertainty of only 0.8%, the result is the most precise determination of the strength of the strong force made by a single experiment to date.

It agrees with the current world average of experimental determinations and state-of-the-art calculations known as lattice quantum chromodynamics.

This record precision was accomplished thanks to both experimental and theoretical advances. On the experimental side, the ATLAS physicists achieved a detailed understanding of the detection efficiency and momentum calibration of the two electrons or muons originating from the Z-boson decay, which resulted in momentum precisions ranging from 0.1% to 1%.

On the theoretical side, the ATLAS researchers used, among other ingredients, cutting-edge calculations of the Z-boson production process that consider up to four loops in quantum chromodynamics. These loops represent the complexity of the calculation in terms of contributing processes. Adding more loops increases the precision.

The strength of the strong nuclear force is a key parameter of the Standard Model, yet it is only known with percent-level precision. For comparison, the electromagnetic force, which is 15 times weaker than the strong force at the energy probed by the LHC, is known with a precision better than one part in a billion.

Stefano Camarda, Physicist, CERN

Stefano Camarda concludes, That we have now measured the strong force coupling strength at the 0.8% precision level is a spectacular achievement. It showcases the power of the LHC and the ATLAS experiment to push the precision frontier and enhance our understanding of nature.

Source: https://home.cern/

Read more here:

ATLAS Collaboration Sheds Light on the Strongest Force in Nature - AZoQuantum

Read More..

Automotive Battery Market is Likely to Garner Revenue of US$ 82.80 … – GlobeNewswire

Wilmington, Delaware, United States, Sept. 26, 2023 (GLOBE NEWSWIRE) -- The Automotive Battery market was estimated to have acquired US$ 45 billion in 2020. It is anticipated to register a 5.70% CAGR from 2021 to 2031 and by 2031; the market is likely to gain US$ 82.80 billion.

The automotive battery market is experiencing a dynamic transformation driven by several key factors. The foremost driver is the increasing demand for electric vehicles (EVs), fueled by environmental concerns and stringent emission regulations worldwide. Lithium-ion batteries have emerged as the dominant technology, offering higher energy density and longer ranges.

Innovations are shaping the market, with a notable focus on solid-state batteries, promising improved safety and energy efficiency. As EVs become more accessible and affordable, consumer awareness is growing, amplifying market opportunities.

Sustainability and circular economy initiatives are gaining traction, with a growing emphasis on battery recycling and second-life applications. The industry is witnessing significant investments in research and development to enhance battery performance and cost-effectiveness.

Collaborations between automakers and battery manufacturers are becoming more prevalent, accelerating technological advancements. The automotive battery market is on an exciting trajectory, with sustainability, innovation, and electrification as its guiding trends.

Request Sample Copy (To Understand the Complete Structure of this Report [Summary + TOC]) - https://www.transparencymarketresearch.com/sample/sample.php?flag=S&rep_id=1814

Market Snapshot:

Key Findings of the Market Report

Market Trends for Automotive Batteries

Get Customization on this Report for Specific Research Solutions: https://www.transparencymarketresearch.com/sample/sample.php?flag=CR&rep_id=1814

Market for Automotive Batteries: Regional Outlook

The global market for automotive batteries is not only characterized by its rapid growth but also by distinct regional variations that shape its trajectory. Here's a snapshot of the regional outlook in this dynamic market:

Global Automotive Battery Market: Key Players The competitive landscape in the automotive battery market is fierce, with major players like Panasonic, LG Chem, and CATL dominating, while newcomers and startups focus on niche innovations and regional markets. The following companies are well-known participants in the global Automotive Battery market:

Key developments in the global Automotive Battery market are:

Quick Buy This Premium Report: https://www.transparencymarketresearch.com/checkout.php?rep_id=1814&ltype=S

Global Automotive Battery Market Segmentation Type

Propulsion

Vehicle Type

Sales Channel

Region

About Transparency Market Research

Transparency Market Research, a global market research company registered at Wilmington, Delaware, United States, provides custom research and consulting services. The firm scrutinizes factors shaping the dynamics of demand in various markets. The insights and perspectives on the markets evaluate opportunities in various segments. The opportunities in the segments based on source, application, demographics, sales channel, and end-use are analysed, which will determine growth in the markets over the next decade.

Our exclusive blend of quantitative forecasting and trends analysis provides forward-looking insights for thousands of decision-makers, made possible by experienced teams of Analysts, Researchers, and Consultants. The proprietary data sources and various tools & techniques we use always reflect the latest trends and information. With a broad research and analysis capability, Transparency Market Research employs rigorous primary and secondary research techniques in all of its business reports.

Contact:

Nikhil SawlaniTransparency Market Research Inc.CORPORATE HEADQUARTER DOWNTOWN,1000 N. West Street,Suite 1200, Wilmington, Delaware 19801 USATel: +1-518-618-1030USA Canada Toll Free: 866-552-3453Website:https://www.transparencymarketresearch.com Blog:https://tmrblog.com Email:sales@transparencymarketresearch.com

Visit link:

Automotive Battery Market is Likely to Garner Revenue of US$ 82.80 ... - GlobeNewswire

Read More..

Theoretical study shows that Kerr black holes could amplify new physics – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

by Ingrid Fadelli , Phys.org

close

Black holes are regions in space characterized by extremely strong gravity, which prevents all matter and electromagnetic waves from escaping it. These fascinating cosmic bodies have been the focus of countless research studies, yet their intricate physical nuances are yet to be fully uncovered.

Researchers at University of CaliforniaSanta Barbara, University of Warsaw and University of Cambridge recently carried out a theoretical study focusing on a class of black holes known as extremal Kerr black holes, which are uncharged stationary black holes with a coinciding inner and outer horizon. Their paper, published in Physical Review Letters, shows that these black holes' unique characteristics could make them ideal "amplifiers" of new, unknown physics.

"This research has its origin in a previous project started during my visit to UC Santa Barbara," Maciej Kolanowski, one of the researchers who carried out the study, told Phys.org. "I started discussing very cold (so called, extremal) black holes with Gary Horowitz (UCSB) and Jorge Santos (at Cambridge). Soon we realized that in fact, generic extremal black holes look very different than it was previously believed."

In their previous paper, Kolanowski, Horowitz and Santos showed that in the presence of a cosmological constant extremal black holes are affected by infinite tidal forces. This means that if living beings were to fall into the black hole, they would be crushed by gravity before they moved even remotely close to the black hole's center. Yet the team showed that if the cosmological constant is zero, as it is assumed to be in many astrophysical scenarios, this effect vanishes.

"The spark for the current paper arose at UC Santa Barbara's weekly Gravity Lunch," Grant Remmen explained. "Chatting with Horowitz after a talk on his work on black hole horizon singularities, I asked whether other effects could give rise to such phenomena. My previous work on effective field theories (EFTs), particularly development of physics models with quantum corrections, gave me an idea. Talking with Horowitz, I wondered whether the higher-derivative terms in a gravitational EFT (i.e., quantum corrections to the Einstein equations) could themselves lead to singularities on the horizons of extreme black holes."

After Remmen shared his idea with Horowitz, they started collaboration with Kolanowski and Santos, aimed at testing this idea via a series of calculations. In their calculations, the researchers considered Einstein gravity coupled to its leading quantum corrections.

"The Einstein equations are linear in the Riemann tensor, a mathematical object describing the curvature of spacetime," Remmen explained. "In three space dimensions, the leading corrections to Einstein are terms that are cubic (third power) and quartic (fourth power) in the curvature. Because curvature is a measure of derivatives of the spacetime geometry, such terms are called 'higher-derivative terms.' We calculated the effect of these higher-derivative terms on rapidly spinning black holes."

Extremal black holes rotate at a maximum possible rate corresponding to the horizon moving at the speed of light. The researchers' calculations showed that the higher-derivative EFT corrections of extremal black holes make their horizons singular, with infinite tidal forces. This is in stark contrast with typical black holes, which have finite tidal forces that only become infinite at the center of the black hole.

"Surprisingly, EFT corrections make the singularity jump all the way from the center of the black hole out to the horizon, where you wouldn't expect it to be," Remmen said. "The value of the coefficient in front of a given EFT termthe 'dial settings' in the laws of physicsare dictated by the couplings and types of particle that are present at high energies and short distances. In this sense, EFT coefficients are sensitive to new physics."

Kolanowski, Horowitz, Remmen and Santos also found that the strength of the divergence in tides at the horizon of extremal black holes, and the possible occurrence of tidal singularity, heavily depends on the EFT coefficients. The results of their calculations thus suggest that the spacetime geometry near the horizon of these black holes is sensitive to new physics at higher energies.

"Interestingly, this unexpected singularity is present for the values of these EFT coefficients generated by the Standard Model of particle physics," Remmen said.

"Our results are surprising, since they imply that the low-energy description of physics can break down in a situation where you wouldn't expect that to happen. In physics, there's usually a sense of 'decoupling' between different distance scales. For example, you don't need to know the details of water molecules to describe waves using hydrodynamics. Yet for rapidly spinning black holes, that's precisely what happens: the low-energy EFT breaks down at the horizon."

Overall, the calculations carried out by this team of researchers hint at the promise of extremal Kerr black holes for probing new physical phenomena. While the horizon of these black holes can be very large, it was not expected to have an infinitely large curvature (i.e., infinite tidal forces) in the EFT. Their results show that it does.

"In future work, we are interested in exploring whether the singularities can be resolved by ultraviolet physics," Remmen added. "A pressing question is whether the sensitivity of the horizon to new physics persists all the way to the Planck scale, or whether the horizon 'smooths out' at the short-distance scale associated with the EFT. We are also looking for other situations in which short distance effects might show up unexpected at large distances."

More information: Gary T. Horowitz et al, Extremal Kerr Black Holes as Amplifiers of New Physics, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.131.091402

Journal information: Physical Review Letters

2023 Science X Network

View post:

Theoretical study shows that Kerr black holes could amplify new physics - Phys.org

Read More..

Researchers Studying the Quantum Realm Observe Alice in … – The Debrief

A team of researchers studying the quantum realm say they have observed an otherworldly mirror universe through the eye of a decaying monopole that is eerily reminiscent of the mirror universe written about by author Lewis Carroll in his Alices Adventures in Wonderland.

Dubbed an Alice ring in honor of Carrolls mirror universe, these fleeting, quantum world events may help to unravel the mysteries of the quantum realm.

In quantum physics, monopoles are the proposed counterpart to dipoles, which have a positive and negative charge at opposing ends, just like a conventional magnet. In contrast, the monopole is only negatively or positively charged.

For decades, scientists have theorized how an actual magnetic monopole might decay, with the most common theory being that it would create a brief, fleeting ring-like structure that might open the door to an alternate mirror universe. As noted, the mirror universe revealed by these decaying rings reminded theorists of the mirror universe in Lewis Carrols Alices Adventures in Wonderland, where everything is the opposite of the real world.

Such theoretical Alice rings have remained particularly elusive for decades. But now, a team of researchers who have been studying the phenomenon for years say they have spotted these structures in nature for the first time ever. And as they suspected, Alice rings may indeed be a portal to what they describe as an otherworldly mirror universe.

The hunt for a real-world Alice ring involved a years-long collaboration between Professor Mikko Mttnen of Aalto University and Professor David Hall from Amherst College. In fact, their first discovery on the road to Carrolls mirror universe took place in 2014, when the duo successfully proved the existence of an analog of a quantum monopole.In 2015, they actually isolated a quantum monopole, and then in 2017 actually observed one decaying into the other. Still, it wasnt until their latest research that they witnessed the appearance of the doorway to the mirror universe known as the elusive Alice ring.

This was the first time our collaboration was able to create Alice rings in nature, which was a monumental achievement, Mttnen said.

According to the press release announcing this once-in-a-career feat, the research team, which was aided by Ph.D. candidate Alina Blinova, manipulated a gas of rubidium atoms prepared in a nonmagnetic state near absolute zero temperature. Then, operating under these extreme conditions, the researchers were able to create a monopole by steering a zero point of a three-dimensional magnetic field into the quantum gas. As previously theorized, the result was a perfectly formed Alice ring.

Notably, the researchers point out that Alice rings only last for a few milliseconds, as they are extremely fragile. This means that when a magnetic monopole is exposed to the slightest external force, it immediately decays into an Alice ring.

Think of the monopole as an egg teetering at the top of a hill, Mttnen said. The slightest perturbations can send it crashing down. In the same way, monopoles are subject to noise that triggers their decay into Alice rings.

Perhaps even more astonishing, and as the longtime collaborators had hoped, their Alice ring seemed to offer a glimpse into a mirror universe just like Carrolls.

From a distance, the Alice ring just looks like a monopole, but the world takes a different shape when peering through the centre of the ring, Hall said.

It is from this perspective that everything seems to be mirrored, as if the ring were a gateway into a world of antimatter instead of matter, Mttnen added.

Published in the journal Nature Communications, the researchers say that the verified observation of an Alice ring in the real world could one day lead to a better understanding of quantum physics. However, there is still no indication whether or not it will lead to attending a tea party with a mad hatter.

Christopher Plain is a Science Fiction and Fantasy novelist and Head Science Writer at The Debrief. Follow and connect with him on X, learn about his books at plainfiction.com, or email him directly at christopher@thedebrief.org.

Read the original:

Researchers Studying the Quantum Realm Observe Alice in ... - The Debrief

Read More..

Teleportation fidelity the big winner in the quantum lottery – ANU College of Science

Running your quantum system as a lottery turns out to be a way to improve the transmission of data via quantum teleportation.

Researchers at the Research School of Physics used a probabilistic twist to develop a new transmission protocol that set a new record in data transmission: 92 percent fidelity, which more than halves the loss in previous experiments.

The new protocol will enable encrypted data, for example in finance or military settings, to be sent with higher accuracy.

Our protocol improves the capability of the quantum teleporter to protect fragile quantum states during long-distance transmission, making the system resilient to noise and loss, said lead researcher Dr Sophie Jie Zhao, from the Department of Quantum Sciences and the CQC2T ARC Centre of Excellence, who is the lead author in the teams publication inNature Communications.

Quantum teleportation is already being used in encrypted networks. It allows information to be shared instantly between linked, or entangled, quantum objects.

However, the entanglement between the objects can easily be destroyed by interactions with external entities. This at once makes quantum teleportation extremely secure as any tampering instantly destroys the data transfer but also very prone to degradation through noise due to environmental interactions.

With entanglement degradation limiting their existing teleportations fidelity and distance, the team set their mind to improving the teleportation efficacy by leveraging the paradoxes of the Heisenberg Uncertainty Principle.

In these experiments, the ends of the teleportation link are two photons from the same source, which creates entanglement in their properties. These photons are sent to two separate locations, untouched, which leaves their properties unknown, and able to appear in any possible state.

The signaller then gets the information to be teleported to interact with one of the photons, and measures the photons properties in this case amplitude and phase making the photon choose a state. This causes the other photon (the receiver) to instantly choose its state as well. Because the two photons are linked, information about the signallers experiment can be deduced by the receiver.

This deduction relies on the sender separately conveying to the receiver the result of the experiment. This does not reveal the teleported information, as it is the result of the mashup between that information and the original photon. However, this result acts as a key that allows the receiver to work backwards from the result at their end and disentangle the teleported information.

It is crucial that the sender cant know what the teleported information is that would constitute a measurement and collapse the quantum information, said University of Queensland researcher and CQC2T member Professor Tim Ralph.

The information needs to be hidden in uncertainty so the sender doesnt know exactly what they are sending. The more they know about the signal, the more they destroy it, he said.

Quantum uncertainty resulting from the mixing of possible states can be cancelled out with the key, however uncertainty resulting from noise from entanglement degradation is harder to cancel out.

To filter this noise the team leveraged the fact that the mixed states have a Gaussian distribution. They realised that a lottery, a protocol in which a subset of the measurements was selected randomly in a way that actually narrowed the Gaussian distribution, while other measurements were randomly discarded, could help filter out noise.

Adding an element of chance to our protocol has the effect of distilling the quantum information, Dr Zhao said.

The post-selection effectively biases the Gaussian distribution in favour of high-amplitude outcomes than outcomes close to the origin of phase space, hence acting as an amplifier. Since this amplification is noiseless and takes over from part of the amplification applied by the receiver in standard teleportation protocols, the teleported states suffer less from the noise added due to imperfect entanglement.

An interesting quirk of the system is that the balance between the probabilistic factor and the noise reduction can be tuned. By simply reducing the probability of measurements being selected in the protocol the teleportation fidelity can be increased.

To achieve their record 92 percent fidelity the team used a success rate of less than one in a hundred thousand, sampling the system for around two hours.

In the new protocol, the success of the teleportation relies on the stability of the laser system, instead of being limited by environmental noise, Dr Zhao said.

You can always get better fidelity if you are willing to sacrifice your success rate. But then you need a longer sampling time.

If the system were stable enough to allow us to sample for say, 20 hours, then I believe we could go above 95 percent, she said.

This article was first published by ANU Research School of Physics.

Go here to read the rest:

Teleportation fidelity the big winner in the quantum lottery - ANU College of Science

Read More..

Zentropy A New Theory That Could Transform Material Science – SciTechDaily

A snapshot of the ab initio molecule dynamics simulations at 753 degrees Kelvin, showing the polarized titanium oxide bonding with local tetragonal structures in various orientations, which depict the local 90 and 180 degree domain walls. Credit: Courtesy Zi-Kui Liu

The universe naturally gravitates towards disorder, and only through the input of energy can we combat this inevitable chaos. This idea is encapsulated in the concept of entropy, evident in everyday phenomena like ice melting, fires burning, and water boiling. However, zentropy theory introduces an additional layer to this understanding.

This theory was developed by a team led by Zi-Kui Liu, the distinguished Dorothy Pate Enright Professor of Materials Science and Engineering at Penn State. The Z in zentropy is derived from the German term Zustandssumm, which translates to the sum over states of entropy.

Alternatively, Liu said, zentropy may be considered as a play on the term zen from Buddhism and entropy to gain insight on the nature of a system. The idea, Liu said, is to consider how entropy can occur over multiple scales within a system to help predict potential outcomes of the system when influenced by its surroundings.

Liu and his research team have published their latest paper on the concept, providing evidence that the approach may offer a way to predict the outcome of experiments and enable more efficient discovery and design of new ferroelectric materials. The work, which incorporates some intuition and a lot of physics to provide a parameter-free pathway to predicting how advanced materials behave, was published inScripta Materialia.

Ferroelectrics have unique properties, making them valuable for a variety of applications both now and in developing materials, researchers said. One such property is spontaneous electric polarization that can be reversed by applying an electric field, which facilitates technologies ranging from ultrasounds to ink-jet printers to energy-efficient RAM for computers to the ferroelectric-driven gyroscope in smartphones that enable smooth videos and sharp photos.

To develop these technologies, researchers need to experiment to understand the behavior of such polarization and its reversal. For efficiencys sake, the researchers usually design their experiments based on predicted outcomes. Typically, such predictions require adjustments called fitting parameters to closely match real-world variables, which take time and energy to determine. But zentropy can integrate top-down statistical and bottom-up quantum mechanics to predict experimental measures of the system without such adjustments.

Of course, at the end of the day, the experiments are the ultimate test, but we found that zentropy can provide a quantitative prediction that can narrow down the possibilities significantly, Liu said. You can design better experiments to explore ferroelectric material and the research work can move much faster, and this means you save time, energy, and money and are more efficient.

While Liu and his team have successfully applied zentropy theory to predict the magnetic properties of a range of materials for various phenomena, discovering how to apply it to ferroelectric materials has been tricky. In the current study, the researchers reported finding a method to apply zentropy theory to ferroelectrics, focusing on lead titanate. Like all ferroelectrics, lead titanate possesses electric polarization that can be reversed when external electric fields, temperature changes, or mechanical stress is applied.

As an electric field reverses electric polarization reverses, the system transitions from ordered in one direction to disordered and then to ordered again as the system settles into the new direction. However, this ferroelectricity occurs only below a critical temperature unique to each ferroelectric material. Above this temperature, ferroelectricity the ability to reverse polarization disappears and paraelectricity the ability to become polarized emerges. The change is called the phase transition. The measurement of those temperatures can indicate critical information about the outcome of various experiments, Liu said. However, predicting the phase transition prior to an experiment is nearly impossible.

No theory and method can accurately predict the free energy of the ferroelectric materials and the phase transitions prior to the experiments, Liu said. The best prediction of transition temperature is more than 100 degrees away from the experiments actual temperature.

This discrepancy arises due to the unknown uncertainties in models, as well as fitting parameters that could not consider all salient information affecting the actual measurements. For example, an often-used theory characterizes macroscopic features of ferroelectricity and paraelectricity but does not consider microscopic features such as dynamic domain walls boundaries between regions with distinct polarization characteristics within the material. These configurations are building blocks of the system and fluctuate significantly with respect to temperature and electric field.

In ferroelectrics, the configuration of electric dipoles in the material can change the direction of polarization. The researchers applied zentropy to predict the phase transitions in lead titanate, including identifying three types of possible configurations in the material.

The predictions made by the researchers were effective and in agreement with observations made during experiments reported in the scientific literature, according to Liu. They used publicly available data on domain wall energies to predict a transition temperature of 776 degrees Kelvin, showing a remarkable agreement withthe observed experimental transition temperature of 763 degrees Kelvin. Liu said the team is working on further reducing the difference between predicted and observed temperatures with better predictions of domain wall energies as a function of temperature.

This ability to predict transition temperature so closely to the actual measurements can provide valuable insights into the physics of ferroelectric material and help scientists to better their experimental designs, Liu said.

This basically means you can have some intuitions and a predictive approach on how a material behaves both microscopically and macroscopically before you conduct the experiments, Liu said. We can start predicting the outcome accurately before the experiment.

Along with Liu, other researchers in the study from Penn State include Shun-Li Shang, research professor of materials science and engineering; Yi Wang, research professor of materials science and engineering; and Jinglian Du, research fellow in materials science and engineering at the time of the study.

Reference: Parameter-free prediction of phase transition in PbTiO3 through combination of quantum mechanics and statistical mechanics by Zi-Kui Liu, Shun-Li Shang, Jinglian Du and Yi Wang, 20 April 2023, Scripta Materialia.DOI: 10.1016/j.scriptamat.2023.115480

The Department of Energys Basic Energy Sciences program supported this research.

Read more here:

Zentropy A New Theory That Could Transform Material Science - SciTechDaily

Read More..

Venice 2023: ‘The Theory of Everything’ is a Confusing Multiverse Tale – First Showing

by Alex BillingtonSeptember 24, 2023

The multiverse subgenre of cinema is growing. There are a handful of new films every year exploring this uncharted new territory, experimenting with big ideas and mind-bending storytelling. Not every story will work, though, not every equation will produce a correct answer. Even though this film has quite a few issues with it, I still can't stop thinking about it weeks after first seeing it at the 2023 Venice Film Festival. The Theory of Everything (originally Die Theorie von Allem in German) is a German-Austrian-Swiss co-production from a German filmmaker named Timm Krger (also the director of The Council of Birds). Not to be confused with the Oscar winning biopic (from 2014) about Stephen Hawking also called The Theory of Everything, this German The Theory of Everything is a unique multiverse tale. It's one of the first clever attempts at mixing film noir with multiverse theory, integrating quantum mechanics thinking into a shady characters mystery plot. Most of it is rather confounding and strange, the film doesn't quite come together as coherently it should, but it's still worth mentioning as another experiment in this intellectual subgenre.

Cinema is going through a multiverse renaissance right now - between Everything Everywhere All at Once rightfully winning Best Picture, the groundbreaking Spider-Man: Into the Spider-Verse / Across the Spider-Verse / Beyond the Spider-Verse movies, along with Marvel dipping their toes in with Doctor Strange in the Multiverse of Madness & the Loki series, and DC trying it out with The Flash (which was a big failure). Of course there have been multiverse movies before this current era (Jet Li's The One, The Butterfly Effect, Run Lola Run, Source Code, Donnie Darko, etc) but right now we're in a vibrant Golden Age of multiverse movies, which bothers some (because they think it's related to dumb comic book movies) and excites many others (who realize it's really about quantum mechanics / string theory / physics / philosophy, etc), myself included. This is where Krger's The Theory of Everything fits right in. It doesn't take long to understand what he's trying to do make a film noir multiverse movie meets romantic B&W mystery set in the Swiss mountains. It's a cool idea for a film and Krger throws in some scientific aspects to make it more grounded not so fantastical or comic booky, closer to "this could've actually happened and we'd never really know."

In The Theory of Everything, German actor Jan Blow stars as Johannes Leinert, a young scientist who travels with his doctoral advisor to a physics congress in the Swiss Alps, where an Iranian scientist is set to reveal a "groundbreaking theory of quantum mechanics." Most of the film is set in 1962 and it's show in lush B&W cinematography by DP Roland Stuprich. Most of the film is also set in the mountains at this remote lodge. When everyone arrives at the hotel, the Iranian guest is nowhere to be found. As we follow Johannes around while everyone else goes skiing, everything starts to get strange he he meets a peculiar jazz pianist woman named Karin, played by Olivia Ross, who seems to know secret details about him. One morning one of the physicists is found dead, and others start disappearing without a trace. As Johannes descends deeper into this mountainous mystery, he finds himself literally descending deeper into this mountain, discovering something incredible within. Krger uses the film noir storytelling to turns this multiverse story into a scientific one with the question of if multiverse theory is real, could this be a thrilling example of what might happen to one person wrapped up in all this? As with most noir stories, there is no clear answer.

Aside from using the same title as the Stephen Hawking film, which doesn't really work well (what even is this theory, really?), the film has a number of other glaring flaws. The score by Diego Ramos Rodriguez & David Schweighart is obnoxiously loud and distracting, a highly melodic, symphonic sound that just doesn't fit with the mysterious vibe (usually I like these kind of scores, but not for this film). Worst of all, the film's narrative is especially confounding and indecipherable in the second half. There are some magical scenes that wowed me, but everything else will make everyone watching wonder "huh? what is going on?" Everyone I talked to after the screening in Venice couldn't make sense of it either. I'm sure Krger knows what he's doing and has all the different narratives laid out in his mind, but this is a case where that just doesn't translate and come across in the film. If most viewers can't make sense of it on their first viewing, it's a bad experience. Even if one day someone does explain everything and provide a guide as to who is from which multiverse and what happens to them, it still won't magically make the film any better. That said, I admire his attempt to tell this kind of complex, intertwined story of multiverses & scientists. Just wish it was better.

Despite my frustrations and everyone's confusion, I'm still thinking about this film and still thinking about how it tries to mix noir with quantum thinking. The Theory of Everything may not instantly join the ranks as one of the best modern multiverse movies, but it also doesn't deserve to be forgotten entirely. This is even a part of the plot, with a line about how everyone shrugs it off as "just a strange story" of something that happened to Johannes. Maybe it really did happen? Would you believe it if someone told a story like this?

Alex's Venice 2023 Rating: 6 out of 10Follow Alex on Twitter - @firstshowing / Or Letterboxd - @firstshowing

Find more posts: Review, Venice 23

View post:

Venice 2023: 'The Theory of Everything' is a Confusing Multiverse Tale - First Showing

Read More..

Light and sound waves reveal negative pressure – Science Daily

Negative pressure is a rare and challenging-to-detect phenomenon in physics. Using liquid-filled optical fibers and sound waves, researchers at the Max Planck Institute for the Science of Light (MPL) in Erlangen have now discovered a new method to measure it. In collaboration with the Leibniz Institute of Photonic Technologies in Jena (IPHT), the scientists in the Quantum Optoacoustics research group, led by Birgit Stiller, can gain important insights into thermodynamic states.

As a physical quantity pressure is encountered in various fields: atmospheric pressure in meteorology, blood pressure in medicine, or even in everyday life with pressure cookers and vacuum-sealed foods. Pressure is defined as a force per unit area acting perpendicular to a surface of a solid, liquid, or gas. Depending on the direction in which the force acts within a closed system, very high pressure can lead to explosive reactions in extrem cases, while very low pressure in a closed system can cause the implosion of the system itself. Overpressure always means that the gas or liquid pushes against the walls of its container from the inside, like a balloon expanding when more air is added. Regardless of whether it's high or low pressure, the numerical value of pressure is always positive under normal circumstances.

However, liquids exhibit a peculiar characteristic. They can exist in a specific metastable state corresponding to a negative pressure value. In this metastable state, even a tiny external influence can cause the system to collapse into one state or another. One can imagine it as sitting at the top of a roller coaster: the slightest touch on one side or the other sends you hurtling down the tracks. In their current research, the scientists are examining the metastable state of liquids with negative pressure. To achieve this, the research team combined two unique techniques in a study published in Nature Physics to measure various thermodynamic states. Initially, tiny amounts -- nanoliters -- of a liquid were encapsulated in a fully closed optical fiber, allowing both highly positive and negative pressures. Subsequently, the specific interaction of optical and acoustic waves in the liquid enabled the sensitive measurement of the influence of pressure and temperature in different states of the liquid. Sound waves act as sensors for examining negative pressure values, exploring this unique state of matter with high precision and detailed spatial resolution.

The influence of negative pressure on a liquid can be envisioned as follows: According to the laws of thermodynamics, the volume of the liquid will decrease, but the liquid is retained in the glass fiber capillary by adhesive forces, much like a water droplet sticking to a finger. This results in a "stretching" of the liquid. It is pulled apart and behaves like a rubber band being stretched. Measuring this exotic state typically requires complex equipment with heightened safety precautions. High pressures can be hazardous endeavors, particularly with toxic liquids. Carbon disulfide, used by the researchers in this study, falls into this category. Due to this complication, previous measurement setups for generating and determining negative pressures have required significant laboratory space and even posed a disturbance to the system in the metastable state. With the method presented here, the researchers have instead developed a tiny, simple setup in which they can make very precise pressure measurements using light and sound waves. The fiber used for this purpose is only as thick as a human hair.

"Some phenomena which are difficult to explore with ordinary and established methods can become unexpectedly accessible when new measurement methods are combined with novel platforms. I find that exciting," says Dr. Birgit Stiller, head of the Quantum Optoacoustics research group at MPL. The sound waves used by the group can detect temperature, pressure, and strain changes very sensitively along an optical fiber. Furthermore, spatially resolved measurements are possible, meaning that the sound waves can provide an image of the situation inside the optical fiber at centimeter-scale resolution along its length. "Our method allows us to gain a deeper understanding of the thermodynamic dependencies in this unique fiber-based system," says Alexandra Popp, one of the two lead authors of the article. The other lead author, Andreas Geilen, adds: "The measurements revealed some surprising effects. The observation of the negative pressure regime becomes abundantly clear when looking at the frequency of the sound waves."

The combination of optoacoustic measurements with tightly sealed capillary fibers enables new discoveries regarding the monitoring of chemical reactions in toxic liquids within otherwise difficult-to-investigate materials and microreactors. It can penetrate new, hard-to-access areas of thermodynamics. "This new platform of fully sealed liquid core fibers provides access to high pressures and other thermodynamic regimes," says Prof. Markus Schmidt from IPHT in Jena, and Dr. Mario Chemnitz, also from IPHT in Jena, emphasizes: "It is of great interest to investigate and even tailor further nonlinear optical phenomena in this type of fiber." These phenomena can unlock previously unexplored and potentially new properties in the unique thermodynamic state of materials. Birgit Stiller concludes: "The collaboration between our research groups in Erlangen and Jena, with their respective expertise, is unique in gaining new insights into thermodynamic processes and regimes on a tiny and easy-to-handle optical platform."

Read the original:

Light and sound waves reveal negative pressure - Science Daily

Read More..

The ‘Green Cloud’: Four strategies for a sustainable and responsible … – Open Access Government

Green Cloud refers to a sustainable way of cloud computing. It reduces energy demand and saves money while keeping an eye on environmental issues at the same time. Moving traditional IT infrastructure to the cloud is beneficial for the environment in several ways; primarily, it reduces the number of physical servers and increases the average utilisation of available computing units. If cloud providers do it right, a measurable impact on a companys CO2 footprint can be achieved.

Recently, Green Cloud has become a buzzword as more companies consider the CO2 emissions and the overall carbon footprint of their new cloud service providers facilities. Respectively, sustainability and responsibility are becoming the main points of differentiation in the marketplace for global hyperscales like AWS, Google Cloud or Microsoft Azure and European cloud companies like OVH.

Firstserv Ltd is putting all its efforts into ensuring its services are as environmentally friendly as possible. The climate crisis and rising energy costs demand future-proofing of the support given to their customers. Improving efficiency is a major step towards a sustainable cloud, particularly regarding physical data centres. Sebastian Tyc, CEO of Firstserv Ltd, outlines the four most effective strategies for creating a sustainable and responsible future of greener cloud services.

The operation of a data centre requires loads of energy. While most of this energy is needed to power the servers, a large part also goes into cooling them to protect the equipment. If data centre locations are picked strategically, their power demand can be substantially reduced. For example, data centres in cool regions such as Scandinavia or underground facilities need much less cooling than in desert or subtropical areas like the Southern US.

The main concept behind cloud computing is that services are shared over a network, optimising the resources effectiveness. For example, a cloud facility that serves Sydney users during Sydney business hours with a specific service (e.g., a web server) could relocate the same resources to serve European users during European business hours with a different application.

As such, cloud services operate more efficiently than on-premises data centres. It is precisely because of the efficient utilisation of IT resources that cloud computing positively impacts the environment. As data-intensive technologies such as Artificial Intelligence (AI) and distributed manufacturing systems surge, cloud computing centres must remain energy efficient.

In this regard, modern data centres increasingly use advanced technologies to eliminate wastage at every level of their operations. For example, most of todays data centres use machine learning to maximise cooling their environments automatically. Besides machine learning, data centres also deploy smart temperature, lighting, and cooling controls to minimise energy use in their environments.

Firstserv Ltd data centres employ renewable energy sources such as geothermal, solar, wind or water-cooling technology. They introduce liquid cooling for processors to minimise their overall carbon footprint. It is also important to ensure that your infrastructure is suitable for hosting your application environment.

Even though cost savings and increased efficiency in business operations are the top drivers of virtualisation, they are not the only benefits. Cloud computing also uses virtualisation to contribute positively to environmental sustainability.

Virtualisation allows an organisation to create several virtual machines (VMs) and run multiple applications on the same physical server via a hypervisor. As such, high-carbon physical machines get replaced with their virtual equivalents.

For example, an organisation could use a single VM rather than a resource-heavy physical server to stream videos. This could help the company to minimise power consumption and the overall carbon footprint. Shifting an on-premises IT infrastructure to the cloud means you use fewer servers, and this type uses less power, potentially having a lower impact on the environment.

To reduce the overall need for energy in data centres, cloud providers strive to use optimised and modern hardware and software infrastructure. This is not limited to changing old light bulbs to energy-saving lights! Data centres employ energy-saving strategies such as dynamic voltage and frequency scaling (DVFS) or shifting to modern data storage devices. Solid state drives (SSDs) need less power, faster access to data, and last longer than their legacy technology, HDDs. Using optimised hardware, data centres become more efficient and minimise energy demand.

Firstserv Ltd uses multiple strategies to optimise IT workflows at every level. This might include shifting workloads to different times, modifying applications to reduce network traffic, optimising storage and server caches, automating routine tasks or taking other steps to reduce energy usage.

It is also important to ensure that your infrastructure is suitable for hosting your application environment. Firstserv Ltd offers a wide range of options: Hosted Private Cloud, Public Cloud, and a variety of Bare Metal servers. With several Bare Metal options and models available, Firstserv Ltd partners can precisely adjust their ratios (RAM per core ratio, storage per RAM or core, etc.) and ensure they use the best virtual machine for every workload.

Editor's Recommended Articles

See more here:
The 'Green Cloud': Four strategies for a sustainable and responsible ... - Open Access Government

Read More..

Do SSD failures follow the bathtub curve? Ask Backblaze – The Register

Cloud-based storage and backup provider Backblaze has published the latest report on usage data gathered from its solid state drives (SSDs), asking if they show the same failure pattern as hard drives.

Backblaze uses SSDs as boot drives in the server infrastructure for its Cloud Storage platform, while high-capacity rotating drives are typically used for storing and serving up data.

However, they do more than just boot the storage servers, holding log files and temporary files produced by each server. The volume of data a boot drive will read, write, and delete thus depends on the activity of the storage server itself.

The company previously reported that its SSDs appeared to be at least as reliable as hard drives, but warned this could change as it has not collected SSD data for as long as hard drives and the accumulation of more data could alter the statistics.

Backblaze says it has added 238 SSDs to its infrastructure since the last SSD report, ending in Q4 2022. These comprised 110 Crucial drives (model: CT250MX500SSD1), 62 WDC drives (WD Blue SA510 2.5) and 44 Seagate drives (ZA250NM1000).

Looking at the Q1 2023 and Q2 2023 figures, Backblaze notes that some drives appear to have exceptionally high annualized failure rates, with the Seagate model SSDSCKKB240GZR listed with an annualized failure rate (AFR) of over 800 percent, for example.

This is a fluke of the statistics because of the low number of drives; in Q1 there were just two of this model, one of which failed shortly after being installed. During Q2, the remaining drive did not fail and thus the AFR for that period was zero.

These figures illustrate why Backblaze considers at least 100 instances of a specific drive model and 10,000 drive days of operation in a specific quarter as a minimum before the calculated AFR can be considered to be reasonable, according to Backblaze storage cloud evangelist Andy Klein

Looking at the AFR over time, Backblaze reports that the AFR across its SSDs was 0.96 percent during Q1 of 2023 and 1.05 percent during Q2. This failure rate is thus up from the previous quarter, but down slightly from the same quarter a year ago. In fact, a chart of the AFR per quarter over the past three years shows that it has fluctuated between 0.36 percent and 1.72 percent, with no apparent underlying pattern.

However, Backblaze says that the quarterly data is still vital as it can reveal issues such as one particular drive model that was the primary cause of a jump in AFR from 0.58 percent in Q1 2021 to 1.51 percent in Q2 then 1.72 percent in Q3.

"It happens from time to time that a given drive model is not compatible with our environment, and we will moderate or even remove that drive's effect on the system as a whole," Klein said.

Backblaze earlier this year calculated the average age at which failure occurred for its entire collection of hard drives, and has repeated the calculation for SSDs in this latest report.

This involved collecting the SMART data for the 63 failed SSD drives the company has had to date, which is not a great dataset size for statistical analysis, as Klein admitted. The resulting figure calculated from the data is 14 months, compared with two years and seven months across all hard drives.

But Backblaze cautions this figure is likely to be unrepresentative, as the average age of the entire fleet of SSDs it has in operation is just 25 months.

Looking at three drive models for which the company has a reasonable amount of data, Klein found that the average age of the failed drives increases as the average age of drives in operation increases, and it is therefore reasonable to expect that the average age for an SSD failure will increase with time.

Turning to the lifetime annualized failure rate for all of its SSDs, Backblaze reports a figure of 0.9 percent, covering a period from Q4 2018 through to the end of Q2 2023. This figure is up slightly from the 0.89 percent it found at the end of Q4 2022, but down from the same quarter a year ago, when the figure was 1.08 percent.

However, this includes those drives which have high apparent failure rates because there is just not enough data to make the calculation reliable.

If the calculation is limited to just those drive models for which there are 100 units in operation and over 10,000 drive days, and also with a confidence interval of 1 percent or lower between the low and the high values, then it cuts the data down to just three drives and an AFR of just 0.6 percent.

Meanwhile, Backblaze has also produced a graph of SSD failures over time to see how well the data matches the classic bathtub curve used in reliability engineering, as the comparable graph for its hard drives does.

According to Klein, while the actual curve (blue line) showing the SSD failures over each quarter is a bit "lumpy," the trend line (red) does have "a definite bathtub curve look to it."

The trend line is about a 70 percent match to the actual data, so Backblaze says it cannot be totally confident at this point, but for the limited amount of data available, it would appear that the occurrences of SSD failures are on a path to conform to the tried-and-true bathtub curve.

As ever, Backblaze makes the raw data used in its report available on a Drive Stats Data page for anyone to download and analyze as long as you cite Backblaze as the source if you use the data, and don't sell it, of course.

Go here to see the original:
Do SSD failures follow the bathtub curve? Ask Backblaze - The Register

Read More..