Page 466«..1020..465466467468..480490..»

Ethereum recovering, Pushd and Polkadot vie for top crypto position – crypto.news

Disclosure: This article does not represent investment advice. The content and materials featured on this page are for educational purposes only.

The crypto market constantly evolves, with new and innovative projects emerging daily.Some popular platforms include Polkadot (DOT) and Ethereum (ETH), each with unique strategies to redefine the market.

Among them is Pushd (PUSHD), a crypto project currently conducting its presale.

Ethereum has a robust ecosystem for creating and implementing decentralized apps (dapps) and smart contracts.

As a result, its decentralized finance (defi) has a total value locked (TVL) of over $30.7 billion.

Kiln, a staking platform on Ethereum, recently announced a successful $17 million funding round to democratize value creation.

This development has improved Ethereums outlook, a net positive as the market recovers.

Ethereum is trading at $2,208, up 43% in the past year, and may recover after the recent retracement.

Polkadot promotes blockchain interoperability.

It seeks to provide a secure and efficient platform for different blockchains to exchange data.

However, DOT, its native currency, has been under pressure, fluctuating between $6.04 and $7.46.

As per the Directional Movement Index (DMI) indicator, DOT remains under pressure at spot rates.

The eCommerce market may reach $8 trillion by 2024.

Pushd, a new marketplace, has already shown strong demand before its official launch.

Its unique selling point is that it is built on a web3 platform. It offers a decentralized online marketplace that allows for peer-to-peer (P2P) commerce without intermediaries and excessive fees.

Market analysts predict PUSHD will rise above $3, buoyed by improving buyer sentiment.

There are over 22,000 sign-ups in the ongoingpresaleas PUSHD changes hands at $0.075 in the current stage.

The crypto market is fluctuating, with ETH likely to recover. Pushd and Polkadot will likely emerge stronger in the upcoming bull run. For this reason, more investors appear to be exploring Pushd, whose presale is ongoing.

Disclosure: This content is provided by a third party. crypto.news does not endorse any product mentioned on this page. Users must do their own research before taking any actions related to the company.

Read the rest here:

Ethereum recovering, Pushd and Polkadot vie for top crypto position - crypto.news

Read More..

Shadows and Light: Discovering the Hidden Depths of Quantum Materials – SciTechDaily

Researchers have developed an advanced optical technique to uncover hidden properties of the quantum material Ta2NiSe5 (TNS) using light. By employing terahertz time-domain spectroscopy, the team observed anomalous terahertz light amplification, indicating the presence of an exciton condensate. This discovery opens up new possibilities for using quantum materials in entangled light sources and other applications in quantum physics. Credit: SciTechDaily.com

Scientists used a laser-based technique to reveal hidden quantum properties of the material Ta2NiSe5, potentially advancing the development of quantum light sources.

Certain materials have desirable properties that are hidden, and just as you would use a flashlight to see in the dark, scientists can use light to uncover these properties.

Researchers at the University of California San Diego have used an advanced optical technique to learn more about a quantum material called Ta2NiSe5 (TNS). Their work was published in the journal Nature Materials.

Materials can be perturbed through different external stimuli, often with changes in temperature or pressure; however, because light is the fastest thing in the universe, materials will respond very quickly to optical stimuli, revealing properties that would otherwise remain hidden.

Using an improved technique that gave access to a broader range of frequencies, the team was able to uncover some of the hidden properties of the TNS exciton condensate. Credit: Sheikh Rubaiat Ul Haque / Stanford University

In essence, we shine a laser on a material and its like stop-action photography where we can incrementally follow a certain property of that material, said Professor of Physics Richard Averitt, who led the research and is one of the papers authors. By looking at how constituent particles move around in that system, we can tease out these properties that are really tricky to find otherwise.

The experiment was conducted by lead author Sheikh Rubaiat Ul Haque, who graduated from UC San Diego in 2023 and is now a postdoctoral scholar at Stanford University. He, along with Yuan Zhang, another graduate student in Averitts lab, improved upon a technique called terahertz time-domain spectroscopy. This technique allows scientists to measure a materials properties over a range of frequencies, and Haques improvements allowed them access to a broader range of frequencies.

The work was based on a theory created by another of the papers authors, Eugene Demler, a professor at ETH Zrich. Demler and his graduate student Marios Michael developed the idea that when certain quantum materials are excited by light, they may turn into a medium that amplifies terahertz frequency light. This led Haque and colleagues to look closely into the optical properties of TNS.

When an electron is excited to a higher level by a photon, it leaves behind a hole. If the electron and hole are bound, an exciton is created. Excitons may also form a condensate a state that occurs when particles come together and behave as a single entity.

Haques technique, backed by Demlers theory and using density functional calculations by Angel Rubios group at Max Planck Institute for the Structure and Dynamics of Matter, the team was able to observe anomalous terahertz light amplification, which uncovered some of the hidden properties of the TNS exciton condensate.

Condensates are a well-defined quantum state and using this spectroscopic technique could allow some of their quantum properties to be imprinted onto light. This may have implications in the emerging field of entangled light sources (where multiple light sources have interconnected properties) utilizing quantum materials.

I think its a wide-open area, stated Haque. Demlers theory can be applied to a suite of other materials with nonlinear optical properties. With this technique, we can discover new light-induced phenomena that havent been explored before.

Reference: Terahertz parametric amplification as a reporter of exciton condensate dynamics by Sheikh Rubaiat Ul Haque, Marios H. Michael, Junbo Zhu, Yuan Zhang, Lukas Windgtter, Simone Latini, Joshua P. Wakefield, Gu-Feng Zhang, Jingdi Zhang, Angel Rubio, Joseph G. Checkelsky, Eugene Demler and Richard D. Averitt, 3 January 2024, Nature Materials.DOI: 10.1038/s41563-023-01755-2

Funding provided by the DARPA DRINQS Program (D18AC00014), the Swiss National Science Foundation (200021_212899), Army Research Office (W911NF-21-1-0184), the European Research Council (ERC-2015-AdG694097), the Cluster of Excellence Advanced Imaging of Matter (AIM), Grupos Consolidados (IT1249-19), Deutsche Forschungsgemeinschaft (170620586), and the Flatiron Institute.

See the original post here:

Shadows and Light: Discovering the Hidden Depths of Quantum Materials - SciTechDaily

Read More..

The Odds Of A Quantum Tunneling Event Are One In A Hundred Billion – IFLScience

The rate at which the rare but crucial quantum phenomenon known as tunneling occurs has been measured experimentally for the first time, and found to match theoretical calculations. The theoretical estimates in this area had been regarded as highly uncertain, so confirmation in one specific case allows for greater confidence in estimating the frequency of other tunneling events.

Quantum tunneling is one of the many phenomena where subatomic particles behave in ways classical physics would say is impossible. In this case, an object trapped in a way that classically requires a certain energy to escape leaves the trap, despite having less than that amount of energy. Its a consequence, and proof of, the dual wave/particle nature of objects like electrons a pure particle could not escape, but a wave occasionally can. Phenomena like alpha decay of atomic nuclei depend on quantum tunneling to occur.

Tunneling is essential to quantum physics, and calculations based on simple examples are set in undergraduate courses. Real-world examples are considerably more complex, however; knowing tunneling will occasionally occur in a specific situation, and knowing how often, are very different things. In a new paper, a team at the Universitt Innsbruck provide the first measure of the reaction between a hydrogen molecule and a deuterium anion, finding it to be the slowest reaction involving charged particles ever observed.

Although there is no solid wall keeping deuterium anions and hydrogen molecules apart, physicists imagine the energy barrier as a physical wall, which quantum tunneling occasionally allows protons to penetrate.

Image credit: Universitt Innsbruck/Harald Ritsch

The reaction (H2+ D H + HD) involves a shift between a molecule of two hydrogen atoms protons without neutrons and an atom consisting of a proton and neutron orbited by two electrons. After tunneling occurs, one of the components of the molecule has a neutron, while the unattached atom, still negatively charged, is neutron-less. Although it looks like a neutron has been transferred, the reaction is considered to represent proton exchange.

Since hydrogen still makes up most of the universe, events like this that require no heavier elements happen very frequently on a cosmic scale, despite the odds in any specific encounter between hydrogen and deuterium being low. Moreover, if we are to have any hope of modeling more complex tunneling events we need to anchor our estimates with measures of simpler examples like this.

The Innsbruck team tested the rate of occurrence experimentally by filling a trap with a mix of deuterium ions cooled to 10 K (-263C/-441F) (warmed by collisions to 15 K) and hydrogen gas. At these temperatures transfer is classically impossible, but the presence of negatively charged hydrogen ions after 15 minutes indicated it had happened, albeit not often.

The rate is measured in cubic centimeters per second, giving a value of5.2 1020 cubic centimeters per second, with a margin of error of around a third, which is unlikely to mean much to anyone other than a quantum physicist.

It translates, however to transfer occurring one in every hundred billion times a deuterium anion collides with a hydrogen molecule. This might seem too rare to worry about, but even a small patch of gas contains many billions of molecules. Add enough deuterium and the number of collisions becomes immense.

Measuring the rate requires an experiment that allows very precise measurements and can still be described quantum-mechanically, senior author Professor Roland Wester said in a statement. The idea for the experiment came to Wester 15 years ago, but the tunneling is so rare it took considerable effort to construct an experiment where it could be measured.

The study is published in Nature.

An earlier version of this article was published inMarch 2023.

Go here to see the original:

The Odds Of A Quantum Tunneling Event Are One In A Hundred Billion - IFLScience

Read More..

The Science Behind the Swirling Patterns in Your Morning Coffee – AZoQuantum

The morning coffee will swirl with clouds of white liquid if a dash of creamer is added. But after a few seconds, those swirls will go, and it will be left with a regular brown liquid in a mug.

In the current study, Nandkishore and his colleagues used mathematical tools to envision a checkerboard pattern of theoretical qubits. The team discovered that if they arranged these zeros and ones in the right way, the patterns could flow around the checkerboard but might never disappear entirely. Image Credit: Stephen, Hart & Nandkishore

Information can quickly become jumbled in quantum computer chips, which are devices that tap into the strange features of the universe at its smallest scales. This limits the memory capacity of these devices.

That does not have to be the case, notes Rahul Nandkishore, Associate Professor of Physics at CU Boulder.

Using mathematical techniques, he and his colleagues have made a significant breakthrough in theoretical physics by demonstrating that it is possible to construct a situation in which milk and coffee do not mix, regardless of how vigorously they are stirred.

The team's research could result in improved quantum computer chips and give engineers new avenues to store data in minuscule items.

Think of the initial swirling patterns that appear when you add cream to your morning coffee; imagine if these patterns continued to swirl and dance no matter how long you watched.

Rahul Nandkishore, Senior Author and Associate Professor, Department of Physics, University of Colorado Boulder

To confirm that these infinite swirls are indeed feasible, more laboratory tests are required. However, the team's findings represent a significant advancement for physicists working on the project known as "ergodicity breaking," which aims to produce materials that stay out of equilibrium for extended periods of time.

The group's results were published in the journal "Physical Review Letters."

The study's universal issue in quantum computing is what drives co-authors David Stephen and Oliver Hart, postdoctoral physics researchers at CU Boulder.

Typically, "bits," which are represented by zeros or ones, power computers. Contrarily, Nandkishore clarified, quantum computers use "qubits," which are entities that can exist as either zero or one at the same moment due to the peculiarities of quantum mechanics.

Qubits have been created by engineers using a variety of materials, such as single atoms trapped by lasers or small components known as superconductors.

But qubits are readily confused, much like that cup of coffee. For instance, if every qubit is flipped to one, the qubits will ultimately flip back and forth until the chip as a whole becomes a disorganized mess.

In their recent research, Nandkishore and his colleagues may have identified a method to overcome the usual tendency of qubits to mix. The group conducted calculations suggesting that if scientists organize qubits into specific patterns, these configurations would preserve their information even when subjected to disturbances such as a magnetic field.

This finding raises the possibility of constructing devices with a form of quantum memory, according to the physicist.

This could be a way of storing information; you would write information into these patterns, and the information could not be degraded.

Rahul Nandkishore, Senior Author and Associate Professor, Department of Physics, University of Colorado Boulder

In the study, an array of hundreds to thousands of qubits arranged in a checkerboard-like pattern was seen by the researchers using mathematical modeling techniques.

They found that packing the qubits into a small space was the key. According to Nandkishore, qubits can affect the actions of their neighbors if they are near enough to one another. It is similar to a throng of people attempting to cram themselves into a phone booth. Even if some of those individuals are either standing straight or on their heads, they are unable to turn around without shoving into other people.

According to their calculations, if these patterns were formed precisely, they may flow around a quantum computing chip and never break down, much like the clouds of cream that swirl indefinitely throughout the coffee.

Nandkishore said, The wonderful thing about this study is that we discovered that we could understand this fundamental phenomenon through what is almost simple geometry.

The teams findings could influence a lot more than just quantum computers.

However, his recent discoveries add to the increasing amount of evidence that implies certain small matter organizations can resist that equilibrium, thereby defying some of the universe's most inflexible laws.

According to Nandkishore, nearly everything in the universe, from massive seas to coffee cups, tends to gravitate toward a state known as "thermal equilibrium." When you place an ice cube inside the mug, for instance, the heat from the coffee will cause the ice to melt and finally turn into a liquid that is all the same temperature.

We are not going to have to redo our math for ice and water. The field of mathematics that we call statistical physics is incredibly successful in describing things we encounter in everyday life. But there are settings where maybe it does not apply.

Rahul Nandkishore, Senior Author and Associate Professor, Department of Physics, University of Colorado Boulder

Stephen, D. T., et.al., (2024). Ergodicity Breaking Provably Robust to Arbitrary Perturbations. Physical Review Letters. doi.org/10.1103/physrevlett.132.040401

Source: https://www.colorado.edu/

Continued here:

The Science Behind the Swirling Patterns in Your Morning Coffee - AZoQuantum

Read More..

New research sheds light on a phenomenon known as ‘false vacuum decay’ – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

close

An experiment conducted in Italy, with theory support from Newcastle University, has produced the first experimental evidence of vacuum decay.

In quantum field theory, when a not-so-stable state transforms into the true stable state, it's called "false vacuum decay." This happens through the creation of small localized bubbles. While existing theoretical work can predict how often this bubble formation occurs, there hasn't been much experimental evidence.

The ultracold atoms lab of the Pitaevskii Center for Bose-Einstein Condensation in Trento reports for the first time the observation of phenomena related to the stability of our universe. The results arise from the collaboration among the University of Newcastle, the National Institute of Optics of CNR, the Physics Department of the University of Trento and Tifpa-Infn, and it has been published in Nature Physics.

The results are supported by both theoretical simulations and numerical models, confirming the quantum field origin of the decay and its thermal activation, opening the way to the emulation of out-of-equilibrium quantum field phenomena in atomic systems.

The experiment uses a supercooled gas at a temperature of less than a microKelvin from absolute zero. At this temperature, bubbles are seen to emerge as the vacuum decays and the Newcastle University's Professor Ian Moss and Dr. Tom Billam were able to show conclusively that these bubbles are a result of thermally activated vacuum decay.

Ian Moss, Professor of Theoretical Cosmology at Newcastle University's School of Mathematics, Statistics and Physics, said, "Vacuum decay is thought to play a central role in the creation of space, time and matter in the Big Bang, but until now there has been no experimental test. In particle physics, vacuum decay of the Higgs boson would alter the laws of physics, producing what has been described as the 'ultimate ecological catastrophe.'"

Dr. Tom Billam, Senior Lecturer in Applied Math/Quantum, added, "Using the power of ultracold atom experiments to simulate analogs of quantum physics in other systemsin this case the early universe itselfis a very exciting area of research at the moment."

The research opens up new avenues in the understanding of the early universe, as well as ferromagnetic quantum phase transitions.

This groundbreaking experiment is only the first step in exploring vacuum decay. The ultimate goal to is find vacuum decay at the temperature of absolute zero where the process is driven purely by quantum vacuum fluctuations. An experiment in Cambridge, supported by Newcastle as part of a national collaboration QSimFP, aims to do just this.

More information: A. Zenesini et al, False vacuum decay via bubble formation in ferromagnetic superfluids, Nature Physics (2024). DOI: 10.1038/s41567-023-02345-4

Journal information: Nature Physics

Read the original here:

New research sheds light on a phenomenon known as 'false vacuum decay' - Phys.org

Read More..

What is Quantum Key Distribution? | by Sai Nitesh | Jan, 2024 – Medium

Quantum Key Distribution (QKD) is a method of secure communication that uses principles of quantum mechanics to enable two parties to produce a shared random secret key. This key can then be used to encrypt and decrypt messages, providing a secure means of communication.

The fundamental idea behind QKD is based on the principles of quantum superposition and entanglement. In traditional cryptographic systems, the security of the communication relies on mathematical algorithms, whereas QKD leverages the unique properties of quantum particles to achieve its security.

Heres a basic overview of how Quantum Key Distribution (QKD) works:

In quantum mechanics, particles like photons can exist in multiple states at once(superposition). In the context of QKD, a sender (Alice) can encode information in the quantum states of particles (e.g., photons) and send them to the receiver (Bob).

2. Quantum Entanglement:

Entanglement is a quantum phenomenon where particles become correlated in such a way that the state of one particle is directly related to the state of another, regardless of the distance between them. This property is used in some QKD protocols to ensure the security of the key exchange.

3. Measurement:

When Bob receives the quantum states from Alice, he performs measurements on these particles. The act of measurement, according to quantum mechanics, changes the state of the particles. Bob communicates the outcomes of his measurements to Alice over a classical communication channel.

4. Key Generation:

Alice and Bob compare a subset of their measurement outcomes to check for discrepancies. If an eavesdropper (an unauthorized third party) tries to intercept or measure quantum states, the act of measurement will disturb the quantum states, and Alice and Bob will notice inconsistencies.

5. Error Correction and Privacy Amplification:

If they detect any discrepancies, Alice and Bob can discard those bits and perform error correction to generate a final secret key. Additionally, privacy amplification techniques are used to enhance the security of the key.

One of the key advantages of QKD is its ability to provide information-theoretic security, meaning that the security is based on the fundamental laws of physics and not on computational assumptions. However, its important to note that while QKD offers a highly secure method for key distribution, it doesnt address all aspects of secure communication, and additional classical cryptographic protocols are often used in conjunction with QKD to achieve comprehensive security.

References: The information is summarised from multiple online resources.

Previous articles:

Why Quantum Computing matters and what do you need to know?

What is Quantum Computing?

Unleashing the Power: When Quantum Computing and AI Join FOrces to Shape Tomorrows World

Navigating the Quantum Revolution: A Closer Look at Cryptographys Future

See the original post:

What is Quantum Key Distribution? | by Sai Nitesh | Jan, 2024 - Medium

Read More..

Coffee, creamer, and the Quantum Realm – Earth.com

In the same way that cream blends into coffee, transforming it from a whirl of white to a uniform brown, quantum computer chips face a challenge.

These devices operate on the minuscule scale of the universes fundamental particles, where data can quickly become chaotic, limiting memory efficiency.

However, new research spearheaded by Rahul Nandkishore, an associate professor of physics at the University of Colorado Boulder, suggests a groundbreaking approach that could revolutionize data retention in quantum computing.

Nandkishore and his team, through mathematical modeling, propose a scenario akin to cream and coffee that never mix, regardless of how much they are stirred.

This concept, if realized, could lead to significant advancements in quantum computer chips, providing engineers with novel methods for storing data in extremely small scales.

Nandkishore, the senior author of the study, illustrates his idea using the familiar sight of cream swirling in coffee, imagining these patterns remaining dynamic indefinitely.

Think of the initial swirling patterns that appear when you add cream to your morning coffee, said Nandkishore. Imagine if these patterns continued to swirl and dance no matter how long you watched.

This concept is central to the study, which involved David Stephen and Oliver Hart, postdoctoral researchers in physics at CU Boulder.

Quantum computers differ fundamentally from classical computers. While the latter operate on bits (zeros or ones), quantum computers use qubits, which can exist as zero, one, or both simultaneously.

Despite their potential, qubits can easily become disordered, leading to a loss of coherent data, much like the inevitable blending of cream into coffee.

Nandkishore and his teams solution lies in arranging qubits in specific patterns that maintain their information even under disturbances, like magnetic fields.

This could be a way of storing information, he said. You would write information into these patterns, and the information couldnt be degraded.

This arrangement could allow for the creation of devices with quantum memory, where data, once written into these patterns, remains uncorrupted.

The researchers employed mathematical models to envision an array of hundreds to thousands of qubits in a checkerboard pattern.

They discovered that tightly packing qubits influences their neighboring qubits behavior, akin to a crowded phone booth where movement is severely limited.

This specific arrangement might enable the patterns to flow around a quantum chip without degrading, much like the enduring swirls of cream in a cup of coffee.

Nandkishore notes that this studys implications extend beyond quantum computing.

The wonderful thing about this study is that we discovered that we could understand this fundamental phenomenon through what is almost simple geometry, Nandkishore said.

It challenges the common understanding that everything in the universe, from coffee to oceans, moves toward thermal equilibrium, where differences in temperature eventually even out, like ice melting in a warm drink.

His findings suggest that certain matter organizations might resist this equilibrium, potentially defying some long-standing physical laws.

While further experimentation is necessary to validate these theoretical swirls, the study represents a significant stride in the quest to create materials that stay out of equilibrium for extended periods.

This pursuit, known as ergodicity breaking, could redefine our understanding of statistical physics and its application to everyday phenomena.

As Nandkishore puts it, while we wont need to rewrite the math for ice and water, there are scenarios where traditional statistical physics might not apply, opening new frontiers in quantum computing and beyond.

The full study was published in the journal Physical Review Letters.

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

Original post:

Coffee, creamer, and the Quantum Realm - Earth.com

Read More..

What if there were more than three dimensions? – Varsity

String theory works only in a 10-dimensional model. MEDHA SURAJPAL/THE VARSITY

What string theory says about existence in other dimensions

Youve probably wondered at some point what life would be like in different dimensions. I certainly have.

Living in the first dimension would consist of us existing as points on a line. This dimension can be visualized as an infinite line or equated to the x-axis on a graph. We would have no concept of size, and any object would always appear as a point with no depth or breadth.

If we add another axis to this dimension, namely the y-axis, that would establish flat land or two-dimensional (2D) space. Here, we could exist as shapes and perceive other shapes around us. Notably, as flatlanders, we would view other objects from the side as lines with no depth existing on the same flat plane.

In actuality, we exist in the third dimension (3D). As 3D creatures, we can view the 2D plane from the top and see what a rectangle and other 2D objects fully look like. However, despite living in a 3D world, our eyes process visual information in 2D, which our brain supports with depth cues so that we can still perceive the three-dimensional things around us.

In this pattern, we could keep going to a higher dimension, where a creature in each dimension would process the world in a dimension lower than the one in which it exists and then use depth cues to perceive its own dimension.

In practice sadly physicists havent really found much evidence of spatial dimensions beyond the usual three, plus a fourth dimension of time. However, one prevalent theory that has gotten the closest to being a viable candidate for the Theory of Everything a grand theory that aims to unify the fundamental forces of physics which posits that the universe has nine spatial dimensions and one dimension of time. This theory is called string theory.

String theory

String theory arose as an attempt to unify quantum mechanics and general relativity. The former studies and describes the movement and interactions of subatomic particles while the latter is Albert Einsteins theory of how gravity affects space-time, the fabric of the universe that distorts under the movement of massive objects.

Together, these theories describe the four fundamental forces that govern interactions in the universe: the strong force, the weak force, the electromagnetic force, and gravity. The first three are described by quantum mechanics, and the last one is described by general relativity.

However, applying Einsteins idea of general relativity to quantum systems just yields nonsensical mathematical solutions. Since larger objects are made up of subatomic particles, using completely disconnected systems to describe the behaviour of subatomic particles and the behaviour of larger objects seems illogical. A Theory of Everything aims to bridge this gap.

In 1984, two physicists, John Schwarz and Michael Green, suggested the beginnings of string theory, which would ease the mathematical antagonism between general relativity and quantum mechanics.

Traditionally, physicists have seen natures fundamental particles as the neutrons, protons, and electrons that make up atoms, which in turn make up everything. Neutrons and protons can be broken down even further into quarks.

String theory further breaks down these fundamental particles and posits that all particles are made of minuscule strings of energy vibrating at different frequencies. The theory unifies the fundamental forces by having all particles be made of the same underlying basic component of strings.

The three fundamental forces governed by quantum mechanics can be quantized to discrete particles: the strong nuclear force is carried and transferred by the gluon, the weak nuclear force by W and Z bosons, and the electromagnetic force by the photon.

However, a hypothetical particle that transmits gravity called the graviton is incompatible under a quantum mechanics framework like the other forces are.

Interestingly, under the paradigm of string theory, a graviton is associated with a string frequency. Similarly, there is an associated string frequency for gluons, W and Z bosons, and photons, uniting gravity with quantum mechanics by describing them under the same framework.

Dimensions in string theory

The math for string theory that unifies these particles does not work in our current four-dimensional model. In fact, it only works in 10 dimensions, one of which is time. The actual view we would have of each dimension is also different from the previous progression we explored from 1D to 3D worlds.

String theory postulates two basic types of dimensions: those that are very large and expanded and those that are really small and wound up. Large dimensions are those that we can experience.

Brian Greene, a physicist specializing in the study of string theory, compares a large dimension to a wide carpet and a small dimension to the wound-up circular loops that make up the carpet that you have to bend down to see. In other words, we may have three dimensions we can experience and easily navigate and six other spatial dimensions that might be so tightly wound up that we simply cant perceive them.

Unfortunately, there has been no substantial experimental support for string theory thus far. That doesnt mean that string theory has no basis, though. It results from decades of analysis and intense study and is the closest physicists have ever come to the Theory of Everything. String theory has truly been significant in expanding our understanding of reality.

And who knows? Maybe someday, we will have evidence for string theory and multiple dimensions. Until then, its certainly fun imagining them.

Excerpt from:

What if there were more than three dimensions? - Varsity

Read More..

AI and Semiconductors – A Server GPU Market Analysis and Forecast, 2023-2028: Global AI and Server GPU Demand … – Yahoo Finance UK

Company Logo

Global AI and Semiconductors Market

Global AI and Semiconductors Market

Dublin, Jan. 23, 2024 (GLOBE NEWSWIRE) -- The "AI and Semiconductors - A Server GPU Market - A Global and Regional Analysis: Focus on Application, Product, and Region - Analysis and Forecast, 2023-2028" report has been added to ResearchAndMarkets.com's offering.

The global AI and semiconductor - a server GPU market accounted for $15.4 billion in 2023 and is expected to grow at a CAGR of 31.99% and reach $61.7 billion by 2028. The proliferation of edge computing, where data processing occurs closer to the source of data generation rather than relying solely on centralized cloud servers, is driving the demand for GPU servers. The increasing trend toward virtualization in data centers and enterprise environments is also a significant driver for GPU servers.

The rapid development of machine learning and artificial intelligence applications is a major driver of this trend. A key element of AI and ML is the training of sophisticated neural networks, which is accelerated in large part by GPU servers. Companies such as Nvidia, for instance, have noticed a spike in demand for their GPU products, such as the Nvidia A100 Tensor Core GPU, which is intended especially for AI tasks. The global AI and semiconductor - server GPU market is growing as a result of the use of GPU servers by a variety of industries, including healthcare, finance, and autonomous cars, to handle large datasets and increase the precision of AI models.

The end-use application segment is a part of the application segment for the worldwide AI and semiconductor - server GPU market. Cloud computing (private, public, and hybrid clouds) and HPC applications (scientific research, machine learning, artificial intelligence, and other applications) are included in the end-use application sector. The global AI and Semiconductor - a server GPU market has also been divided into segments based on the kind of facility, which includes blockchain mining facilities, HPC clusters, and data centers (including hyperscale, colocation, enterprise, modular, and edge data centers).

According to estimates, the data center category will have the biggest market share in 2022 and will continue to lead the market during the projection period. The push toward GPU-accelerated computing in data centers is fueled by GPU technological breakthroughs that provide increased energy efficiency and performance. GPU servers can transfer certain computations from conventional CPUs to GPU servers, which improves overall performance and reduces energy consumption. Consequently, the increasing use of GPU servers in data centers is in line with the changing requirements of companies and institutions that want to manage the sustainability and efficiency of their data center operations while achieving higher levels of processing capacity.

The push toward GPU-accelerated computing in data centers is fueled by GPU technological breakthroughs that provide increased energy efficiency and performance. GPUs offer an efficient way to strike a balance between processing capacity and power consumption, which is something that data center operators are looking for in solutions. GPU servers can transfer certain computations from conventional CPUs to GPU servers, which improves overall performance and reduces energy consumption. Consequently, the increasing use of GPU servers in data centers is in line with the changing requirements of companies and institutions that want to manage the sustainability and efficiency of their data center operations while achieving higher levels of processing capacity.

Story continues

Data center expansion and the rise of cloud computing services have further propelled the demand for GPU servers in North America. Cloud service providers, including industry giants such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud, are investing heavily in GPU infrastructure to offer customers high-performance computing capabilities on a scalable and cost-effective basis. This trend is particularly prominent as businesses increasingly rely on cloud-based resources for AI training, simulation, and other GPU-intensive tasks.

Demand - Drivers, Challenges, and Opportunities

Market Drivers:

GPU server producers can capitalize on this need by providing customized cryptocurrency mining solutions, including rigs specifically designed for mining, cloud-based mining services, or GPU-as-a-service platforms. By charging fees, charging subscriptions, or entering into contracts, these systems can make money for the makers while giving miners access to strong and scalable GPU resources.

The need for data center GPUs derives from their key role in AI model training and execution, which is especially advantageous for businesses engaged in computationally demanding tasks like engineering simulations and scientific research. Manufacturers of GPU servers can take advantage of this demand by providing specialized solutions for high-performance computing (HPC) applications, such as GPU-as-a-service platforms, cloud-based GPU services, and dedicated GPU servers. In addition to giving businesses scalable GPU resources, these customized services bring in money for the manufacturers through fees, subscriptions, or contracts.

Market Challenges:

The economies of scale provided by GPU manufacturers, most notably Nvidia, create a significant barrier to entry for manufacturers of data center GPU servers wishing to integrate backward. A company trying to backward integrate into the GPU production process, for example, would find it difficult to achieve equivalent economies of scale. This has an impact on the business's capacity to maintain overall competitiveness, engage in research and development, and match prices. As a result, it might be difficult for producers of data center GPU servers to achieve comparable economies of scale, which could limit their efficacy in the extremely competitive market. Additionally, a recurring problem for manufacturers of data center GPU servers is the continual innovation by GPU manufacturers, demonstrated by the ongoing development of GPUs, CPUs, and data processing units (DPUs).

Market Opportunities:

OpenAI's GPT-4, the latest and largest language model, is one specific real-time illustration of how GPU servers may help HPC and AI. It needed a lot of processing power to train on a huge dataset with over 1 trillion words. A significant contribution was made by GPU servers, more especially by Nvidia H100 Tensor Core GPUs, which sped up the training process up to 60 times faster than CPUs alone. Mixed-precision training was used to achieve this acceleration by optimizing both calculation performance and memory use. Because of this, GPT-4 might be trained in a few short weeks and accomplish cutting-edge results in challenges involving natural language processing.

Artificial intelligence (AI) and advanced analytics play a crucial role in smart cities as they optimize resource allocation, enhance public safety, and improve overall quality of life. Due to their suitability for AI and analytics workloads, GPU servers are becoming an essential part of the infrastructure for the development of smart cities.

Market Segmentation:

Segmentation by Application (End User)

Cloud Computing

HPC Application

Segmentation by Product (Configuration Type)

Single GPU

Dual to Quad GPU

High-Density GPU

Segmentation by Region

North America - U.S. and Rest-of-North America

Europe - Germany, France, Netherlands, Italy, Ireland, U.K., and Rest-of-Europe

Asia-Pacific - Japan, China, India, Australia, Singapore, and Rest-of-Asia-Pacific

Rest-of-the-World - Middle East and Africa and Latin America

Some prominent names established in this market are:

GPU Manufacturers

Nvidia Corporation (Nvidia)

Advanced Micro Devices, Inc. (AMD)

Intel Corporation (Intel)

Server GPU Manufacturers

Dell Inc.

Penguin Computing, Inc.

Exxact Corporation

Key Attributes:

Report Attribute

Details

No. of Pages

127

Forecast Period

2023 - 2028

Estimated Market Value (USD) in 2023

$15.4 Billion

Forecasted Market Value (USD) by 2028

$61.7 Billion

Compound Annual Growth Rate

31.9%

Regions Covered

Global

Key Topics Covered:

1 Market1.1 Industry Outlook1.1.1 Ongoing Trends1.1.1.1 Timeline of GPU and Server Design Upgrades1.1.1.2 Data Center Capacities: Current and Future1.1.1.3 Data Center Power Consumption Scenario1.1.1.4 Other Industrial Trends1.1.1.4.1 HPC Cluster Developments1.1.1.4.2 Blockchain Initiatives1.1.1.4.3 Super Computing1.1.1.4.4 5G and 6G Developments1.1.1.4.5 Impact of Server/Rack Density1.1.2 Equipment Upgrades and Process Improvements1.1.3 Adaptive Cooling Solutions for Evolving Server Capacities1.1.3.1 Traditional Cooling Techniques1.1.3.2 Hot and Cold Aisle Containment1.1.3.3 Free Cooling and Economization1.1.3.4 Liquid Cooling Systems1.1.4 Budget and Procurement Model of Data Center End Users1.1.5 Stakeholder Analysis1.1.6 Ecosystem/Ongoing Programs1.2 Business Dynamics1.2.1 Business Drivers1.2.1.1 Surging Demand for Cryptocurrency Mining1.2.1.2 Rising Enterprise Adoption of Data Center GPUs for High-Performance Computing Applications1.2.2 Business Challenges1.2.2.1 High Bargaining Power of GPU Manufacturers1.2.3 Market Strategies and Developments1.2.4 Business Opportunities1.2.4.1 Technological Advancement in High-Performing Computing (HPC)1.2.4.2 Government Support for Smart City Development and Digitalization1.3 Global Data Center GPU Market1.3.1 Market Size and Forecast1.3.1.1 Data Center GPU Market (by Application and Product)

2 Application2.1 Global AI and Semiconductors - A Server GPU Market (by Application)2.1.1 Global Server GPU Market (by End-Use Application)2.1.2 Global Server GPU Market (by Facility Type)

3 Products3.1 Global AI and Semiconductors - A Server GPU Market (by Product)3.1.1 Server GPU Market (by Configuration Type)3.1.2 Server GPU Market (by Form Factor)3.2 Pricing Analysis3.3 Patent Analysis

4 Region4.1 Global AI and Semiconductor - A Server GPU Market (by Region)

5 Markets - Competitive Benchmarking & Company Profiles5.1 Competitive Benchmarking5.2 Market Share Analysis5.2.1 By GPU Manufacturer5.2.2 By GPU Server Manufacturer5.3 Company Profiles

Nvidia Corporation

Advanced Micro Devices

Intel

Qualcomm Technologies

Imagination Technologies

ASUSTeK Computer

INSPUR

Huawei Technologies

Super Micro Computer

GIGA-BYTE Technology

Penguin Computing

Advantech

Fujitsu

Dell Inc.

Exxact

For more information about this report visit https://www.researchandmarkets.com/r/386r

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Attachment

Read more:
AI and Semiconductors - A Server GPU Market Analysis and Forecast, 2023-2028: Global AI and Server GPU Demand ... - Yahoo Finance UK

Read More..

Arbitrum Close to All-Time High in ETH Held in Its Smart Contracts – Unchained – Unchained

The rollup holds about 4.5 million ETH in its smart contracts, a 13% increase in the past 30 days and a 104% jump in the past year.

Arbitrum has attracted the most ETH among all blockchains.

(Shutterstock)

Posted January 26, 2024 at 2:19 pm EST.

Arbitrum, the leading rollup scaling solution for the Ethereum blockchain, is close to its all-time high for the amount of ETH held in its smart contracts.

Arbitrum has made a lot more progress in decentralizing/securing their L2, Paul Vaden, a core contributor at derivatives trading platform Lyra, wrote to Unchained on Telegram. So, for people who want to move to a safe battle-tested L2, Arbitrum is the obvious choice.

Users bridge their ether to Arbitrum for a variety of reasons. For example, users may want to trade on protocols based only on Arbitrum such as GMX, transact with lower costs compared to Ethereum, or experiment with Arbitrums technology stack.

The number of ETH in Arbitrum compared to other layer 2 blockchains. (Growthepie)

Arbitrum holds nearly 4.5 million ETH in its smart contracts, a 13% increase in the past 30 days and a 104% jump in the past year. Optimism, Base, zkSync Era, and Lineas combined amount of ETH is less than Arbitrums figure, data from layer 2 analytics website growthepie.xyz shows.

Read more: Arbitrum Commands Nearly Half of Total Market Share in Ethereum Rollups: Nansen

Arbitrum reached an all-time high of more than 4.5 million ETH in its smart contracts roughly three weeks ago on January 5.

Rollups, considered scaling solutions for Ethereum, aim to optimize transaction speed and cost efficiency for users by moving computation off-chain and bundling a collection of transactions before posting them to Ethereums base layer.

The main Arbitrum bridge currently holds more than 1.4 million ETH, making it the sixth-largest holder of ETH, behind Ethereums staking deposit contract, the smart contract for wrapped ETH, one Binance wallet, one Kraken address, and Robinhood.

The main Arbitrum bridge has seen consistent growth in its ETH balance. (Etherscan)

The number of ETH in this Arbitrum bridge has more than doubled in roughly the past five months from about 700,000 in Sept. 2022, data from blockchain explorer Etherscan shows, highlighting how people have been increasingly moving their ETH from Ethereums base layer to Arbitrums layer 2.

Disclosure: Arbitrum is a sponsor for Unchained.

See the rest here:

Arbitrum Close to All-Time High in ETH Held in Its Smart Contracts - Unchained - Unchained

Read More..