Page 67«..1020..66676869..8090..»

Using OpenAI and PandasAI for Series Operations | by Michael B Walker | Jun, 2024 – Towards Data Science

Incorporate natural language queries and operations into your Python data cleaning workflow. Red panda drawing donated by Karen Walker, the artist.

Many of the series operations we need to do in our pandas data cleaning projects can be assisted by AI tools, including by PandasAI. PandasAI takes advantage of large language models, such as that from OpenAI, to enable natural language queries and operations on data columns. In this post, we examine how to use PandasAI to query Series values, create new Series, set Series values conditionally, and reshape our data.

You can install PandasAI by entering pip install pandasai into a terminal or into Windows Powershell. You will also need to get a token from openai.com to send a request to the OpenAI API.

As the PandasAI library is developing rapidly, you can anticipate different results depending on the versions of PandasAI and pandas you are using. In this article, I use version 1.4.8 of PandasAI and version 1.5.3 of pandas.

We will work with data from the National Longitudinal Study of Youth (NLS) conducted by the United States Bureau of Labor Statistics. The NLS has surveyed the same cohort of high school students for over 25 years, and has useful data items on educational outcomes and weeks worked for each of those years, among many other variables. It is available for public use at nlsinfo.org. (The NLS public releases are covered by the United States government Open Data Policy, which permits both non-commercial and commercial use.)

We will also work with COVID-19 data provided by Our World in Data. That dataset has one row per country per day with number of new cases and new deaths. This dataset is available for download at ourworldindata.org/covid-cases, with a Creative Commons CC BY 4.0 license. You can also download all code and data used in this post from GitHub.

We start by importing the OpenAI and SmartDataframe modules from PandasAI. We also have to instantiate an llm object:

Next, we load the DataFrames we will be using and create a SmartDataframe object from the NLS pandas DataFrame:

Now we are ready to generate summary statistics on Series from our SmartDataframe. We can ask for the average for a single Series, or for multiple Series:

We can also summarize Series values by another Series, usually one that is categorical:

We can also create a new Series with the chat method of SmartDataframe. We do not need to use the actual column names. For example, PandasAI will figure out that we want the childathome Series when we write child at home:

We can use the chat method to create Series values conditionally:

PandasAI is quite flexible regarding the language you might use here. For example, the following provides the same results:

We can do calculations across a number of similarly named columns:

This will calculate the average of all weeksworked00-weeksworked22 columns and assign that to a new column called weeksworked.

We can easily impute values where they are missing based on summary statistics:

We can also use PandasAI to do some reshaping. Recall that the COVID-19 case data has new cases for each day for each country. Lets say we only want the first row of data for each country. We can do that the traditional way with drop_duplicates:

We can get the same results by creating a SmartDataframe and using the chat method. The natural language I use here is remarkably straightforward, Show first casedate and location and other values for each country:

Notice that PandasAI makes smart choices about the columns to get. We get the columns we need rather than all of them. We could have also just passed the names of the columns we wanted to chat. (PandasAI sorted the rows by iso_code, rather than by location, which is why the first row is different.)

Much of the work when using PandasAI is really just importing the relevant libraries and instantiating large language model and SmartDataframe objects. Once thats done, simple sentences sent to the chat method of the SmartDataframe are sufficient to summarize Series values and create new Series.

PandasAI excels at generating simple statistics from Series. We dont even need to remember the Series name exactly. Often the natural language we might use can be more intuitive than traditional pandas methods like groupby. The Show satmath average by gender value passed to chat is a good example of that.

Operations on Series, including the creation of a new Series, is also quite straightforward. We created a total number of children Series (childnum) by instructing the SmartDataframe to add the number of children living at home to the number of children not living at home. We didnt even provide the literal Series names, childathome and childnotathome respectively. PandasAI figured out what we meant.

Since we are passing natural language instructions to chat for our Series operations, there is no one right way to get what we want. For example, we get the same result when we passed evermarried is No when maritalstatus is Never-married, else Yes to chat as we did with if maritalstatus is Never-married set evermarried2 to No, otherwise Yes.

We can also do fairly extensive DataFrame reshaping with simple natural language instructions, as in the last command we provided. We add and other values to the instructions to get columns other than casedate. PandasAI also figures out that location makes sense as the index.

You can read more about how to use PandasAI and SmartDataframes here:

Or in the second edition of my book, Python Data Cleaning Cookbook:

Good luck with your data cleaning and I would love to hear how things are going!

Read the original post:

Using OpenAI and PandasAI for Series Operations | by Michael B Walker | Jun, 2024 - Towards Data Science

Read More..

Lenovo adopts Chinese Loongson CPUs for cloud servers 16-core Loongson 3C5000 chips necessary to rebuff US … – Tom’s Hardware

This week, Chinese CPU developer Loongson published 105 programs from 53 developers that natively support its 5000- and 6000-series processors based on the proprietaryLoongArch architecture. As the list revealed, Lenovo has quietly deployed Loongson's processors in its datacenters and is running cloud services on them, reportsThe Register. The scale of the deployment is unclear, but the revelation highlights Lenovo's commitment to using Chinese CPUs.

For now, Lenovo offers three software packages that support Loongson's LoongArch-based platforms: Wentian WxSphere Server Virtualization System Software V8.0 (16-core 3C5000L/3C5000), Wentian WxCloud Cloud Computing Management Platform V3.0 (16-core 3C5000L/3C5000), and Wentian WxStack Hyper-converged System Software V8.0 (quad-core3A6000). For Lenovo, this is enough to deploy Loongson's 5000-series CPUs commercially for its cloud services and prepare to deploy the next-generation Loongson's 6000-series processors.

Loongson has quietly gained traction in Chinawith mini PCs aimed at the channel, NAS, andthe education sector. These moves align with China's increasing urgency to replace Western technology with homegrown solutions, driven by policy objectives and necessity due to U.S.-led sanctions.

Deploying 16-core 3C5000 processors for cloud services is something new, but it shows that Lenovo is confident in these CPUs and their successors, which willfeature up to 128 cores. Lenovo's support for Loongson's architecture is crucial in making Chinese hardware a viable alternative to existing enterprise technologies. This support is expected to challenge companies like AMD and Intel, especially given China's vast market, which includes major telecommunications companies with extensive customer bases.

It is unclear whether it makes much financial sense to use 16-core CPUs for cloud services nowadays, as there are more powerful equivalents from traditional x86 CPU vendors specifically architected for such workloads. However, Lenovo needs to learn how Loongson's CPUs behave with its instances today and try out the next-generationDragonChainmicroarchitecture-based processors that will be rolling out over the next couple of years.

Notably, Lenovo's software stack is not the only cloud platform in China to support Loongson's processors; there are ten more platforms from various vendors, so there are more Loongson-based cloud deployments in the country.

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Read the original:
Lenovo adopts Chinese Loongson CPUs for cloud servers 16-core Loongson 3C5000 chips necessary to rebuff US ... - Tom's Hardware

Read More..

Is Apple going to give us the XServe replacement we need? – XDA Developers

Key Takeaways

There was plenty announced at WWDC this year, from new versions of iOS and macOS, to the AI-ification of everything. But hidden in Apple's plethora of announcements was Private Cloud Compute (PCC), a technical model for offloading AI processing capabilities to the cloud in a privacy-focused manner. There's a lot to unpack about PCC, but one thing did catch our eye. It runs exclusively on Apple Silicon, making use of several semi-proprietary technologies like Secure Enclave in the cloud.

Apple has made servers before, but they haven't shipped a new one for well over a decade. With this new market for Apple Silicon-based cloud computing, could Apple be about to finally announce a replacement for the classic XServe?

Source: Wikimedia commons

The original XServe marked Apple's entry into the enterprise server market. Released in 2002, XServer was a series of rack-mountable servers running on a customer version of macOS. Capable of running either individually or in clusters, it was especially favored by creatives and educational institutes due to the collaborative features and easy onboarding for specific sectors. This server (at the time, known as Mac OS X Server) was relatively unique in the enterprise space in that it provided a graphical user interface not just to set up the operating system, but also to set up and enable core functionalities like hosting web pages, email servers, or databases.

XServe had several generations, even making the switch from Power PC to Intel alongside the rest of the Mac lineup, and was initially very popular. The creative sector in particular loved the XServe, with its easy setup and seamless integrations with existing Mac products, and applications like Final Cut Server, all making it perfect for scaling up creativity.

An XServe cluster in use at Nasa (Source: Wikimedia Commons)

Ultimately the original XServe was dropped in 2010 as its popularity faded. Less and less effort was put into maintaining the bespoke applications and GUIs that made the XServe popular, and eventually, the dedicated version of Mac OS X Server was rolled into mainline macOS. This spelled the end for XServe, and it was announced that any further development would cease.

In the aftermath of XServe's demise, a frustrated customer emailed Steve Jobs, only to be told "Hardly anyone was buying them."

In theory, the Mac Pro Server and Mac Mini Server replaced the XServe. In practicality though, this ushered in a decade of pain for anyone trying to run Mac OS X in the data center. The 'server' OS was bundled into mainline Mac OS X, and the server-specific components were made available as a download through the newly launched Mac App Store. Development continued on the server elements, and it still exists now in some form.

The hardware for the Mac Pro Server and Mac Mini Server didn't last long though. The Mac Pro Server was discontinued in 2013, after barely a single refresh from its late 2010 launch. Ouch. The fate of the Mac Mini Server wasn't much better, and it was discontinued in 2014. There was no clear replacement for either of these products, and running Mac OS X in the data center became even harder. In the wake of this, we got the trashcan Mac Pro, which, well, the less said the better. It was everything you didn't want in a server rack - circular, badly cooled, and difficult to repair.

Source: Wikimedia commons

Fast-forward to today and macOS is a nightmare for the data center. Due to the way Apple's ecosystem works for certain things, like XCode or iOS app development, developers are required to build jobs on Apple hardware or run certain code tests. To do this at scale, big companies often need access to tens or hundreds of macOS machines in the cloud.

The combination of developers and businesses needing access to macOS in the cloud, the difficulty of virtualizing and the lack of any appropriate form factors, has led to a whole side industry of Mac Mini hosting. You can rent Macs in the cloud either through dedicated providers like macincloud.com, or more recently they've also become available through AWS. Depending on the provider (or which spec you select), these will likely be Apple Silicon Mac Minis or Mac Pros mounted in a data center somewhere.

As we mentioned at the start, Apple is now running its PCC at scale, on Apple Silicon hardware. Apple doesn't mention what hardware they're using for this. The modern Mac Pro has a rack mount version, but it's prohibitively expensive and is more intended for professionals in the music space than for data center use. The Mac Mini is also far from ideal, and wasn't designed for data center utilization. The Mac Studio is probably the most reasonable candidate, but again, it is a far cry from the rack-mountable hardware we're used to in the data center.

This raises the question - does Apple have some more hardware in the pipeline? And could we finally see a worthy successor to XServe? Cloud hosting for Macs has been a nightmare for years, and it's a problem Apple is surely aware of (as it's running more and more of its silicon in the cloud right now). Apple is almost certainly never going to compete directly for cloud compute, but what has changed since the discontinuation of XServe is the size of Apple's utilization. The company is using more and more of its hardware in the cloud, and building additional software services like PCC on top of its proprietary hardware. Given Apple's track record for software support, this is the company likely to commit to years of supporting Apple Silicon in the data center.

Whether we will ever actually get new, proper data center hardware from Apple, we don't know. It's fully possible that Apple identifies its market as mostly internal, and already has specialized hardware that won't ever be released for public consumption. But if any of this potentially novel server hardware does see the light of day, you can guess it won't be by halves, and we'll be doing our utmost to dig into every detail of it. That said, it's unlikely I'll be using Apple Silicon to replace my NAS any time soon.

Apple had a ton of AI features to show off during WWDC, but once it releases in Q3 2024, a lot of them will be missing in action.

Go here to see the original:
Is Apple going to give us the XServe replacement we need? - XDA Developers

Read More..

Quantum sensing: quantum technology you’ve never heard of – Cosmos

Quantum physics has become ubiquitous across science over the past few years, often in connection to advances and investment in quantum computing research.

However, its quantum sensing where much of the investment in quantum technologies is directed and its a growing area of research.

It is true that quantum computers will one day offer us increased computing power and efficiency, however the goal of creating a fully-fledged quantum computer is many years away.

This is due to the engineering challenges involved; it is extremely difficult to maintain a qubit (the building block of a quantum computer) in a quantum state long enough to use it. Any outside perturbations cause the system to collapse, rendering it useless. Even the tiniest fluctuations in properties like magnetic and electric fields or temperature can cause the collapse of a quantum state.

This sensitivity presents an obvious challenge to the development of a quantum computer; however, researchers can harness this sensitivity, and access interactions and phenomena at levels well outside the range of conventional sensing approaches.

Todays quantum sensors have their roots in well-established techniques such as magnetic resonance imaging (MRI), which is founded on similar quantum mechanical principles. In an MRI experiment, individual nuclei are used as qubits, which report on their surrounding environment. Similarly, most modern quantum sensing uses either a nuclear or electronic spin as a qubit.

As the name suggests, MRIs measure how the magnetic field environment around hydrogen nuclei affects their behaviour. In many cases, modern quantum sensors are also used as highly sensitive magnetic field detectors. Unlike MRI however, they often combine magnetic field sensitivity with extremely high spatial resolution and the prospect of low cost and portability. Together, these attributes make them useful across a diverse collection of industries and research areas.

For example, one promising application of quantum sensing is the identification of novel materials for use in classical computers.

To maintain the utility of classical computers into the future, considerations around power consumption and size constraints will need to be addressed. Electrical engineers are interested in new materials, such as graphene and perovskite, which will offer benefits over traditional silicon-based devices.

Quantum sensing is helping to understand the magnetic behaviour of these novel materials; a vital requirement for selecting those worth further development.

As molecular biology has advanced, questions about the nature of intracellular interactions, such as those within or between individual proteins, have become the target of fundamental research. Quantum sensors can offer unique information at a higher resolution than compared to traditional techniques like light microscopy.

Researchers are hopeful that with this new level of detail, quantum sensing can be used to answer questions useful to medical science, such as how to design better drugs, the nature of neuronal signalling and how to more accurately diagnose disease. These goals are being addressed by the new 7-year, ARC Centre of Excellence on Quantum Biotechnology.

Quantum sensing has also seen strong uptake within the mineral resources sector where it can be used to identify new mineral extraction sites via the subtle magnetic fields they produce. SQUID magnetometers (Superconducting Quantum Interference Devices use quantised superconducting states as the sensor) are already deployed for this task and can detect magnetic fields many times smaller than the earths.

Finally, given their unique sensitivity, physicists are also interested in the new physical regimes quantum sensors could access. Quantum sensors may end up helping scientists answer some of physics most fundamental questions, such as the nature of dark matter or gravity. SQUIDs have recently been deployed at the Simons Observatory in Chile to help detect cosmic microwave background (CMB) radiation. In this case, instead of a magnetic signal, what is detected is the heat created when a CMB photon collides with a SQUID, disrupting its quantum state.

Here is the original post:

Quantum sensing: quantum technology you've never heard of - Cosmos

Read More..

Tiny Quantum Ghosts Might Be Creating Brand-New Elements – Popular Mechanics

In the beginning, there was lots and lots of hydrogen and heliumthat is until the fiery fusion furnaces of primordial stars began churning out heavier elements. Nuclear fusion can form elements all the way until an atom contains 26 protons and 30 neutrons (aka Iron) until it inevitably collapses. Of course, theres just one problem. If youve happened to glance at a periodic table lately, theres many more elements with atomic masses far beyond iron. So what gives?

Turns out theres another element-producing process at work, and its called neutron capture, or nucleosynthesis. This process breaks down into two different types, which are called rapid neutron-capture process (r-process) and slow neutron-capture process (s-process), and each are roughly responsible for creating half of the known elements beyond iron. As their names suggest, these processes occur in very different environments. R-process requires a high density of free neutrons (think neutron star mergers or supernova collapses) while s-process occurs in asymptotic giant branch (AGB) stars and

But as with most things in astrophysics, things are not quite so black and white. Back in 1977, scientists proposed a third process, known as the intermediate-process (i-process), that exists sort of in between both r- and s-processes. The idea faded with time but has regained attention in recent years due to the enigma known as carbon-enhanced metal-poor (CEMP) r/s stars, which produce abundances of carbon and heavy elements associated with both processes. Now, a new study from the University of WisconsinMadison investigates how exactly such an i-process would work, and the solution to this very big mystery veers into the very small quantum world.

When a supernova collapse occurs, you start with a big star, which is gravitationally bound, and that binding has energy, UW-Madisons Baha Balantekin, a co-author of a paper on the i-process published in The Astrophysical Journal, said in a press statement. While the i-process is a nucleosynthesis middle child, one aspect is shares with r-process is that it only occurs in similarly violent conditions. When it collapses, that energy has to be released, and it turns out that energy is released in neutrinos.

Its when those neutrinos experience quantum entanglement due to interactions in a supernova, that the i-process can take over and produce heavy elements. This entanglement means the two neutrinos remember each other no matter how far apart they may be. Using well-known rates of neutron capture, catalogs of atomic spectra of various stars, and data surrounding neutrino production via supernova, the team ran simplified simulations (supernovae produce 10^58 neutrinos after all) and arrive at differing abundances depending on whether these neutrinos were entangled or not.

We have a system of, say, three neutrinos and three antineutrinos together in a region where there are protons and neutrons and see if that changes anything about element formation, Balantekin says. We calculate the abundances of elements that are produced in the star, and you see that the entangled or not entangled cases give you different abundances.

There are a few things about this hypothesis that still need to be testedchief among them is that neutrino-neutrino interactions are largely hypothetical at this point. However, this new process could help further explain how something came from nothing.

Darren lives in Portland, has a cat, and writes/edits about sci-fi and how our world works. You can find his previous stuff at Gizmodo and Paste if you look hard enough.

Follow this link:

Tiny Quantum Ghosts Might Be Creating Brand-New Elements - Popular Mechanics

Read More..

Manipulating the quantum dance of spinning electrons – Earth.com

In the world of spinning electrons and quantum states, an exciting realm thats reshaping our everyday lives through our gadgets, researchers have made a discovery that promises even more powerful storage and processing capacities.

Just like a compass needle aligns itself to a magnetic field, electrons possess an inherent angular momentum, termed as spin.

Beyond their electric charge, which dictates behavior in electronic circuits, their spin has become pivotal for storing and processing data.

Our current gadgets, such as MRAM memory elements (magnetic random access memories), information is stored via small classical magnets.

These comprise a myriad of electron spins. The MRAMs, in turn, operate on spin-aligned electron currents, which can shift magnetization at a certain point in a material.

Researcher Pietro Gambardella, and his team at ETH Zurich, discovered that spin-polarized currents can also govern the quantum states of single electron spins.

Their findings, freshly published in the scientific journal Science, promise great potential for controlling quantum states of quantum bits (qubits).

Electron spins have been traditionally manipulated utilizing electromagnetic fields like radio-frequency waves or microwaves, explains Sebastian Stepanow, a Senior Scientist in Gambardellas laboratory.

This established technique, known as electron paramagnetic resonance, traces back to the mid-1940s and has found use in assorted fields such as material research, chemistry, and biophysics.

However, the exact mechanism of inducing electron paramagnetic resonance in singular atoms has remained hazy.

To delve deeper into the quantum mechanical processes behind this mechanism, the researchers readied pentacene molecules (an aromatic hydrocarbon) onto a silver substrate.

A thin insulating layer of magnesium oxide, previously deposited on the substrate, ensures that the electrons in the molecule behave more or less as they would in free space.

The researchers used a scanning tunnelling microscope to measure the current created when the electrons tunnelled quantum mechanically from the tip of a tungsten needle to the molecule.

Classical physics would argue against this process, but quantum mechanics empowers the electrons to tunnel through the gap, generating a measurable current.

By applying a constant voltage and a rapidly oscillating voltage to a magnetized tungsten tip, and subsequently measuring the resulting tunnel current, the team was able to observe characteristic resonances in the tunnel current.

The shape of these resonances allowed them to infer the processes between the tunnelling electrons and those of the molecule.

Through their data analysis, Stepanow and his team reaped two critical insights.

Firstly, the electron spins in the pentacene molecule reacted to the electromagnetic field created by the alternating voltage, similar to ordinary electron paramagnetic resonance.

Secondly, they found an additional process at play that also influenced the spins of the electrons in the molecule.

This process is the so-called spin transfer torque, says PhD student Stepan Kovarik. Under the influence of a spin-polarized current, the spin of the molecule is altered without any direct action of an electromagnetic field.

The ETH researchers demonstrated that its possible to create quantum mechanical superposition states of the molecular electron spin, and these states are being used in quantum technologies.

Spin control by spin-polarized currents at the quantum level gives way to numerous potential applications, Kovarik predicts.

Contrary to electromagnetic fields, spin-polarized currents can act locally and be steered with a precision of less than a nanometer.

They could be deployed to address electronic circuit elements in quantum devices with extreme precision, thereby controlling the quantum states of magnetic qubits.

Time will tell how this exciting development will translate into practical applications in storing and processing data. But until then, thanks to the relentless curiosity of scientists like Gambardella, Stepanow, and Kovarik, our understanding of the quantum dance of electrons continues to evolve.

The full study was published in the journal Science.

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

Go here to see the original:

Manipulating the quantum dance of spinning electrons - Earth.com

Read More..

The Future of Quantum Computing with Neutral-Atom Arrays – The Quantum Insider

At the recent MCQST Colloquium held at the Max Planck Institute for Quantum Optics, Johannes Zeiher provided a compelling overview of the advances in quantum simulation and quantum computing with neutral-atom arrays. His presentation offered valuable insights into how these systems are poised to transform quantum technology.

Zeiher started by explaining the core motivation behind their work.

Our goal is to understand, control and create many-body systems using individually controllable neutral atoms, he stated. These neutral atoms, arranged using optical tweezers, serve as a powerful platform for studying quantum phenomena due to their high level of controllability and scalability.

One of the key advantages of neutral-atom arrays is their ability to simulate complex quantum systems.

We can use these systems to study strongly correlated systems, transport out of equilibrium dynamics, and phase transitions, Zeiher elaborted. This capability is vital for exploring fundamental aspects of quantum mechanics and for developing new technological applications.

Zeiher also stressed the importance of long-range interactions in these systems.

Long-range interactions introduce competing length scales, which can lead to rich and complex physical phenomena, he noted. By manipulating these interactions, researchers can simulate various phases of matter, such as the superfluid and Mott insulator phases, and even more exotic states like the Haldane insulator and density wave phases.

In terms of practical applications, Zeiher discussed the potential of neutral-atom arrays in quantum computing.

Neutral atoms offer a promising platform for quantum computing due to their scalability and the high fidelity of quantum gates, he said. Recent advancements have pushed the fidelity of two-qubit gates to over 99.5%, putting them on par with other leading quantum computing platforms.

One of the groundbreaking techniques Zeiher discussed is the use of Rydberg dressing. By coupling atoms off-resonantly to Rydberg states, researchers can induce long-range interactions while maintaining a high level of stability. He explained that Rydberg dressing allows them to significantly enhance the lifetime of these states, enabling complex quantum simulations and computations over extended periods.

Zeiher concluded his talk by drawing attention to the broader implications of their research.

The ability to control and manipulate neutral atoms with such precision opens up new frontiers in both quantum simulation and quantum computing, he remarked.

The insights from these systems do not just allow one to push understanding in the realm of quantum mechanics further. Still, they will also serve as a frontier toward innovative technologies that have the potential to be revolutionary in most fields, from materials science to cryptography.

Zeiher uncovered the revolutionizing potential that neutral-atom arrays bear in quantum technology in his talk at the MCQST Colloquium. Given developments in controlling long-range interactions and fidelity of quantum gates, these systems will be of great importance for the future of quantum computing and simulation.

View original post here:

The Future of Quantum Computing with Neutral-Atom Arrays - The Quantum Insider

Read More..

Rare form of quantum matter created with molecules for the first time – Earth.com

Scientists have produced a rare form of quantum matter known as a Bose-Einstein condensate (BEC) using molecules instead of atoms.

Made from chilled sodium-cesium molecules, these BECs are as chilly as five nanoKelvin, or about -459.66 F, and stay stable for a remarkable two seconds.

These molecular BECs open up an new research arenas, from understanding truly fundamental physics to advancing powerful quantum simulations, noted Columbia University physicist Sebastian Will. Weve reached an exciting milestone, but its just the kick-off.

A Bose-Einstein Condensate (BEC) represents a state of matter that occurs when a collection of bosons, particles that follow Bose-Einstein statistics, are cooled to temperatures very close to absolute zero.

Under such extreme conditions, a significant fraction of the bosons occupy the lowest quantum state, resulting in macroscopic quantum phenomena.

This means that they behave as a single quantum entity, effectively collapsing into a single wave function that can be easily described using the principles of quantum mechanics.

The fascinating aspect of BECs stems from their superfluid properties exhibiting zero viscosity as they flow, which allows them to move without dissipating energy.

This unique property enables BECs to simulate other quantum systems and explore new realms of physics.

For instance, studying BECs can provide insights into quantum coherence, phase transitions, and many-body interactions in quantum gases.

The creation of molecular BECs, like those involving sodium-cesium molecules, extends this exploration even further, potentially leading to breakthroughs in quantum computing and precision measurements.

The journey of BECs is a long and winding one, dating back a century to the works of physicists Satyendra Nath Bose and Albert Einstein.

They prophesied that a cluster of particle cooled to the brink of standstill would merge into a singular macro-entity, governed by the dictates of quantum mechanics. The first true atomic BECs emerged in 1995, 70 years after the original theoretical predictions.

Atomic BECs have always been relatively simple round objects with minimal polarity-based interactions. But the scientific community came to crave a more complex version of BECs compiled of molecules, albeit with no avail.

Finally, in 2008, the first breakthrough came when a duo of physicists chilled a gas of potassium-rubidium molecules to about 350 nanoKelvin. The quest for achieving an even lower temperature to cross the BEC threshold continued.

In 2023, the initial step towards this goal was achieved when the research group created their desired ultracold sodium-cesium molecule gas using a blend of laser cooling and magnetic manipulations. To further decrease the temperature, they decided to introduce microwaves.

Microwaves can construct small shields around each molecule, preventing them from colliding and leading to a drop in the overall temperature of the sample.

The groups achievement of creating a molecular BEC represents a spectacular accomplishment in quantum control technology.

This brilliant piece of scientific work is bound to impact a multitude of scientific fields, from the study of quantum chemistry to the exploration of complex quantum materials.

We really have a thorough understanding of the interactions in this system, which is vital for the subsequent steps, like exploring dipolar many-body physics, said co-author and Columbia postdoc Ian Stevenson.

The research team developed schemes to control interactions, tested these from a theoretical angle, and executed them in the actual experiment. Its truly wondrous to witness the realization of these microwave shielding concepts in the lab.

The creation of molecular BECs enables the fulfilment of numerous theoretical predictions. The stable nature of these molecular BECs allows extensive exploration of quantum physics.

A proposition to build artificial crystals with BECs held in a laser-made optical lattice might provide a comprehensive simulation of interactions in natural crystals.

On switching from a three-dimensional system to a two-dimensional one, new physics is expected to emerge. This area of research opens up a plethora of possibilities in the study of quantum phenomena, including superconductivity and superfluidity, amongst others.

This feels like a whole new universe of possibilities unveiling itself, Sebastian Will concluded, summing up the enthusiasm in the scientific community.

In summary, this research chronicles the successful creation of a Bose-Einstein Condensate (BEC) using ultracold sodium-cesium molecules, reaching a stable state at five nanoKelvin for two seconds.

Leveraging a combination of laser cooling, magnetic manipulations, and innovative microwave shielding, the research group and their theoretical collaborator achieved unprecedented control over molecular interactions at quantum levels.

This milestone enables comprehensive exploration of quantum phenomena such as coherence, phase transitions, and many-body interactions, potentially unlocking new avenues in quantum simulations, quantum computing, and precision measurements.

The full study was published in the journal Nature.

Special thanks to Ellen Neff from Columbia University.

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

Go here to see the original:

Rare form of quantum matter created with molecules for the first time - Earth.com

Read More..

Yes, the Most Massive Particle Shows Some ‘Spooky Action At a Distance’ – Popular Mechanics

At its bare essentials, every atom is made of two fundamental particles: electrons and quarks. But not all quarks are the same. In fact,

Top quarks weigh in at an impressive 175.6 gigaelectron volts (GeV)about the same mass of the atomic nucleus of goldbut only exist for 15 to 24 seconds before decaying into free particles. Because of the top quarks (relatively) massive bulk, it took decades after the discovery of the bottom quark for the U.S.-based Fermilabs to create an accelerator capable of detecting the elusive particle.

In the 30 years since, investigating the top quark has opened up new worlds of particle physics, and the discovery of the Higgs boson in 2012 revealed the two particles close association. Now, scientists at CERN have been busy investigating the top quarks quantum properties, and discovered that top quarks experience quantum entanglement like other elementary particles even in spite of their mass.

In the fall of 2023, the Toroidal LHC Apparatus (ATLAS) Experiment discovered entanglement between two top quarks, and earlier this week, another CERN detectorthe Compact Muon Solenoid (CMS)also detected quantum entanglement between top quarks, according to CERN. Specifically, the team discovered entanglement between the unstable top quark and its antimatter partner across distances farther than what can be covered by information transferred at the speed of light, according to a press release. In the famous words of Albert Einstein, that is what's known as spooky action at distance.

To help illustrate this strange effect of quantum mechanics, Regina Demina from the University of Rochesterwho was part of the original team that discovered the top quark in 1995, co-lead a team that built the tracking device for finding the Higgs boson, and now led the CMS team at the Large Hadron Collider at CERNdescribes the idea in colorful terms via a Facebook video:

This kind of entanglement has been a hot topic when it comes to exploring quantum information and quantum computers, but top quarks can only be made in colliders. So, while they wont be used in these types of next-gen machines, the discovery of their entanglement could answer questions about this the nature of this spooky action from a distancequestions like whether that entanglement continues once a particle decays, and what eventually breaks that entanglement.

Its been a long journey of discovery when it comes to the top quark, and there are likely many more mysteries yet to uncover.

Darren lives in Portland, has a cat, and writes/edits about sci-fi and how our world works. You can find his previous stuff at Gizmodo and Paste if you look hard enough.

See original here:

Yes, the Most Massive Particle Shows Some 'Spooky Action At a Distance' - Popular Mechanics

Read More..

What is a ‘kugelblitze’ and why should you care? – Earth.com

For nearly the last seventy years, the corridors of astrophysics have been echoing with the murmur of an exciting theory: the existence of kugelblitze.

These are not your average black holes. They are born not from the collapse of matter, but conceived through incredibly dense concentrations of light.

Kugelblitze have been speculated as the potential key to unlocking mysteries of the universe, like dark matter, and perhaps more enticingly, powering spaceships of the future.

However, this extraordinary theory has just hit a roadblock.

A formidable team of researchers from the University of Waterloo and Universidad Complutense de Madrid, led by the brilliant Eduardo Martn-Martnez, a professor of applied mathematics and mathematical physics, has established that kugelblitz might not be a reality in our universe.

Their compelling research, conveniently titled No black holes from light, is soon to be published in the Physical Review Letters after a preprint on arXiv.

The quantum realm and black holes share intriguing connections. Quantum mechanics governs the behavior of particles at the smallest scales. Black holes represent extreme gravity at cosmic scales.

Scientists believe quantum effects become important near a black holes center. Hawking radiation, a quantum phenomenon, causes black holes to slowly evaporate.

The black hole information paradox arises from conflicts between quantum theory and general relativity.

Researchers study black holes to better understand quantum gravity. Some theories propose black holes as gateways to other universes via quantum effects.

The relationship between these realms remains an active area of research in theoretical physics.

The most commonly known black holes are those caused by enormous concentrations of regular matter collapsing under its own gravity, said Prof. Martn-Martnez, who is also affiliated with the Perimeter Institute for Theoretical Physics. However, this prediction was made without considering quantum effects.

Trying to shed light on the matter, the team built a mathematical model, incorporating quantum effects.

They found that the concentration of light needed to spawn a kugelblitz outpaces the light intensity found in quasars, the brightest objects in our cosmos, by tens of orders of magnitude.

Long before you could reach that intensity of light, certain quantum effects would occur first, Jos Polo-Gmez, a Ph.D. candidate in applied mathematics and quantum information, remarked.

That strong of a concentration of light would lead to the spontaneous creation of particles like electron-positron pairs, which would move very quickly away from the area.

Though testing such effects on Earth isnt possible with current technologies, the teams confidence in their findings stems from the rock-solid principles of mathematics and science, that also power positron emission tomography (PET) scans.

Electrons, and their antiparticles (positrons) can annihilate each other and disintegrate into pairs of photons, or light particles, Martn-Martnez explained.

When there is a large concentration of photons they can disintegrate into electron-positron pairs, which are quickly scattered away taking the energy with them and preventing the gravitational collapse.

While the sprint towards kugelblitze mightve hit an unexpected speed breaker, the research is a massive victory for fundamental physics.

This collaborative effort between applied mathematics, the Perimeter Institute, and the Institute for Quantum Computing at Waterloo is laying the foundation for future significant scientific breakthroughs.

While these discoveries may not have known applications right now, we are laying the groundwork for our descendants technological innovations, Polo-Gmez ambitiously outlines.

The science behind PET scan machines was once just as theoretical, and now theres one in every hospital.

As we conclude this enlightening journey, lets remember science is not just about confirming theories but also about disproving them. Todays quantum quirks are the stepping stones for tomorrows path-breaking technologies.

While kugelblitze may not have panned out the way they were imagined, theyve undoubtedly illuminated a new direction for further exploration.

Whether its understanding the universe or developing technologies for future generations, every eureka moment counts, including those that remind us of what is not possible.

And in this vast expanse of what we know and what remains unknown, one fact remains resolute on the journey of scientific discovery, theres never a dull moment.

The full study was published in the journal arXiv.

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

More:

What is a 'kugelblitze' and why should you care? - Earth.com

Read More..