Page 2,575«..1020..2,5742,5752,5762,577..2,5802,590..»

Scientists are using quantum computing to help them discover signs of life on other planets – ZDNet

Scientists will use quantum computing tools to eventually help them detect molecules in outer space that could be precursors to life.

Quantum computers are assisting researchers in scouting the universe in search of life outside of our planet -- and although it's far from certain they'll find actual aliens, the outcomes of the experiment could be almost as exciting.

Zapata Computing, which provides quantum software services, has announced a new partnership with the UK's University of Hull, which will see scientists use quantum computing tools to eventually help them detect molecules in outer space that could be precursors to life.

During the eight-week program, quantum resources will be combined with classical computing tools to resolve complex calculations with better accuracy, with the end goal of finding out whether quantum computing could provide a useful boost to the work of astrophysicists, despite the technology's current limitations.

See also: There are two types of quantum computing. Now one company says it wants to offer both.

Detecting life in space is as tricky a task as it sounds. It all comes down to finding evidence of molecules that have the potential to create and sustain life -- and because scientists don't have the means to go out and observe the molecules for themselves, they have to rely on alternative methods.

Typically, astrophysicists pay attention to light, which can be analyzed through telescopes. This is because light -- for example, infrared radiation generated by nearby stars -- often interacts with molecules in outer space. And when it does, the particles vibrate, rotate, and absorb some of the light, leaving a specific signature on the spectral data that can be picked up by scientists back on Earth.

Therefore, for researchers, all that is left to do is detect those signatures and trace back to which molecules they correspond.

The problem? MIT researchershave previously established that over 14,000 moleculescould indicate signs of life in exoplanets' atmospheres. In other words, there is still a long way to go before astrophysicists have drawn a database of all the different ways that those molecules might interact with light -- of all the signatures that they should be looking for when pointing their telescopes to other planets.

That's the challenge that the University of Hull has set for itself: the institution's Centre for Astrophysics is effectively hoping to generate a database of detectable biological signatures.

For over two decades, explains David Benoit, senior lecturer in molecular physics and astrochemistry at the University of Hull, researchers have been using classical means to try and predict those signatures. Still, the method is rapidly running out of steam.

The calculations carried out by the researchers at the center in Hull involve describing exactly how electrons interact with each other within a molecule of interest -- think hydrogen, oxygen, nitrogen and so on. "On classical computers, we can describe the interactions, but the problem is this is a factorial algorithm, meaning that the more electrons you have, the faster your problem is going to grow," Benoit tells ZDNet.

"We can do it with two hydrogen atoms, for example, but by the time you have something much bigger, like CO2, you're starting to lose your nerve a little bit because you're using a supercomputer, and even they don't have enough memory or computing power to do that exactly."

Simulating these interactions with classical means, therefore, ultimately comes at the cost of accuracy. But as Benoit says, you don't want to be the one claiming to have detected life on an exo-planet when it was actually something else.

Unlike classical computers, however, quantum systems are built on the principles of quantum mechanics -- those that govern the behavior of particles when they are taken at their smallest scale: the same principles as those that underlie the behavior of electrons and atoms in a molecule.

This prompted Benoit to approach Zapata with a "crazy idea": to use quantum computers to solve the quantum problem of life in space.

"The system is quantum, so instead of taking a classical computer that has to simulate all of the quantum things, you can take a quantum thing and measure it instead to try and extract the quantum data we want," explains Benoit.

Quantum computers, by nature, could therefore allow for accurate calculations of the patterns that define the behavior of complex quantum systems like molecules without calling for the huge compute power that a classical simulation would require.

The data that is extracted from the quantum calculation about the behavior of electrons can then be combined with classical methods to simulate the signature of molecules of interest in space when they come into contact with light.

It remains true that the quantum computers that are currently available to carry out this type of calculation are limited: most systems don't break the 100-qubit count, which is not enough to model very complex molecules.

See also: Preparing for the 'golden age' of artificial intelligence and machine learning.

Benoit explains that this has not put off the center's researchers. "We are going to take something small and extrapolate the quantum behavior from that small system to the real one," says Benoit. "We can already use the data we get from a few qubits, because we know the data is exact. Then, we can extrapolate."

That is not to say that the time has come to get rid of the center's supercomputers, continues Benoit. The program is only starting, and over the course of the next eight weeks, the researchers will be finding out whether it is possible at all to extract those exact physics on a small scale, thanks to a quantum computer, in order to assist large-scale calculations.

"It's trying to see how far we can push quantum computing," says Benoit, "and see if it really works, if it's really as good as we think it is."

If the project succeeds, it could constitute an early use case for quantum computers -- one that could demonstrate the usefulness of the technology despite its current technical limitations. That in itself is a pretty good achievement; the next milestone could be the discovery of our exo-planet neighbors.

View post:

Scientists are using quantum computing to help them discover signs of life on other planets - ZDNet

Read More..

Quantum computing will break today’s encryption standards – here’s what to do about it – Verizon Communications

When you come to the fork in the road, take it. Yogi Berra

For cryptologists, Yogi Berras words have perhaps never rang more true. As a future with quantum computing approaches, our internet and stored secrets are at risk. The tried-and-true encryption mechanisms that we use every day, like Transport Layer Security (TLS) and Virtual Private Networks (VPN), could be cracked and exposed by a hacker equipped with a large enough quantum computer using Shors algorithm, a powerful algorithm with exponential speed over classical algorithms. The result?The security algorithms we use today that would take roughly 10 billion years to decrypt could take as little as 10 seconds. To prevent this, its imperative that we augment our security protocols, and we have two options to choose from: one using physics as its foundation, or one using math our figurative fork in the road.

To understand how to solve the impending security threats in a quantum era, we need to first understand the fundamentals of our current encryption mechanism. The most commonly used in nearly all internet activities TLS is implemented anytime someone performs an online activity involving sensitive information, like logging into a banking app, completing a sale on an online retailer website, or simply checking email. It works by combining the data with a 32-byte key of random 1s and 0s in a complicated and specific way so that the data is completely unrecognizable to anyone except for the two end-to-end parties sending and receiving the data. This process is called public key encryption, and currently it leverages a few popular algorithms for key exchange, e.g., Elliptic curve Diffie-Hellman (ECDH) or RSA (each named after cryptologists,) each of which are vulnerable to quantum computers. The data exchange has two steps: the key exchange and the encryption itself. The encryption of the data with a secure key will still be safe, but the delivery of the key to unlock that information (key distribution) will not be secure in the future quantum era.

To be ready for quantum computers, we need to devise a new method of key distribution, a way to safely deliver the key from one end of the connection to the other.

Imagine a scenario wherein you and a childhood friend want to share secrets, but can only do so once you each have the same secret passcode in front of you (and there are no phones.) One friend has to come up with a unique passcode, write it down on a piece of paper (while maintaining a copy for themselves,) and then walk it down the block so the other has the same passcode. Once you and your friend have the shared key, you can exchange secrets (encrypted data) that even a quantum computer cannot read.

While walking down the block though, your friend could be vulnerable to the school bully accosting him or her and stealing the passcode, and we cant let this happen. What if your friend lives across town, and not just down the block? Or even more difficult in a different country? (And where is that secret decoder ring we got from a box of sugar-coated-sugar cereal we ate as kids?)

In a world where global information transactions are happening nonstop, we need a safe way of delivering keys no matter the distance. Quantum physics can provide a way to securely deliver shared keys quicker and in larger volume, and, most importantly, immune to being intercepted. Using fiber optic cables (like the ones used by telecommunications companies,) special Quantum Key Distribution (QKD) equipment can send tiny particles (or light waves) called photons to each party in the exchange of data. The sequence of the photons encapsulates the identity of the key, a random sequence of 1s and 0s that only the intended recipients can receive to construct the key.

Quantum Key Distribution also has a sort of built-in anti-hacker bonus. Because of the no-cloning theorem (which essentially states that by their very nature, photons cannot be cloned,) QKD also renders the identity of the key untouchable by any hacker. If an attacker tried to grab the photons and alter them, it would automatically be detected, and the affected key material would be discarded.

The other way we could choose to solve the security threats posed by quantum computers is to harness the power of algorithms. Although its true the RSA and ECDH algorithms are vulnerable to Shors algorithm on a suitable quantum computer, the National Institute of Standards and Technology (NIST) is working to develop replacement algorithms that will be safe from quantum computers as part of its post-quantum cryptography (PQC) efforts. Some are already in the process of being vetted, like ones called McEliece, Saber, Crystals-Kyber, and NTRU.

Each of these algorithms has its own strong and weak points that the NIST is working through. For instance, McEliece is one of the most trusted by virtue of its longstanding resistance to attack, but it is also handicapped by its excessively long public keys that may make it impractical for small devices or web browsing. The other algorithms, especially Saber, run very well on practically any device, but, because they are relatively new, the confidence level in them from cryptographers is still relatively low.

With such a dynamic landscape of ongoing efforts, there is promise that a viable solution will emerge in time to keep our data safe.

The jury is still out. We at Verizon and most of the world rely heavily on e-commerce to sell our products and encryption to communicate via email, messaging, and cellular voice calls.All of these need secure encryption technologies in the coming quantum era. But whether we choose pre-shared keys (implemented by the awesome photon) or algorithms, further leveraging mathematics, our communications software will need updating. And while the post quantum cryptography effort is relatively new, it is not clear which algorithms will withstand scrutiny from the cryptographic community. In the meantime, we continue to peer down each fork in the road to seek the best option to take.

Read more from the original source:

Quantum computing will break today's encryption standards - here's what to do about it - Verizon Communications

Read More..

A novel way to heat and cool things – The Economist

Oct 7th 2021

REFRIGERATORS AND air-conditioners are old and clunky technology, and represent a field ripe for disruption. They consume a lot of electricity. And they generally rely on chemicals called hydrofluorocarbons which, if they leak into the atmosphere, have a potent greenhouse-warming effect. Buildings central-heating systems, meanwhile, are often powered by methane in the form of natural gas, which releases carbon dioxide, another greenhouse gas, when it is burned, and also has a tendency to leak from the pipes that deliver itwhich is unfortunate, because methane, too, is a greenhouse gas, and one much more potent than CO2.

Your browser does not support the

Enjoy more audio and podcasts on iOS or Android.

One potential way of getting around all this might be to exploit what is known as the thermoelectric effect, a means of carrying heat from place to place as an electric current. Thermoelectric circuits can be used either to cool things down, or to heat them up. And a firm called Phononic, based in Durham, North Carolina, has developed a chip which does just that.

The thermoelectric effect was discovered in 1834 by Jean Charles Peltier, a French physicist. It happens in an electrical circuit that includes two materials of different conductivity. A flow of electrons from the more conductive to the less conductive causes cooling. A flow in the other direction causes heating.

The reason for this is that electrons are able to vibrate more freely when pushed into a conductive material. They thereby transfer energy to their surroundings, warming them. When shunted into a less conductive one, electrons vibrations are constrained, and they absorb energy from their surroundings, cooling those surroundings down. An array of thermoelectric circuits built with all the high-conductivity materials facing in one direction and all the low conductivity ones in the other can thus move heat in either direction, by switching the polarity of the current. For reasons buried in the mathematics of quantum physics, the heat thus flowing does so in discrete packages, called phonons. Hence the name of the firm.

The thermoelectric effect works best when the conductors involved are actually semiconductors, with bismuth and tin being common choices. Fancy cameras contain simple cooling chips which use these, as do some scientific instruments. But Phononics boss, Tony Atti, thinks that is small beer. Using the good offices of Fabrinet, a chipmaker in Thailand, he has started making more sophisticated versions at high volume, using the set of tools and techniques normally employed to etch information-processing circuits onto wafers made of silicon. In this case, though, the wafers are made of bismuth.

The results are, admittedly, still a long way from something that could heat or cool a building. But they are already finding lucrative employment in applications where space is at a premium. At the moment, the fastest-growing market is cooling the infrared lasers used to fire information-encoding photons through fibre-optic cables, for the long-distance transmission of data. They are also being used, though, in the 5G mobile-phone base stations now starting to blanket street corners, to keep the batteries of electric vehicles at optimal operating temperatures, and as components of the optical-frequency radar-like systems known as LIDAR, that help guide autonomous vehicles.

The crucial question from Mr Attis point of view is whether semiconductor-based thermoelectronics can break out of these niches and become more mainstream, in the way that semiconductor-based electronics and lighting have done. In particular, he would like to incorporate heat-pumping chips into buildings, to provide them with integral thermoregulation.

In their current form, thermoelectric chips are unlikely to replace conventional air conditioning and central heating because they cannot move heat over the distances required to pump it in and out of a building in bulk. But they could nonetheless be used as regulators. Instead of turning a big air-conditioning system on or off, to lower or raise the temperature by the small amounts required to maintain comfort, with all the cost that entails, thermoelectric chips might tweak matters by moving heat around locally.

Phononic has already run trials of such local-temperature-tweaking chips in Singapore, in partnership with Temasek, that countrys state-run investment fund. In 2019 SP Group, Singapores utility company, installed eight of the firms heat pumps, which comprise an array of chips pointed down at people, pumping heat out of the air above them, on the boardwalk on Clarke Quay in the city. Phononic claims the devices lowered the temperature in their vicinity by up to 10C and, as a bonus, consequently reduced humidity by 15%. If that can be scaled up, it would certainly be a cool result.

This article appeared in the Science & technology section of the print edition under the headline "Cool thinking"

View original post here:

A novel way to heat and cool things - The Economist

Read More..

New Fundamental Limit of Trapping and Exploiting Light at the Nanoscale – SciTechDaily

Metasurface of split-ring resonators, partially overlaid with 3D colourmaps showing the simulated electric-field distribution. High-momentum magnetoplasmons lead to the break-down of polaritons (blue spheres with photon energies in red). Credit: Urban Senica, ETH Zurich

Physicists from the University of Southampton and ETH Zrich have reached a new threshold of light-matter coupling at the nanoscale.

The international research, published recently in Nature Photonics, combined theoretical and experimental findings to establish a fundamental limitation of our ability to confine and exploit light.

The collaboration focused on photonic nano-antennas fabricated in ever reducing sizes on the top of a two-dimensional electron gas. The setup is commonly used in laboratories all over the world to explore the effect of intense electromagnetic coupling, taking advantage of the antennas ability to trap and focus light close to electrons.

Professor Simone De Liberato, Director of the Quantum Theory and Technology group at the University of Southampton, says: The fabrication of photonic resonators able to focus light in extremely small volumes is proving a key technology which is presently enabling advances in fields as different as material science, optoelectronics, chemistry, quantum technologies, and many others.

In particular, the focussed light can be made to interact extremely strongly with matter, making electromagnetism non-perturbative. Light can then be used to modify the properties of the materials it interacts with, thus becoming a powerful tool for material science. Light can be effectively woven into novel materials.

Scientists discovered that light could no longer be confined in the system below a critical dimension, of the order of 250nm in the sample under study, when the experiment started exciting propagating plasmons. This caused waves of electrons to move away from the resonator and spill the energy of the photon.

Experiments performed in the group of Professors Jrme Faist and Giacomo Scalari at ETH Zrich had obtained results that could not be interpreted with state-of-the-art understanding of light-matter coupling. The physicists approached Southamptons School of Physics and Astronomy, where researchers led theoretical analysis and built a novel theory able to quantitatively reproduce the results.

Professor De Liberato believes the newfound limits could yet be exceeded by future experiments, unlocking dramatic technological advances that hinge on ultra-confined electromagnetic fields.

Read Exploring the Quantitative Limits of LightMatter Coupling at the Nanoscale for more on this research.

Reference: Polaritonic nonlocality in lightmatter interaction by Shima Rajabali, Erika Cortese, Mattias Beck, Simone De Liberato, Jrme Faist and Giacomo Scalari, 9 August 2021, Nature Photonics.DOI: 10.1038/s41566-021-00854-3

The rest is here:

New Fundamental Limit of Trapping and Exploiting Light at the Nanoscale - SciTechDaily

Read More..

The best way to win a Nobel is to get nominated by another laureate – The Economist

Oct 9th 2021

THE NOBEL prizes, whose winners are announced this month (see Science), may be the worlds most coveted awards. As soon as a new crop of laureates is named, critics start comparing the victors achievements with those of previous winners, reigniting debates over past snubs.

Your browser does not support the

Enjoy more audio and podcasts on iOS or Android.

A full account of why, say, Stephen Hawking was passed over will have to wait until 2068: the Nobel Foundations rules prevent disclosure about the selection process for 50 years. But once this statute of limitations ends, the foundation reveals who offered nominations, and whom they endorsed. Its data start in 1901 and end in 1953 for medicine; 1966 for physics, chemistry and literature; and 1967 for peace. (The economics prize was first awarded in 1969.)

Nomination lists do not explain omissions like Leo Tolstoy (who got 19 nominations) or Mahatma Gandhi (who got 12). But they do show that in 1901-66, Nobel voters handed out awards more in the style of a private members club than of a survey of expert opinion. Whereas candidates with lots of nominations often fell short, those with the right backerslike Albert Einstein or other laureatesfared better.

The bar to a Nobel nomination is low. For the peace prize, public officials, jurists and the like submit names to a committee, chosen by Norways parliament, that picks the winner. For the others, Swedish academies solicit names from thousands of people, mostly professors, and hold a vote for the laureate. On average, 55 nominations per year were filed for each prize in 1901-66.

Historically, voters paid little heed to consensus among nominators. In literature and medicine, the candidate with the most nominations won just 11% and 12% of the time; in peace and chemistry, the rates were 23% and 26%. Only in physics, at 42%, did nomination leaders have a big advantage. In 1956 Ramn Menndez Pidal, a linguist and historian, got 60% of nominations for the literature prize, but still lost.

However, voters did make one group of nominators happy: current and future laureates. Candidates put forward by past victors went on to win at some point in the future 40% more often than did those whose nominators never won a Nobel. People whose nominators became laureates later on also won unusually often. This implies that being accomplished enough to merit future Nobel consideration was sufficient to gain extra influence over voters.

In theory, this imbalance could simply reflect laureates nominating stronger candidates. However, at least one Nobel winner seems to have boosted his nominees chances, rather than merely naming superstars who would have won anyway.

According to the Nobel Foundations online archive, all 11 of Einsteins nominees won a prize. Some were already famous, like Max Planck; others, like Walther Bothe, were lesser-known. In two cases, his support seems to have been decisive.

In 1940 Einstein supported Otto Stern, a physicist who had already had 60 nominations. Stern won the next time the prize was given. Similarly, Wolfgang Pauli, whose exclusion principle is central to quantum mechanics, had received 20 nominations before Einstein backed him in 1945. He got his prize that same year.

Source: Nobel Foundation

This article appeared in the Graphic detail section of the print edition under the headline "Noblesse oblige"

See the article here:

The best way to win a Nobel is to get nominated by another laureate - The Economist

Read More..

Top 10 Quantum Computing Workshop and Conferences to Attend in 2021 – Analytics Insight

As you know, quantum computing is a type of computation that harnesses the collective properties of quantum states, such as superposition, interference, and entanglement, to perform calculations. To discuss the future of quantum computing there are some workshops and conferences taking place in 2021 that every person should attend.

Here are the top ten quantum computing workshops and conferences:

IEEE Quantum Week the IEEE International Conference on Quantum Computing and Engineering (QCE) is bridging the gap between the science of quantum computing and the development of an industry surrounding it. IEEE Quantum Week is a multidisciplinary quantum computing and engineering venue that gives attendees the unique opportunity to discuss challenges and opportunities with quantum researchers, scientists, engineers, entrepreneurs, and more.

The International Conference on Quantum Communication ICQOM 2021 will take place at the Jussieu campus in Paris, France from the 18th to the 22nd of October 2021. The scope of the conference is focused on Quantum Communications, including theoretical and experimental activities related to Quantum Cryptography and Quantum Networks in a broad sense.

Quantum Techniques in Machine Learning (QTML) is an annual international conference focusing on the interdisciplinary field of quantum technology and machine learning. The goal of the conference is to gather leading academic researchers and industry players to interact through a series of scientific talks focused on the interplay between machine learning and quantum physics.

The 23rd Annual SQuInT Workshop is co-organized by the Center for Quantum Information and Control (CQuIC) at the University of New Mexico (UNM) and the Oregon Center for Optical Molecular and Quantum Science (OMQ) at the University of Oregon (UO). The last date of registration is October 11, 2021.

Keysight World 2021 will be held as a virtual conference. As part of a track focusing on Driving the Digital Transformation, there will be a session titled Pushing the Envelope on Quantum Computing that will include panel sessions with authorities from Rigetti, Google, IQC, and Keysight.

The Quantum Startup Foundry at the University of Maryland will be holding an Investment Summit for quantum startups to showcase their companies to potential investors on October 12-13, 2021. The focus of the event is to link the most promising early- and growth-stage companies with investors and informing key stakeholders about the unique aspects of investing in quantum.

The Inside Quantum Technology (IQT) Fall Conference will be held as a hybrid conference, both in-person and online, in New York City. The conference will be a gathering of business leaders, product developers, marketing strategists, and investors anywhere in the world focused on quantum technology.

The annual Chicago Quantum Summit engages scientific and government leaders, the industries that will scale and drive the applications of emerging quantum research, and the trainees that will lead this future. Focusing on fostering a domestic and international community, experts discuss the future of quantum information science and technology research, the companies in the quantum ecosystem, and strategies to educate and build tomorrows quantum workforce.

The Quantum Computing Summit Silicon Valley organized by Informa Tech will occur on November 3-4, 2021. It will run alongside the AI Summit that has been designed to provide business, technical, research, academic, and innovation insight, qualified via application-based quantum experiences to showcase how quantum is delivering real business value, drive process efficiency, and cost optimization.

The Optical Society (OSA) will hold its Quantum Information and Measurement VI as a virtual conference. The conference topics will cover the latest in theoretical developments and experimental implementations of quantum information technology, including the advanced engineering needed to realize such technologies. In addition to the conferences traditional focus on quantum optics and more.

Read more:

Top 10 Quantum Computing Workshop and Conferences to Attend in 2021 - Analytics Insight

Read More..

A key part of the Big Bang remains troublingly elusive – Popular Science

It all started with a bang. During an unimaginably brief fraction of a second, the embryonic universe ballooned in size with unimaginable swiftness. In a flash, dimples of imperfection stretched into cosmic scars and locked in the universe we experience today, a milieu filled with galaxies, stars, planets, and humans.

The circumstantial evidence for this origin story, known as inflation, is overwhelming. It has inspired a generation of cosmologists to write papers, teach classes, and publish textbooks about the sundry ways inflation could have played out. And yet, a smoking gun remains elusive: Ancient ripples in spacetime should have left a particular imprint on the sky, but searches have repeatedly come up short.

A group of astronomers known as the BICEP/Keck collaboration leads the hunt for these primordial gravitational waves. On Monday the researchers released their latest results, the culmination of years of painstaking labor in one of the harshest places on Earth. Once more, they found no sign of their quarry. If an inflating universe reverberated with gravitational wavesas most cosmologists still fully expect it did it must have done so in a rather subtle way.

The simplest flavors, we are right now ruling out, says Clem Pryke, an astrophysicist at the University of Minnesota and member of the BICEP/Keck collaboration. [This result] is killing previously very popular theories of inflation.

[Related: What did the universe look like just after the big bang?]

The oldest light in the universe has been streaming through space for more than 13 billion years, ever since the cosmos cooled enough to become transparent. Astronomers have precisely mapped this Cosmic Microwave Background (CMB) and used it to learn that the universe was, and has remained, strikingly uniform overall. The CMB indicates that when the universe was just 380,000 years old, it had nearly the same density of matter everywhere. And today, astronomers see galaxies in every direction.

But the CMB is ever so slightly clumpy, and it clumps in a special way. Dense and thin spots come in all sizes, from very small to very large. Today we see a related pattern, from single galaxies to giant mega clusters of them.

How did the universe get this way? Inflation winds back the clock even further, attempting to explain how lumps of all sizes developed during the cosmoss first 0.00000000000000000000000000000000001 of a second. During this period, the minuscule universe seethed with energy, and quantum theory ruled the day. In the quantum realm, nothing holds perfectly steady. Subatomic jitters continuously introduced tiny flaws into the inflating universe, tweaking the density of substances that would eventually become light, matter, dark matter, and more. The growing universe continuously stretched these blemishes, even as newer, smaller fluctuations kept appearing, resulting in blips of all sizes. Eventually, the CMB recorded the final product. Inflation naturally produces lumpiness of exactly that type, Pryke says.

Or so the story goes. Inflation has become the leading theory of the birth of the cosmos because it explains exactly what astronomers see when they study the large-scale patterns formed by matter, dark matter, and more.

But one pattern has eluded them. The fabric of spacetime itself cannot hold perfectly still at the quantum scale, and inflation should have stretched those initial tremors into proper waves just as it did with matter and everything else. These primordial gravitational waves would have left faint fingerprints in the CMB, specific whorls in the light known as B-mode polarization. Astronomers have the capability to directly detect these whorls today, if the pattern is prominent enough, but have yet to find any.

Frustratingly, the universe is awash in materials that shine in a similarly whirly way. The BICEP team triumphantly announced the discovery of primordial gravitational waves in 2014, for instance, only to later learn that they had picked up the dim heat glow of dust grains streaming along the magnetic fields that fill the space between stars in the Milky Way.

The BICEP/Keck collaboration has now spent years refining their methods and building a series of telescopes at the south pole, where the crisp and arid air offers a crystal-clear view of the cosmos. Their newest results blend data from the last three generations of their Antarctic telescopes with other experiments.

For more than a decade, they have increased the number of sensors from dozens to thousands. And crucially, they have expanded the set of colors in which they observe, from one wavelength to three. Any B-mode swirls in the CMB, which fills the entire universe, should show up evenly in all wavelengths. Polarization that comes through more strongly at different wavelengths, however, can be blamed on local dust.

The key measure of how much inflation rattled the universe goes by the name tensor-to-scalar ratio, or r to those in the field. This single number describes how forcefully space time rippled compared to other fluctuations. An r of zero would imply that inflation didnt rock the fabric of the cosmos at all, suggesting that cosmology textbooks might need to rip out their first chapter.

BICEP/Keck observations have successively lowered the ceiling for primordial gravitational waves, showing that r should be smaller than 0.09 in 2016 and less than 0.07 in 2018. With the latest results, published in Physical Review Letters, the collaboration states with 95 percent confidence that r should be less than 0.036, a value that makes one commonly studied class of inflationary models impossible.

The shrinking limit for gravitational waves has obliged theorists to crouch lower and lower, but plenty of riffs on the general theme of inflation still fit comfortably below the new BICEP/Keck roof. The situation is getting cozy though, and if the limit falls below 0.01, many inflation researchers will start to sweat.

Its pretty hard to get a value less than that in any basic textbook model of inflation, theorist Marc Kamionkowski of John Hopkins University told Physics Today in 2019.

[Related: Wait a second: What came before the big bang?]

The nearly hundred members of the BICEP/Keck collaboration plan to reach that level of precision in a matter of years. They are currently building a new array of four telescopes, which should allow for a measurement of r up to three times as precise. By the end of the decade, a mega-collaboration between BICEP/Keck and other CMB teams known as CMB-S4 should get a few times more sensitive still, limiting r to roughly 0.001.

Many cosmologists hope that primordial gravitational waves will show up in one of these ever-sharper images of the CMB, proving that theorists really do have a handle on the universes initial bang. If not, the theory may languish in limbo a bit longer. It would take an r ten times lower still to cull most straggler inflationary models, and that would require experimentalists like Pryke to once again come up with even better ways to measure the nearly imperceptible ripples theorists forecast.

From an experimental point of view, it just seems unobtainable, he says. But when I got into the business 20 years ago, measuring B modes at all seemed ridiculous.

See original here:

A key part of the Big Bang remains troublingly elusive - Popular Science

Read More..

New chess engine: Fisherov 0.98c NNUE

Fisherov - UCI chess engine (NNUE)Rating JCER = 3542

Fisherov is a chess engine derived from Stockfish. Code has been added in various parts to achieve a different style, which becomes more evident by deactivating his neural networks (although this would cause him to significantly lose his strength but he would gain in aggressiveness, which would be interesting for human training or to make him compete against other non-neural chess engines).

For greater robustness and competitiveness, the use of a neural network must be left activated (as it already comes by default). The engine already has a network incorporated, although there is also the possibility of using another network from the UCI options.

This engine was developed by Luis Damin Primo and Andres Ivan Primo residents of the province of Crdoba, Argentina.

Meaning of the name: Fisherov = Stock"fish" + Fisch"er" + Kaspar"ov".

Fisherov 0.98c download-Engine is not publicly available and is non-commercial

"Due to a request from the Stockfish team in doubt about compliance with the GNU General Public License, Version 3 (GPL) of the Fisherov engine will not be available to the public. Until the situation is clarified, the engine will be used for testing purposes only and the results of the tournaments with the Fisherov engine can be found on our website.

At the same time, people who have downloaded previous versions of the Fisherov engine, please do not share the files with other people.

Chess Engines Diary Team"

Excerpt from:
New chess engine: Fisherov 0.98c NNUE

Read More..

RIT professor and team discover new method to measure motion of superfluids – RochesterFirst

HENRIETTA, N.Y (WROC) According to the Rochester Institute of Technology, Mishkat Bhattacharya, an associate professor at RITsSchool of Physics and AstronomyandFuture Photon Initiative, proposed a new method for detecting superfluid motion in anarticle published inPhysical Review Letters.

Bhattacharyas theoretical team on the paper consisted of RIT postdoctoral researchers Pardeep Kumar and Tushar Biswas, and alumnus Kristian Feliz 21 (physics). The international collaborators consisted of professors Rina Kanamoto from Meiji University, Ming-Shien Chang from the Academia Sinica, and Anand Jha from the Indian Institute of Technology. Bhattacharyas work was supported by aCAREER Award from the National Science Foundation.

The laser is shined through the superfluid in a minimally destructive manner, and the system can then read how the superfluid and light react, so the subatomic movements can be observed and studied.

This new research represents the first time that scientists will be able to get a closer look at how this seemingly-physics-defying moves. As scientists understand this wonder-liquid, they can start to harness it to make incredibly efficient power generation.

Bhattacharyas measuring method can also be used in quantum information processing.

So, clearly, theres a lot going on there. Lets break it down.

A superfluid is a gas or a liquid that can move without viscosity, or internal friction.

This means that the particles dont jostle each other, Bhattacharya said. Theyre not elbowing each other, or colliding.

Water for example has a very low viscosity. Its easy to imagine how quickly and smoothly water flows, compared to a highly viscous fluid, like maple syrup.

Its difficult to imagine, but a superfluid has zero internal friction.

This means that it slows down at an incredibly slow rate, meaning that once the gas or liquid is set in motion, its nearly impossible to stop. It also means that this movement of the particles doesnt lose energy like other processes of friction.

Slamming the brakes on your car introduces a lot of friction, and everyone knows that there is a lot of sound and heat that is given off. That is the release of energy when friction is applied. Superfluids dont have this.

This unusual trait can be harnessed practically if an electrical current is applied to it.

If you can get something to flow, its like current going around in a circle, Bhattacharya said.

That means that unlike normal electrical circuits that get incredibly hot when they are used to capacity, these atomtronic circuits with superfluid dont.

We dont really understand the physics of this, Bhattacharya said.

Part and parcel with this lack of understanding is that the only known superfluids like liquid helium only reach that state when they are supercooled. Needless to say, our cell phones would be massive and unusable if they needed a supercooler to use them.

Bhattacharya says that if someone can discover a superfluid that works at room temperature, they would not only win a Nobel prize, but they would revolutionize technology as we know it.

He says tests in Germany have shown that this technology admittedly with the supercooled superfluid can power entire towns in an economically feasible way.

Youd have to ask a computer scientist, Bhattacharya said when asked how much more powerful our phones would get. But it would be reasonable to say one hundred or one thousand times more powerful.

The challenge in creating a room temperature superfluid is that as Bhattacharya alluded to, physicists dont quite understand how superfluids really work, beyond the visually observable macro effects, like seeing it infinitely loop in a closed circuit, or the creep effect of liquid helium.

But to begin to understand how superfluids work, Bhattacharya and his team decided they needed a way to measure its subatomic movement, using quantum physics.

If you think about the particles of which the fluid is made, as little balls, it is impossible to explain what it is, without realizing that it also acts as a wave, he said.

So since an electrically charged superfluid circuit acts more like a wave rather than a particle, because of its lack of friction and electrical charges, it becomes a quantum object. Which means then that even the incredibly weak pressure force of a light wave will destroy the object, making it impossible to observe.

Bhattacharya and his team worked their way around this problem, by calibrating their laser light source to be a different wavelength than the superfluid that they are observing.

This minimally destructive method allows them to observe the incredibly small effect that the laser has, and by studying that wiggle, they can begin to determine how superfluid moves.

Once they understand how it moves, they can begin to figure out how to engineer a superfluid that stays in that state at room temperature.

Interestingly, this measuring method also has that application.

Scientists have begun to encode information on a paritcular kind of quantum particular that wiggles in a particular way. That can not only storage vast amounts of information, but move information at the speed of light.

Bhattacharya says that fiber optics does this is some manner, but the optics are too impure to move at that speed, and even the most advance fiber optics need signal boosters and repeaters at fairly regular intervals.

While quantum light processing made need those as well, the information would still be moving at the speed of light.

But his measuring technology can begin to more precisely measure the quantum wiggle, allowing them to figure out how to store the information longer.

It started at 1 millionth of a second, Bhattacharya said. Its now up to 60 seconds of storage. Thats seven orders of magnitude greater.

With Bhattacharya and his team on it, we may have those 1,000 times stronger cell phones in no time.

Read more:

RIT professor and team discover new method to measure motion of superfluids - RochesterFirst

Read More..

Eric Appel: Gels are changing the face of engineering … and medicine – Stanford University News

Readers of Eric Appels academic profile will note appointments in materials science, bioengineering and pediatrics, as well as fellowship appointments in the ChEM-H institute for human health research and the Woods Institute for the Environment. While the breadth of these appointments does not leap to mind as being particularly consistent, the connections quickly emerge for those who hear Appel talk about his research.

Appel is an expert in gels, those wiggly, jiggly materials that arent quite solid, but not quite liquid either. Gels in-betweenness is precisely what gets engineers like Appel excited about them. Appel has used gels for everything from new-age fire retardants that can proactively prevent forest fires to improved drug and vaccine delivery mechanisms for everything from diabetes to COVID-19. Hence the appointments across engineering and medicine.

Listen in with host and bioengineer Russ Altman as Appel explains to Stanford Engineerings The Future of Everything podcast why gels could be the future of science. Listen and subscribe here.

Russ Altman, the Kenneth Fong Professor of Bioengineering, of genetics, of medicine (general medical discipline), of biomedical data science and, by courtesy, of computer science.

Eric Appel, Assistant Professor of Material Science and Engineering, by courtesy, of Pediatrics (Endocrinology) and Center Fellow, by courtesy, at The Woods Institute For The Environment

See the rest here:

Eric Appel: Gels are changing the face of engineering ... and medicine - Stanford University News

Read More..