Page 3,336«..1020..3,3353,3363,3373,338..3,3503,360..»

The Future of Computing: Hype, Hope, and Reality – CIOReview

Bill Reichert, Partner, Pegasus Tech Ventures

For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has changed tremendously, and software has evolved accordingly. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades.

All that is changing.

The same advances in semiconductor fabrication technology that powered Moores Lawthe exponential increase in the power of computers over the last several decadeshave enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.

At the same time, software engineering is also progressing. Marc Andreessen has famously said, Software is eating the world. What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software.

Heterogeneous Computing

New architectures, however,require that both software engineers and hardware engineers work together. A new class of hardware is emerging that takes advantage of what is called heterogeneous computing, multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change.

AI Chips

With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency.

Virtually all the progress in computing over the past 30 years has been thanks to hardware, not software

Edge Computing

The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the cloud and support thousands, even millions, of users. But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. Now companies are looking to process data at the edge, at the point of collection, rather than sending it up to the cloud, thereby reducing latency and cutting bandwidth and storage costs. At its simplest level, edge computing filters out unimportant data and sends only the most important data to the cloud. For more complex tasks, such as autonomous driving, edge computing requires processing massive AI models and making very accurate judgments in milliseconds. For these tasks, the new special-purpose chips discussed above and below are fighting for design wins.

Analog Computing

As brilliant as binary code is for enabling absolutely precise calculations, the real world is analog, not digital, and many compute tasks could be done more efficiently if we could operate with analog values rather than having to digitize them. But analog computing is imprecise, and most computing problems require exact values, not approximate values. (How much money do you have in your bank account?) Some problems, like AI inference and monitoring sensor data, do not need six sigma precision to get the right answer or make the right decision. Companies like Mythic, Analog Inference, and Aspinity are incorporating analog computing architectures into their chips to make them up to 100x more efficient solving problems involving data from our analog world.

Photonic Computing

Light has been used for digital communications and computer networks for decades, but using photons to do the math and putting photonic processors on a chip are extremely challenging. That is what several startups are trying to do. Spinning technologies out of MIT and Princeton, three companies, Lightelligence, Lightmatter, and Luminous Computing, are racing to commercialize the first photonic chip for doing AI inference at the edge.

Neuromorphic Computing

In spite of what the media portrays as the imminent cyber-apocalypse where robots rebel against their human masters and take over the world, we are a long way away from the science fiction world imagined in popular culture. The fact is that the human brain is still massively more powerful and efficient than the most powerful supercomputers on earth. But computer scientists think there is a path to create an artificial brain. The branch of artificial intelligence that uses neural network mathematical frameworks to compute information in a manner similar to the human brain is sometimes referred to as neuromorphic, because it mimics human neuro-biology. But researchers have been working on models that even more closely mimic the human brain in its design and efficiency. The brain sends signals as electrochemical spikes, not digital bytes, and the brains roughly 86 billion neurons are interconnected in a way that is very different from transistors on a chip. Researchers at Stanford, Intel, IBM, and several startup companies, such as Rain Neuromorphics and BrainChip, are trying to develop hardware and software that uses neuromorphic principles to deliver very high-power computing on very small semiconductor chips.

Quantum Computing

Almost certainly the most radical initiative in computing is the attempt to harness the potential of quantum computing. At the subatomic level, particles of matter behave in extraordinary and wonderful ways they can exist in more than one state simultaneously, and they can entangle with one another across a distance without any apparent physical connection. It turns out that electronic devices like transistors and diodes wouldnt even work if the universe were strictly Newtonian. If we can figure out how to control the quantum properties of light and matter the way we figured out how to use gears to make adding machines and transistors to make computers, we will be able to make quantum computers that are as superior to current supercomputers as supercomputers are to adding machines.

Some people say we are still a long way away from quantum supremacy, when quantum computers can solve problems that no classical computer can solve. But recent advances indicate that we may not be that far away from quantum advantage, when quantum computers can solve certain specialized problems faster than classical computers.

Already big players like IBM, Google, Intel, Honeywell, and Microsoft are demonstrating machines that can execute quantum algorithms and startups like Rigetti Computing,IonQ, and PsiQuantum are joining the race, along with quantum software companies like QC Ware, Cambridge Quantum Computing, and Zapata Computing. Big corporations and governments are investing in projects that will take advantage of the power of quantum computing in chemistry, pharmaceuticals, finance, logistics, failure analysis, and artificial intelligence.

Each of these emerging technologies promises to significantly advance computing, and with these advances will come new technology leaders. The evolution of computing has given rise to multiple generations of spectacular success stories like IBM, Intel, Microsoft, Nvidia, Google, and Amazon Web Services. Most of these companies are trying to reinvent themselves to catch the next wave of computing technology, but certainly new companies will emerge in these new sectors, and some famous names will founder and go the way of the dinosaurs, like Univac, Digital Equipment, MIPS, and Silicon Graphics. Meanwhile, corporate CIOs will have to decide where to place their bets and start investing in these new technologies, if they havent already.

More:
The Future of Computing: Hype, Hope, and Reality - CIOReview

Read More..

Rare magnetism found in the world’s strongest material – Live Science

Graphene, one of the world's strongest materials, isn't normally magnetic. But when stacked and twisted, graphene develops a rare form of magnetism, new research finds.

The magnetic field isn't created by the usual spin of electrons within the individual graphene layers, but instead arises from the collective swirling of electrons in all of the three-layers of the stacked graphene structure, researchers reported Oct. 12 in the journal Nature Physics.

Graphene is a material made of a single layer (or monolayer) of carbon atoms arranged in a honeycomb pattern. It's incredibly light and strong (though it is vulnerable to cracking). It also conducts electricity, making it exciting for use in electronics and sensors.

Related: Elementary, my dear: 8 little-known elements

"We wondered what would happen if we combined graphene monolayers and bilayers into a twisted three-layer system," Cory Dean, a physicist at Columbia University in New York and one of the senior authors on the new paper, said in a statement. "We found that varying the number of graphene layers endows these composite materials with some exciting new properties that had not been seen before."

Dean and his colleagues stacked two layers of graphene and then added a single layer on top, rotating the stack by 1 degree. They then studied this graphene sandwich in a variety of circumstances, including temperatures just above absolute zero (the point at which all molecular motion stops). At these low temperatures, they found that the graphene stopped conducting electricity and became an insulator instead.

They also found that they could control the properties of the twisty stack of graphene by applying an electric field. When the electric field was oriented in one direction, the system acted like a twisted double layer of graphene. When they reversed the field, the stack took on the properties of a twisted four-layer graphene structure.

Perhaps strangest of all was the rare magnetism that appeared in the three-layer structure. A study published by another group in the journal Advanced Materials found that graphene bonded with boron nitride can give rise to a strange magnetic field; that field arose from the molecular bonds of the carbon in graphene and the boron in boron nitride. The new research reveals that this same type of magnetism can occur in pure graphene alone, simply because of interactions between carbon molecules.

"Pure carbon is not magnetic," study co-author Matthew Yankowitz, a physicist at the University of Washington in Seattle, said in the statement. "Remarkably, we can engineer this property by arranging our three graphene sheets at just the right twist angles."

The structure also contains regions where the properties are undisturbed by the twisting of the layer. These unique areas in the material could be exploited for data storage or quantum computing applications, study co-author Xiaodong Xu, also at the University of Washington, said in the statement.

The researchers are now planning to delve deeper into the fundamental properties of the graphene structure. "This is really just the beginning," Yankowitz said.

Originally published on Live Science.

Continue reading here:
Rare magnetism found in the world's strongest material - Live Science

Read More..

Room-temperature superconductivity has been achieved for the first time – MIT Technology Review

Room-temperature superconductorsmaterials that conduct electricity with zero resistance without needing special coolingare the sort of technological miracle that would upend daily life. They could revolutionize the electric grid and enable levitating trains, among many other potential applications. But until now, superconductors have had to be cooled to extremely low temperatures, which has restricted them to use as a niche technology (albeit an important one). For decades it seemed that room-temperature superconductivity might be forever out of reach, but in the last five years a few research groups around the world have been engaged in a race to attain it in the lab.

One of them just won.

In a paper published today in Nature, researchers report achieving room-temperature superconductivity in a compound containing hydrogen, sulfur, and carbon at temperatures as high as 58 F (13.3 C, or 287.7 K). The previous highest temperature had been 260 K, or 8 F, achieved by a rival group at George Washington University and the Carnegie Institution in Washington, DC, in 2018. (Another group at the Max Planck Institute for Chemistry in Mainz, Germany, achieved 250 K, or -9.7 F, at around this same time.) Like the previous records, the new record was attained under extremely high pressuresroughly two and a half million times greater than that of the air we breathe.

Its a landmark, says Jos Flores-Livas, a computational physicist at the Sapienza University of Rome, who creates models that explain high-temperature superconductivity and was not directly involved in the work. In a couple of years, he says, we went from 200 [K] to 250 and now 290. Im pretty sure we will reach 300.

Electric currents are flowing electric charges, most commonly made up of electrons. Conductors like copper wires have lots of loosely bound electrons. When an electric field is applied, those electrons flow relatively freely. But even good conductors like copper have resistance: they heat up when carrying electricity.

Superconductivityin which electrons flow through a material without resistancesounds impossible at first blush. Its as though one could drive at high speed through a congested city center, never hitting a traffic light. But in 1911, Dutch physicist Heike Kamerlingh Onnes found that mercury becomes a superconductor when cooled to a few degrees above absolute zero (about -460 F, or -273 C). He soon observed the phenomenon in other metals like tin and lead.

For many decades afterwards, superconductivity was created only at extremely low temperatures. Then, in late 1986 and early 1987, a group of researchers at IBMs Zurich laboratory found that certain ceramic oxides can be superconductors at temperatures as high as 92 Kcrucially, over the boiling temperature of liquid nitrogen, which is 77 K. This transformed the study of superconductivity, and its applications in things like hospital MRIs, because liquid nitrogen is cheap and easy to handle. (Liquid helium, though colder, is much more finicky and expensive.) The huge leap in the 1980s led to feverish speculation that room-temperature superconductivity might be possible. But that dream had proved elusive until the research being reported today.

One way that superconductors work is when the electrons flowing through them are coupled to phononsvibrations in the lattice of atoms the material is made out of. The fact that the two are in sync, theorists believe, allows electrons to flow without resistance. Low temperatures can create the circumstances for such pairs to form in a wide variety of materials. In 1968, Neil Ashcroft, of Cornell University, posited that under high pressures, hydrogen would also be a superconductor. By forcing atoms to pack closely together, high pressures change the way electrons behave and, in some circumstances, enable electron-phonon pairs to form.

Scientists have for decades sought to understand just what those circumstances are, and to figure out what other elements might be mixed in with hydrogen to achieve superconductivity at progressively higher temperatures and lower pressures.

In the work reported in todays paper, researchers from the University of Rochester and colleagues first mixed carbon and sulfur in a one-to-one ratio, milled the mixture down to tiny balls, and then squeezed those balls between two diamonds while injecting hydrogen gas. A laser was shined at the compound for several hours to break down bonds between the sulfur atoms, thus changing the chemistry of the system and the behavior of electrons in the sample. The resulting crystal is not stable at low pressuresbut it is superconducting. It is also very smallunder the high pressures at which it superconducts, it is about 30 millionths of a meter in diameter.

The exact details of why this compound works are not fully understoodthe researchers arent even sure exactly what compound they made. But they are developing new tools to figure out what it is and are optimistic that once they are able to do so, they will be able to tweak the composition so that the compound might remain superconducting even at lower pressures.

Getting down to 100 gigapascalabout half of the pressures used in todays Nature paperwould make it possible to begin industrializing super tiny sensors with very high resolution, Flores-Livas speculates. Precise magnetic sensors are used in mineral prospecting and also to detect the firing of neurons in the human brain, as well as in fabricating new materials for data storage. A low-cost, precise magnetic sensor is the type of technology that doesnt sound sexy on its own but makes many others possible.

And if these materials can be scaled up from tiny pressurized crystals into larger sizes that work not only at room temperature but also at ambient pressure, that would be the beginning of an even more profound technological shift. Ralph Scheicher, a computational modeler at Uppsala University in Sweden, says that he would not be surprised if this happened within the next decade.

The ways in which electricity is generated, transmitted, and distributed would be fundamentally transformed by cheap and effective room-temperature superconductors bigger than a few millionths of a meter. About 5% of the electricity generated in the United States is lost in transmission and distribution, according to the Energy Information Administration. Eliminating this loss would, for starters, save billions of dollars and have a significant climate impact. But room-temperature superconductors wouldnt just change the system we havetheyd enable a whole new system. Transformers, which are crucial to the electric grid, could be made smaller, cheaper, and more efficient. So too could electric motors and generators. Superconducting energy storage is currently used to smooth out short-term fluctuations in the electric grid, but it still remains relatively niche because it takes a lot of energy to keep superconductors cold. Room-temperature superconductors, especially if they could be engineered to withstand strong magnetic fields, might serve as very efficient way to store larger amounts of energy for longer periods of time, making renewable but intermittent energy sources like wind turbines or solar cells more effective.

And because flowing electricity creates magnetic fields, superconductors can also be used to create powerful magnets for applications as diverse as MRI machines and levitating trains. Superconductors are of great potential importance in the nascent field of quantum computing, too. Superconducting qubits are already the basis of some of the worlds most powerful quantum computers. Being able to make such qubits without having to cool them down would not only make quantum computers simpler, smaller, and cheaper, but could lead to more rapid progress in creating systems of many qubits, depending on the exact properties of the superconductors that are created.

All these applications are in principle attainable with superconductors that need to be cooled to low temperatures in order to work. But if you have to cool them so radically, you lose manyin some cases allof the benefits you get from the lack of electrical resistance. It also makes them more complicated, expensive, and prone to failure.

It remains to be seen whether scientists can devise stable compounds that are superconducting not only at ambient temperature, but also at ambient pressure. But the researchers are optimistic. They conclude their paper with this tantalizing claim: A robust room-temperature superconducting material that will transform the energy economy, quantum information processing and sensing may be achievable.

See the original post:
Room-temperature superconductivity has been achieved for the first time - MIT Technology Review

Read More..

Global quantum computing market is projected to register a healthy CAGR of 29.5% in the forecast period of 2019 to 2026. – re:Jerusalem

An all inclusive report will suit business requirements in many ways while also assisting in informed decision making and smart working. Company profiles of the key market competitors are analysed with respect to company snapshot, geographical presence, product portfolio, and recent developments. To figure out market landscape, brand awareness, latest trends, possible future issues, industry trends and customer behaviour, the finest market research report is very essential.Market research report provides myriad of benefits for a prosperous business.This report is the best to gain a competitive advantage in this quickly transforming marketplace.

Data Bridge Market Research recently released GlobalQuantum ComputingMarket research with more than 250 market data tables and figures and an easy to understand TOC in Global Quantum Computing Market research, so you can get a variety of ways to maximize your profits.Quantum Computingpredicted until 2026.The Quantum Computing market research report classifies the competitive spectrum of this industry in elaborate detail. The study claims that the competitive reach spans the companies of

Access Insightful Study about Quantum Computing market! Click Here to Get FREE PDF Sample Market Analysis:https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-quantum-computing-market&sc

Be the first to knock the door showing potential that Global Quantum Computing market is holding in it. Uncover the Gaps and Opportunities to derive most relevant insights from our research document to gain market size.

Global Quantum Computing Market :

Global quantum computing market is projected to register a healthyCAGR of 29.5% in the forecast period of 2019 to 2026.

On the off chance that you are associated with the Quantum Computing Analytics industry or mean to be, at that point this investigation will give you far reaching standpoint. Its crucial you stay up with the latest Quantum Computing Market segmented by:If you are involved in the Quantum Computing industry or intend to be, then this study will provide you comprehensive outlook. Its vital you keep your market knowledge up to datesegmentedBy System (Single Qubit Quantum System and Multiple Qubit System), Qubits (Trapped Ion Qubits, Semiconductor Qubits and Super Conducting), Deployment Model (On-Premises and Cloud), Component (Hardware, Software and Services), Application (Cryptography, Simulation, Parallelism, Machine Learning, Algorithms, Others), Logic Gates (Toffoli Gate, Hadamard Gate, Pauli Logic Gates and Others), Verticals (Banking And Finance, Healthcare & Pharmaceuticals, Defence, Automotive, Chemical, Utilities, Others) and Geography (North America, South America, Europe, Asia- Pacific, Middle East and Africa) Industry Trends and Forecast to 2026

Unlock new opportunities with DBMR reports to gain insightful analyses about the Quantum Computing market and have a comprehensive understanding. Learn about the market strategies that are being adopted by your competitors and leading organizations also potential and niche segments/regions exhibiting promising growth.

New vendors in the market are facing tough competition from established international vendors as they struggle with technological innovations, reliability and quality issues. The report will answer questions about the current market developments and the scope of competition, opportunity, cost and more.

According to the Regional Segmentation the Main Bearing Market provides the Information covers following regions:

The key countries in each region are taken into consideration as well, such as United States, Canada, Mexico, Brazil, Argentina, Colombia, Chile, South Africa, Nigeria, Tunisia, Morocco, Germany, United Kingdom (UK), the Netherlands, Spain, Italy, Belgium, Austria, Turkey, Russia, France, Poland, Israel, United Arab Emirates, Qatar, Saudi Arabia, China, Japan, Taiwan, South Korea, Singapore, India, Australia and New Zealand etc.

Market Dynamics:

Set of qualitative information that includes PESTEL Analysis, PORTER Five Forces Model, Value Chain Analysis and Macro Economic factors, Regulatory Framework along with Industry Background and Overview.

Some of the Major Highlights of TOC covers:

Chapter 1: Methodology & Scope

Definition and forecast parameters

Methodology and forecast parameters

Data Sources

Chapter 2: Executive Summary

Business trends

Regional trends

Product trends

End-use trends

Chapter 3: Quantum Computing Industry Insights

Industry segmentation

Industry landscape

Vendor matrix

Technological and innovation landscape

Chapter 4: Quantum Computing Market, By Region

North America

South America

Europe

Asia-Pacific

Middle East and Africa

Chapter 5: Company Profile

Business Overview

Financial Data

Product Landscape

Strategic Outlook

SWOT Analysis

Thanks for reading this article, you can also get individual chapter wise section or region wise report version like North America, Europe or Asia.

BROWSE FREE | TOC with selected illustrations and example pages of Global Quantum Computing Market @https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-quantum-computing-market&sc

In addition, the years considered for the study are as follows:

Historical year 2014-2019 | Base year 2019 | Forecast period 2020 to 2027

Key Insights that Study is going to provide:

The 360-Quantum Computing overview based on a global and regional level

Market Share & Sales Revenue by Key Players & Emerging Regional Players

Competitors In this section, various Quantum Computing industry leading players are studied with respect to their company profile, product portfolio, capacity, price, cost, and revenue.

A separate chapter on Market Entropy to gain insights on Leaders aggressiveness towards market [Merger & Acquisition / Recent Investment and Key Developments]

Patent Analysis** No of patents / Trademark filed in recent years.

A complete and useful guide for new market aspirants

Forecast information will drive strategic, innovative and profitable business plans and SWOT analysis of players will pave the way for growth opportunities, risk analysis, investment feasibility and recommendations

Supply and Consumption In continuation of sales, this section studies supply and consumption for the Quantum Computing Market. This part also sheds light on the gap between supply and consumption. Import and export figures are also given in this part

Production Analysis Production of the Quantum Computing is analyzed with respect to different regions, types and applications. Here, price analysis of various Quantum Computing Market key players is also covered.

Sales and Revenue Analysis Both, sales and revenue are studied for the different regions of the Quantum Computing Market. Another major aspect, price, which plays an important part in the revenue generation, is also assessed in this section for the various regions.

Other analyses Apart from the information, trade and distribution analysis for the Quantum Computing Market

Competitive Landscape:Company profile for listed players with SWOT Analysis, Business Overview, Product/Services Specification, Business Headquarter, Downstream Buyers and Upstream Suppliers.

May vary depending upon availability and feasibility of data with respect to Industry targeted

Inquire for further detailed information of Global Quantum Computing Market Report @https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-quantum-computing-market&sc

Key questions answered in this report-:

Research Methodology: Global Quantum Computing Market

Primary Respondents:OEMs, Manufacturers, Engineers, Industrial Professionals.

Industry Participants:CEOs, V.P.s, Marketing/Product Managers, Market Intelligence Managers and, National Sales Managers.

About Data Bridge Market Research:

An absolute way to forecast what future holds is to comprehend the trend today!

Data Bridge set forth itself as an unconventional and neoteric Market research and consulting firm with unparalleled level of resilience and integrated approaches. We are determined to unearth the best market opportunities and foster efficient information for your business to thrive in the market. Data Bridge endeavors to provide appropriate solutions to the complex business challenges and initiates an effortless decision-making process.

Data Bridge adepts in creating satisfied clients who reckon upon our services and rely on our hard work with certitude. We are content with our glorious 99.9 % client satisfying rate.

Contact:

Data Bridge Market ResearchUS: +1 888 387 2818UK: +44 208 089 1725Hong Kong: +852 8192 7475Email:Corporatesales@databridgemarketresearch.com

Read more:
Global quantum computing market is projected to register a healthy CAGR of 29.5% in the forecast period of 2019 to 2026. - re:Jerusalem

Read More..

Max Planck and the Birth of Quantum Mechanics – SciTechDaily

From left to right: Walther Nernst, Albert Einstein, Max Planck, Robert Andrews Millikan, and Max von Laue at a dinner given by von Laue on November 12, 1931, in Berlin.

In the early evening of Sunday, October 7, 1900120 years agoMax Planck found the functional form of the curve that we now know as the Planck distribution of black-body radiation. By my account, it was the birthdate of quantum mechanics.

A few hours earlier Hermann Rubens and his wife had visited the Plancks. This being a Sunday, they probably enjoyed coffee and cake together. Rubens was the experimental professor of physics at Humboldt University in Berlin where Planck was the theoretical one. Rubens and his collaborator, Ferdinand Kurlbaum, had recently managed to measure the power emitted by a black body as a function of temperature at the unusually long wavelength of 51 microns. They had used multiple reflections from rock salt to filter a narrow band of the spectrum. Working at 51 microns, they measured the low temperature limit and the highest temperatures within the experimental reach of their oven. The remarkable result was that at low frequencies, in the classical regime, the results did not fit the predictions of Wilhelm Wien. Rubens told Planck that for small frequencies the measured spectral density was linear with temperature.

Planck was intrigued. As soon as the gathering ended, he set to work. His interest in the data was profound. That evening he figured out the shape of the curve, with its peculiar denominator that in the limit of low frequency showed the appropriate experimental behaviorlinear with temperature.

The anecdote, as referred by Abraham Pais in his book Subtle is the Lord, states that Planck mailed a postcard to Rubens with the function that very evening, so that Rubens would get it first thing in the morning (the post would have been delivered and set on his desk by the time he arrived at his office in the university). Rubens probably asked Planck that very same morning: Why is it this shape?

The presentation of new data, followed by Plancks function, was on October 17. The function fit the data, both at the low temperature and high temperature limits. Planck had been interested on the black body spectrum for a long time. He understood thermodynamics and classical electrodynamics. But it was the high-quality data of Rubens that drove his mind to find a solution. It took him a few months, and on Dec. 14 he presented the derivation of his theory where, on an act of desperation, he introduced the quantum of energy: the beginning of quantum mechanics.

In memory of Mario Molina.

This historical note was written by JQI Fellow Luis Orozco.

Read the original here:

Max Planck and the Birth of Quantum Mechanics - SciTechDaily

Read More..

Reality Does Not Depend on the Measurer According to New Interpretation of Quantum Mechanics – SciTechDaily

For 100 years scientists have disagreed on how to interpret quantum mechanics. A recent study by Jussi Lindgren and Jukka Liukkonen supports an interpretation that is close to classical scientific principles.

Quantum mechanics arose in the 1920s and since then scientists have disagreed on how best to interpret it. Many interpretations, including the Copenhagen interpretation presented by Niels Bohr and Werner Heisenberg and in particular von Neumann-Wigner interpretation, state that the consciousness of the person conducting the test affects its result. On the other hand, Karl Popper and Albert Einstein thought that an objective reality exists. Erwin Schrdinger put forward the famous thought experiment involving the fate of an unfortunate cat that aimed to describe the imperfections of quantum mechanics.

Photo: Jukka Liukkonen (left) and Jussi Lindgren (right) describe Heisenbergs uncertainty principle. Credit: Aalto University

In their most recent article, Finnish civil servants Jussi Lindgren and Jukka Liukkonen, who study quantum mechanics in their free time, take a look at the uncertainty principle that was developed by Heisenberg in 1927. According to the traditional interpretation of the principle, location and momentum cannot be determined simultaneously to an arbitrary degree of precision, as the person conducting the measurement always affects the values.

However, in their study Lindgren and Liukkonen concluded that the correlation between a location and momentum, i.e. their relationship, is fixed. In other words, reality is an object that does not depend on the person measuring it. Lindgren and Liukkonen utilized stochastic dynamic optimization in their study. In their theorys frame of reference, Heisenbergs uncertainty principle is a manifestation of thermodynamic equilibrium, in which correlations of random variables do not vanish.

But is an explanation really an explanation, if its a vague one? Jussi Lindgren

The results suggest that there is no logical reason for the results to be dependent on the person conducting the measurement. According to our study, there is nothing that suggests that the consciousness of the person would disturb the results or create a certain result or reality, says Jussi Lindgren.

This interpretation supports such interpretations of quantum mechanics that support classical scientific principles.

The interpretation is objective and realistic, and at the same time as simple as possible. We like clarity and prefer to remove all mysticism, says Liukkonen.

The researchers published their last article in December 2019, which also utilized mathematical analysis as a tool to explain quantum mechanics. The method they used was stochastic optimal control theory, which has been used to solve such challenges as how to send a rocket from the Earth to the Moon.

Following Occams razor, the law of parsimony named after William of Ockham, the researchers have now chosen the simplest explanation from those that fit.

We study quantum mechanics as a statistical theory. The mathematical tool is clear, but some might think it is a boring one. But is an explanation really an explanation, if its a vague one? asks Lindgren.

In addition to the study of quantum mechanics, Lindgren and Liukkonen have many other things in common: they were both members of the same maths club at Kuopio Lyceum High School, they both have done post-graduate research, and both have careers as civil servants. Liukkonen has already finished his PhD dissertation on endoscopic ultrasound on joints and now works as an inspector at Radiation and Nuclear Safety Authority.

Physics is a great hobby for a civil servant. Together we have agonized over how the interpretations of quantum mechanics make no sense, says Liukkonen.

Lindgrens dissertation currently consists of various mathematical articles trying to explain quantum mechanics. He works full-time as a ministerial adviser at Prime Ministers Office where he has been negotiating such issues as the EUs recovery plan. A decade ago, he also participated in negotiations on Greeces loan guarantees, as a junior official.

Lindgren and Liukkonens idea of a paradise is a festival conference that would combine short films with lectures on quantum physics.

Physicists and artists could find new ways to work together after all, both areas are manifestations of creativity, says Lindgren.

Reference: The Heisenberg Uncertainty Principle as an Endogenous Equilibrium Property of Stochastic Optimal Control Systems in Quantum Mechanics by Jussi Lindgren and Jukka Liukkonen, 17 September 2020, Symmetry.DOI: 10.3390/sym12091533

Read the original:

Reality Does Not Depend on the Measurer According to New Interpretation of Quantum Mechanics - SciTechDaily

Read More..

Could Schrdinger’s cat exist in real life? Our research may soon provide the answer – The Conversation AU

Have you ever been in more than one place at the same time? If youre much bigger than an atom, the answer will be no.

But atoms and particles are governed by the rules of quantum mechanics, in which several different possible situations can coexist at once.

Quantum systems are ruled by whats called a wave function: a mathematical object that describes the probabilities of these different possible situations.

And these different possibilities can coexist in the wave function as what is called a superposition of different states. For example, a particle existing in several different places at once is what we call spatial superposition.

Its only when a measurement is carried out that the wave function collapses and the system ends up in one definite state.

Generally, quantum mechanics applies to the tiny world of atoms and particles. The jury is still out on what it means for large-scale objects.

In our research, published today in Optica, we propose an experiment that may resolve this thorny question once and for all.

In the 1930s, Austrian physicist Erwin Schrdinger came up with his famous thought experiment about a cat in a box which, according to quantum mechanics, could be alive and dead at the same time.

In it, a cat is placed in a sealed box in which a random quantum event has a 5050 chance of killing it. Until the box is opened and the cat is observed, the cat is both dead and alive at the same time.

In other words, the cat exists as a wave function (with multiple possibilities) before its observed. When its observed, it becomes a definite object.

After much debate, the scientific community at the time reached a consensus with the Copenhagen interpretation. This basically says quantum mechanics can only apply to atoms and molecules, but cant describe much larger objects.

Turns out they were wrong.

In the past two decades or so, physicists have created quantum states in objects made of trillions of atoms large enough to be seen with the naked eye. Although, this has not yet included spatial superposition.

Read more: Experiment shows Einstein's quantum 'spooky action' approaches the human scale

But how does the wave function become a real object?

This is what physicists call the quantum measurement problem. It has puzzled scientists and philosophers for about a century.

If there is a mechanism that removes the potential for quantum superposition from large-scale objects, it would require somehow disturbing the wave function and this would create heat.

If such heat is found, this implies large-scale quantum superposition is impossible. If such heat is ruled out, then its likely nature doesnt mind being quantum at any size.

If the latter is the case, with advancing technology we could put large objects, maybe even sentient beings, into quantum states.

Physicists dont know what a mechanism preventing large-scale quantum superpositions would look like. According to some, its an unknown cosmological field. Others suspect gravity could have something to do with it.

This years Nobel Prize winner for physics, Roger Penrose, thinks it could be a consequence of living beings consciousness.

Read more: 2020 Nobel Prize in physics awarded for work on black holes an astrophysicist explains the trailblazing discoveries

Over the past decade or so, physicists have been feverishly seeking a trace amount of heat which would indicate a disturbance in the wave function.

To find this out, wed need a method that can suppress (as perfectly as is possible) all other sources of excess heat that may get in the way of an accurate measurement.

We would also need to keep an effect called quantum backaction in check, in which the act of observing itself creates heat.

In our research, weve formulated such an experiment, which could reveal whether spatial superposition is be possible for large-scale objects. The best experiments thus far have not been able to achieve this.

Our experiment would use resonators at much higher frequencies than have been used. This would remove the issue of any heat from the fridge itself.

As was the case in previous experiments, we would need to use a fridge at 0.01 degrees kelvin above absolute zero. (Absoloute zero is the lowest temperature theoretically possible).

With this combination of very low temperatures and very high frequencies, vibrations in the resonators undergo a process called Bose condensation.

You can picture this as the resonator becoming so solidly frozen that heat from the fridge cant wiggle it, not even a bit.

We would also use a different measurement strategy that doesnt look at the resonators movement at all, but rather the amount of energy it has. This method would strongly suppress backaction heat, too.

Read more: Seven common myths about quantum physics

But how would we do this?

Single particles of light would enter the resonator and bounce back and forth a few million times, absorbing any excess energy. They would eventually leave the resonator, carrying the excess energy away.

By measuring the energy of the light particles coming out, we could determine if there was heat in the resonator.

If heat was present, this would indicate an unknown source (which we didnt control for) had disturbed the wave function. And this would mean its impossible for superposition to happen at a large scale.

The experiment we propose is challenging. Its not the kind of thing you can casually set up on a Sunday afternoon. It may take years of development, millions of dollars and a whole bunch of skilled experimental physicists.

Nonetheless, it could answer one of the most fascinating questions about our reality: is everything quantum? And so, we certainly think its worth the effort.

As for putting a human, or cat, into quantum superposition theres really no way for us to know how this would effect that being.

Luckily, this is a question we dont have to think about, for now.

Read the original here:

Could Schrdinger's cat exist in real life? Our research may soon provide the answer - The Conversation AU

Read More..

Bringing the promise of quantum computing to nuclear physics – MSUToday

Quantum mechanics, the physics of atoms and subatomic particles, can be strange, especially compared to the everyday physics of Isaac Newtons falling apples. But this unusual science is enabling researchers to develop new ideas and tools, including quantum computers, that can help demystify the quantum realm and solve complex everyday problems.

Thats the goal behind a new U.S. Department of Energy Office of Science (DOE-SC) grant, awarded to Michigan State University (MSU) researchers, led by physicists at Facility for Rare Isotope Beams (FRIB). Working with Los Alamos National Laboratory, the team is developing algorithms essentially programming instructions for quantum computers to help these machines address problems that are difficult for conventional computers. For example, problems like explaining the fundamental quantum science that keeps an atomic nucleus from falling apart.

The $750,000 award, provided by the Office of Nuclear Physics within DOE-SC, is the latest in a growing list of grants supporting MSU researchers developing new quantum theories and technology.

The aim is to improve the efficiency and scalability of quantum simulation algorithms, thereby providing new insights on their applicability for future studies of nuclei and nuclear matter, said principal investigator Morten Hjorth-Jensen, an FRIB researcher who is also a professor in MSUs Department of Physics and Astronomy and a professor of physics at the University of Oslo in Norway.

Morten Hjorth-Jensen (Credit: Hilde Lynnebakken)

Although this grant focuses on nuclear physics, the algorithms it yields could benefit other fields looking to use quantum computings promise to more rapidly solve complicated problems. This includes scientific disciplines such as chemistry and materials science, but also areas such as banking, logistics, and data analytics.

There is a lot of potential for transferring what we are developing into other fields, Hjorth-Jensen said. Hopefully, our results will lead to an increased interest in theoretical and experimentaldevelopments of quantum information technologies. All the algorithms developed as part of this work will be publicly available, he added.

What makes quantum computers attractive tools for these applications is a freedom afforded by quantum mechanics.

Classical computers are constrained to a binary system of zeros and ones with transistors that are either off or on. The restrictions on quantum computers are looser.

Instead of transistors, quantum computers use technology called qubits (pronounced q-bits) that can be both on and off at the same time. Not somewhere in between, but in both opposite states at once.

Combined with the proper algorithms, this freedom enables quantum computers to run certain calculations much faster than their classical counterparts. The type of calculations, for instance, capable of helping scientists explain precisely how swarms of elementary particles known as quarks and gluons hold atomic nuclei together.

"It is really hard to do those problems, said Huey-Wen Lin, a co-investigator on the grant. I dont see a way to solve them any time soon with classical computers.

Huey-Wen Lin

Lin is an assistant professor in the Department of Physics and Astronomy and the Department of Computational Mathematics, Science and Engineering at MSU.

She added that quantum computers wont solve these problems immediately, either. But the timescales could be measured in years rather than careers.

Hjorth-Jensen believes this project will also help accelerate MSUs collaborations in quantum computing. Formally, this grant supports a collaboration of eight MSU researchers and staff scientist Patrick Coles at Los Alamos National Laboratory.

But Hjorth-Jensen hopes the project will spark more discussions and forge deeper connections with the growing community of quantum experts across campus and prepare the next generation of researchers. The grant will also open up new opportunities in quantum computing training for MSU students who are studying in the nations top-ranked nuclear physics graduate program.

The grant, titled From Quarks to Stars: A Quantum Computing Approach to the Nuclear Many-Body Problem, was awarded as part of Quantum Horizons: Quantum Information Systems Research and Innovation for Nuclear Science," a funding opportunity issued by DOE-SC.

Hjorth-Jensen and Lin are joined on this grant by their MSU colleagues Alexei Bazavov and Matthew Hirn from the Department of Computational Mathematics, Science and Engineering; Scott Bogner, Heiko Hergert, Dean Lee and Andrea Shindler from FRIB, and the Department of Physics and Astronomy. Hirn is also an assistant professor in the Department of Mathematics.

MSU is establishing FRIB as a new user facility for the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. Under construction on campus and operated by MSU, FRIB will enable scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security, and industry.

The U.S. Department of Energy Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of todays most pressing challenges. For more information, visit energy.gov/science.

Read the original post:

Bringing the promise of quantum computing to nuclear physics - MSUToday

Read More..

A Force From Nothing Used to Control and Manipulate Objects – SciTechDaily

Depiction of the clamping device and how it works. Credit: Jake Pate, UC Merced

A collaboration between researchers from The University of Western Australia and The University of California Merced has provided a new way to measure tiny forces and use them to control objects.

The research, published recently in Nature Physics,was jointly led by Professor Michael Tobar, from UWAs School of Physics, Mathematics and Computing and Chief Investigator at the Australian Research Council Centre of Excellence for Engineered Quantum Systems and Dr. Jacob Pate from the University of Merced.

Professor Tobar said that the result allowed a new way to manipulate and control macroscopic objects in a non-contacting way, allowing enhanced sensitivity without adding loss.

Once thought to be of only academic interest, this tiny force known as the Casimir force is now drawing interest in fields such as metrology (the science of measurement) and sensing.

If you can measure and manipulate the Casimir force on objects, then we gain the ability to improve force sensitivity and reduce mechanical losses, with the potential to strongly impact science and technology, Professor Tobar said.

We have now shown its also possible to use the force to do cool things. But to do that, we need to develop precision technology that allows us control and manipulate objects with this force. Professor Michael Tobar

To understand this, we need to delve into the weirdness of quantum physics. In reality a perfect vacuum does not exist even in empty space at zero temperature, virtual particles, like photons, flicker in and out of existence.

These fluctuations interact with objects placed in vacuum and are actually enhanced in magnitude as temperature is increased, causing a measurable force from nothing otherwise known as the Casimir force.

This is handy because we live at room temperature. We have now shown its also possible to use the force to do cool things.But to do that, we need to develop precision technology that allows us control and manipulate objects with this force.

Professor Tobar said researchers were able to measure the Casimir force and manipulate the objects through a precision microwave photonic cavity, known as a re-entrant cavity, at room-temperature, using a setup with a thin metallic membrane separated from the re-entrant cavity, exquisitely controlled to roughly the width of a grain of dust.

Because of the Casimir force between the objects, the metallic membrane, which flexed back and forth, had its spring-like oscillations significantly modified and was used to manipulate the properties of the membrane and re-entrant cavity system in a unique way, he said.

This allowed orders of magnitudes of improvement in force sensitivity and the ability to control the mechanical state of the membrane.

Reference: Casimir spring and dilution in macroscopic cavity optomechanics by J. M. Pate, M. Goryachev, R. Y. Chiao, J. E. Sharping and M. E. Tobar, 3 August 2020, Nature Physics.DOI: 10.1038/s41567-020-0975-9

View post:

A Force From Nothing Used to Control and Manipulate Objects - SciTechDaily

Read More..

The Week of October 12, 2020 – FYI: Science Policy News

Research Groups Seek Rollback of Diversity TrainingRestrictions

A Sept. 22 executive order restricting certain kinds of diversity and inclusion training has created confusion for universities and federal contractors, spurring some institutions to suspendtraining programs and postpone planned events. Federal agencies have also been instructed to suspend all diversity training programs pending a review of compliance with the order. Dozens of higher education associations sent a letter to President Trump last week requesting he withdraw the order, saying it has a chilling effect on campus efforts to ensure non-discriminatory workplaces and requires an unprecedented expansive review of internal training materials at both public and private entities. Separately, 50 scientific societies, including AIP, sent a letter to the White House last week denouncing the order, arguing it wrongfully insinuates that certain trainings are inherently anti-American and "sends a message of division, intolerance, and subjectivity that is damaging to our R&D community.

On Oct. 8, the New England Journal of Medicine published an editorial blasting the U.S. response to the COVID-19 pandemic and, while not mentioning President Trump specifically, appealed to voters to cast out current federal government leaders. Calling them dangerously incompetent, the editorial argues those leaders have undercut trust in science and in government, causing damage that will certainly outlast them. The top-tier medical journal has not previously made such an exhortation to voters in its 208 year history. On Oct. 5, the United Kingdom-based journal Nature published a news feature surveying ways the Trump administration has damaged science, touching on issues such as the pandemic response, climate change, environmental regulation, and immigration policy. Citing policy experts, the article also reports that the administration has, across agencies, undermined scientific integrity by suppressing or distorting evidence to support political decisions. The journal has not taken an editorial position on the election, but its editors also announced last week that they plan to increase coverage of global politics and publish more political science research, partly due tosigns that politicians around the world are pushing back against the principle of protecting scholarly autonomy, or academic freedom. The two journals are the latest prestigious science publications to cast Trump as corrosive to science and science-informed policy. In recent weeks, the editor-in-chief of Science has excoriated Trump for lying about the pandemic, while Scientific American made its first-ever presidential endorsement, backing Democratic candidate Joe Biden. (Update: Nature has since endorsed Biden.)

On Oct. 2, former National Oceanic and Atmospheric Administration heads Conrad Lautenbacher and Jane Lubchenco wrote to the agency on behalf of an ocean policy advocacy group, expressing alarm over the recent appointments of climatologist David Legates and meteorologist Ryan Maue to high-level positions there. The appointments have attracted criticism because Legates and Maue have often dismissed mainstream views about the severity of anthropogenic climate change, and E&E News has reported the Trump administration expects its new appointees to influence the agencys work on climate and the next interagency National Climate Assessment. Lautenbacher and Lubchenco led NOAA during the administrations of Presidents George W. Bush and Barack Obama, respectively, and while Lubchenco has often protested Trump administration actions, Lautenbacher has been more reserved. Justifying their intervention, the two wrote, We cannot be silent on this we are concerned that the freedom of NOAA scientists to communicate honestly and openly about the impacts of climate change, the future of honest and accurate weather forecasting, objective fisheries management, disaster response, and much more will be further curtailed if these appointments go forward.

The Departments of Labor and Homeland Security issued rules last week that together increase the wages employers must offer workers seeking H-1B visas and require the applicants degree to more closely match their job category, among other changes. Both departments cite the increased unemployment caused by COVID-19 as justification for the rules taking effect immediately without a public notice and comment period. The H-1B visa program is used by many technology companies and universities to hire workers in STEM fields, but it has come under criticism in recent years that is largely focused on alleged abuses of the program by certain information technology companies. President Trump has already suspended issuance of H-1B visas through the end of the year, though a federal judge partially blocked the policy on Oct. 1.

The House Intelligence Subcommittee on Strategic Technologies and Advanced Research released a report last week recommending ways the U.S. can maintain a leading role in developing emerging technologies such as artificial intelligence, quantum computing, and biotechnology. Among its proposals, the report calls for the federal government to expand spending on basic research and couple those investments with changes to how the intelligence community organizes, establishes relationships, and sets priorities for R&D. The report also argues that the emphasis often placed on competition with China represents an overly narrow view, and stresses that the subcommittees recommendations are generally not calls for the hierarchy, direction, and centralized control that characterize Chinese innovation efforts [and instead] reflect the ideas of openness, flexibility and agility that gave rise to American innovative success from Los Alamos to Silicon Valley. The reportrecommends a number of moves to bolster the intelligence community workforce, including by creating a STEM fellowship program and reforming U.S. immigration policies. Subcommittee Chair Jim Himes (D-CT) is discussing the report at an event on Thursday.

Last week, the National Quantum Coordination Office rolled out its official logo and website quantum.gov, which collects strategy documents and updates about the National Quantum Initiative. The office also released a report summarizing frontier research areas in quantum information science and announced the inaugural meeting of the National Q12 Education Partnership, an effort to introduce students to QIS concepts at earlier grade levels. The White House established the coordination office last year, as required by the National Quantum Initiative Act, to keep tabs on the governments growing portfolio of QIS research centers and workforce development efforts. The office is led by physicist Charles Tahan, who is on detail from the National Security Agencys Laboratory for Physical Sciences, where he is chief scientist. Tahan also serves as co-chair of the newly established National Quantum Initiative Advisory Committee, which is holding its first meeting on Oct. 27.

Here is the original post:

The Week of October 12, 2020 - FYI: Science Policy News

Read More..