Page 318«..1020..317318319320..330340..»

Spending on Shared Cloud Infrastructure Continues to Lead the Way in Enterprise Infrastructure Investments … – IDC

NEEDHAM, Mass., March 28, 2024 According to the International Data Corporation (IDC) Worldwide Quarterly Enterprise Infrastructure Tracker: Buyer and Cloud Deployment, spending on compute and storage infrastructure products for cloud deployments, including dedicated and shared IT environments, increased 18.5% year over year in the fourth quarter of 2023 (4Q23) to $31.8 billion. Spending on cloud infrastructure continues to outgrow the non-cloud segment with the latter growing 16.4% year over year in 4Q23 to $18.9 billion. The cloud infrastructure segment saw unit shipments decline 22.8% in the quarter with an increase in average selling prices (ASPs) mostly related to higher than usual GPU server shipments to hyperscalers.

"Cloud infrastructure spending continues to accelerate towards more robust configurations mainly fueled by the explosion of AI-related investments," said Juan Pablo Seminara, research director, Worldwide Enterprise Infrastructure Trackers at IDC. "Even though some caution remains on the socio-political side, the improvement in economic prospects contribute to a very positive spending outlook for 2024 and 2025 where cloud-based spending is expected to rebound at double-digit growth rates."

Spending on shared cloud infrastructure reached $22.8 billion in the quarter, increasing 27.0% compared to a year ago. The shared cloud infrastructure category continues to capture the largest share of spending compared to dedicated deployments and non-cloud spending. In 4Q23, shared cloud accounted for 44.9% of total infrastructure spending. The dedicated cloud infrastructure segment saw modest growth of 1.4% year over year in 4Q23 to $9.0 billion.

For 2024, IDC is forecasting cloud infrastructure spending to grow 19.3% compared to 2023 to $129.9 billion. Non-cloud infrastructure is expected to decline 1.4% to $57.6 billion. Shared cloud infrastructure is expected to grow 21.6% year over year to $95.3 billion for the full year while spending on dedicated cloud infrastructure is expected to have robust growth of 13.3% in 2024 to $34.6 billion for the full year. The subdued growth forecast for non-cloud infrastructure, which is forecast to decline 1.4% year over year in 2024, reflects the expectation that the market still faces some challenges. Cloud spending will remain very positive due to new and existing mission-critical workloads, which often require higher-end, performance-oriented systems.

IDC's service provider category includes cloud service providers, digital service providers, communications service providers, hyperscalers, and managed service providers. In 4Q23, service providers as a group spent $30.0 billion on compute and storage infrastructure, up 19.6% from the prior year. This spending accounted for 59.2% of the total market. Non-service providers (e.g., enterprises, government, etc.) also increased their spending to $20.7 billion, growing 15.2% year over year. IDC expects compute and storage spending by service providers to reach $124.3 billion in 2024, growing 21.8% year over year.

On a geographic basis, year-over-year spending on cloud infrastructure in 4Q23 showed mixed results, with China, the Middle East, and Canada showing negative growth led by China with a decline of 31.1%, mainly affected by an economy still under pressure in the fourth quarter of 2023. The Middle East & Africa saw spending decline 12.2% due to difficult year-over-year comparison that resulted from large projects at the end of the prior year. Spending in Canada declined 4.4% year over year. The regions with increased spending in 4Q23 were Asia/Pacific (excluding Japan and China), the United States, Central & Eastern Europe, Japan, Western Europe, and Latin America, where cloud spending grew at 48.2%, 40,6%, 11.3%, 10.5%, 2.7%, and 1.5% year over year, respectively. Most of this growth was related to large high-performance computing and AI-based projects.

Long term, IDC predicts spending on cloud infrastructure to have a compound annual growth rate (CAGR) of 12.8% over the 2023-2028 forecast period, reaching $199.1 billion in 2028 and accounting for 73.6% of total compute and storage infrastructure spend. Shared cloud infrastructure spending will account for 71.8% of the total cloud spending in 2028, growing at a 12.8% CAGR and reaching $143.0 billion. Spending on dedicated cloud infrastructure will grow at a CAGR of 12.9% to $56.1 billion. Spending on non-cloud infrastructure will also rebound with a 4.1% CAGR, reaching $71.4 billion in 2028. Spending by service providers on compute and storage infrastructure is expected to grow at a 13.1% CAGR, reaching $188.5 billion in 2028.

IDC's Worldwide Quarterly Enterprise Infrastructure Tracker: Buyer and Cloud Deployment is designed to provide clients with a better understanding of what portion of the compute and storage hardware markets are being deployed in cloud environments. The Tracker breaks out each vendors' revenue into shared and dedicated cloud environments for historical data and provides a five-year forecast. This Tracker is part of the Worldwide Quarterly Enterprise Infrastructure Tracker, which provides a holistic total addressable market view of the four key enabling infrastructure technologies for the datacenter (servers, external enterprise storage systems, and purpose-built appliances: HCI and PBBA).

Taxonomy Notes

IDC defines cloud services more formally through a checklist of key attributes that an offering must manifest to end users of the service.

Shared cloud services are shared among unrelated enterprises and consumers; open to a largely unrestricted universe of potential users; and designed for a market, not a single enterprise. The shared cloud market includes a variety of services designed to extend or, in some cases, replace IT infrastructure deployed in corporate datacenters; these services in total are called public cloud services. The shared cloud market also includes digital services such as media/content distribution, sharing and search, social media, and e-commerce.

Dedicated cloud services are shared within a single enterprise or an extended enterprise with restrictions on access and level of resource dedication and defined/controlled by the enterprise (and beyond the control available in public cloud offerings); can be onsite or offsite; and can be managed by a third-party or in-house staff. In dedicated cloud that is managed by in-house staff, "vendors (cloud service providers)" are equivalent to the IT departments/shared service departments within enterprises/groups. In this utilization model, where standardized services are jointly used within the enterprise/group, business departments, offices, and employees are the "service users."

For more information about IDC's Quarterly Enterprise Infrastructure Tracker: Buyer & Cloud Deployment, please contact Lidice Fernandez at lfernandez@idc.com.

About IDC Trackers

IDC Tracker products provide accurate and timely market size, vendor share, and forecasts for hundreds of technology markets from more than 100 countries around the globe. Using proprietary tools and research processes, IDC's Trackers are updated on a semiannual, quarterly, and monthly basis. Tracker results are delivered to clients in user-friendly Excel deliverables and on-line query tools.

Click here to learn about IDCs full suite of data products and how you can leverage them to grow your business.

About IDC

International Data Corporation (IDC) is the premier global provider of market intelligence, advisory services, and events for the information technology, telecommunications, and consumer technology markets. With more than 1,300 analysts worldwide, IDC offers global, regional, and local expertise on technology, IT benchmarking and sourcing, and industry opportunities and trends in over 110 countries. IDC's analysis and insight helps IT professionals, business executives, and the investment community to make fact-based technology decisions and to achieve their key business objectives. Founded in 1964, IDC is a wholly owned subsidiary of International Data Group (IDG), the world's leading tech media, data, and marketing services company. To learn more about IDC, please visit http://www.idc.com. Follow IDC on Twitter at @IDC and LinkedIn. Subscribe to the IDC Blog for industry news and insights.

All product and company names may be trademarks or registered trademarks of their respective holders.

Read this article:
Spending on Shared Cloud Infrastructure Continues to Lead the Way in Enterprise Infrastructure Investments ... - IDC

Read More..

Enter the Cloud Excellence Awards 2024 – Computing

The cloud paradigm enables organisations to respond rapidly to changing market conditions and to experiment with new ideas, products and tools. It can be an incredibly efficient way to set up new infrastructure and platforms, or to share the management of parts of the IT estate the business would prefer not to keep in-house.

And like its aerial namesake, cloud computing is changing all the time, offering new opportunities to forward-looking organisations, teams and individuals.

With categories covering all aspects of the cloud from vendor, partner and customer angles, there will surely be something for every organisation. And with these awards covered in Computing itself, your success is shared not just with those present on the awards night itself but with the entirety of the brand's audience.

Our 2024 winners will be announced at an exclusive awards ceremony in London on 18th September, the perfect opportunity to celebrate your success and reward your teams for their hard work.

Entries for the awards are now open and will close on 14th June, so make sure to get our entries in on time to be celebrated as an industry-leading cloud innovator.

Click here to visit the awards website and here to start your entries.

Read the original post:
Enter the Cloud Excellence Awards 2024 - Computing

Read More..

Amazon to ‘invest $150bn in data centers’ for AI growth – ReadWrite

Amazon is reportedly gearing up to invest nearly $150 billion over the next 15 years in data centers. This substantial financial commitment will equip the cloud-computing giant with the necessary resources to manage a projected rise in demand for AI applications and various digital services.

Bloomberg reports that this investment is a strategic display of dominance, to preserve Amazons leading position in the cloud services sector. Amazon currently holds about twice the market share of its closest competitor, Microsoft Corp.

An Amazon spokesperson confirmed to ReadWrite that the figures were based on its recent infrastructure announcements found on its website.

However, Amazon Web Services experienced its slowest sales growth on record last year, as corporate clients reduced expenses and postponed upgrades. Now, as spending begins to rebound, Amazon is eagerly securing land and energy for its energy-intensive operations.

In the last two years, Bloomberg said its calculations indicate that Amazon has pledged $148 billion towards the building and operation of data centers globally. The company aims to expand its server farm locations in northern Virginia and Oregon, and venture into new areas such as Mississippi, Saudi Arabia, and Malaysia.

Despite the expansion, AWS saw a 2% decrease in its data center investments in 2023, marking its first reduction, even as Microsoft ramped up its expenditures by over 50%, as reported by DellOro Group. However, Amazons Chief Financial Officer announced last month that there would be an uptick in capital investments this year to fuel AWSs expansion, encompassing projects related to artificial intelligence.

As we look forward to 2024, we anticipate capex to increase year over year, primarily driven by increased infrastructure capex to support growth of our AWS business, including additional investments in generative AI and large language models, said CFO Brian Olsavsky.

While Amazons expansion of its data centers aims to cater to the growing need for corporate services, its focus on sophisticated, high-cost chips will provide the substantial computing power needed for the predicted increase in generative AI.

Reports suggest that Amazon is developing proprietary tools to compete with OpenAIs ChatGPT, and has developed partnerships with various companies to enhance its AI services using its servers. As a result, Amazon expects to generate AI-related revenue amounting to tens of billions of dollars.

Featured image: Canva / Web Summit Rio

Go here to read the rest:
Amazon to 'invest $150bn in data centers' for AI growth - ReadWrite

Read More..

Microsoft says Russian companies will be forced off its cloud services within days – TechRadar

Despite recent reports that Microsoft was all set to ban Russian companies from its suite of cloud services from March 20, it turns out this still isnt in effect, but should be by the end of March 2024 - this week - instead, after the company held discussions with IT platform Softline, one of its customers.

As a reminder, the ban isnt a political move on Microsofts part, but several cloud storage providers hands being forced by economic sanctions imposed by the European Union on Russian-owned companies back in December 2023 as a result of the ongoing Russia-Ukraine conflict.

The latest update on the imminent blockade, from BleepingComputer, is that the delay so far appears to only be something that Microsoft is offering, in response to correspondence with Softline, despite the latter issuing a press release (Russian language, machine-translated by us) last week in which it claimed that it has all the necessary resources to ensure a smooth transition to its own infrastructure from Microsoft and Amazon services.

Before the extension, in a letter that Softline has since published on its Telegram channel, Microsoft broke the news gently to Softline, but stated its [commitment] to compliance with EU trade laws and regulations, as well as all other jurisdictions in which it operates.

According to Russian news agency TASS, Microsoft stands to cut off access to over 50 of its products to Russian companies, including video conferencing software behemoth Microsoft Teams and collaboration tool suite Microsoft 365.

Thats not to mention the collateral damage caused by providers such as Google and Amazon withholding their own services without postponing the deadline. BleepingComputer also revealed that business customers of those companies based in Russia received notice of service termination last week.

Its too early to say whether the sanctions will be effective in applying pressure on Russia to withdraw from the conflict: they could, for instance, merely drive the popularity of local cloud and IT providers among businesses, and fuel their expansion.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

But regardless of the European Unions ruling, there is one upside to all this: individuals and solo professionals based in Russia using cloud services from these and similar cloud services aren't affected.

See the article here:
Microsoft says Russian companies will be forced off its cloud services within days - TechRadar

Read More..

How AI Is Poised to Upend Cloud Networking – Data Center Knowledge

Much has been said about how AI will accelerate the growth of cloud platforms and enable a new generation of AI-powered tools for managing cloud environments.

But here's another facet of the cloud that AI is likely to upend: networking. As more and more AI workloads enter the cloud, the ability to deliver better cloud networking solutions will become a key priority.

Related: Why AI Workloads Probably Won't Transform the Data Center Industry

Here's why, and what the future of cloud networking may look like in the age of AI.

The reason why AI will place new demands on cloud networks is simple enough: To work well at scale, AI workloads will require unprecedented levels of performance from cloud networks.

Related: Explosion of Data in the Cloud Era Leading to Observability Complexity

That's because the data that AI workloads need to access will reside in many cases on remote servers located either within the same cloud platform where the workloads live or in a different cloud. (In some cases, the data could also live on-prem while the workloads reside in the cloud, or vice versa.)

Cloud networks will provide the essential link that connects AI workloads to data. The volumes of data will be vast in many cases (even training a simple AI model could require many terabytes' worth of information), and models will need to access the data at low latency rates. Thus, networks will need to be able to support very high bandwidth with very high levels of performance.

To be sure, AI is not the only type of cloud workload that requires great network performance. The ability to deliver low-latency, high-bandwidth networking has long been important for use cases like cloud desktops and video streaming.

Cloud vendors have also long offered solutions to help meet these network performance needs. All of the major clouds provide "direct connect" networking services that can dramatically boost network speed and reliability, especially when moving data between clouds in a multicloud architecture, or between a private data center and the public cloud as part of a hybrid cloud model.

But for AI workloads with truly exceptional network performance needs, direct connect services may not suffice. Workloads may also require optimizations at the hardware level in the form of solutions such as data processing units (DPUs), which can help process network traffic hyper-efficiently. Indeed, vendors like Nvidia, which has unveiled an Ethernet platform tailored for generative AI, are already investing in this area and it says a lot that a company mostly known for selling video cards is also recognizing that unlocking the full potential of AI requires networking hardware innovations, too.

For now, it remains to be seen exactly how cloud vendors, hardware vendors, and AI developers will respond to the special challenges that AI brings to the realm of cloud networking. But in general, it's likely that we'll see changes such as the following:

There's no way around it: If you want to take full advantage of the cloud to help host AI workloads, you need to optimize your cloud networking strategy a move that requires taking advantage of advanced networking services and hardware, while also adjusting cloud cost optimization and network performance management strategies.

For now, the solutions available to help with these goals are still evolving, but this is a space to follow closely for any business seeking to deploy AI workloads in the cloud.

About the author

Read this article:
How AI Is Poised to Upend Cloud Networking - Data Center Knowledge

Read More..

Why Amazon’s multi-billion dollar AI alliance with Anthropic isn’t the game-changer it needs to remain king of the cloud – Fortune

When Amazon announced Wednesday that it had showered the hot AI startup Anthropic with an additional $2.75 billion to complete the $4 billion investment it had announced last fall, the company positioned the news as a royal win.

Amazons AWS, the king of cloud computing (with nearly a third of global market share), was deepening its partnership with Anthropic, the number two prince (Harry, not William) of generative AI foundation models. Under the pact, Anthropic will use AWS as its primary cloud provider for mission critical workloads, and train and deploy its future AI models on Amazons homegrown chips, while AWS customers get access to future generations of Anthropics AI technology.

Look more closely however, and the deal seems less like a sign of Amazon perpetuating its cloud dominance into the Gen AI era, and more like a hint at how vulnerable the company has become in a shifting landscape.

Amazon, considered a laggard in the race to deploy generative AI technology,reallyneeds Anthropics highly-touted models, including the most recent Claude 3. At the same time though, Amazon is hitching its cart to an AI startup that, while boasting impressive technology, will not instantly distinguish or differentiate Amazon from the competition since Google is also an Anthropic partner.

Pairing up with Anthropic is a necessary and beneficial move for Amazon. But with the AWS empire under siege, the question is whether the deal is too little too late.

AWS continues to be the reigning champion of the cloud business, with 31% global market share in 2023s fourth quarter. But the race is getting tighter. Microsoft Azure, AWSs biggest cloud competitor, has edged closer with 24%, while Google Cloud has 11%. Both Microsoft and Google have seen cloud growth thanks to their Gen AI offerings the former with its powerhouse partnership with OpenAI, and the latter with its Bard and Gemini AI momentum.

According to Forrester principal analyst Tracy Woo, Amazons Gen AI efforts have not been very impressive. It took three, four months [for Amazon] to come up with any sort of generative AI-specific announcements [in 2023], Woo told Fortune, adding that the results were really lackluster.

The announcements at Amazons Re:invent conference in December 2023 including Amazon Q, a generative AI work assistant, and next generation AWS-designed chips should have been a resounding response to show that youre firmly back as the number one cloud provider that everyone looks to, Woo said. Instead, it was underwhelming especially given the competition with Microsoft, and its alliance with OpenAI.

Amazon announced the equivalent of a shiny engine, wheel and pane of glass for the windows, while Microsoft came out with a Rolls Royce marketing its Copilot offering for Azure and OpenAI models like a car that flies, it goes in the water, it is incredible.

Microsofts bread-and-butter has always been software packages and solutions that slot perfectly into the enterprise workflow, so its not surprising that the company made such a strong showing. But Amazons underwhelming announcements underscored how mismatched the cloud competitors remain when it comes to the software side of things.

The AWS strong suit has always been infrastructure. And leveraging that skillset to differentiate is one way Amazon could try to get an edge. But theres danger there too, thanks to the rise of Nvidia, whose GPUs rule the roost, and are only getting better with Blackwell, the new Nvidia AI chip announced at its GTC conference.

While its possible for Amazon to pull off an infrastructure cloud play, said Woo, the company would have to do things differently.

Everyone builds on CUDA, she said. So to ask everyone to rearrange their software architecture so they can cater to these AI-based TPUs that AWS has come up with is a huge ask.

At this point, AWS is arguably behind in both AI infrastructure and AI software, but no one should count out the cloud king, Woo emphasized. While Amazon missed the boat on recognizing that the cloud race was no longer at the infrastructure but had moved up the stack to AI-powered software solutions she added that with AWS, anything is possible.

I see this as a little bit of a desperation call from [Amazon CEO Andy] Jassy responding to his shareholders, she said. [AWS CEO] Adam Selipskyreally understands the market and so I have a lot of confidence that he can steer the ship in the right direction. Of course, Amazon has never been identified as the AI cloud king so they have a huge uphill battle ahead of them, she added. But I think they are resilient. They are an aggressive company that moves aggressively they are not fat and happy.

See the original post here:
Why Amazon's multi-billion dollar AI alliance with Anthropic isn't the game-changer it needs to remain king of the cloud - Fortune

Read More..

Russian businesses get shut out from Microsoft cloud services at the end of this month new EU sanctions come into … – Tom’s Hardware

To comply with the EU regulations outlined in December 2023, Microsoft will cease the provision of cloud services to Russian organizations at the end of this month. This was originally meant to happen on March 20, but extra time has been given to impacted organizations to migrate to alternative solutions.

As a result of the above implementation of EU sanctions, organizations in Russia will no longer have access to best-in-class Microsoft products including Office 365 apps, OneDrive, Microsoft Teams, Azure, SharePoint, Visual Studio, SQL Server, as well as LinkedIn apps and Media Player development kits.

There have been no reports indicating that Microsoft will be restricting its cloud services to individuals, and hence they remain accessible to the general Russian public for now. Meanwhile, the Russian government has made efforts to promote domestic alternatives to ensure the continued smooth operations of private companies and organizations.

The Russian Ministry of Digital Transformation, Communication, and Mass Media anticipated this kind of move by foreign cloud service providers like Microsoft, Amazon, Google, and Oracle. Thus, last year it started to advise Russian organizations to make the transition to domestically made alternatives.No specific domestic alternatives were named or pointed to - so we don't know how the affected organizations will cope. The EU regulations specifically mention imposing a ban on providing software for managing a wide range of essential business operations.

This latest imposition of sanctions comes as a result of Russia's war against Ukraine, and the aggression which has continued for over two years. Sanctions are usually used against countries in such situations with some success. Typically some tech products avoid sanctions by being shipped via third-party distributors and sold in Russia, but other important sanction policies cannot be circumvented, such as the SWIFT ban preventing online transactions via banks.

On the hardware side of tech, reports suggest that China and Russia still enjoy component imports for chipmaking equipment, PCs, and servers. Interestingly the sanctions have also provided profits for scalpers from other nations, and we have seen some smugglers get caught, their goods confiscated, and facing likely legal proceedings.Meanwhile, it is thought some Russian entities subscribe to Microsoft cloud services using foreign accounts or using other bypass methods.

Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.

Original post:
Russian businesses get shut out from Microsoft cloud services at the end of this month new EU sanctions come into ... - Tom's Hardware

Read More..

A 7-minute guide to the relationship between quantum mechanics and black holes – Big Think

Physicist Brian Cox takes us into the mind-bending world where quantum mechanics, black holes, and the future of computing converge.

In this interview, Cox shares the engineering challenges behind building quantum computers and the intricate dance of storing information in their notoriously delicate memory. However, black holes have an unexpected link to quantum information storage. Cox discusses how Planck units, holography, and redundancy could shape the future of computing.

It is a mind-expanding discussion that pushes the boundaries of our understanding. Even Cox says, Youre not meant to understand what Ive just said because I dont understand what Ive just said because nobody understands what Ive just said.

Welcome to the frontier where natures laws and technological innovation collide.

BRIAN COX:There's an engineering challenge in building quantum computers, which is how to store information in the memory of the quantum computer safely, robustly, because quantum computer memory is notoriously susceptible to any interference from the outside environment. If any of the environment in which the memory sits interacts with the memory in any way, then the information is destroyed.

And there are deep problems associated with the fact that you can't copy information in quantum mechanics, which is basically the way that your iPhone, or whatever it is, stores information and prevents errors entering into the memory of the computers that we're all familiar with; it's basically copying information. You can't do that in quantum mechanics. So it's a tremendous challenge.

Engineers have had to develop very clever algorithms and ways of trying to store information in quantum computer memory and build the memory such that it's resilient to errors. And it turns out that the solutions that are being proposed and explored look like the solutions that nature itself uses in building space and time from the quantum theory that lives on the boundary. It's really strange.

The remarkable thing for me is an intimate relationship between If we go back right to the beginning of the work on black holes in the 1970s, Jacob Bekenstein, the colleague of Stephen Hawking's actually, one of the first researchers to really begin working on black holes alongside greats like John Wheeler.

Bekenstein noticed in a simple calculation that you can answer the question, "How much information can a black hole store?" That's a strange thing to say because the model of a black hole is pure geometry, pure spacetime. Now, how does something store any information? You need some structure. You need atoms or something that can store bits of information. Well, turns out that you can calculate that a black hole stores in bits. The information content is equal to the surface area of the event horizon in square Planck units.

What's a Planck unit? It's a fundamental distance in the Universe that you can calculate by putting together things like the strength of gravity, Planck's constant, the speed of light. It's the smallest distance that we can talk about sensibly in physics as we understand it. The questions it raises: How is information stored? Why is the information content of a region of space equal to the surface area surrounding that region rather than the volume?

If I asked you, how much information can you store in your room, the room that you're sitting in now, just say it's a library, then you would say, "Well, it's to do with how many books I can fit in the room." But black holes seem to be telling us that there's something about the surface surrounding a region. This is the first glimpse, I think, of an idea called What is that?

So if you think about what a hologram is, at the very simplest level, it's a piece of film. But that piece of film contains all the information to make a three-dimensional image. It's the idea that there are different descriptions of our reality. There's one description, which is that we live in this space, the three dimensions of space, and time is a thing that ticks, and Einstein told us that they're kind of mixed up, but still you have this picture of space being this, right, the thing in which we exist.

There's an equivalent description for a very specific model called by a physicist called Maldacena, which is a dual theory that lives purely on the boundary of the space and the space itself in the interior of this region. So it's strongly suggestive that there's a deeper theory of our experience of the world, of space and time, that does not have space and time in it.

And that's one of the wonderful surprises that's really emerged from the study of black holes and the attempt to answer the very well-posed questions. I should say that the work done by Maldacena was purely mathematical. It wasn't framed in the study of black holes, although the questions ultimately seem to be intimately related.

So the study of black holes seems to be strongly suggesting that these ideas of holography, holographic universe, which came from a different region of physics, from trying to understand other things, those descriptions may be valid, maybe in some sense true. And it seems that we're beginning to glimpse an answer, at least in very simplified models- and that the information is stored on the boundary redundantly, which means that you can lose a bit of it and still fully specify the physics of the interior.

And it does seem that that's akin to, or similar to, the way that we will in the future encode information in the memory of quantum computers to protect them from errors. So I'm giving you an interpretation which, and there will be other people who have different interpretations, but it does seem that whatever this quantum theory is that underlies our reality, then there's some redundancy in the way the information is stored in that quantum theory. And it does seem that that's similar to the way that we will in the future encode information in the memory of quantum computers to protect them from errors,

And I just emphasize, you're not meant to understand what I've just said because I don't understand what I've just said because nobody understands what I've just said, right? We're catching glimpses of this theory, and that's where the research is at the moment- it's why it's tremendously exciting.

NARRATOR:Want to dive deeper? Become a "Big Think" member, and join our members-only community, watch videos early, and unlock full interviews.

View original post here:

A 7-minute guide to the relationship between quantum mechanics and black holes - Big Think

Read More..

Inside the 20-year quest to unravel the bizarre realm of ‘quantum superchemistry’ – Livescience.com

Chemistry depends on heat.

Atoms or molecules bounce around randomly, collide, and form other molecules. At higher temperatures, atoms collide more and the rate at which atoms become molecules increases. Below a certain temperature, the reaction won't happen at all.

But something very weird happens at the lowest temperatures. In this extreme cold, there is essentially no heat energy, yet chemical reactions happen faster than they do at high temperatures.

The phenomenon is called quantum superchemistry. And it was finally demonstrated last year, more than 20 years after physicists first proposed it.

In that experiment, University of Chicago physicist Cheng Chin and colleagues coaxed a group of cesium atoms at just a few nanokelvin into the same quantum state. Amazingly, each atom did not interact separately. Instead, 100,000 atoms reacted as one, almost instantaneously.

The first demonstration of this weird process has opened a window for scientists to better understand how chemical reactions operate in the strange realm of quantum mechanics, which governs the behavior of subatomic particles. It also may help to simulate quantum phenomena that classic computers struggle to model accurately, such as superconductivity.

But what happens after that, as with so many advances in research, is hard to predict. Chin, for one, has no plans to stop studying this strange form of chemistry.

Get the worlds most fascinating discoveries delivered straight to your inbox.

"No one knows how far we can go," Chin told Live Science. "It might take another 20 years. But nothing can stop us."

The term "superchemistry" was coined in 2000 to liken the phenomenon to other strange effects, like superconductivity and superfluidity, which emerge when large numbers of particles are in the same quantum state.

Unlike superconductivity or superfluidity, however, "'superchemistry' differs in that it is still barely realized, while these other phenomena have been extensively studied in experiments," Daniel Heinzen, lead author of the 2000 study and a physicist at the University of Texas at Austin, told Live Science in an email.

Heinzen and colleague Peter Drummond, who is now at the Swinburne University of Technology in Australia, were studying a special state of matter known as a Bose-Einstein condensate (BEC), in which atoms reach their lowest energy state and enter the same quantum state. In this regime, groups of atoms begin to act more like a single atom. At this small scale, particles can't be described as being in a given place or state. Rather, they have a probability of being in any given place or state, which is described by a mathematical equation known as the wave function.

In a BEC, just as Satyendra Nath Bose and Albert Einstein's work predicted, the individual wave functions of each atom become a single, collective wave function. Heinzen and Drummond realized that a group of particles with the same wave function is similar to a laser a group of photons, or packets of light, that have the same wavelength. Unlike with other light sources, the peaks and troughs of a laser's wave are aligned. This allows its photons to stay focused in a tight beam over long distances, or to be broken up into bursts as short as millionths of a billionth of a second.

Related: How do lasers work?

Similarly, Heinzen, Drummond and their colleagues showed mathematically that the atoms in a BEC should behave in ways other groups of atoms don't. Near absolute zero, where there is almost no heat energy, quantum superchemistry means the atoms in a BEC could convert, quickly and all together, to molecules: Atoms A would bond in a flash to form molecules of A2, and so forth.

The process would resemble a phase transition, Chin says, such as when liquid water freezes to ice. And, thanks to the quantum weirdness of these systems, the more atoms condensed in the BEC, the faster the reaction happens, Heinzen and Drummond's calculations predicted.

Heinzen and his research group tried to demonstrate the phenomenon with experiments for several years. But they never found convincing evidence that the effect was happening. "And then we kind of dropped it," Heinzen said.

While Heinzen abandoned the quest to demonstrate quantum superchemistry, others were still hunting for ways to turn the wild theory into experimental reality. One of them was Chin, who started working on quantum superchemistry almost immediately.

Chin was a doctoral student studying cesium atoms at cold temperatures when Heinzen and Drummond's superchemistry paper came out. "My research was totally derailed because of this new research," Chin told Live Science. He set out on what would become a 20-year quest to achieve quantum superchemistry in the lab.

It wasn't a straight path, and Chin sometimes took breaks from working toward quantum superchemistry. But he never abandoned his goal.

"Nobody knew if this was going to work out before it happened. But also nobody said it couldn't happen," he said.

After a decade of slow progress, in 2010, Chin and his colleagues figured out how to precisely tune magnetic fields onto a BEC to coax cesium atoms together to make Cs2 molecules.

"That provided the evidence of how to move forward," Chin said.

But to show quantum superchemistry was occurring, his team still needed better ways to cool and control ultracold molecules.

Nobody knew if this was going to work out before it happened. But also nobody said it couldn't happen.

Scientists typically use two techniques to push atoms and molecules to ultracold temperatures. First, lasers cool atoms to millionths of a kelvin above absolute zero. Atoms in the sample absorb photons from a laser tuned to very specific energy, thus reducing the atoms' momentum and the sample's temperature incrementally.

Next, they use evaporative cooling. The atoms in these experiments are trapped by laser light or magnetic fields. Scientists can adjust the traps to let the fastest and, therefore, hottest atoms escape. This process further cools the atoms to billionths of a kelvin, where quantum superchemistry is possible.

It was the second step that took Chin and his collaborators the longest to get right. For years, he had used bowl-shaped traps that pushed the atoms together in the middle, which raised the samples' temperature.

Six or seven years ago, his group began using a digital micromirror device to better control the shape of the trap. The result? Flat-bottomed traps, shaped something like petri dishes, where the atoms could spread out and stay ultracold.

Around 2020, Chin's group finally made a BEC of cesium molecules. They were some of the coldest molecules ever made, about ten-billionths of a degree above absolute zero. And while the team suspected quantum superchemistry had occurred, they didn't have proof.

That proof came three years later. By then, they had collected the evidence of two hallmarks of quantum superchemistry. First, the reaction was happening collectively, meaning many cesium atoms became cesium molecules at once. And second, it was reversible, meaning the atoms would become molecules, which would become atoms, and on and on.

For Chin, last year's experiments are just the beginning. They produced two-atom molecules using superchemistry. But Chin thinks three-atom molecules are within reach, and he's excited to see what else might be possible.

As is often the case in areas of fundamental research like this one, the experiments have raised new theoretical questions. For instance, in Heinzen and Drummond's theoretical quantum superchemistry system, more than half of all the atoms in a trap would convert into molecules and then go back again. But Chin's group observed that such a conversion happened only 20% of the time. Much is still to be understood to gain higher efficiencies, Chin said in an email.

Heinzen suspects collisions between molecules in the dense gas are to blame. Collisions could push molecules into different quantum states, knocking them out of the pool of condensed molecules. He and Drummond had not accounted for that possibility in their theory.

"It was obvious even from the beginning [that collisions were] going to be kind of a negative effect, but in 2000 we had no idea how big it would be," Heinzen said. "We just said, we're ignoring it because we don't know how big."

The experiments also revealed that three cesium atoms were frequently involved in forming a single Cs2 molecule (and leaving one Cs atom left over), which physicists call a three-body interaction. Previous predictions about quantum superchemistry did not include such interactions.

For Chin, that's a hint that he'll need to do some new experiments. If his group can design and perfect experiments to probe these many-body interactions, it could help elucidate the rules of quantum superchemistry.

Despite these open questions, many scientists view quantum superchemistry as a possible tool for better understanding chemical reactions in general. Atoms and molecules in a boiling beaker inhabit wide ranges of quantum states and interact in myriad ways that make them too complicated to study in fine detail experimentally. In contrast, atoms and very simple molecules in BECs are in precisely controlled, well-defined quantum states. So quantum superchemistry could be a way to study reactions in very fine detail.

"[It's] a very appealing regime in terms of advancing our fundamental understanding of chemistry," Waseem Bakr, a physicist at Princeton University who studies ultracold atoms and molecules, told Live Science.

Quantum superchemistry also has scientists excited because it provides precise control over molecular quantum states.

That could be useful for quantum simulation, a cousin of quantum computers. Typically, scientists simulate quantum systems on "classical" systems, such as conventional computers. But many processes, such as high-temperature superconduction, might be better modeled using quantum systems that are governed by the same quantum rules. Quantum superchemistry would give scientists a tool for producing molecules in specific quantum states that would enable those simulations, Bakr said.

Heinzen sees plenty of reasons for scientists to keep exploring the phenomenon he helped dream up more than 20 years ago. While the applications are little more than pipe dreams right now, history has shown that advances in fundamental science can sometimes lead to surprising applications down the road.

"It's not obvious right now," he said. "But it's still really worth doing."

See more here:

Inside the 20-year quest to unravel the bizarre realm of 'quantum superchemistry' - Livescience.com

Read More..

Quantum Mechanics Hack Could Lead to Unbreakable Metals by Leveraging Weird Distortion of Atoms – The Debrief

Scientists say they have created a new method of testing materials that allows predictions to be made about their ductility, which could lead to the production of virtually unbreakable metals for use with components in a variety of applications.

Drawing from quantum mechanics principles, the new method allows for significant improvements by enhancing predictions about metals ability to be drawn out into thinner shapes while maintaining their strength.

According to researchers involved with the discovery, the new method has proven very effective for metals used in high-temperature applications and could help industries like aerospace and other fields perform tests of various materials more rapidly.

The discovery was reported by scientists at Ames National Laboratory in cooperation with Texas A&M University.

The teams new quantum-mechanics-based approach has already proven effective on refractory multi-principal-element alloys, a group of materials that often lack the ductility required for their use in the demanding conditions of fusion technology, aerospace applications, and other applications where metals must be capable of withstanding extreme temperatures.

Problems associated with metal ductility have remained a challenge to such industries for many decades since it remains difficult to predict a metals thresholds for deformation without compromising its toughness. This has led many industries to resort to trial and error, which also presents issues due to the material costs associated with repeated testing and the amount of time it requires.

One of the hidden factors underlying such problems has to do with the fact that all materials possess atomic structures with a surprising degree of variety. Each atom possesses a different shape from one to the next, and these atoms constantly adjust to fit within the spaces they occupy, giving rise to a phenomenon known as local atomic distortion.

According to Prashant Singh, a scientist at Ames Lab who leads its theoretical design efforts, he and his colleagues, which included Gaoyuan Ouyang, also an Ames Lab Scientist who led the teams experimental efforts, incorporated local atomic distortion into their analysis of materials to determine their strength and potential ductility.

Singh says that current approaches to performing such tests are not very efficient at distinguishing between ductile and brittle systems for small compositional changes. However, he and his teams method can capture such non-trivial details because now we have added a quantum mechanical feature in the approach that was missing.

Singh says that the highly efficient new method he and his colleagues have developed can test thousands of individual materials in a very short amount of time. This allows unprecedented predictions to be made about various materials and what combinations of them are worth conducting additional experiments with.

Singh and his colleagues new processs speed and efficiency significantly reduce the time required for testing, which has hindered past efforts, and also reduces the strain placed on resources.

Tests were performed on a series of predicted materials known as refractory multi-principal-element alloys, or RMPEAs. These alloys are well suited for use in high-temperature applications, including nuclear reactors, propulsion systems, and a variety of others.

The predicted ductile metals underwent significant deformation under high stress, Ouyang said of the teams validation tests, while the brittle metal cracked under similar loads, confirming the robustness of new quantum mechanical method.

The team describes their innovative new work and its potential use in creating virtually unbreakable metals in a paper titled A ductility metric for refractory-based multi-principal-element alloys, which was recently published in the journal Acta Materialia.

Micah Hanks is the Editor-in-Chief and Co-Founder of The Debrief. He can be reached by email atmicah@thedebrief.org. Follow his work atmicahhanks.comand on X:@MicahHanks.

Read more here:

Quantum Mechanics Hack Could Lead to Unbreakable Metals by Leveraging Weird Distortion of Atoms - The Debrief

Read More..