Page 2,666«..1020..2,6652,6662,6672,668..2,6802,690..»

EXCLUSIVE Microsoft warns thousands of cloud customers of exposed databases – Reuters

SAN FRANCISCO, Aug 26 (Reuters) - Microsoft (MSFT.O) on Thursday warned thousands of its cloud computing customers, including some of the world's largest companies, that intruders could have the ability to read, change or even delete their main databases, according to a copy of the email and a cyber security researcher.

The vulnerability is in Microsoft Azure's flagship Cosmos DB database. A research team at security company Wiz discovered it was able to access keys that control access to databases held by thousands of companies. Wiz Chief Technology Officer Ami Luttwak is a former chief technology officer at Microsoft's Cloud Security Group.

Because Microsoft cannot change those keys by itself, it emailed the customers Thursday telling them to create new ones. Microsoft agreed to pay Wiz $40,000 for finding the flaw and reporting it, according to an email it sent to Wiz.

"We fixed this issue immediately to keep our customers safe and protected. We thank the security researchers for working under coordinated vulnerability disclosure," Microsoft told Reuters.

Microsoft's email to customers said there was no evidence the flaw had been exploited. "We have no indication that external entities outside the researcher (Wiz) had access to the primary read-write key," the email said.

This is the worst cloud vulnerability you can imagine. It is a long-lasting secret, Luttwak told Reuters. This is the central database of Azure, and we were able to get access to any customer database that we wanted.

Luttwak's team found the problem, dubbed ChaosDB, on Aug. 9 and notified Microsoft Aug. 12, Luttwak said.

A Microsoft logo is pictured on a store in the Manhattan borough of New York City, New York, U.S., January 25, 2021. REUTERS/Carlo Allegri

Read More

The flaw was in a visualization tool called Jupyter Notebook, which has been available for years but was enabled by default in Cosmos beginning in February. After Reuters reported on the flaw, Wiz detailed the issue in a blog post.

Luttwak said even customers who have not been notified by Microsoft could have had their keys swiped by attackers, giving them access until those keys are changed. Microsoft only told customers whose keys were visible this month, when Wiz was working on the issue.

Microsoft told Reuters that "customers who may have been impacted received a notification from us," without elaborating.

The disclosure comes after months of bad security news for Microsoft. The company was breached by the same suspected Russian government hackers that infiltrated SolarWinds, who stole Microsoft source code. Then a wide number of hackers broke into Exchange email servers while a patch was being developed.

A recent fix for a printer flaw that allowed computer takeovers had to be redone repeatedly. Another Exchange flaw last week prompted an urgent U.S. government warning that customers need to install patches issued months ago because ransomware gangs are now exploiting it.

Problems with Azure are especially troubling, because Microsoft and outside security experts have been pushing companies to abandon most of their own infrastructure and rely on the cloud for more security.

But though cloud attacks are more rare, they can be more devastating when they occur. What's more, some are never publicized.

A federally contracted research lab tracks all known security flaws in software and rates them by severity. But there is no equivalent system for holes in cloud architecture, so many critical vulnerabilities remain undisclosed to users, Luttwak said.

Reporting by Joseph Menn; Editing by William Mallard

Our Standards: The Thomson Reuters Trust Principles.

Read the rest here:
EXCLUSIVE Microsoft warns thousands of cloud customers of exposed databases - Reuters

Read More..

Monday: Hardware & consumption boom, Bitcoin theft, cloud & T-Mobile gaps – Market Research Telecast

Processors and graphics cards continue to sell well, but in the context of the energy transition and the call for more sustainability, electric cars, new house insulation and organic shoes are also in demand. But does that have to be all? It is also environmentally friendly if you continue to use your previous belongings instead of replacing them a brief overview of the most important messages.

Although the second quarter has traditionally been rather weak, the three big chip manufacturers have Intel, Nvidia and AMD further increased their sales figures. Despite the lack of chips, sales of CPUs and graphics cards continue to rise. Intel was able to assert itself as the market leader, but Nvidia increased its market share for graphics cards a little.

These Consumer culture is also evident in other areas. A survey on the personal participation in the protection of future human habitat shows what the respondents bought. New electric cars. New house insulation. New organic shoes. New e-bikes. New bamboo straws. New zinc sheet watering cans. the The downside is missing, the garbage behind it. The Missing Link is about overconsumption and false consumption promises: Dont buy an electric car!

Not bought, but supposedly with one Malware steals bitcoins did two years ago british youngsters. With a Civil action an American wants to regain these 16 bitcoins. At the time of the theft, the two alleged perpetrators were still minors and lived with their parents. After the loss of 16 bitcoins, the stolen person also sued the parents of the alleged thieves.

Microsofts cloud service Azure was not infested with malware, but apparently offered one security breach, through which unauthorized persons could gain full access to the customers cloud databases. Microsoft says it has closed the gap in the meantime, but affected customers should take action themselves to prevent unauthorized access. After the cloud database disaster, Microsoft has therefore informed its Azure customers about the serious gap.

In contrast to the Azure vulnerability, which has not had any consequences so far, the most recent Break into the servers of T-Mobile US above 50 million customer data stolen. The system made it easy for the hacker, he himself explains in a letter to the press. Cracking the defense mechanisms of the US Telecom subsidiary cost him little effort. The hacker used a devastating security hole for the data breach at T-Mobile US.

A devastating development is also becoming apparent in the coronavirus pandemic, because the number of nationwide Covid-19 patients treated in intensive care units is in the fourth corona wave rose above 1000 for the first time. In the DIVI Register daily report on Sunday, 1008 Covid 19 patients were reported in intensive care, 485 of whom had to be ventilated. The low was 354 on July 22nd. Since then, the occupancy has increased again, so that the number of Covid-19 patients in intensive care units has risen again to over 1000.

Also important:

(fds)

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.

Read the original:
Monday: Hardware & consumption boom, Bitcoin theft, cloud & T-Mobile gaps - Market Research Telecast

Read More..

Rethinking Your Tool Chain When Moving Workloads to the Cloud – Virtual-Strategy Magazine

Software-driven IT organizations generally rely on a tool chain made up of commercial and home-grown solutions to develop, deploy, maintain and manage the applications and OSes that their business depends on. Most IT shops have preferred tools for needs like application monitoring, data protection, release management or provisioning and deprovisioning resources. But are those tools always the best options?

While tool chains do evolve over time, its rare for IT organizations to conduct a full, top-to-bottom review of the tools they are comfortable using with an eye toward optimization or new capabilities. One motivator is when companies are considering moving workloads from the data center to the cloud. The inherent differences in how applications are developed and managed for on-premise vs. cloud environments is a strong reason to reassess whether the current tools in your arsenal are the best alternatives available or, just as important, whether theyre well-suited to a more cloud-centric software lifecycle.

When it comes to reevaluating your tool chain, it helps to have a process. Heres one approach:

Its important to start with a full audit of your current stack, including areas such as:

Obviously, its important to assess how well each product meets your current needs as they stand today. (Are there capabilities you wish it had or weaknesses youve become accustomed to working around?) Then consider how those needs will change as workloads move to the cloud. A good first question to ask is whether the tool is still supported by the vendor. Given how infrequently IT teams switch tools, theres a not insignificant likelihood that one or more of your tools has become an orphan. Second, does the license agreement for the tool accommodate or restrict its use in the cloud? For instance, some tools are licensed to a specific physical server and some vendors require their hardware to be owned by the same entity that holds the license. Both of these scenarios are problematic for cloud-based deployments. Third, does moving to a cloud base tool open up new possibilities that you want to take advantage of? Removing the constraints of on-premise solutions and gaining capabilities like nearly unlimited compute and storage, dynamic workloads and multiple regions around the world can provide much needed flexibility. But the advantage of moving to a cloud-based tool (replacing, say, an on-premise application log reporting solution with Azure Log Manager), needs to be balanced against the added management required solution as well as the need to retrain teams.

There are also non-technical factors to consider when looking at new tools. Do teams enjoy using the tool? Does it make them more productive (or conversely, slow them down)? Does it meet the business needs of the organization? How much work will adopting a new tool take and will it be worth it in the long run? While these may not be the most important considerations, they shouldnt be overlooked.

There is almost always going to be an alternative to any individual tool and, potentially, one tool that can do the work of several, making it possible to consolidate. One way to get a sense of whats available is to start by asking other teams in your organization what they use in the destination cloud. There are often cloud-based tools (offered by cloud vendors or sold as separate SaaS products) that offer pay-as-you-go licensing, can be easier to scale up or down, move workloads around, and can expand to other regions. Today, some legacy vendors even offer consumption-based options to better match up against cloud-based competitors, while others stick with more traditional perpetual licenses. Last, consider if a new tool will give IT teams the opportunity and motivation to expand their skill set. Offering the chance to learn and use new products could actually increase job satisfaction and improve your organizations ability to retain engineering talent.

Before you pull the trigger on a new solution it often pays to check in with the existing vendor. To keep your business they may offer more generous or flexible terms. Of course, vendors that see the cloud as a threat are probably going to be less inclined to give you a break on licensing. But the conversation doesnt lead to new or better terms, talking to your vendors on a regular basis can provide insights into how they see their customers and the market.

Once youve completed the previous steps, youll have a good idea of the tools youre likely to keep and those youd like to upgrade. At that point, its important to create a plan for adopting each new tool. Start by separating products that need to be replaced soon from those where more research is required; it also helps to compile any other useful information learned during the process so that the larger IT team can access it. Youll want to assess whether teams will need training, whether internal documentation or playbooks need to be updated, and how new tools will plug into existing authorization/authentication solutions. Finally, you will also need a migration plan for each tool that details how and when the organization will move from the old product to the new one, what scripts will need to be rewritten, and what to do with historical data like log files from the old product.

While cloud-based tools offer meaningful benefits in terms of flexibility, cost savings and ease of scalability, they may not be the best solution for every organization. The only way to be sure is to do the kind of analysis outlined above. For companies that have already made the decision to move workloads to the cloud, the potential long-term benefits of adopting new solutions is worth the effort.

Skytap

More:
Rethinking Your Tool Chain When Moving Workloads to the Cloud - Virtual-Strategy Magazine

Read More..

GraphQL’s Emerging Role in Modernization of Monolithic Applications – IT Jungle

August 30, 2021Alex Woodie

Every now and then, a technology emerges that lifts up everything around it. GraphQL has the potential to be a technology like that, and thats good news for customers running older, established applications, such as those that run on IBM iron.

IBMs mainframe and its midrange IBM i server often are maligned as old, washed-up, legacy platforms, but people who say that are missing a key distinction: its usually not the server thats old. In most cases, its the application that was first deployed in the Clinton Administration (or earlier) that is the problem.

Companies typically have many reasons for why they havent modernized applications or migrated to something newer. For starter, the applications just work. And, despite the changing technological winds around us, the fact that these applications continue to do what they were originally designed to do process transactions reliably and securely, day in and day out, often for years on end, without much maintenance is not a trivial thing, nor is it something to fool around with. If it aint broke, dont fix it probably best encapsulates this attitude.

You were supposed to abandon these monolithic applications in favor of client-server architectures 30 years ago. In the late 1990s, you were supposed to migrate your RPG and COBOL code to Java, per IBMs request. The explosion of the World Wide Web in the early 2000s provided another logical architecture to build to, followed by the growth of smart devices after the iPhones appearance in 2007. Today, everybody aspires to run their applications as containerized microservices in the cloud, which surely is the pinnacle of digital existence.

After all these boundary-shaking technological inflection points, its a wonder that mainframes and IBM i servers even at exist at this point. But of course, despite all the best laid plans to accelerate their demise, they do (as you, dear IT Jungle reader, know all too well).

So what does all this have to do with GraphQL? We first wrote about the technology in March 2020, just before the COVID pandemic hit (so perhaps you missed it).

GraphQL, in short, is a query language and runtime that was originally created at Facebook in 2012 and open sourced in 2015. Facebook developers, tired of maintaining all of the repetitive and brittle REST code needed to pull data out of backend servers to feed to mobile clients, desired an abstraction layer that could insulate them from REST and accelerate development. The result was GraphQL, which continues to serve data to Facebooks mobile clients to this day.

Since it was open source, GraphQL adoption has grown exponentially, if downloads of the open source technology mean anything. According to Geoff Schmidt, the chief executive officer and co-founder of GraphQL-backer Apollo, says 30 percent of the Fortune 500 have adopted Apollo tools to manage their growing GraphQL estates.

Following Apollos recent Series D funding round, which netted the San Francisco company $130 million at a $1.5 billion valuation, Schmidt is quite excited about the potential for GraphQL to alleviate technical burdens for enterprises with lots of systems to integrate, including monolithic applications running on mainframes.

Frankly, there are great use cases around mainframe systems or COBOL systems, Schmidt says. You just slide this graph layer in between the mainframe and the mobile app, and you dont have to change anything. You just get that layer in there, start moving the traffic over to GraphQL and route it all through the graph.

Once the GraphQL layer is inserted between the backend systems and the front-end interfaces, front-end developers have much more freedom to develop the compelling app experiences, without going to the backend developer to tweak a REST API. In addition to accelerating the developers productivity, it also insulates the backend system from changes.

Once you put that abstraction layer in place, not only can you combine all that stuff and get it to every platform in a very agile, fast manner, Schmidt tells IT Jungle, but at the same time, if you want to refactor thatif you have a monolithic backend that you want to move into microservices, or you just want to change how the back is architected you can now do that without having to disturb all those clients that exist in the field.

Microservices and Web services that utilize the REST approach are the defacto standard in the industry at the moment. But that could change. Schmidt cites a recent survey that found 86 percent of JavaScript developers ranked GraphQL as a top interest.

This graph layer makes sense, Schmidt says. Its at the edge of your data center. Its the abstraction layer that you want to put around all your backend services, whether its to support front-end teams or to support partners. And partners are even a more powerful use case, because if they need a change to the API hey, that can be six months or a year.

One of Apollos customers is Walmart. The Arkansas-based retailer maintains systems for managing transactions in the store and on its ecommerce website. Using GraphQL, Walmart is able to deliver an incredible 360-degree customer experience, Schmidt says.

Whether the customer wants to shop in store or they want to shop online, were giving them the very best possible shopping experience, the CEO says. The customer is going to describe how they want to shop, not the retailer, and thats what Walmart is able to deliver with a graph that bring to get all the mainframe that power their brick-and-mortar stores together with all of their cutting-edge ecommerce investment to serve the customer wherever they are.

Walmart, of course, has powered its share of innovation in IT. While some details of the implementation are not available, the fact that the retail giant is adopting GraphQL to address its data and application integration requirements may tell you something about the potential of this technology in an IBM environment, particularly considering the rallying cry from Rochester over the need to build next-gen IBM i apps.

The way Schmidt sees it, GraphQL lets customers think about their businesses as platforms, as a bunch of capabilities that we can combine to meet customer needs, in any channel, anytime, he says.

IT leaders who put a graph strategy in place now, maybe even before the business realizes the need for it theyre the ones who are going to have this platform strategy, he continues. The IT leaders who put that in place are going to be heroes, because whatever the business asks for, theyre going to be to be able to deliver a year faster than the competition.

So You Want To Do Containerized Microservices In the Cloud?

Public Cloud Dreams Becoming A Reality for IBM i Users

In Search Of Next Gen IBM i Apps

Modernization Trumps Migration for IBM i and Mainframe, IDC Says

COVID-19: A Great Time for Application Modernization

How GraphQL Can Improve IBM i APIs

Excerpt from:
GraphQL's Emerging Role in Modernization of Monolithic Applications - IT Jungle

Read More..

Linux is not invulnerable, here are some top Linux malware in 2021 – Technology Zimbabwe

So yesterday I wrote about the latest iteration of Ubuntu 20.04 LTS coming out in my usual glowing terms. I feel like there was nothing amiss in that article after all Ubuntu, especially the version in question, is a stellar operating system that is rock solid and has served me well. A few people however decided to call me on my bias and asked me to publicly admit that there is no such thing as an invulnerable operating system under the sun.

So here is me doing exactly that. I think I should repeat that for emphasis: There is no such thing as an invulnerable operating system under the sun. I often say the best way to make your computer impenetrable is to shut it down and pulverise it thoroughly with a hammer. But even then who knows? I have seen FBI nerds in real movies pull information on a single surviving chip.

What makes Linux better than Windows in my opinion is not just the open-source code that is reviewed by scores of experts around the world. Its the philosophy behind it all. In Windows, ignorant users can click around and blunder the way to productivity. The system is meant to be easy and fits many use cases by default. All you need to do is boot up, enter your password or just stare at your computer to login, get to the desktop and click on Chrome and you are watching cat videos.

In Linux, things can be but are usually not that easy. While you can use Windows without knowing what a registry is. In Linux, you have to be hands-on with your configurations. Every action you take has to be deliberate otherwise your risk breaking things. Often you have to set up your desktop the way you want, Chrome is not installed by default and sometimes you cannot even play videos until you install the right codecs. Linux forces you to learn and pay attention to what you are doing. You are often forced to learn why you are doing things in addition to how to do things.

Now that we have put the explanations out of the way its time to look at some of the top Linux Malware in 2021. One thing to note is that cloud-centric malware dominates in Linux. There are probably a couple of reasons for this including:

Below are the top malware in Linux according to Trend Micro

One thing to note from the above is that unlike in Windows, Linux malware is often heavily customised by attackers to target a specific vulnerability and often each Linux system is unique. This means that its rare to see one specific piece of malware dominate instead you have families of related malware.

Again I am biased but I believe identifying and thwarting an attack in Linux is pretty easy. You have tools like UFW (or better yet iptables) to lock down your internet connection in ways that are unimaginable in Windows. For example, whenever I set up a new cloud server I simply block all non-Zimbabwean IPs by default. That alone removes 99.99% of the threats from the table.

Also, make it a habit to uninstall software you dont need. Better still when installing make sure you only install the base operating system with as little stuff as possible. You can then add only just the stuff you need. Why install Apache on a Minecraft or mail server? Do you really need FTP? If not stop and disable the service via ssh.

Above all. Always check the logs. Always. Check resource usage too and see if it tallies with what you expect.

Follow this link:
Linux is not invulnerable, here are some top Linux malware in 2021 - Technology Zimbabwe

Read More..

Here’s Why Nvidia Will Surpass Apple’s Valuation In 5 Years – Forbes

POLAND - 2021/02/05: In this photo illustration a Nvidia logo seen displayed on a smartphone screen ... [+] with stock market graphic on the background. (Photo Illustration by Omar Marques/SOPA Images/LightRocket via Getty Images)

Nvidia has a market cap of roughly $550 billion compared to Apples nearly $2.5 trillion. We believe Nvidia can surpass Apple by capitalizing on the artificial intelligence economy, which will add an estimated $15 trillion to GDP. This is compared to the mobile economy that brought us the majority of the gains in Apple, Google and Facebook, and contributes $4.4 trillion to GDP.For comparison purposes, AI contributes $2 trillion to GDP as of 2018.

While mobile was primarily consumer, and some enterprise with bring-your-own-device, artificial intelligence will touch every aspect of both industry and commerce, including consumer, enterprise, and small-to-medium sized businesses, and will do so by disrupting every vertical similar to cloud. To be more specific, AI will be similar to cloud by blazing a path that is defined by lowering costs and increasing productivity.

I have an impeccable record on Nvidia including when I stated the sell-off in 2018 was overblown and missing the bigger picture as Nvidia has two impenetrable moats: developer adoption and the GPU-powered cloud. This was when headlines were focused exclusively on Nvidias gaming segment and GPU sales for crypto mining.

Although Nvidias stock is doing very well this year, this has been a fairly contrarian stance in the past. Not only was Nvidia wearing the dunce hat in 2018, but in August of 2019, the GPU data center revenuewas flat to declining sequentially for three quarters, and in fiscal Q3 2020, also declined YoY (calendar Q4 2019). We established and defended our thesis on the data center as Nvidia clawed its way back in price through China tensions, supply shortages, threats of custom silicon from Big Tech, cyclical capex spending, and on whether the Arm acquisition will be approved.

Suffice to say, three years later and Nvidia is no longer a contrarian stock as it once was during the crypto bust. Yet, the long-term durability is still being debated - its a semiconductor company after all - best to stick with software, right? Right? Not to mention, some institutions are still holding out for Intel. Imagine being the tech analyst at those funds (if theyre still employed!).

Before we review what will drive Nvidias revenue in the near-term, it bears repeating the thesis we published in November of 2018:

Nvidia is already the universal platform for development, but this wont become obvious until innovation in artificial intelligence matures. Developers are programming the future of artificial intelligence applications on Nvidia because GPUs are easier and more flexible than customized TPU chips from Google or FGPA chips used by Microsoft [from Xilinx]. Meanwhile, Intels CPU chips will struggle to compete as artificial intelligence applications and machine learning inferencing move to the cloud. Intel is trying to catch-up but Nvidia continues to release more powerful GPUs and cloud providers such as Amazon, Microsoft and Google cannot risk losing the competitive advantage that comes with Nvidias technology.

The Turing T4 GPU from Nvidia should start to show up in earnings soon, and the real-time ray-tracing RTX chips will keep gaming revenue strong when there is more adoption in 6-12 months. Nvidia is a company that has reported big earnings beats, with average upside potential of 33.35 percent to estimates in the last four quarters. Data center revenue stands at 24% and is rapidly growing. When artificial intelligence matures, you can expect data center revenue to be Nvidias top revenue segment. Despite the corrections weve seen in the technology sector, and with Nvidia stock specifically, investors who remain patient will have a sizeable return in the future.

Notably, the stock is up 335% since my thesis was first published a notable amount for a mega cap stock and nearly 2-3X more returns than any FAAMG in the same period. This is important because I expect this to trend to continue until Nvidia has surpassed all FAAMG valuations.

Although Nvidias stock is doing very well this year, this has been a fairly contrarian stance in ... [+] the past.

Below, we discuss the Ampere architecture and A100 GPUs, the Enterprise AI Suite and an update on the Arm acquisition. These are some of the near-term stepping stones that will help sustain Nvidias price in the coming year. We are also bullish on the Metaverse with Nvidia specifically but will leave that for a separate analysis in the coming month.

Nvidias acceleration may happen one or two years earlier as they are the core piece in the stack that is required for the computing power for the front-runners referenced in the graph above. There is a chance Nvidia reflects data center growth as soon as 2020-2021. -published August 2019, Premium I/O Fund

Last year, Nvidia released the Ampere architecture and A100 GPU as an upgrade from the Volta architecture. The A100 GPUs are able to unify training and inference on a single chip, whereas in the past Nvidias GPUs were mainly used for training. This allows Nvidia a competitive advantage by offering both training and inferencing. The result is a 20x performance boost from a multi-instance GPU that allows many GPUs to look like one GPU. The A100 offers the largest leap in performance to date over the past 8 generations.

At the onset, the A100 was deployed by the worlds leading cloud service providers and system builders, including Alibaba cloud, Amazon Web Services, Baidu Cloud, Dell Technologies, Google Cloud platform, HPE and Microsoft Azure, among others. It is also getting adopted by several supercomputing centers, including the National Energy Research Scientific Computing Center and the Jlich Supercomputing Centre in Germany and Argonne National Laboratory.

One year later and the Ampere architecture is becoming one of the best-selling GPU architectures in the companys history. This quarter, Microsoft Azure recently announced the availability of Azure ND A100 v4 Cloud GPU which is powered by NVIDIA A100 Tensor Core GPUs. The company claims it to be the fastest public cloud supercomputer. The news follows the launch by Amazon Web Services and Google Cloud general availability in prior quarters. The company has been extending its leadership in supercomputing. The latest top 500 list shows that Nvidia power 342 of the worlds top 500 supercomputers, including 70 percent of all new systems and eight of the top 10. This is a remarkable update from the company.

Ampere architecture-powered laptop demand has also been solid as OEMs adopted Ampere Architecture GPUs in a record number of designs. It also features the third-generation Max-Q power optimization technology enabling ultrathin designs. The Ampere architecture product cycle for gaming has also been robust, driven by RTXs real-time ray tracing.

In the area of GPU acceleration, Nvidia isworking with Apache Sparkto release Spark 3.0 run onDatabricks. Apache Spark is the industrys largest open source data analytics platform. The results are a 7x performance improvement and 90 percent cost savings in an initial test. Databricks and Google Cloud Dataproc are the first to offer Spark with GPU acceleration, which also opens up Nvidia for data analytics.

The demand has been strong for the companys products which have exceeded supply. In the earnings call, Jensen Huang mentioned And so I would expect that we will see a supply-constrained environment for the vast majority of next year is my guess at the moment. However, he assured that they have secured enough supplies to meet the growth plans for the second half of this year when he said, We expect to be able to achieve our Company's growth plans for next year.

Virtualization allows companies to use software to expand the capabilities of physical servers onto a virtual system. VMWare is popular with IT departments as the platform allows companies to run many virtual machines on one server and networks can be virtualized to allow applications to function independently from hardware or to share data between computers. The storage, network and compute offered through full-scale virtual machines and Kubernetes instances for cloud-hosted applications comes with third-party support, making VMWare an unbeatable solution for enterprises.

Therefore, it makes sense Nvidia would choose VMWares VSphere as a partner on the Enterprise AI Suite, which is a cloud native suite that plugs into VMWares existing footprint to help scale AI applications and workloads. As pointed out by the write-up by IDC, many IT organizations struggle to support AI workloads as they do not scale as deep learning training and AI inferencing is very data hungry and requires more memory bandwidth than what standard infrastructures are capable of. CPUs are also not as efficient as GPUs, which have parallel processing. Although developers and data scientists can leverage the public cloud for the more performance demanding instances, there are latency issues with where the data repositories are stored (typically on-premise).

The result is that IT organizations and developers can deploy virtual machines with accelerated AI computing where previously this was only done with bare metal servers. This allows for departments to scale and pay only for workloads that are accelerated with Nvidia capitalizing on licensing and support costs. Nvidias AI Enterprise targets customers who are starting out with new enterprise applications or deploying more enterprise applications and require a GPU. As enterprise customers of the Enterprise AI Suite mature and require larger training workloads, its likely they will upgrade to the GPU-powered cloud.

Subscription licenses start at $2,000 per CPU socket for one year and it includes standard business support five days a week. The software will also be supported with a perpetual license of $3,595, but support is extra. You also have the option to have get 24x7 support with additional charges. According to IDC, companies are on track to spend a combined nearly $342 billion on AI software, hardware, and services like AI Enterprise in 2021. So, the market is huge and Nvidia is expecting a significant business.

Nvidia also announced Base Command, which is a development hub to move AI projects from prototype to production. Fleet Command is a managed edge AI software SaaS offering that allows companies to deploy AI applications from a central location with real-time processing at the edge. Companies like Everseen use these products to help retailers manage inventory and for supply chain automation.

Over the past year, there have been some quarters where data center revenue exceeded gaming, while in the most recent quarter, the two segments are inching closer with gaming revenue at $3.06 billion, up 85 percent year-over-year, and data center revenue at $2.37 billion, up 35 percent year-over-year.

It was good timing for Jensen Huang to appear in a fully rendered kitchen for the GTC keynote as professional visualization segment was up 156% year-over-year and 40% quarter-over-quarter. Not surprisingly, automotive was down 1% sequentially although up 37% year-over-year.

Gross margins were 64.8% when compared to 58.8% for the same period last year, which per management reflected the absence of certain Mellanox acquisition-related costs. Adjusted gross margins were 66.7%, up 70 basis points, and net income increased 282% YoY to $2.4 billion or $0.94 per share compared to $0.25 for the same period last year.

Adjusted net income increased by 92% YoY to $2.6 billion or $1.04 per share compared to $0.55 for the same period last year.

The company had a record cash flow from operation of $2.7 billion and ended the quarter with cash and marketable securities of $19.7 billion and $12 billion in debt. It returned $100 million to the shareholders in the form of dividends. It also completed the announced four-for-one split of its common stock.

The company is guiding for third quarter fiscal revenue of $6.8 billion with adjusted margins of 67%. This represents growth of 44% and with the lions share of sequential growth driven by the data center.

Weve covered the Arm acquisition extensively with in a full-length analysis you can find here on Why the Nvidia-Arm acquisition Should Be Approved. In the analysis, we point towards why we are positive on the deal, as despite Arms extremely valuable IP, the company makes very little revenue for powering 90% of the worlds mobile processors/smartphones (therefore, it needs to be a strategic target). We also argue that the idea of Arm being neutral in a competitive industry is idealistic, and to block innovation at its most crucial point would be counterproductive for the governments reviewing the deal. We also discuss how the Arm acquisition will help facilitate Nvidias move towards edge devices.

In the recent earnings call, CFO Colette Kress reiterated that the Arm deal is a positive for both the companies and its customers as Nvidia can help expand Arms IP into new markets like the Data Center and IoT. Specifically, the CFO stated, We are confident in the deal and that regulators should recognize the benefits of the acquisition to Arm, its licensees, and the industry.

The conclusion to my analysis is the same as the introduction, which is that I believe Nvidia is capable of out-performing all five FAAMG stocks and will surpass even Apples valuation in the next five years.

As stated in the article, Beth Kindig and I/O Fund currently own shares of NVDA. This is not financial advice. Please consult with your financial advisor in regards to any stocks you buy.

Please note: The I/O Fund conducts research and draws conclusions for the Funds positions. We then share that information with our readers. This is not a guarantee of a stocks performance. Please consult your personal financial advisor before buying any stock in the companies mentioned in this analysis.

Follow me onTwitter.Check outmywebsiteorsome of my other workhere.

The rest is here:
Here's Why Nvidia Will Surpass Apple's Valuation In 5 Years - Forbes

Read More..

Pure breaches the hyperscaler disk wall Blocks and Files – Blocks and Files

Pure Storages revenues grew 23 per cent year-over-year in its its latest quarter and it expects a growth acceleration next quarter, with an eight-figure hyperscaler deal and the COVID recovery gathering pace.

Revenues were $498.8 million in the quarter ended 1 August 2021, with a loss of $45.3 million, compared to the year-ago loss of $65 million. It was the highest second-quarter revenue in Pures history, as product and subscription sales both accelerated.

Chairman and CEO Charles Giancarlo sounded ecstatic in his prepared remarks: Pure had an outstanding Q2! As a growing, share-taking company, we expect every quarter to be record breaking, but this quarter was extraordinary. Sales, revenue and profitability were well above expectations [and] we had the highest Q2 operating profit in our history.

He was keen to tell investors that Pure had made the right choices: We predicted that Pures growth would accelerate as businesses adjusted to the COVID environment. We believe that our growth will be even stronger as businesses return to an in-office environment. We estimated that this would start this past Q2, and we are obviously very pleased with the results.

He believes that the current environment enables us to return to our historical double-digit growth rates, with increasing profitability, but he didnt forecast when Pure would make a profit. He expects continued improvements quarter by quarter in our operating profit margins.

On a year-over-year basis:

Pure gained 380 new customers in the quarter, ten per cent year-over-year growth, taking its total customer count, we calculate, to 9647. Sales to large enterprises were more than 50 per cent of sales with the top ten customers spending more than $100 million in the quarter.

The outlook for Q3 is $530 million, 29 per cent higher than a year ago. This guidance includes revenue Pure expects to recognise in connection with a more than $10 million sale of the QLC flash-based FlashArray//C to one of the top ten hyper-scalers. Full fiscal 2022 year revenue is forecast to be $2.04 billion, up 21 per cent.

Pure feels comfortable it can grow revenues at more than 20 per cent for the foreseeable future.

In the earnings call Giancarlo said that the hyperscaler FlashArray//C sale was won against traditional magnetic disk based on our high performance, small space and power footprint and superior total cost of ownership.

Answering about the prospects of this win hesaid: Its a part of their overall operations. we do feel that this is sustainable both in the sense of continuing with this customer, as well aswe think its the beginning of seeing other similarly situated hyperscale customers starting to look at flash as a real alternative.

As you may know, most of the hyperscalers, the vast majority of what they store, they store on disk. They may have a little bit of flash in their servers, but for the most part, all storage is on disk. And we think this is the beginning of breaking that structure. We finally have the kind of price performance that can really compete within the disk market.

He added: the last bastion of mostly disk data centre right now is actually in the cloud. And so it represents a great opportunity for us.

CTO Rob Lee (see below) said: FlashArray//C [is] very price competitive up to 30 per cent price advantage in some cases price is one element of the equation but all of the other attributes and benefits we are able to bring from flash, such as the performance, such as power, cooling savings, footprint savings, those are all very meaningful across the board. But at the hyperscale, they become super, super meaningful, right? And so, as we look at, for example, this customer, FlashArray//C was the only product that can meet their needs, without them having to go build new data centres.

Giancarlo commented in the outlook: We are very pleased with what were seeing in terms of the Q3 outlook and the idea that, were driving almost 30 per cent growth next year, with the opportunity we highlighted on FlashArray//C.

This hyperscaler FlashArray//C sale was not the first to a hyperscaler customer, just a very big single sale. Giancarlo also said it is something thats easily transferable to other hyperscalers.

Rob Lee becomes Pures CTO with the previous incumbent, co-founder John Cosgrove, becoming Chief Visionary Officer a full time role, with Lee reporting to him. According to the companys leadership web page, Cosgrove is responsible for developing and executing Pures global technical strategy while Lee, in an apparently overlapping role, looks at global technology strategy, and identifying new innovation (sic) and market expansion opportunities for Pure That sounds like two people mostly doing the same thing.

Read the original here:
Pure breaches the hyperscaler disk wall Blocks and Files - Blocks and Files

Read More..

Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet

The trial successfully demonstrated, according to Verizon, that it is possible to replace current security processes with protocols that are quantum-proof.

To protect our private communications from future attacks by quantum computers, Verizon is trialing the use of next-generation cryptography keys to protect the virtual private networks (VPNs) that are used every day by companies around the world to prevent hacking.

Verizon implemented what it describes as a "quantum-safe" VPN between one of the company's labs in London in the UK and a US-based center in Ashburn, Virginia, using encryption keys that were generated thanks to post-quantum cryptography methods meaning that they are robust enough to withstand attacks from a quantum computer.

According to Verizon, the trial successfully demonstrated that it is possible to replace current security processes with protocols that are quantum-proof.

VPNs are a common security tool used to protect connections made over the internet, by creating a private network from a public internet connection. When a user browses the web with a VPN, all of their data is redirected through a specifically configured remote server run by the VPN host, which acts as a filter that encrypts the information.

This means that the user's IP address and any of their online activities, from sending emails to paying bills, come out as gibberish to potential hackers even on insecure networks like public WiFi, where eavesdropping is much easier.

Especially in the last few months, which have seen many employees switching to full-time working from home,VPNs have become an increasingly popular tool to ensure privacy and security on the internet.

The technology, however, is based on cryptography protocols that are not un-hackable. To encrypt data, VPN hosts use encryption keys that are generated by well-established algorithms such as RSA (RivestShamirAdleman). The difficulty of cracking the key, and therefore of reading the data, is directly linked to the algorithm's ability to create as complicated a key as possible.

In other words, encryption protocols as we know them are essentially a huge math problem for hackers to solve. With existing computers, cracking the equation is extremely difficult, which is why VPNs, for now, are still a secure solution. But quantum computers are expected to bring about huge amounts of extra computing power and with that, the ability to hack any cryptography key in minutes.

"A lot of secure communications rely on algorithms which have been very successful in offering secure cryptography keys for decades," Venkata Josyula, the director of technology at Verizon, tells ZDNet. "But there is enough research out there saying that these can be broken when there is a quantum computer available at a certain capacity. When that is available, you want to be protecting your entire VPN infrastructure."

One approach that researchers are working on consists ofdeveloping algorithms that can generate keys that are too difficult to hack, even with a quantum computer. This area of research is known as post-quantum cryptography, and is particularly sought after by governments around the world.

In the US, for example, the National Institute of Standards and Technology (NIST) launched a global research effort in 2016 calling on researchers to submit ideas for algorithms that would be less susceptible to a quantum attack. A few months ago, the organization selected a group of 15 algorithms that showed the most promise.

"NIST is leading a standardization process, but we didn't want to wait for that to be complete because getting cryptography to change across the globe is a pretty daunting task," says Josyula. "It could take 10 or even 20 years, so we wanted to get into this early to figure out the implications."

Verizon has significant amounts of VPN infrastructure and the company sells VPN products, which is why the team started investigating how to start enabling post-quantum cryptography right now and in existing services, Josyula adds.

One of the 15 algorithms identified by NIST, called Saber, was selected for the test. Saber generated quantum-safe cryptography keys that were delivered to the endpoints in London and Ashburn of a typical IPsec VPN through an extra layer of infrastructure, which was provided by a third-party vendor.

Whether Saber makes it to the final rounds of NIST's standardization process, in this case, doesn't matter, explains Josyula. "We tried Saber here, but we will be trying others. We are able to switch from one algorithm to the other. We want to have that flexibility, to be able to adapt in line with the process of standardization."

In other words, Verizon's test has shown that it is possible to implement post-quantum cryptography candidates on infrastructure links now, with the ability to migrate as needed between different candidates for quantum-proof algorithms.

This is important because, although a large-scale quantum computer could be more than a decade away, there is still a chance that the data that is currently encrypted with existing cryptography protocols is at risk.

The threat is known as "harvest now, decrypt later" and refers to the possibility that hackers could collect huge amounts of encrypted data and sit on it while they wait for a quantum computer to come along that could read all the information.

"If it's your Amazon shopping cart, you may not care if someone gets to see it in ten years," says Josyula. "But you can extend this to your bank account, personal number, and all the way to government secrets. It's about how far into the future you see value for the data that you own and some of these have very long lifetimes."

For this type of data, it is important to start thinking about long-term security now, which includes the risk posed by quantum computers.

A quantum-safe VPN could be a good start even though, as Josyula explains, many elements still need to be smoothed out. For example, Verizon still relied on standard mechanisms in its trial to deliver quantum-proof keys to the VPN end-points. This might be a sticking point, if it turns out that this phase of the process is not invulnerable to quantum attack.

The idea, however, is to take proactive steps to prepare, instead of waiting for the worst-case scenario to happen. Connecting London to Ashburn was a first step, and Verizon is now looking at extending its quantum-safe VPN to other locations.

Continue reading here:
Quantum computers could read all your encrypted data. This 'quantum-safe' VPN aims to stop that - ZDNet

Read More..

Sumitomo Corporation Quantum Transformation (QX) Project Announces Its Vision and Activities at the IEEE Quantum AI Sustainability Symposium -…

TOKYO--(BUSINESS WIRE)--Sumitomo Corporation Quantum Transformation (QX) Project will present at the IEEE Quantum AI Sustainability Symposium on September 1st, 2021. The QX Project was launched in March 2021 by Sumitomo Corporation, a global Fortune 500 trading and investment company, with the intent to provide new value to society by applying quantum computing technology to the wide-ranging industries in which the company operates. This is the worlds first project that defines Quantum Transformation (QX) as the next social paradigm shift, beyond Digital Transformation (DX).

The founder and head of the QX Project, Masayoshi Terabe, will present about the vision and activities of QX at the IEEE Quantum AI Sustainability Symposium. The organizer IEEE is the world's largest technical professional organization for the advancement of technology. In this talk, he will show how quantum computing can contribute to sustainability. For example, he will introduce the Quantum Sky project, which is a pilot experiment for developing flight routes for numerous air mobility vehicles by quantum computing. Also you can find other concepts like Quantum Smart City and Quantum Energy Management.

The objective of the QX Project is to create new value to the society by combining vast business fields of Sumitomo Corporation throughout its more than 900 consolidated companies, from underground to space, and an extensive number of business partners around the world.

A broad and deep ecosystem is necessary to achieve QX. This is because combining a wide range of technologies, not limited to quantum, and working with a crossover of various industries, is essential. If you are interested in this project, lets take on the challenge of creating a new business, and a new society together!

[information]

[Appendix]

The rest is here:
Sumitomo Corporation Quantum Transformation (QX) Project Announces Its Vision and Activities at the IEEE Quantum AI Sustainability Symposium -...

Read More..

Life, the universe and everything Physics seeks the future – The Economist

Aug 25th 2021

A WISE PROVERB suggests not putting all your eggs in one basket. Over recent decades, however, physicists have failed to follow that wisdom. The 20th centuryand, indeed, the 19th before itwere periods of triumph for them. They transformed understanding of the material universe and thus peoples ability to manipulate the world around them. Modernity could not exist without the knowledge won by physicists over those two centuries.

Your browser does not support the

Get The Economist app and play articles, wherever you are

In exchange, the world has given them expensive toys to play with. The most recent of these, the Large Hadron Collider (LHC), which occupies a 27km-circumference tunnel near Geneva and cost $6bn, opened for business in 2008. It quickly found a long-predicted elementary particle, the Higgs boson, that was a hangover from calculations done in the 1960s. It then embarked on its real purpose, to search for a phenomenon called Supersymmetry.

This theory, devised in the 1970s and known as Susy for short, is the all-containing basket into which particle physicss eggs have until recently been placed. Of itself, it would eliminate many arbitrary mathematical assumptions needed for the proper working of what is known as the Standard Model of particle physics. But it is also the vanguard of a deeper hypothesis, string theory, which is intended to synthesise the Standard Model with Einsteins general theory of relativity. Einsteins theory explains gravity. The Standard Model explains the other three fundamental forceselectromagnetism and the weak and strong nuclear forcesand their associated particles. Both describe their particular provinces of reality well. But they do not connect together. String theory would connect them, and thus provide a so-called theory of everything.

String theory proposes that the universe is composed of minuscule objects which vibrate in the manner of the strings of a musical instrument. Like such strings, they have resonant frequencies and harmonics. These various vibrational modes, string theorists contend, correspond to various fundamental particles. Such particles include all of those already observed as part of the Standard Model, the further particles predicted by Susy, which posits that the Standard Models mathematical fragility will go away if each of that models particles has a heavier supersymmetric partner particle, or sparticle, and also particles called gravitons, which are needed to tie the force of gravity into any unified theory, but are not predicted by relativity.

But, no Susy, no string theory. And, 13 years after the LHC opened, no sparticles have shown up. Even two as-yet-unexplained results announced earlier this year (one from the LHC and one from a smaller machine) offer no evidence directly supporting Susy. Many physicists thus worry they have been on a wild-goose chase.

They have good reason to be nervous. String theory already comes with a disturbing conceptual price tagthat of adding six (or in one version seven) extra dimensions to the universe, over and above the four familiar ones (three of space and one of time). It also describes about 10500 possible universes, only one of which matches the universe in which human beings live. Accepting all that is challenging enough. Without Susy, though, string theory goes bananas. The number of dimensions balloons to 26. The theory also loses the ability to describe most of the Standard Models particles. And it implies the existence of weird stuff such as particles called tachyons that move faster than light and are thus incompatible with the theory of relativity. Without Susy, string theory thus looks pretty-much dead as a theory of everything. Which, if true, clears the field for non-string theories of everything.

The names of many of these do, it must be conceded, torture the English language. They include causal dynamical triangulation, asymptotically safe gravity, loop quantum gravity and the amplituhedron formulation of quantum theory. But at the moment the bookies favourite for unifying relativity and the Standard Model is something called entropic gravity.

Entropy is a measure of a systems disorder. Famously, the second law of thermodynamics asserts that it increases with time (ie, things have a tendency to get messier as they get older). What that has to do with a theory of gravity, let alone of everything, is not, perhaps, immediately obvious. But the link is black holes. These are objects which have such strong gravitational fields that even light cannot escape from them. They are predicted by the mathematics of general relativity. And even though Einstein remained sceptical about their actual existence until the day he died in 1955, subsequent observations have shown that they are indeed real. But they are not black.

In 1974 Stephen Hawking, of Cambridge University, showed that quantum effects at a black holes boundary allow it to radiate particlesespecially photons, which are the particles of electromagnetic radiation, including light. This has peculiar consequences. Photons carry radiant heat, so something which emits them has a temperature. And, from its temperature and mass, it is possible to calculate a black holes entropy. This matters because, when all these variables are plugged into the first law of thermodynamics, which states that energy can be neither created nor destroyed, only transformed from one form (say, heat) into another (say, mechanical work), what pops out are Einsteins equations of general relativity.

That relationship was discovered in 2010 by Erik Verlinde of Amsterdam University. It has serious implications. The laws of thermodynamics rely on statistical mechanics. They involve properties (temperature, entropy and so on) which emerge from probabilistic descriptions of the behaviour of the underlying particles involved. These are also the particles described by quantum mechanics, the mathematical theory which underpins the Standard Model. That Einsteins equations can be rewritten thermodynamically implies that space and time are also emergent properties of this deeper microscopic picture. The existing forms of quantum mechanics and relativity thus do indeed both seem derivable in principle from some deeper theory that describes the underlying fabric of the universe.

String theory is not so derivable. Strings are not fundamental enough entities. But entropic gravity claims to describe the very nature of space and timeor, to use Einsteinian terminology, spacetime. It asserts this is woven from filaments of quantum entanglement linking every particle in the cosmos.

The idea of quantum entanglement, another phenomenon pooh-poohed by Einstein that turned out to be true, goes back to 1935. It is that the properties of two or more objects can be correlated (entangled) in a way which means they cannot be described independently. This leads to weird effects. In particular, it means that two entangled particles can appear to influence each others behaviour instantaneously even when they are far apart. Einstein dubbed this spooky action at a distance, because it seems to violate the premise of relativity theory that, in the speed of light, the universe has a speed limit.

As with black holes, Einstein did not live long enough to see himself proved wrong. Experiments have nevertheless shown he was. Entanglement is real, and does not violate relativity because although the influence of one particle on another can be instantaneous there is no way to use the effect to pass information faster than light-speed. And, in the past five years, Brian Swingle of Harvard University and Sean Carroll of the California Institute of Technology have begun building models of what Dr Verlindes ideas might mean in practice, using ideas from quantum information theory. Their approach employs bits of quantum information (so-called qubits) to stand in for the entangled particles. The result is a simple but informative analogue of spacetime.

Qubits, the quantum equivalent of classical bitsthe ones and zeros on which regular computing is builtwill be familiar to those who follow the field of quantum computing. They are the basis of quantum information theory. Two properties distinguish qubits from the regular sort. First, they can be placed in a state of superposition, representing both a one and a zero at the same time. Second, several qubits can become entangled. Together, these properties let quantum computers accomplish feats such as performing multiple calculations at once, or completing certain classes of calculation in a sensible amount of time, that are difficult or impossible for a regular computer.

And because of their entanglement qubits can also, according to Dr Swingle and Dr Carroll, be used as stand-ins for how reality works. More closely entangled qubits represent particles at points in spacetime that are closer together. So far, quantum computers being a work in progress, this modelling can be done only with mathematical representations of qubits. These do, though, seem to obey the equations of general relativity. That supports entropic-gravity-theorys claims.

All of this modelling puts entropic gravity in pole position to replace strings as the long-sought theory of everything. But the idea that spacetime is an emergent property of the universe rather than being fundamental to it has a disturbing consequence. It blurs the nature of causality.

In the picture built by entropic gravity, spacetime is a superposition of multiple states. It is this which muddies causality. The branch of maths that best describes spacetime is a form of geometry that has four axes at right angles to each other instead of the more familiar three. The fourth represents time, so, like the position of objects, the order of events in spacetime is determined geometrically. If different geometric arrangements are superposed, as entropic gravity requires, it can therefore sometimes happen that the statements A causes B and B causes A are both true.

This is not mere speculation. In 2016 Giulia Rubino of the University of Bristol, in England, constructed an experiment involving polarised photons and prisms which achieved exactly that. This spells trouble for those who have old-fashioned notions about causalitys nature.

However, Lucien Hardy of the Perimeter Institute, in Canada, has discovered a way to reformulate the laws of quantum mechanics to get around this. In his view, causality as commonly perceived is like data compression in computing: it is a concept that gives you more bang for your buck. With a little bit of information about the present, causality can infer a lot about the futurecompressing the amount of information needed to capture the details of a physical system in time.

But causality, Dr Hardy thinks, may not be the only way to describe such correlations. Instead, he has invented a general method for building descriptions of the patterns in correlations from scratch. This method, which he calls the causaloid framework, tends to reproduce causality but it does not assume it, and he has used it to reformulate both quantum theory (in 2005) and general relativity (in 2016). Causaloid maths is not a theory of everything. But there is a good chance that if and when such a theory is found, causaloid principles will be needed to describe it, just as general relativity needed a geometry of four dimensions to describe spacetime.

Entropic gravity has, then, a lot of heavy-duty conceptual work to back it up. But it is not the only candidate to replace string theory. Others jostling for attention include an old competitor called loop quantum gravity, originally proposed in 1994 by Carlo Rovelli, then at the University of Pittsburgh, and Lee Smolin, of the Perimeter Institute. This, and causal dynamical triangulation, a more recent but similar idea, suggest that spacetime is not the smooth fabric asserted by general relativity, but, rather, has a structureeither elementary loops or triangles, according to which of the two theories you support.

A third option, asymptotically safe gravity, goes back still further, to 1976. It was suggested by Steven Weinberg, one of the Standard Models chief architects. A natural way to develop a theory of quantum gravity is to add gravitons to the model. Unfortunately, this approach got nowhere, because when the interactions of these putative particles were calculated at higher energies, the maths seemed to become nonsensical. However, Weinberg, who died in July, argued that this apparent breakdown would go away (in maths speak, the calculations would be asymptotically safe) if sufficiently powerful machines were used to do the calculating. And, with the recent advent of supercomputers of such power, it looks, from early results, as if he might have been right.

One of the most intriguing competitors of entropic gravity, though, is the amplituhedron formulation of quantum theory. This was introduced in 2013 by Nima Arkani-Hamed of the Institute of Advanced Study at Princeton and Jaroslav Trnka of the University of California, Davis. They have found a class of geometric structures dubbed amplituhedrons, each of which encodes the details of a possible quantum interaction. These, in turn, are facets of a master amplituhedron that encodes every possible type of physical process. It is thus possible to reformulate all of quantum theory in terms of the amplituhedron.

Most attempts at a theory of everything try to fit gravity, which Einstein describes geometrically, into quantum theory, which does not rely on geometry in this way. The amplituhedron approach does the opposite, by suggesting that quantum theory is actually deeply geometric after all. Better yet, the amplituhedron is not founded on notions of spacetime, or even statistical mechanics. Instead, these ideas emerge naturally from it. So, while the amplituhedron approach does not as yet offer a full theory of quantum gravity, it has opened up an intriguing path that may lead to one.

That space, time and even causality are emergent rather than fundamental properties of the cosmos are radical ideas. But this is the point. General relativity and quantum mechanics, the physics revolutions of the 20th century, were viewed as profound precisely because they overthrew common sense. To accept relativity meant abandoning a universal notion of time and space. To take quantum mechanics seriously meant getting comfortable with ideas like entanglement and superposition. Embracing entropic gravity or its alternatives will require similar feats of the imagination.

No theory, though, is worth a damn without data. That, after all, is the problem with Supersymmetry. Work like Dr Rubinos points the way. But something out of a particle-physics laboratory would also be welcome. And, though their meaning is obscure, the past few months have indeed seen two experimentally induced cracks in the Standard Model.

On March 23rd a team from CERN, the organisation that runs the LHC, reported an unexpected difference in behaviour between electrons and their heavier cousins, muons. These particles differ from one another in no known properties but their masses, so the Standard Model predicts that when other particles decay into them, the two should each be produced in equal numbers. But this appears not to be true. Interim results from the LHC suggest that a type of particle called a B-meson is more likely to decay into an electron than a muon. That suggests an as-yet-undescribed fundamental force is missing from the Standard Model. Then, on April 7th, Fermilab, Americas biggest particle-physics facility, announced the interim results of its own muon experiment, Muon g-2.

In the quantum world, there is no such thing as a perfect vacuum. Instead, a froth of particles constantly pops in and out of existence everywhere in spacetime. These are virtual rather than real particlesthat is, they are transient fluctuations which emerge straight out of quantum uncertainty. But, although they are short-lived, during the brief periods of their existence they still have time to interact with more permanent sorts of matter. They are, for example, the source of the black-hole radiation predicted by Hawking.

The strengths of their interactions with types of matter more conventional than black holes are predicted by the Standard Model, and to test these predictions, Muon g-2 shoots muons in circles around a powerful superconducting magnetic-storage ring. The quantum froth changes the way the muons wobble, which detectors can pick up with incredible precision. The Muon g-2 experiment suggests that the interactions causing these wobbles are slightly stronger than the Standard Model predicts. If confirmed, this would mean the model is missing one or more elementary particles.

There is a slim chance that these are the absent sparticles. If so, it is the supporters of supersymmetry who will have the last laugh. But nothing points in this direction and, having failed thus far to stand their ideas up, they are keeping sensibly quiet.

Whatever the causes of these two results, they do show that there is something out there which established explanations cannot account for. Similarly unexplained anomalies were starting points for both quantum theory and relativity. It looks possible, therefore, that what has seemed one of physicss darkest periods is about to brighten into a new morning.

This article appeared in the Science & technology section of the print edition under the headline "Bye, bye, little Susy"

Read the original:
Life, the universe and everything Physics seeks the future - The Economist

Read More..