Page 1,839«..1020..1,8381,8391,8401,841..1,8501,860..»

Three Indiana research universities to collaborate with industry and government to develop quantum technologies in new NSF-funded center – Purdue…

WEST LAFAYETTE, Ind. Quantum science and engineering can help save energy, speed up computation, enhance national security and defense and innovate health care. With a grant from the National Science Foundation, researchers from Purdue University, Indiana University and the University of Notre Dame will work to develop industry- and government-relevant quantum technologies as part of the Center for Quantum Technologies. Purdue will serve as the lead site. IUPUI, a joint campus of Purdue and Indiana universities in Indianapolis, will also contribute.

This collaboration allows us to leverage our collective research expertise to address the many challenges facing multiple industries using quantum technology, said Sabre Kais, center director and distinguished professor of chemical physics in Purdues College of Science. As a university with world-leading engineering and science programs, and faculty members whose work focuses on many areas of quantum research, Purdue is a natural leader for this center.

Given the wide applicability of quantum technologies, the new Center for Quantum Technologies (CQT) will team with member organizations from a variety of industries, including computing, defense, chemical, pharmaceutical, manufacturing and materials. The CQT researchers will develop foundational knowledge into industry-friendly quantum devices, systems and algorithms with enhanced functionality and performance.

Through critical partnerships and collaboration with experts from across the state of Indiana, government and leading industries nationwide, the CQT will accelerate innovation and advance revolutionary research and technologies, said Theresa Mayer, Purdues executive vice president for research and partnerships. Purdue is thrilled to lead the CQT and further Indianas efforts to cultivate the quantum ecosystem.

Committed industry and government partners include Accenture, the Air Force Research Laboratory, BASF, Cummins, D-Wave, Eli Lilly, Entanglement Inc., General Atomics, Hewlett Packard Enterprise, IBM Quantum, Intel, Northrup Grumman, NSWC Crane, Quantum Computing Inc., Qrypt and SkyWater Technology.

Additionally, the CQT will train future quantum scientists and engineers to fill the need for a robust quantum workforce. Students engaged with the center will take on many of the responsibilities of principal investigators, including drafting proposals, presenting research updates to members and planning meetings and workshops.

At Purdue, faculty from a variety of departments will participate, including Physics and Astronomy, Chemistry, Computer Science, Materials Engineering, and the Elmore Family School of Electrical and Computer Engineering. The center will also be supported by the Purdue Quantum Science and Engineering Institute.

The CQT is funded for an initial five years through the NSFs Industry-University Cooperative Research Centers (IUCRC) program, which generates breakthrough research by enabling close and sustained engagement between industry innovators, world-class academic teams and government agencies. Through the IUCRC program, center members fund and guide the direction of the center research through active involvement and mentoring. Other academic collaborators include Gerardo Ortiz, Indiana University site director, scientific director of the IU Quantum Science and Engineering Center and professor of physics; Peter Kogge, the University of Notre Dame site director and the Ted H. McCourtney Professor of Computer Science and Engineering; Ricardo Decca, IUPUI campus director, co-director of the IUPUI Nanoscale Imaging Center, and professor and department chair of physics; and David Stewart, CQT industry liaison officer and managing director of the Purdue Quantum Science and Engineering Institute.

To learn more about the CQT, including membership, please visit http://www.purdue.edu/cqt.

About Purdue University

Purdue University is a top public research institution developing practical solutions to todays toughest challenges. Ranked in each of the last four years as one of the 10 Most Innovative universities in the United States by U.S. News & World Report, Purdue delivers world-changing research and out-of-this-world discovery. Committed to hands-on and online, real-world learning, Purdue offers a transformative education to all. Committed to affordability and accessibility, Purdue has frozen tuition and most fees at 2012-13 levels, enabling more students than ever to graduate debt-free. See how Purdue never stops in the persistent pursuit of the next giant leap athttps://stories.purdue.edu

Writer: Rhianna Wisniewski, rmwisnie@purdue.edu

Media contact: Mary Martialay, mmartial@purdue.edu

Source: David Stewart, davidstewart@purdue.edu

See the original post:

Three Indiana research universities to collaborate with industry and government to develop quantum technologies in new NSF-funded center - Purdue...

Read More..

Quantum Birth of the Universe (Weekend Feature) – The Daily Galaxy –Great Discoveries Channel

In some pockets of space, far beyond the limits of our observations, wrote cosmologist Dan Hooper at the University of Chicago in an email to The Daily Galaxy, referring to the theory of eternal inflation and the inflationary multiverse: the laws of physics could be very different from those we find in our local universe. Different forms of matter could exist, which experience different kinds of forces. In this sense, what we call the laws of physics, instead of being a universal fact of nature, could be an environmental fact, which varies from place to place, or from time to time.

I think I know how the universe was born, said Andrei Linde, Russian-American theoretical physicist and the Harald Trap Friis Professor of Physics at Stanford University. Linde is one of the main authors of the inflationary universe theory, as well as the theory of eternal inflation and inflationary multiverse.

According to quantum models, galaxies like the Milky Way grew from faint wrinkles in the fabric of spacetime. The density of matter in these wrinkles was slightly greater compared to surrounding areas and this difference was magnified during inflation, allowing them to attract even more matter. From these dense primordial seeds grew the cosmic structures we see today. Galaxies are children of random quantum fluctuations produced during the first 10-35 seconds after the birth of the universe, said Linde.

As a result, the universe becomes a multiverse, an eternally growing fractal consisting of exponentially many exponentially large parts, Linde wrote. These parts are so large that for all practical purposes they look like separate universes.

Late one summer night in 1981, while still a junior research fellow at Lebedev Physical Institute in Moscow, Andrei Linde was struck by a revelation. Unable to contain his excitement, he shook awake his wife, Renata Kallosh, and whispered to her in their native Russian, I think I know how the universe was born.

Kallosh, a theoretical physicist herself, muttered some encouraging words and fell back asleep. It wasnt until the next morning that I realized the full impact of what Andrei had told me, recalled Kallosh, now a professor of physics at the Stanford Institute for Theoretical Physics.

Lindes nocturnal eureka moment had to do with a problem in cosmology that he and other theorists, including Stephen Hawking, had struggled with.

A year earlier, a 32-year-old postdoc at SLAC National Accelerator Laboratory named Alan Guth shocked the physics community by proposing a bold modification to the Big Bang theory. According to Guths idea, which he called inflation, our universe erupted from a vacuum-like state and underwent a brief period of faster-than-light expansion. In less than a billionth of a trillionth of a trillionth of a second, space-time doubled more than 60 times from a subatomic speck to a volume many times larger than the observable universe.

Guth envisioned the powerful repulsive force fueling the universes exponential growth as a field of energy flooding space. As the universe unfurled, this inflation field decayed, and its shed energy was transfigured into a fiery bloom of matter and radiation. This pivot, from nothing to something and timelessness to time, marked the beginning of the Big Bang. It also prompted Guth to famously quip that the inflationary universe was the ultimate free lunch.

As theories go, inflation was a beauty. It explained in one fell swoop why the universe is so large, why it was born hot, and why its structure appears to be so flat and uniform over vast distances. There was just one problem it didnt work.

To conclude the unpacking of space-time, Guth borrowed a trick from quantum mechanics called tunneling to allow his inflation field to randomly and instantly skip from a higher, less stable energy state to a lower one, thus bypassing a barrier that could not be scaled by classical physics.

But closer inspection revealed that quantum tunneling caused the inflation field to decay quickly and unevenly, resulting in a universe that was neither flat nor uniform. Aware of the fatal flaw in his theory, Guth wrote at the end of his paper on inflation: I am publishing this paper in the hope that it will encourage others to find some way to avoid the undesirable features of the inflationary scenario.

Linde Answers Guth

Guths plea was answered by Linde, who on that fateful summer night realized that inflation didnt require quantum tunneling to work. Instead, the inflation field could be modeled as a ball rolling down a hill of potential energy that had a very shallow, nearly flat slope. While the ball rolls lazily downhill, the universe is inflating, and as it nears the bottom, inflation slows further and eventually ends. This provided a graceful exit to the inflationary state that was lacking in Guths model and produced a cosmos like the one we observe. To distinguish it from Guths original model while still paying homage to it, Linde dubbed his model new inflation.

Models of Inflation Theory

By the time Linde and Kallosh moved to Stanford in 1990, experiments had begun to catch up with the theory. Space missions were finding temperature variations in the energetic afterglow of the Big Bang called the cosmic microwave background radiation that confirmed a startling prediction made by the latest inflationary models. These updated models went by various names chaotic inflation, eternal inflation, eternal chaotic inflation and many more but they all shared in common the graceful exit that Linde pioneered.

Quantum Fluctuation Fingerprints

Inflation predicted that these quantum fluctuations would leave imprints on the universes background radiation in the form of hotter and colder regions, and this is precisely what two experiments dubbed COBE and WMAP found. After the COBE and WMAP experiments, inflation started to become part of the standard model of cosmology, Shamit Kachru said.

Pocket Universes New Inflating Regions in the Universe

Linde and others later realized that the same quantum fluctuations that produced galaxies can give rise to new inflating regions in the universe. Even though inflation ended in our local cosmic neighborhood 13.8 billion years ago it can still continue in disconnected regions of space beyond the limits of our observable universe The consequence is an ever-expanding sea of inflating space-time dotted with pocket universes like our own where inflation has ceased.

As a result, the universe becomes a multiverse, an eternally growing fractal consisting of exponentially many exponentially large parts, Linde wrote. These parts are so large that for all practical purposes they look like separate universes.

Linde took the multiverse idea even further by proposing that each pocket universe could have differing properties, a conclusion that some string theorists were also reaching independently.

Its not that the laws of physics are different in each universe, but their realizations, Linde said. An analogy is the relationship between liquid water and ice. Theyre both H2O but realized differently.

Lindes multiverse is like a cosmic funhouse filled with reality-distorting mirrors. Some pocket universes are resplendent with life, while others were stillborn because they were cursed with too few (or too many) dimensions, or with physics incompatible with the formation of stars and galaxies. An infinite number are exact replicas of ours, but infinitely more are only near-replicas. Right now, there could be countless versions of you inhabiting worlds with histories divergent from ours in ways large and small. In an infinitely expanding multiverse, anything that can happen will happen.

The inflationary universe is not just the ultimate free lunch, its the only lunch where all possible dishes are served, Linde said.

While disturbing to some, this eternal aspect of inflation was just what a small group of string theorists were looking for to help explain a surprise discovery that was upending the physics world dark energy.

The Last Word -Brian Keating and Avi Loeb

When asked, will Lindes pocket universes be subject to the same laws of physics as our Universe, Brian Keating, Distinguished Professor of Physics at the Center for Astrophysics & Space Sciences at University of California, San Diego, told The Daily Galaxy: No, not necessarily. Its not mandatory that the properties of space-time be consistent from universe to universe. Nor is it impossible that the laws of logic and mathematics be consistent throughout the universe. This has led some physicists such as Paul Steinhart claiming that the multiverse concept is not a self-consistent or proper subject with the traditions of the scientific method.

Not so certain of the existence of Lindes free lunch, Harvard astrophysicist Avi Loeb told The Daily Galaxy: Advances in scientific knowledge are enabled by experimental tests of theoretical ideas. Physics is a dialogue with nature, not a monologue. I am eagerly waiting for a proposed experimental test of the multiverse idea.

Avi Shporer, Research Scientist, with the MIT Kavli Institute for Astrophysics and Space Research via Dan Hooper, Brian Keating, Avi Loeb and Stanford University

The Galaxy Report newsletter brings you twice-weekly news of space and science that has the capacity to provide clues to the mystery of our existence and add a much needed cosmic perspective in our current Anthropocene Epoch.

Yes, sign me up for my free subscription.

Recent Galaxy Reports:

Avi Shporer,Research Scientist, MIT Kavli Institute for Astrophysics and Space Research. AGoogle Scholar, Avi was formerly aNASA Sagan Fellowat the Jet Propulsion Laboratory (JPL). His motto, not surprisingly, is a quote from Carl Sagan: Somewhere, something incredible is waiting to be known.

Go here to read the rest:

Quantum Birth of the Universe (Weekend Feature) - The Daily Galaxy --Great Discoveries Channel

Read More..

Evansville’s ties to the first detonation of the A-bomb in 1945 – Courier & Press

Its not hyperbole to suggest that there are two worlds one before and one after the detonation of the atomic bomb.

Interestingly, there are two Southern Indiana connections to J. Robert Oppenheimer, leader of the Manhattan Project.

Joseph Fabian Mattingly, the uncle of Evansville baseball legend Don, was present July 6, 1945 as the gadget was successfully tested in Alamogordo, New Mexico. The U.S. dropped the A-bomb on Hiroshima on Aug. 6 and Nagasaki on Aug. 9 and Japan surrendered from World War II shortly thereafter.

It was very bright, Joseph Fabian Mattingly told the Evansville Courier in 1995. When it lit up the sky, the colors were beautiful violet and purple. It was a pretty sight. We were on a mountainside about 17 miles out.

It was bright as hell, and it was quiet. Eerie. There was no sound for a minute and a half. Then, whoom! A thunderous reverberation from the mountains occurred again and again. The light was like looking at the sun. There was a cloud layer about 17,000 feet and it looked like there was somebody at the end of the clouds shaking them like a bedsheet, vibrating up and down.

Local news:Evansville's total debt on three Downtown developments is $142 million

Then 86, that was Mattinglys recollection of seeing the detonation in the New Mexico desert. Randy Mattingly said his uncle, who died at 91 in 2000, made for quite a conversation piece at family gatherings when he was growing up.

Initially, I was young enough that it didnt register to me, Randy told the Courier & Press. The A-bomb didnt really register to me. He showed us the goggles (he wore during the detonation) at our grandfathers house.

Although those goggles (welders glasses) might bring in quite a price at an auction, Randy isnt sure where they are.

Melba Newell Phillips, a female trailblazer from Hazleton in Gibson County, Indiana, worked with J. Robert Oppenheimer years before the A-bomb exploded.

Phillips, who died in 2004 at age 97, studied under and collaborated with Oppenheimer. She was part of a heroic age of physics, a time when scientists were just beginning to study quantum theory and other areas of physics that would bring the world into the atomic age, according to American Prometheus: the Triumph and Tragedy of J. Robert Oppenheimer a Pulitzer Prize-winning biography by Kai Bird and Martin J. Sherwin. It is the basis for an upcoming biographical film, Oppenheimer, scheduled to be released in July 2023.

Barely 16, Phillips graduated from Union High School in rural Pike County in 1923. She began her undergraduate work at Oakland City University and worked with Oppenheimer at the University of California at Berkeley in the early 1930s. During the Red Scare of 1952, she stood up to congressional bullies of Senator Joseph McCarthy, but lost her job at Brooklyn College in the process, said Oakland City University social sciences professor and area historian Randy Mills.

Still, she persevered. In fact, the American Association of Physics Teachers in 1983 recognized her commitment to education by creating the Melba Newell Phillips Award, a national honor given yearly to the individual who is judged to have made an exceptional contribution to physics education.

In 1943, while working at the U.S. Weather Bureau in Evansville, Joseph Mattingly received a call from Dr. Philemon Edwards Church, who was assigned to the Manhattan Project to study/predict weather patterns and turbulence for the project, according to the July 2006 Mattingly Family Newsletter.

Church invited Mattingly, a 1927 Memorial High School graduate, to take part in his studies at the University of Chicago. He was given special leave where his position with the Weather Bureau was protected for the duration of the war. Mattingly also received, over objection from local military authorities, a special military deferment personally from Gen. Leslie Groves, Military Chief of the Manhattan Project.

After training in Chicago, he was sent to Hanford, Washington, assigned to Hanford Engineering Works, a division of E.I. DuPont. DuPont had erected the first full-size nuclear reactor at this site and would produce plutonium for the atomic bomb. Few of the 20,000 workers at Hanford, including Mattingly, knew what was going on or what the Hanford site mission entailed. One mile from the reactor, they built a tower several hundred feet tall that his team used to make continuous observations of barometric pressure, temperature, humidity, and cloud cover in an attempt to track the radioactive smoke from the production facility. Geiger counters were placed all over the area.

Local news:Vanderburgh County Prosecutor's Office linked to firm of lawyer who wasn't charged for DUI

Every morning Mattingly boarded a Piper Cub and was taken up to 2,000 feet to track smoke from the stacks. The Hanford Site was 600 square miles and the smoke was to diffuse before it got off the reservation. No one knew what was really going on other than a war project that involved something called the gadget.

The Hanford area ws later considered one of the most contaminated places in the world. Mattingly said at least one person died of cancer and it was in Hanford that his wife, Adeline, became ill with Parkinson's disease.

"But there's no way to know if radiation had anything to do with it," Mattingly told the Evansville Courier.

In July of 1945, Mattingly was sent to Alamogordo. Uncle Fabian was on hand to witness the most powerful development of the century. Following are a few of the quotes from his notebook made on the date of the detonation: White hot 1 mile. The second drawing shows a mushroom with the note, Golden glowing one-half mile. The third drawing shows a larger cloud and the note, Violet brilliant color. Other notes from his address book: Base precaution C, burn from ultraviolet rays, (2) prone on face, (3) eye protection, (4) evacuation, in case of disaster. One half hour after blast, stratified layers aloft, no longer distinguishable from Albuquerque road. B-29 at 24,000 feet reported light bump at altitude above shot.

When Mattingly returned to Hanford, he was the only one of the 20,000 workers who knew what the gadget was and what it could do. He didnt know how it was going to be used until Aug. 6, 1945, when the story broke that the bomb Little Boy was dropped over the city of Hiroshima and three days later the bomb Fat Man was dropped over the city of Nagasaki.

Unlike the Trinity Site in New Mexico, the Hanford reactor site is one of the most polluted sites in the world. In their rush they just didnt know what the consequences were to the environment. The government is spending $1 billion per year on cleanup that will go for several more years.

In 1947, Mattingly returned to the University of Washington in a sub-faculty position in the newly formed Department of Meteorology and Climatology. He returned to the U.S. Weather Bureau in Evansville in 1949. He built his house in the summer of 1950 on St. George Road and lived there the rest of his life next door to his sister Catherine Hess.

After the U.S. dropped atomic bombs on Japan, Phillips joined other scientists organized to prevent future nuclear wars.She took a great hit to her career during the Cold War for standing up to McCarthyism. Colleagues and students notedher intellectual honesty, self-criticism, and style, and called her a role model for principle and perseverance" in "Melba Phillips: Leader in Science and Conscience."

As she moved up the academic ranks, Phillips pursued graduate research under Oppenheimer and earned her doctorate in 1933. Within a few years she was known throughout the physics world because of her contribution to the field via theOppenheimer-Phillips effect, according to "Women in Physics."

The 1935 Oppenheimer-Phillips Effect explained what was at the time unexpected behavior of accelerated deuterons (nuclei of deuterium, or heavy hydrogen atoms) in reactions with other nuclei, according to aUniversity of Chicagopress release. When Oppenheimer died in 1967, hisNew York Timesobituary noted his and Phillips discovery as a basic contribution to quantum theory.

Phillips was subsequently fired from her university positions due to a law which required the termination of any New York City employee who invoked the Fifth Amendment.

Bonner explained, McCarran was a specialist at putting people in the position in which they had to invoke the Fifth Amendment. It was a deliberate expression of the McCarthyism of the time.

In a1977 interview,Phillips briefly discussed the incident (although she was reluctant because she was trying to keep the interviewer focused on her scientific accomplishments).She stated: I was fired from Brooklyn College for failure to cooperate with the McCarran Committee, and I think that ought to go into the record . . . city colleges were particularly vulnerable, and the administration was particularly McCarthyite.

Phillips stated that she wasnt particularly political. Her objection to cooperating had been a matter of principle.

In 1987, Brooklyn College publicly apologized for firing Phillips, and in 1997 created the aforementioned scholarship in her name. Phillips died on Nov. 8, 2004 in Petersburg, Indiana.

TheNew York Timesreferred to Phillips in her obituary as a pioneer in science education and noted that at a time when there were few women working as scientists, Dr. Phillips was leader among her peers.

Her accomplishments helped pave the way for other women in the sciences.

In a 1977 interview, Phillips addressed the problems women face in aspiring to science careers an a 1977 interview, stating: "Were not going to solve them, but, as Ive been saying all the time; if we make enough effort, well make progress; and I think progress has been made. We sometimes slip back, but we never quite slip all the way back; or we never slip back to the same place. Theres a great deal of truth in saying that progress is not steady no matter how inevitable."

Contact Gordon Engelhardt by email at gordon.engelhardt@courierpress.com or on Twitter @EngGordon.

Link:

Evansville's ties to the first detonation of the A-bomb in 1945 - Courier & Press

Read More..

Quantum Computing Now And In The Future: Explanation, Applications, And Problems – Forbes

A new generation of computer technology is on the horizon, which many think will eventually increase the computing power available to humanity by factors of thousands or possibly even millions. If this happens, it could vastly increase the speed at which we can carry out many vital tasks, such as discovering and testing new drugs or understanding the impact of climate change.

Quantum Computing Now And In The Future: Explanation, Applications, and Problems

Quantum computing is already with us in limited form. But the next five to 10 years may see it leap into the mainstream in the same way that classical computers moved from labs and large corporations to businesses of all sizes, as well as homes, in the 1970s and 1980s.

However, as well as big leaps forward in what we are able to do with computers, they also require us to face up to a new set of problems, specifically around the threats they pose to security and encryption. And some people think that, in fact, quantum computers may never be useful at all due to their complexity and the limited amount of tasks at which they have been shown to be superior to classical computer technology.

So, heres my overview of where we currently are and where were hoping to get to with quantum computing, with expert input from my most recent podcast guest, Lawrence Gasman, co-founder and president of Inside Quantum Technology and author of over 300 research reports.

What is quantum computing?

Like everything involving the quantum (sub-atomic) domain, quantum computing isnt the easiest concept to get your head around. Fundamentally, the term describes a new (or future) generation of super-fast computers that process information as qubits (quantum bits) rather than the regular bits ones and zeroes of classical computing.

Classical computers are really just much more sophisticated versions of pocket calculators they are based on electrical circuits and switches that can be either on (one) or off (zero). By stringing lots of these ones and zeroes together, they can store and process any information. However, their speed is always limited due to the fact that large amounts of information need a lot of ones and zeroes to represent it.

Rather than simple ones and zeroes, the qubits of quantum computing can exist in many different states. Due to the strange properties of quantum mechanics, this might mean they can exist as one and zero simultaneously (quantum superposition). They can also exist in any state between one and zero.

As Gasman explains, That means you can process a lot more information on a quantum computer, and that means you can do some problems much faster. And sometimes that really matters sometimes its not whoopee I can do this in two hours instead of two days, its whoopee I can do this in two hours instead of nine million years.

Nine million years sounds like the sort of number that people only use when they are exaggerating, but according to some estimates, quantum computers will operate 158 million times faster than the fastest supercomputers available today.

Theres one important caveat, though currently, quantum computers are only really useful for a fairly narrow set of uses. Dont expect to simply be able to plug a quantum processor into your Macbook and do everything that you can do on it now, but millions of times quicker.

So what can quantum computing do better than classical computing?

The truth is that classical computers can solve all of the problems that quantum computers will solve there hasnt yet been a use case discovered for quantum computers that cant already be done with classical computers.

The problem, Gasman tells me, is that it will take classical computers so long to solve them that anyone who starts looking for the answer today will be long dead!

In particular, they are potentially hugely useful for a set of problems known as optimization problems. The idea is illustrated by imagining a traveling salesman who has to visit a number of towns, in any order but without retracing their steps, and doing it while covering the shortest distance (or in the shortest amount of time) possible. Elementary mathematics can show us that as soon as there are more than a few towns, the number of possible routes becomes incredibly high millions or billions. This means that calculating the distance and time taken for all of them in order to find the fastest can take a huge amount of processing power if we're using classical binary computing.

This has implications for fields as diverse as tracing and routing financial transactions across global financial networks, developing new materials by manipulating physical or genetic properties, or even understanding how changing climate patterns affect the world around us.

Gasman tells me, "The ones that have the most potential are, I'd say, in very large banks but if you're a big corporation and you're giving Goldman Sachs a billion dollars to look after, do you really want them to put it in the hands of some newfangled technology? A certain level of trust will have to be established but all the big banks have their own quantum teams now exploring what can be done in the next five to 10 years.

What are the challenges around quantum computing?

Firstly, there are some pressing physics challenges that need to be solved. Qubits themselves, when existing in a physical state as they need to do to represent data and allow computation to take place, are highly unstable. This means they must be held in a super-cooled environment, even to exist for just a few nanoseconds, in order to be of use. This means that quantum computing is currently very expensive, and only the largest companies and best-funded research organizations can afford to own them.

This means that assessing use cases is an expensive and time-consuming process too. Already one use creating more efficient MRI scans - has proven to be a dead end, Gasman tells me.

Its also been suggested that cosmic rays could pose an obstacle to the widespread adoption of quantum computing. Moreover, the errors caused by the phenomena which can affect even classical computing could be even more impactful on the hyper-sensitive engineering needed to harness qubits on a large scale usefully.

Theres also a critical shortage of people with the skills to develop and work with quantum computers. As Gasman puts it, what you want is someone who is a computer scientist, and a physicist, and an expert on pharmaceuticals or finance the specifics of the disciplines are so different that getting people to talk to each other is quite difficult!

Finally, as well as the challenges around implementing quantum computing, we cant ignore the challenges that the technology will potentially create itself when it is widespread.

The one causing the biggest headaches right now is the threat it poses to encryption. Digital cryptography is used today to secure everything online, as well as all of our communications and information, such as military, commercial and national secrets. It works on the basis that encryption methods are so complex it would take classical computers millions or billions of years to crack them by brute-forcing every possible password or key. However, to quantum computers, doing so could be trivial.

"It's a huge issue," Gasman tells me. If I have something encrypted on my machine and its broken by somebody in nine million years, Im not likely to care that much!

But then it turns out that with a quantum computer, it can be decoded like, now this is a real problem!

"We don't have such a quantum computer, and the estimate of when it might appear is anything from five years to never I think it will happen sooner rather than later.

The problem is currently being taken very seriously by governments as well as corporations, which are both putting resources into developing what is known as "post-quantum encryption" so that, hopefully, all of their deepest secrets won't suddenly be laid bare.

What is in store for the future of quantum computing?

The first developments we are expected to see are likely to mirror those that occurred as classical computers moved from being lab toys or something only the largest corporations could afford in the latter half of the 20th century.

This is likely to follow the format of the transition from mainframes (filling entire buildings) to minicomputers (filling rooms) and eventually to microcomputers that could live on our desks.

This democratization of access to quantum power will lead to new use cases as businesses will be able to put it to the test against their own specific sets of challenges.

Gasman says, "A fifty-thousand dollar computer is something that most medium-sized companies can afford an eight-hundred thousand dollar computer not so much.

Problems where quantum computers will potentially be put to use include monitoring and predicting traffic flow across complex urban environments or even processing the huge amounts of data necessary for artificial intelligence and machine learning. If one day humans are able to model a system as complex as a biological brain paving the way for true AI it almost certainly wont be by using classical computing.

Gasman says, "The exciting thing for me is the breakthroughs that are likely to happen. To mix metaphors, the world is quantum computing's oyster. There are lots of good reasons to be in classical computing, but if you're looking for the massive breakthroughs it aint going to happen. Thats the excitement of quantum computing.

You can click here to watch my webinar with Lawrence Gasman, president, and co-founder of IQT Research, where we take a deeper dive into the future of quantum computing and what it means for the world.

To stay on top of the latest on the latest business and tech trends, make sure to subscribe to my newsletter and have a look at my book Business Trends in Practice, which just won the Business Book of the Year 2022.

You can also follow me on Twitter, LinkedIn, and YouTube. And dont forget to check out my website.

Go here to read the rest:
Quantum Computing Now And In The Future: Explanation, Applications, And Problems - Forbes

Read More..

Multiverse Computing Introduces a New Version of their Singularity Portfolio Optimization Software – Quantum Computing Report

Multiverse Computing Introduces a New Version of their Singularity Portfolio Optimization Software

We reported in August 2021 about a new software program from Multiverse Computing called Singularity. This program has an interesting characteristic in that it is implemented as an Excel plug-in that make it easy and quick for an inexperienced end user to try without requiring them to learn a lot about quantum computing. They have now released an update to this program that includes Singularity Portfolio Optimization v1.2 that supports a variety of modes including a Multiverse Hybrid mode, a D-Wave Leap Hybrid mode, and a pure classical solver. The program also can accept a variety of constraints while performing the optimizations including investors level of risk aversion, resolution of asset allocation, minimum and maximum allowable investment per asset, and others. The portfolio optimizer uses Multiverses hybrid solver for its core algorithms and the company indicates it can produce results competitive to classical solvers in a shorter period of time. The program is hardware agnostic and can be used with a variety of different quantum processors as well as quantum-inspired and classical configurations. Additional information about this new version of Singularity is available in a news release posted on the Multiverse website here.

August 28, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Read the original:
Multiverse Computing Introduces a New Version of their Singularity Portfolio Optimization Software - Quantum Computing Report

Read More..

Ethereum Nodes Largely Run on Cloud ServicesThis Host Says Theyre Banned – Decrypt

Popular cloud service provider Hetzner has said that its product is not for crypto usersdespite the fact it hosts roughly 10% of all Ethereum nodes.

In a Reddit post earlier this week, the company said that running just one node is a violation. A node is a computer that, together with other nodes, powers a decentralized network, like Bitcoin or Ethereum.

Nodes are needed to run software that can verify blocks and transaction data and keep a network chugging along.

Ethereum is the second largest cryptocurrency by market cap and, as it runs on a decentralized blockchain, anyone can set up a node in order to contribute and take part in running the network.

Many such participants on the Ethereum network make use of cloud computing providers to host their nodes. According to data from Ethernodes.org, almost 62% of all nodes on the Ethereum network operate via a hosting service, such as Amazon. Of those hosted nodes, 14% currently use Hetzner, which means the German company currently hosts 10% of Ethereum nodes.

The issue resurfaced this morning when W3bcloud co-founder Maggie Love lamented this fact in a tweet. Love suggested that Ethereum lacked proper decentralization because it relies so heavily on cloud services like Amazon, which itself accounts for 50% of all hosted Ethereum nodes. The tweet also name-checked Hetzner, and the company replied with a link to a Reddit post which it said customers who are using Ethereum should please read.

In the post, the company said its service should definitely not be used for things related to crypto, including Ethereum. Using our products for any application related to mining, even remotely related, is not permitted, it said.

This includes Ethereum. It includes proof of stake and proof of work and related applications. It includes trading. It is true for all of our products, except colocation.

Hetzner added: We are aware that there are many Ethereum users currently at Hetzner, and we have been internally discussing how we can best address this issue.

Hetzner did not immediately respond to Decrypts request for a comment.

Stay on top of crypto news, get daily updates in your inbox.

Read the original:
Ethereum Nodes Largely Run on Cloud ServicesThis Host Says Theyre Banned - Decrypt

Read More..

Liquid Web Celebrates 25 Years of Helping Businesses Grow Online – PR Newswire

A Quarter Century of Business Dotted with Technology Advancements and Industry-Leading Support

ATLANTA, Aug. 29, 2022 /PRNewswire/ -- Liquid Web, the market leader in cloud hosting and software solutions for small to medium-sized businesses (SMBs), celebrates 25 years of helping businesses grow online.

Over the past quarter century, Liquid Web has grown into a Family of Brands delivering technology, services, and support for thriving businesses and nonprofits running mission-critical websites, eCommerce stores, and applications. Liquid Web(High Performance Managed Hosting), Nexcess (Digital Commerce Cloud), StellarWP(WordPress Software and Tools), and Modern Tribe(Enterprise WordPress Agency) have more than 500,000+ sites under management and support over 175,000 software subscribers and 2.5 Million+ free version software users.

The company was founded in 1997 by Matthew Hill, who passed away unexpectedly on July 13, 2022. He was in high school when he founded Liquid Web in Holt, Michigan. For 18 years, before he sold in 2015, with the help of his family and friends, he built the foundation for what the company is today.

In 2015, Madison Dearborn and the management team (Jim Geiger (CEO), Joe Oesterling (CTO), and Carrie Wheeler (COO)) invested in Liquid Web. The new leadership team set a vision to build a platform to enable small businesses - and the creators who build sites and stores for them - to make money online. The strategy was developed to innovate products and services that address market trends toward simplification and deliver the best experience in hosting - to become the Most Helpful Humans in Hosting.

"All of our efforts and strategies are focused on helping small businesses make money online," said Jim Geiger, CEO. "We've coined the phrase "Web-Dependent SMBs" because, for our customers, their online presence is their business. They are not "setting and forgetting" their website; it's not brochureware, it's not a hobby. Our platform(s) power their online commerce and therefore, their livelihood", said Geiger.

In the last 7 years, Liquid Web invested heavily in product development, sales, and marketing and expanded platform and support capabilities by adding Managed WordPress (2016); Premium Business Email (2017); the industry's first Managed WooCommerce Hosting (2018); Protection and Remediation Services (2018); VMware Private Cloud (2019); CloudFlare (2019); Acronis Cyber Backups (2020); Managed Cloud Servers (2020); VMware Private Cloud Multi-Tenant (2020); and Threat Stack (2021).

The Liquid Web Family of Brands has also strategically grown its presence in the WordPress space by adding software companies like iThemes(2018), Restrict Content Pro(2020), The Events Calendar(2021), Iconic(2021), KadenceWP(2021), GiveWP(2021), LearnDash(2021), and Modern Tribe(2021) to our portfolio. In addition, a new arm of our business was introduced to serve as the umbrella brand for all WordPress software offerings: StellarWP.

The company has honed a strategic focus around open source software and platforms and the flexibility and ownership those provide for online site and store owners. Because the customers they attract are largely within the WordPress ecosystem the world's most dominant content and commerce management system powering over 43% of all online properties on the Internet today the company has a commitment to fostering ongoing innovation with new features, curated solutions, and tools to help customers realize the power that open source provides. In 2021, the company launched the first SaaS-like customer experience for buying, building, and managing an online store with a product called StoreBuilder, and in 2022, launched LearnDash Cloud. These products integrate hosting and curate a number of the company's owned software solutions to innovate online commerce solutions on WordPress.

"We continue to believe that we can lead in providing simplified, highly performant, secure, curated solutions to accelerate SMBs and creators getting, staying, and growing online. Our North Star, our strategic vision, is to continue to simplify online commerce in all its forms for new and existing online merchants by leveraging open source solutions riding on our infrastructure, our software assets, our industry expertise, and our world-class technical support," said Geiger. "We celebrate this anniversary with great pride and appreciation for our customers who trust us with their business and our employees, the Most Helpful Humans in Hosting who make us stand out as an industry leader. As we look ahead to our future, we understand the responsibility we have to the businesses we support and will continue to focus on our purpose of helping SMBs and their creators make money online."

About Liquid Web

Building on 25 years of success, our Liquid Web Family of Brands delivers software, solutions, and managed services for mission-critical sites, stores, and applications to SMBs and the designers, developers, and agencies who create for them. Liquid Web (Managed Hosting), Nexcess(Digital Commerce Cloud), and StellarWP(WordPress Software and Tools) have more than 500,000+ sites under management, over 175,000 software subscribers, and 2 million+ free version software users. Collectively, the companies have assembled a world-class team of industry experts, provide unparalleled service from solution engineers available 24/7/365, and own and manage ten global data centers. As an industry leader in customer service, the rapidly expanding brand family has been recognized among INC. Magazine's 5000 Fastest-Growing Companies for 12 years. Learn more about the Liquid Web Family of Brands.

Media Contact: Jackie Cowan[emailprotected]

SOURCE Liquid Web

Read more:
Liquid Web Celebrates 25 Years of Helping Businesses Grow Online - PR Newswire

Read More..

How Snap rebuilt the infrastructure that now supports 347 million daily users – Protocol

In 2017, 95% of Snaps infrastructure was running on Google App Engine. Then came the Annihilate FSN project.

Snap, which launched in 2011, was built on GAE FSN (Feelin-So-Nice) was the name for the original back-end system and the majority of Snapchats core functionality was running within a monolithic application on it. While the architecture initially was effective, Snap started encountering issues when it became too big for GAE to handle, according to Jerry Hunter, senior vice president of engineering at Snap, where he runs Snapchat, Spectacles and Bitmoji as well as all back-end or cloud-based infrastructure services.

Google App Engine wasn't really designed to support really big implementations, Hunter, who joined the company in late 2016 from AWS, told Protocol. We would find bugs or scaling challenges when we were in our high-scale periods like New Year's Eve. We would really work hard with Google to make sure that we were scaling it up appropriately, and sometimes it just would hit issues that they had not seen before, because we were scaling beyond what they had seen other customers use.

Today, less than 1.5% of Snaps infrastructure sits on GAE, a serverless platform for developing and hosting web applications, after the company broke apart its back end into microservices backed by other services inside of Google Cloud Platform (GCP) and added AWS as its second cloud computing provider. Snap now picks and chooses which workloads to place on AWS or GCP under its multicloud model, playing the competitive edge between them.

The Annihilate FSN project came with the recognition that microservices would provide a lot more reliability and control, especially from a cost and performance perspective.

[We] basically tried to make the services be as narrow as possible and then backed by a cloud service or multiple cloud services, depending on what the service we were providing was, Hunter said.

Snapchat now has 347 million daily active users who send billions of short videos, send photos called Snaps or use its augmented-reality Lenses.

Its new architecture has resulted in a 65% reduction in compute costs, and Hunter said he has come to deeply understand the importance of having competitors in Snaps supply chain.

I just believe that providers work better when they've got real competition, said Hunter, who left AWS as a vice president of infrastructure. You just get better pricing, better features, better service. We're cloud-native, and we intend on staying that way, and it's a big expense for us. We save a lot of money by having two clouds.

The Annihilate FSN process wasnt without at least one failed hypothesis. Hunter mistakenly thought that Snap could write its applications on one layer and that layer would use the cloud provider that best fit a workload. That proved to be way too hard, he said.

The clouds are different enough in most of their services and changing rapidly enough that it would have taken a giant team to build something like that, he said. And neither of the cloud providers were interested at all in us doing that, which makes sense.

Instead, Hunter said, there are three types of services that he looks at from the cloud.

There's one which is cloud-agnostic, he said. It's pretty much the same, regardless of where you go, like blob storage or [content-delivery networks] or raw compute on EC2 or GCP. There's a little bit of tuning if you're doing raw compute but, by and large, those services are all pretty much equal. Then there's sort of mixed things where it's mostly the same, but it really takes some engineering work to modify a service to run on one provider versus the other. And then there's things that are very cloud-specific, where only one cloud offers it and the other doesn't. We have to do this process of understanding where we're going to spend our engineering resources to make our services work on whichever cloud that it is.

Snaps current architecture also has resulted in reduced latency for Snapchatters.

In its early days, Snap had its back-end monolith hosted in a single region in the middle of the United States Oklahoma which impacted performance and the ability for users to communicate instantly. If two people living a mile apart in Sydney, Australia, were sending Snaps to each other, for example, the video would have to traverse Australia's terrestrial network and an undersea cable to the United States, be deposited in a server in Oklahoma and then backtrack to Australia.

If you and I are in a conversation with each other, and it's taking seconds or half a minute for that to happen, you're out of the conversation, Hunter said. You might come back to it later, but you've missed that opportunity to communicate with a friend. Alternatively, if I have just the messaging stack sitting inside of the data center in Sydney now you're traversing two miles of terrestrial cable to a data center that's practically right next to you, and the entire transaction is so much faster.

If I want to experiment and move something to Sydney or Singapore or Tokyo, I can just do it.

Snap wanted to regionalize its services where it made sense. The only way to do that was by using microservices and understanding which services were useful to have close to the customer and which ones weren't, Hunter said.

Customers benefit by having data centers be physically closer to them because performance is better, he said. CDNs can cover a lot of the broadcast content, but when doing one-on-one communications with people people send Snaps and Snap videos those are big chunks of data to move through the network.

That ability to switch regions is one of the benefits of using cloud providers, Hunter said.

If I want to experiment and move something to Sydney or Singapore or Tokyo, I can just do it, he said. I'm just going to call them up and say, OK, we're going to put our messaging stack in Tokyo, and the systems are all there, and we try it. If it turns out it doesn't actually make a difference, we turn that service off and move it to a cheaper location.

Snap has built more than 100 services for very specific functions, including Delta Force.

In 2016, any time a user opened the Snapchat app, it would download or redownload everything, including stories that a user had already looked at but hadnt yet timed out in the app.

It was a naive deployment of just download everything so that you don't miss anything, Hunter said. Delta Force goes and looks at the client finds out all the things that you've already downloaded and are still on your phone, and then only downloads the things that are net-new.

This approach had other benefits.

Of course, that turns out to make the app faster, Hunter said. It also costs us way less, so we reduced our costs enormously by implementing that single service.

Snap uses open-source software to create its infrastructure, including Kubernetes for service development, Spinnaker for its application team to deploy software, Spark for data processing and memcached/KeyDB for caching. We have a process for looking at open source and making sure we're comfortable that it's safe and that it's not something that we wouldn't want to deploy in our infrastructure, Hunter said.

Snap also uses Envoy, an edge and service proxy and universal data plane designed for large, microservice service-mesh architectures.

I actually feel like the way of the future is using a service mesh on top of your cloud to basically deploy all your security protocols and make sure that you've got the right logins and that people aren't getting access to it that shouldn't, Hunter said. I'm happy with the Envoy implementations giving us a great way of managing load when we're moving between clouds.

Hunter prefers using primitives or simple services from AWS and Google Cloud rather than managed services. A Snap philosophy that serves it well is the ability to move very fast, Hunter said.

I don't expect my engineers to come back with perfectly efficient systems when we're launching a new feature that has a service as a back end, he said, noting many of his team members previously worked for Google or Amazon. Do what you have to do to get it out there, let's move fast. Be smart, but don't spend a lot of time tuning and optimizing. If that service doesn't take off, and it doesn't get a lot of use, then leave it the way it is. If that service takes off, and we start to get a lot of use on it, then let's go back and start to tune it.

Our total compute cost is so large that little bits of tuning can have really large amounts of cost savings for us.

Its through that tuning process of understanding how a service operates where cycles of cloud usage can be reduced and result in instant cost savings, according to Hunter.

Our total compute cost is so large that little bits of tuning can have really large amounts of cost savings for us, he said. If you're not making the sort of constant changes that we are, I think it's fine to use the managed services that Google or Amazon provide. But if you're in a world where we're constantly making changes like daily changes, multiple-times-a-day changes I think you want to have that technical expertise in house so that you can just really be on top of things.

Three factors figure into Snaps ability to reap cost savings: the competition between AWS and Google Cloud, Snaps ability to tweeze out costs as a result of its own work and going back to the cloud providers and looking at their new products and services.

We're in a state of doing those three things all the time, and between those three, [we save] many tens of millions of dollars, Hunter said.

Snap holds a cost camp every year where it asks its engineers to find all the places where costs possibly could be reduced.

We take that list and prioritize that list, and then I cut people loose to go and work on those things, he said. On an annual basis depending on the year, it's many tens of millions dollars of cost savings.

Snap has considered adding a third cloud provider, and it could still happen some day, although the process is pretty challenging, according to Hunter.

It's a big lift to move into another cloud, because you've got those three layers, he said. The agnostic stuff is pretty straightforward, but then once you get to mixed and cloud-specific, you've got to go hire engineers that are good at that cloud, or you've got to go train your team up on the nuances of that cloud.

Enterprises considering adding another cloud provider need to make sure they have the engineering staff to pull it off: 20 to 30 dedicated cloud people as a starting point, Hunter said.

It's not cheap, and second, that team has to be pretty sophisticated and technical, he said. If you don't have a big deployment, it's probably not worth it. I think about a lot of the customers I used to serve when I was in AWS, and the vast majority of them, their implementations were serving their company's internal stuff, and it wasn't gigantic. If you're in that boat, it's probably not worth the extra work that it takes to do multicloud.

Here is the original post:
How Snap rebuilt the infrastructure that now supports 347 million daily users - Protocol

Read More..

Cloud Computing in Higher Education Market Demand, Innovations, and Regional Outlook and Forecast 2022-2030 Muleskinner – Muleskinner

New York (US) The study undertaken by Astute Analytica foresees a tremendous growth in revenue of the market forglobal cloud computing in higher education marketfromUS$ 2,693.5 Millionin 2021 toUS$ 15,180.1 Millionby 2030. The market is anticipated to grow at a CAGR of 22% during the forecast period 2022-2030.

Request Sample Copy of Research Report @https://www.astuteanalytica.com/request-sample/cloud-computing-higher-education-market

Cloud computing in higher education provides an online platform for educational institutes through various applications and subscription models. In this era of technology, employing latest IT technologies and services in higher education assists teachers, administrators and students in their education related activities. Cloud computing in higher education centrally manages the various business processes such as student and course management, helps teachers in uploading learning materials, students to access their homework, administrators to easily collaborate with each other and library management among others.Cloud computing segment is gaining majority of the spenders from high income group as well as skilled share of people from around the world.

On the basis of institute type, thetechnical schools are estimated to hold the highest market share in 2021 and is also expected to project the highest CAGR over the forecast period owing to increasing demand for cloud computing in technical schools.Moreover, based on ownership,private institutes segment is anticipated to hold the largest market share owing to increasing funding in private institutes for adoption of cloud computing services. Whereas, the public institutes segment is expected to grow at the highest CAGR over forecast period. Furthermore, in terms of application, administration application holds a major share in the cloud computing in higher education in 2021. Whereas, unified communication is expected to project the highest CAGR over the forecast period due to increasing trend of e-learning. In addition to this, by deployment, the hybrid cloud segment held the largest market share in 2021.

Market Dynamics and Trends

Drivers

The increasing adoption of SaaS based cloud platforms in higher education, increasing adoption of e-learning, increasing IT spending on cloud infrastructure in educationand increasingapplication of quantum computing in education sectorwill boost the global cloud computing in higher education market during the forecast period. Software-as-a-Service (SaaS) is a type of delivery model of cloud computing. In the higher education sector, SaaS applications include hosting various management systems for educational institutes and managing other activities. Moreover, higher education industry witnesses an increased adoption of e-learning due to its easy accessibility and high effectiveness. Users such as drop-outs, transfer learners, full-time employees are increasingly relying on e-learning trainings and education to upgrade their skills. Furthermore, higher education institutes are rapidly moving towards cloud-based services to save an intensive IT infrastructure cost and boost efficiency of operations.

Restraints

Cybersecurity and data protection risks, lack of compliance to the SLAand legal and jurisdiction issues is a restraining factor which inhibits the growth of the market during the forecast period. Issues related to data privacy pose threats in interest to mitigation of higher education institutions to the cloud. There are federal regulations for higher education institutes along with state and local laws to manage information security in the education environment. Moreover, the level of complexity in the cloud is high, which usually complies with several service providers and thus makes it hard for users to make changes or intervene. Also, the cloud computing industry faces various legal and jurisdiction issues that can run into years due to regional laws.

Cloud Computing in Higher Education Market Country Wise Insights

North AmericaCloud Computing in Higher EducationMarket-

US holds the major share in terms of revenue in theNorth Americacloud computing in higher education market in 2021 and is also projected to grow with the highest CAGR during the forecast period. Moreover, in terms of institute type, technical schools hold the largest market share in 2021.

EuropeCloud Computing in Higher EducationMarket-

Western Europeis expected to project the highest CAGR in theEuropecloud computing in higher education market during forecast period. Wherein,Germanyheld the major share in theEuropemarket in 2021 because there is high focus on innovations obtained from research & development and technology adoption in the region.

Asia PacificCloud Computing in Higher EducationMarket-

Indiais the highest share holder region in theAsia Pacificcloud computing in higher education market in 2021and is expected to project the highest CAGR during the forecast period owing to potential growth opportunities, as end users such as schools and universities are turning toward cloud services in order to offer high quality services that help users to collaborate, share and track multiple versions of a document.

South AmericaCloud Computing in Higher EducationMarket-

Brazilis projected to grow with the highest CAGR in theSouth Americacloud computing in higher education market over the forecast period. Furthermore, based on ownership, private institutes segment holds the major share in 2021 in theSouth Americacloud computing in higher education market owing to increasing funding in private institutes for adoption of cloud computing services.

Middle EastCloud Computing in Higher EducationMarket-

Egyptis the highest share holder region in 2021 and UAE is projected to grow with the highest CAGR during the forecast period. Moreover, in terms of application, administration holds a major share in the cloud computing in higher education in 2021. Whereas, unified communication is expected to project the highest CAGR over the forecast period due to increasing trend of e-learning.

AfricaCloud Computing in Higher EducationMarket-

South Africais the highest share holder region in theAfricacloud computing in higher education market in 2021. Furthermore, by deployment, the private cloud segment is expected to witness the highest CAGR during forecast period due to the security benefits provided by the private deployment of the cloud.

Competitive Insights

GlobalCloud Computing in Higher Education Market is highly competitive in order to increase their presence in the marketplace. Some of the key players operating in the global cloud computing in higher education market include Dell EMC, Oracle Corporation, Adobe, Inc., Cisco Systems, Inc., NEC Corporation, Microsoft Corporation, IBM Corporation, Salesforce.com, Netapp, Ellucian Company L.P., Vmware, Inc and Alibaba Group among others.

Segmentation Overview

Global Cloud Computing in Higher Education Market is segmented based on institute type, ownership, application, deployment and region. The industry trends in the global cloud computing in higher education market are sub-divided into different categories in order to get a holistic view of the global marketplace.

Following are the different segments of the Global Cloud Computing in Higher Education Market:

Download Sample Report, SPECIAL OFFER (Avail an Up-to 30% discount on this report-https://www.astuteanalytica.com/industry-report/cloud-computing-higher-education-market

By Institute Type segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Ownership segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Application segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Deployment segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Region segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

North America

Europe

Western Europe

Eastern Europe

Asia Pacific

South America

Middle East

Africa

Request Full Report-https://www.astuteanalytica.com/request-sample/cloud-computing-higher-education-market

About Astute Analytica

Astute Analytica is a global analytics and advisory company which has built a solid reputation in a short period, thanks to the tangible outcomes we have delivered to our clients. We pride ourselves in generating unparalleled, in depth and uncannily accurate estimates and projections for our very demanding clients spread across different verticals. We have a long list of satisfied and repeat clients from a wide spectrum including technology, healthcare, chemicals, semiconductors, FMCG, and many more. These happy customers come to us from all across the Globe. They are able to make well calibrated decisions and leverage highly lucrative opportunities while surmounting the fierce challenges all because we analyze for them the complex business environment, segment wise existing and emerging possibilities, technology formations, growth estimates, and even the strategic choices available. In short, a complete package. All this is possible because we have a highly qualified, competent, and experienced team of professionals comprising of business analysts, economists, consultants, and technology experts. In our list of priorities, you-our patron-come at the top. You can be sure of best cost-effective, value-added package from us, should you decide to engage with us.

Contact us:Aamir BegBSI Business Park, H-15,Sector-63, Noida- 201301-IndiaPhone:+1-888 429 6757(US Toll Free);+91-0120- 4483891 (Rest of the World)Email:sales@astuteanalytica.com

Website:www.astuteanalytica.com

. More Report Here ..

Hyaluronic Acid MarketAustralia Earth Observation MarketHair Color MarketPea Protein Ingredients MarketLead Acid Battery Market

See the original post:
Cloud Computing in Higher Education Market Demand, Innovations, and Regional Outlook and Forecast 2022-2030 Muleskinner - Muleskinner

Read More..

3 Reasons to Buy Snowflake Stock, and 3 Reasons to Sell – The Motley Fool

Snowflake's (SNOW -5.71%) stock surged 18% during after hours trading on Aug. 24 following the release of the company's second-quarter report. The cloud-based data warehousing company's revenue surged 83% year-over-year to $497.2 million, beating analysts' estimates by $29.9 million. But its net loss still widened from $189.7 million to $222.8 million, or $0.70 per share, and missed the consensus forecast by $0.12.

Snowflake's robust revenue growth overshadowed its earnings miss, but is it still a worthwhile investment in this tough market for tech stocks? Let's review three reasons to buy Snowflake -- and three reasons to sell it -- to decide.

Image source: Getty Images.

Snowflake looks attractive because it's growing like a weed, its retention rates are high, and there's plenty of pent-up demand for its services.

Its product revenue, which accounts for most of its top line, surged 106% to $1.14 billion in fiscal 2022, which ended this January. For fiscal 2023, it expects its product revenue to grow another 67%-68%.

Snowflake expects its product revenue to reach $10 billion by fiscal 2029, which implies it can grow at a compound annual growth rate (CAGR) of 36% over the next seven fiscal years. It intends to achieve that goal by expanding its customer base and gaining more high-value customers.

The company ended the second quarter with 6,808 customers, representing 36% growth from a year earlier. Within that total, its number of customers that generated more than $1 million in trailing-12-month product revenue increased 112% to 246. It expects that high-value cohort to expand to about 1,400 customers by fiscal 2029.

Snowflake's long-term goals seem lofty, but the stickiness of its ecosystem supports those ambitions. It ended the second quarter with a net revenue retention rate of 171%, compared to 169% a year earlier. Its remaining performance obligations, or the revenue it expects to recognize from its existing contracts, also surged 78% year-over-year to $2.7 billion.

Snowflake's growth rates are jaw-dropping, but its negative margins, competitive headwinds, and nosebleed valuations offset most of those strengths.

Its net loss widened from $539.1 million in fiscal 2021 to $679.9 million in fiscal 2022. Analysts expect a wider loss of $742 million this year, followed by even steeper losses in fiscal 2024 and fiscal 2025.

On a non-GAAP (generally accepted accounting principles) basis, which excludes its stock-based compensation and other one-time costs, its operating margin came in at negative 3% in fiscal 2022. It expects its non-GAAP operating margin to rise to positive 2% this year, but it still faces a long uphill battle toward generating stable non-GAAP profits.

All that red ink leaves Snowflake exposed to competitive threats. It established an early-mover's advantage in the cloud-based data warehousing market, but its growth has convinced legacy players like Amazon (AMZN -0.73%) Web Services (AWS), Microsoft (MSFT -1.07%) Azure, and Alphabet's (GOOG -0.86%) (GOOGL -0.83%) Google Cloud to upgrade their older data warehousing solutions. All three tech giants are bundling their data warehousing services with their other cloud infrastructure services, but Snowflake actually operates its platform on top of AWS, Azure, and Google Cloud.

Therefore, Snowflake is ironically paying hefty cloud hosting fees to its largest rivals. If push comes to shove, Amazon, Microsoft, and Google could aggressively undercut Snowflake's prices while raising their cloud hosting fees. That pressure could make it impossible for Snowflake to ever turn a profit.

Lastly, Snowflake's stock has plunged more than 50% since it hit its all-time high last November, but it's still expensive. With a market cap of $60 billion, Snowflake trades at nearly 30 times this year's sales. That high valuation -- along with its lack of profits -- makes it a risky stock to own as rising interest rates drive investors away from speculative growth plays.

Snowflake's stock was already expensive when it went public at $120 nearly two years ago, and it's still pricey today. Its business is growing rapidly, but it's hard to tell if it can achieve its ambitious goals for fiscal 2029 without being derailed by its larger competitors. I'd consider buying Snowflake if its stock gets cut in half again, but it's simply too rich and risky for my blood right now.

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Leo Sun has positions in Alphabet (A shares) and Amazon. The Motley Fool has positions in and recommends Alphabet (A shares), Alphabet (C shares), Amazon, Microsoft, and Snowflake Inc. The Motley Fool has a disclosure policy.

Continued here:
3 Reasons to Buy Snowflake Stock, and 3 Reasons to Sell - The Motley Fool

Read More..