Page 3,461«..1020..3,4603,4613,4623,463..3,4703,480..»

Hear how three startups are approaching quantum computing differently at TC Disrupt 2020 – TechCrunch

Quantum computing is at an interesting point. Its at the cusp of being mature enough to solve real problems. But like in the early days of personal computers, there are lots of different companies trying different approaches to solving the fundamental physics problems that underly the technology, all while another set of startups is looking ahead and thinking about how to integrate these machines with classical computers and how to write software for them.

At Disrupt 2020 on September 14-18, we will have a panel with D-Wave CEO Alan Baratz, Quantum Machines co-founder and CEO Itamar Sivan and IonQ president and CEO Peter Chapman. The leaders of these three companies are all approaching quantum computing from different angles, yet all with the same goal of making this novel technology mainstream.

D-Wave may just be the best-known quantum computing company thanks to an early start and smart marketing in its early days. Alan Baratz took over as CEO earlier this year after a few years as chief product officer and executive VP of R&D at the company. Under Baratz, D-Wave has continued to build out its technology and especially its D-Wave quantum cloud service. Leap 2, the latest version of its efforts, launched earlier this year. D-Waves technology is also very different from that of many other efforts thanks to its focus on quantum annealing. That drew a lot of skepticism in its early days, but its now a proven technology and the company is now advancing both its hardware and software platform.

Like Baratz, IonQs Peter Chapman isnt a founder either. Instead, he was the engineering director for Amazon Prime before joining IonQ in 2019. Under his leadership, the company raised a $55 million funding round in late 2019, which the company extended by another $7 million last month. He is also continuing IonQs bet on its trapped ion technology, which makes it relatively easy to create qubits and which, the company argues, allows it to focus its efforts on controlling them. This approach also has the advantage that IonQs machines are able to run at room temperature, while many of its competitors have to cool their machines to as close to zero Kelvin as possible, which is an engineering challenge in itself, especially as these companies aim to miniaturize their quantum processors.

Quantum Machines plays in a slightly different part of the ecosystem from D-Wave and IonQ. The company, which recently raised $17.5 million in a Series A round, is building a quantum orchestration platform that combines novel custom hardware for controlling quantum processors because once quantum machines reach a bit more maturity, a standard PC wont be fast enough to control them with a matching software platform and its own QUA language for programming quantum algorithms. Quantum Machines is Itamar Sivans first startup, which he launched with his co-founders after getting his Ph.D. in condensed matter and material physics at the Weizmann Institute of Science.

Come to Disrupt 2020 and hear from these companies and others on September 14-18. Get a front-row seat with your Digital Pro Pass for just $245 or with a Digital Startup Alley Exhibitor Package for $445. Prices are increasing next week, so grab yours today to save up to $300.

Read more:
Hear how three startups are approaching quantum computing differently at TC Disrupt 2020 - TechCrunch

Read More..

D-Waves quantum computing cloud comes to India – The Hindu

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Canadian quantum computing company D-Wave Systems is launching its cloud service in India, giving developers and researchers in the country real-time access to its quantum computers.

Through this geographic expansion, D-Waves 2000Q quantum computers, hybrid solvers and the application environment can be used via its cloud platform Leap to drive development of business-critical and in-production hybrid applications.

Quantum computing is poised to fundamentally transform the way businesses solve critical problems, leading to new efficiencies and profound business value in industries like transportation, finance, pharmaceuticals and much more, Murray Thom, VP of Software and Services at D-Wave, said in a statement.

The future of quantum computing is in the cloud. Thats why we were eager to expand Leap to India and Australia, where vibrant tech scenes will have access to real-time quantum computers and the hybrid solver service for the first time, unlocking new opportunities across industries.

As part of this rollout, users in India and Australia can work on the D-Waves Leap and Leap 2 platforms.

The two cloud platforms offer updated features and tools, including hybrid solver service that can solve large and complex problems of up to 10,000 variables; and integrated developer environment that has a prebuilt, ready-to-code environment in the cloud configured with the latest Ocean SDK for quantum hybrid development in Python.

D-Waves systems and software have been used in financial modelling, machine learning and route optimization.

Its latest launch in India comes about a year after the countrys Department of Science and Technology (DST) chalked out plans to build its own quantum computers.

In early 2019, DST launched a programme focused on quantum computing, called Quantum-Enabled Science and Technology (QuEST). As part of QuEST, India earmarked 80 crore investment to be spent over a span of three years to facilitate research in setting up quantum computers.

A year later, Finance Minister Nirmala Sitharaman, in her Union Budget 2020 Speech, announced a National Mission on Quantum Technologies and Applications (NM-QTA) with an outlay of 8,000 crore for the next five years.

Quantum technology is opening up new frontiers in computing, communications, cyber security with wide-spread applications, Sitharaman said in her Budget Speech.

It is expected that lots of commercial applications would emerge from theoretical constructs which are developing in this area.

NM-QTAs focus, as outlined by the minister, will be in fundamental science, translation, technology development and, human and infrastructural resource generation.

Other areas of quantum computing applications will include aero-space engineering, numerical weather prediction, simulations, securing communication and financial transactions, cyber-security, and advanced manufacturing.

You have reached your limit for free articles this month.

To get full access, please subscribe.

Already have an account ? Sign in

Show Less Plan

Find mobile-friendly version of articles from the day's newspaper in one easy-to-read list.

Move smoothly between articles as our pages load instantly.

Enjoy reading as many articles as you wish without any limitations.

A one-stop-shop for seeing the latest updates, and managing your preferences.

A select list of articles that match your interests and tastes.

We brief you on the latest and most important developments, three times a day.

*Our Digital Subscription plans do not currently include the e-paper ,crossword, iPhone, iPad mobile applications and print. Our plans enhance your reading experience.

More here:
D-Waves quantum computing cloud comes to India - The Hindu

Read More..

Ripple Executive Says Quantum Computing Will Threaten Bitcoin, XRP and Crypto Markets Heres When – The Daily Hodl

Ripple CTO David Schwartz says quantum computing poses a serious threat to the future of cryptocurrency.

On the Modern CTO Podcast, Schwartz says quantum computing will break the cryptographic algorithms that keep cryptocurrencies like Bitcoin (BTC) and XRP as well as the internet at large secure.

From the point of view of someone who is building systems based on conventional cryptography, quantum computing is a risk. We are not solving problems that need powerful computing like payments and liquidity the work that the computers do is not that incredibly complicated, but because it relies on conventional cryptography, very fast computers present a risk to the security model that we use inside the ledger.

Algorithms like SHA-2 and ECDSA (elliptic curve cryptography) are sort of esoteric things deep in the plumbing but if they were to fail, the whole system would collapse. The systems ability to say who owns Bitcoin or who owns XRP or whether or not a particular transaction is authorized would be compromised

A lot of people in the blockchain space watch quantum computing very carefully and what were trying to do is have an assessment of how long before these algorithms are no longer reliable.

Schwartz says he thinks developers have at least eight years until the technology, which leverages the properties of quantum physics to perform fast calculations, becomes sophisticated enough to crack cryptocurrency.

I think we have at least eight years. I have very high confidence that its at least a decade before quantum computing presents a threat, but you never know when there could be a breakthrough. Im a cautious and concerned observer, I would say.

Schwartz says crypto coders should closely follow the latest public developments in quantum computing, but hes also concerned about private efforts from the government.

The other fear would be if some bad actor, some foreign government, secretly had quantum computing way ahead of whats known to the public. Depending on your threat model, you could also say what if the NSA has quantum computing. Are you worried about the NSA breaking your payment system?

While some people might realistically be concerned it depends on your threat model, if youre just an average person or an average company, youre probably not going to be a victim of this lets say hypothetically some bad actor had quantum computing that was powerful enough to break things, theyre probably not going to go after you unless you are a target of that type of actor. As soon as its clear that theres a problem, these systems will probably be frozen until they can be fixed or improved. So, most people dont have to worry about it.

Featured Image: Shutterstock/Elena11

See the rest here:
Ripple Executive Says Quantum Computing Will Threaten Bitcoin, XRP and Crypto Markets Heres When - The Daily Hodl

Read More..

The Hyperion-insideHPC Interviews: Dr. Michael Resch Talks about the Leap from von Neumann: ‘I Tell My PhD Candidates: Go for Quantum’ – insideHPC

Dr. Michael M. Resch of the University of Stuttgart has professorships, degrees, doctorates and honorary doctorates from around the world, he has studied and taught in Europe and the U.S., but for all the work he has done in supercomputing for the past three-plus decades, he boils down his years in HPC to working with the same, if always improving, von Neumann architecture. Hes eager for the next new thing: quantum. Going to quantum computing, we have to throw away everything and we have to start anew, he says. This is a great time.

In This Update. From The HPC User Forum Steering Committee

By Steve Conway and Thomas Gerard

After the global pandemic forced Hyperion Research to cancel the April 2020 HPC User Forum planned for Princeton, New Jersey, we decided to reach out to the HPC community in another way by publishing a series of interviews with members of the HPC User Forum Steering Committee. Our hope is that these seasoned leaders perspectives on HPCs past, present and future will be interesting and beneficial to others. To conduct the interviews, Hyperion Research engaged insideHPC Media.

We welcome comments and questions addressed to Steve Conway, sconway@hyperionres.com or Earl Joseph, ejoseph@hyperionres.com.

This interview is with Michael M. Resch. Prof. Dr. Dr. h.c. mult. He is dean of the faculty for energy-process and biotechnology of the University of Stuttgart, director of the High Performance Computing Center Stuttgart (HLRS), the Department for High Performance Computing, and the Information Center (IZUS), all at the University of Stuttgart, Germany. He was an invited plenary speaker at SC07. He chairs the board of the German Gauss Center for Supercomputing (GCS) and serves on the advisory councils for Triangle Venture Capital Group and several foundations. He is on the advisory board of the Paderborn Center for Parallel Computing (PC2). He holds a degree in technical mathematics from the Technical University of Graz, Austria and a Ph.D. in engineering from the University of Stuttgart. He was an assistant professor of computer science at the University of Houston and was awarded honorary doctorates by the National Technical University of Donezk (Ukraine) and the Russian Academy of Science.

He was interviewed by Dan Olds, HPC and big data consultant at Orionx.net.

The HPC User Forum was established in 1999 to promote the health of the global HPC industry and address issues of common concern to users. More than 75 HPC User Forum meetings have been held in the Americas, Europe and the Asia-Pacific region since the organizations founding in 2000.

Olds: Hello, Im Dan Olds on behalf of Hyperion Research and insideHPC, and today Im talking to Michael Resch, who is an honorable professor at the HPC Center in Stuttgart, Germany. How are you, Michael?

Resch: I am fine, Dan. Thanks.

Olds: Very nice to talk to you. I guess lets start at the beginning. How did you get involved in HPC in the first place?

Resch: That started when I was a math student and I was invited to work as a student research assistant and, by accident, that was roughly the month when a new supercomputer was coming into the Technical University of Graz. So, I put my hands on that machine and I never went away again.

Olds: You sort of made that machine yours, I guess?

Resch: We were only three users. There were three user groups and I was the most important user of my user group because I did all the programming.

Olds: Fantastic, thats a way to make yourself indispensable, isnt it?

Resch: In a sense.

Olds: So, can you kind of summarize your HPC background over the years?

Resch: I started doing blood flow simulations, so I at first looked into this very traditional Navier-Stokes equation that was driving HPC for a long time. Then I moved on to groundwater flow simulations pollution of groundwater, tunnel construction work, and everything until after like five years I moved to the University of Stuttgart, where I started to work with supercomputers, more focusing on the programming side, the performance side, than on the hardware side. This is sort of my background in terms of experience.

In terms of education, I studied a mixture of mathematics, computer science and economics, and then did a Ph.D. in engineering, which was convenient if youre working in Navier-Stokes equations. So, I try to bring all of these things together to make an impact in HPC.

Olds: What are some of the biggest changes youve seen in HPC over your career?

Resch: Well, the biggest change is probably that when I started, as I said, there were three user groups. These were outstanding experts in their field, but supercomputing was nothing for the rest of the university. Today, everybody is using HPC. Thats probably the biggest change, that we are moving from something where you had one big system and a few experts around that system, and you moved to a larger number of systems and tens of thousands of experts working with them.

Olds: And, so, the systems have to get bigger, of course.

Resch: Well, certainly, they have to get bigger. And they have to get, I would say, more usable. Thats another feature, that now things are more hidden from the user, which makes it easier to use them. But at the same time, it takes away some of the performance. There is this combination of hiding things away from the user and then the massive parallelism that we saw, and thats the second most important thing that I think we saw in the last three decades. That has made it much more difficult to get high sustained performance.

Olds: Where do you see HPC headed in the future? Is there anything that has you particularly excited or concerned?

Resch: [Laughs] Im always excited and concerned. Thats just normal. Thats what happens when you go into science and thats normal when you work with supercomputers. I see, basically, two things happening. The first thing is that people will merge everything that has to do with data and everything that has to do with simulation. I keep saying its data analytics, machine learning, artificial intelligence. Its sort of a development from raw data to very intelligent handling of data. And these data-intensive things start to merge with simulation, like we see people trying to understand what they did over the last 20 years by employing artificial intelligence to work its way through the data trying to find what we have already done and what should we do next, things like that.

The second thing that is exciting is quantum computing. Its exciting because its out of the ordinary, in a sense. You might say that over the last 32 years the only thing I did was work with improved technology and improved methods and improved algorithms or whatever, but I was still working in the same John von Neumann architecture concept. Going to quantum computing we have to throw away everything and we have to start anew. This is a great time. I keep telling my Ph.D. candidates, go for quantum computing. This is where you make an impact. This is where you have a wide-open field of things you can explore and this is what is going to make the job exciting for the next 10, 12, 15 years or so.

Olds: Thats fantastic and your enthusiasm for this really comes through. Your enthusiasm for HPC, for the new computing methods, and all that. And, thank you so much for taking the time.

Resch: It was a pleasure. Thank you.

Olds: Thank you, really appreciate it.

Here is the original post:
The Hyperion-insideHPC Interviews: Dr. Michael Resch Talks about the Leap from von Neumann: 'I Tell My PhD Candidates: Go for Quantum' - insideHPC

Read More..

The Computational Limits of Deep Learning Are Closer Than You Think – Discover Magazine

Deep in the bowels of the Smithsonian National Museum of American History in Washington DC sits a large metal cabinet the size of a walk-in wardrobe. The cabinet houses a remarkable computer the front is covered in dials, switches and gauges and inside it is filled with potentiometers controlled by small electric motors. Behind one of the cabinet doors is a 20 x 20 array of light sensitive cells, a kind of artificial eye.

This is the Perceptron Mark I, a simplified electronic version of a biological neuron. It was designed by the American psychologist Frank Rosenblatt at Cornell University in the late 1950s who taught it to recognize simple shapes such as triangles.

Rosenblatts work is now widely recognized as the foundation of modern artificial intelligence but at the time it was controversial. Despite the original success, researchers were unable to build on this, not least because more complex pattern recognition required vastly more computational power than was available at the time. This insatiable appetite prevented further study of artificial neurons and the networks they create.

Todays deep learning machines also eat power, lots of it. And that raises an interesting question about how much they will need in future. Is this appetite sustainable as the goals of AI become more ambitious?

Today we get an answer thanks to the work of Neil Thompson at the Massachusetts Institute of Technology in Cambridge and several colleagues. This team has measured the improved performance of deep learning systems in recent years and show that how it depends on increases in computing power.

By extrapolating this trend, they say that future advances will soon become unfeasible. Progress along current lines is rapidly becoming economically, technically, and environmentally unsustainable, say Thompson and colleagues, echoing the problems that emerged for Rosenblatt in the 1960s.

The teams approach is relatively straightforward. They analyzed over 1000 papers on deep learning to understand how learning performance scales with computational power. The answer is that the correlation is clear and dramatic.

In 2009, for example, deep learning was too demanding for the computer processors of the time. The turning point seems to have been when deep learning was ported to GPUs, initially yielding a 5 15 speed-up, they say.

This provided the horsepower for a neural network called AlexNet, which famously triumphed in a 2012 image recognition challenge where it wiped out the opposition. The victory created huge and sustained interest in deep neural networks that continues to this day.

But while deep learning performance increased by 35x between 2012 and 2019, the computational power behind it increased by an order of magnitude each year. Indeed, Thompson and co say this and other evidence suggests the computational power for deep learning has increased 9 orders of magnitude faster than the performance.

So how much computational power will be required in future? Thompson and co say that error rate for image recognition is currently 11.5 percent using 10^14 gigaflops of computational power at a cost of millions of dollars (ie 10^6 dollars).

They say achieving an error rate of just 1 per cent will require 10^28 gigaflops. And extrapolating at the current rate, this will cost 10^20 dollars. By comparison, the total amount of money in the world right now is measured in trillions ie 10^12 dollars.

Whats more, the environmental cost of such a calculation will be enormous, an increase in the amount of carbon produced of 14 orders of magnitude. Progress along current lines is rapidly becoming economically, technically, and environmentally unsustainable, conclude Thompson and colleagues.

The future isnt entirely bleak, however. Thompson and cos extrapolations assume that future deep learning systems will use the same kinds of computers that are available today.

But various new approaches offer much more efficient computation. For example, in some tasks the human brain can outperform the best supercomputers while running on little more than a bowl of porridge. Neuromorphic computing attempts to copy this. And quantum computing promises orders of magnitude more computing power with relatively little increase in power consumption.

Another option is to abandon deep learning entirely and concentrate on other forms of machine learning that are less power hungry.

Of course, there is no guarantee that these new techniques and technologies will work. But if they dont, its hard to see how artificial intelligence will get much better than it is now.

Curiously, something like this happened after the Perceptron Mark I first appeared, a period that lasted for decades and is now known as the AI winter. The Smithsonian doesnt currently have it on display, but it is surely marks a lesson worth remembering.

Ref: arxiv.org/abs/2007.05558 : The Computational Limits of Deep Learning

Read the original here:
The Computational Limits of Deep Learning Are Closer Than You Think - Discover Magazine

Read More..

China’s newest technology stock exchange is thriving despite the pandemic – The Economist

But the countrys answer to Americas Nasdaq is not for the faint of heart

Jul 22nd 2020

SHANGHAIS STAR market, a stock exchange for Chinas home-grown technology firms, celebrates its first birthday today. It has much to cheer about. Launched with an ambition to rival Nasdaq, a venue in New York where many American tech giants are listed, the toddler has surpassed the older ChiNext exchange in Shenzhen and already ranks second globally by capital raised in IPOs so far this year. And it just received a lovely present. On July 20th Ant Group, the financial-services arm of Alibaba, an e-commerce giant, said it had chosen STAR as one of two exchanges on which it is planning its long-awaited listing (the other winner is Hong Kong, which has also grown popular among fast-growing Chinese companies). Though the exact size and timing of the offering are still unknown, it could well turn out to be the largest IPO ever. Ant was last valued at $150bn in 2018; listing even a small portion of its shares could place it above Saudi Aramcos IPO last year, the largest yet at $26bn.

Two factors explain STARs appeal to issuers. First, it enjoys rock-solid political backing. Chinas government sees it as a way to channel capital towards young technologies it wants to nurture, from high-tech sensors to quantum computing. To help money flow, it has loosened restrictions that apply to stock offerings elsewhere (eg, on other Chinese exchanges, an informal price cap of 23 times earnings and a 44% ceiling on first-day gains). It has also fast-tracked IPO approvals, which can take years on other exchanges. Second, the mood has soured against Chinese companies in America, where many promising companies from the mainland would have traditionally considered listing. America has threatened to impose sanctions on Chinese officials. Earlier this year, the Senate also passed legislation that could force American-listed Chinese firms to delist if they fail to show their audit work papers to American regulators for three consecutive years. That makes Asian alternatives more palatable.

It helps that investors like STAR too. Some offerings have attracted orders amounting to thousands of times the quantum of shares up for sale; some stocks have rocketed tenfold within hours of listing. But STAR is not for the faint-hearted. The prices of shares listed there are sometimes way off those of similar securities listed on more mature markets, hinting that they may be divorced from company fundamentals. The Shanghai price of Semiconductor Manufacturing International Corporation, a chipmaker that raised 53.2bn yuan ($7.6bn) in early July through a dual IPO, is more than three times its Hong Kong price, for example. Such inconsistencies can exist on the way down, as well as on the way up. As investors sell older stocks to pile into the newest and flashiest offerings, prices can slide by double-digit percentages, suggesting the market may not have the liquidity yet to absorb large IPOs in quick succession.

This creates a conundrum for Chinas rulers. Investors positive reaction to STAR may prompt regulators to ease stockmarket rules on other mainland exchanges, leading to more efficient, liquid markets and allowing the government to funnel capital to strategic sectors. But untamed speculation by fickle punters makes bubbles more likely, and the political, PR and financial risks of market crashes rank among the things that keep Chinas masters awake at night. If it proves little else than a fashionable casino, STARs allure could dim fast.

More:
China's newest technology stock exchange is thriving despite the pandemic - The Economist

Read More..

Cloud Computing: How Creators and Innovators are Redrawing the Landscape of the Tech Industry – – TechiExpert.com

The world of technology, much like the cosmos, is constantly expanding. New technologies are being introduced with such rapidity that the technology of today becomes outdated, even obsolete, in just a few weeks time. The latter half of the 20th century saw speedy technological advancements, headlined by the birth of computers. These machines bolstered the central human endeavor of simplifying tasks, increasing efficiency and productivity, and optimally utilizing the limited resources, including time, at our disposal. The advent of the cloud marked a watershed moment in the field of computer science and forever changed the laws governing the technological universe.

In the simplest of terms, cloud computing can be understood as an instantaneous access to computing services, delivered over the internet or the cloud. These services encompass hardware solutions such as servers and software solutions such as analytics, databases, and intelligence. Cloud-based systems offer a host advantages such as:

Given its extensive benefits for businesses, it is not surprising to note that the cloud computing market is growing at breakneck speed. According to Fortune Business Insights, the global cloud computing market size stood at USD 199.01 billion in 2019 and is expected to reach USD 760.98 billion by 2027, surging at a CAGR of 18.6% during the forecast period. It will be worthwhile to check out some of the most notable strides that have been made in this industry in the past few years.

In April 2020, BlackRock, the US-based investment corporation, entered into a strategic collaboration with Microsoft to host its investment infrastructure, Aladdin, on the Microsoft Azure cloud platform. By making Aladdin available on Azure, BlackRock will be able to utilize the cloud systems network of datacenters spread around the world, allowing the company to vastly expand its scale of financial instruments and services for its clients. Moreover, this move will also empower BlackRock to leverage Azures superior cloud capabilities to meet its clients localized needs, whilst maintaining the security its system. With this strategic maneuver, BlackRock hopes to gain a leading position in the financial services industry.

Wipro, the Indian IT giant, joined forces with IBM in June 2020 to develop and deliver hybrid cloud solutions to businesses. Under the partnership, the two companies have created the Wipro IBM Novus Lounge at Wipros campus in Bangalore, which has been designed as a state-of-the-art innovation facility. Here, holistic solutions harnessing the capabilities of Artificial Intelligence (AI), Machine Learning (ML), and Internet of Things (IoT) will be engineered to accelerate innovation among enterprises and independent developers. The focus, according to Wipro, will be to design hybrid cloud systems to facilitate migration, management, and transformation of mission-critical business operations across private and public clouds.

The US-domiciled computing innovator, Oracle Corporation, announced in July 2020 the release of the Oracle Dedicated Region Cloud@Customer, its fully-managed cloud system that brings together all of the companys second-generation cloud solutions to customer datacenters. Costing only $500,000 a month, the new system includes Oracles flagship Oracle Cloud applications and the Autonomous Database. At a price that is only a fraction of that quoted by other cloud providers, enterprises will be able to gain access to the complete set of the comprehensive cloud services offered by Oracles public cloud regions in their own datacenters. Organizations such as the Oman Information and Communications Technology Group and the Nomura Research Institute in Japan have already adopted Oracles innovative cloud offering to deliver high-quality services to customers with an enhanced security quotient.

Even more exciting than the industry developments listed above is the proliferation of start-ups in the booming cloud market. These nascent companies have reenergized the tech world with their innovative ideas and are making tangible breakthroughs in cloud-based technologies. Some of the most prominent start-ups in this industry include:

These entities represent only a small portion of the number of cloud start-ups that have cropped up since the turn of the century. The important thing to remember is that the cloud market is opening avenues for individuals and groups to funnel their entrepreneurial zeal and provide actionable solutions to enterprises.

Enhancing operational efficiency and maximize revenues are two central goals of any enterprise. Cloud-based systems, strengthened by the integration of automation technologies such as AI, ML, and IoT, appear to possess the properties necessary for businesses to attain their dual aims. However, cloud computing is by not infallible. It has its shortcoming, most notably visible through their vulnerability to hackers and cyber-criminals. Nonetheless, the potential of cloud systems is immense and seemingly boundless. Moreover, crisis situations such as the current COVID-19 pandemic, which has forced people to work from home, are further fuelling the adoption of cloud in organizations. How the future of this industry unfolds will certainly be an interesting experience for the world.

See the rest here:
Cloud Computing: How Creators and Innovators are Redrawing the Landscape of the Tech Industry - - TechiExpert.com

Read More..

Got $3,000 to Invest? Here Are 3 No-Brainer Stocks to Buy in Cloud Computing – Motley Fool

The emergence of the COVID-19 pandemic earlier this year has changed everything, from how we live to how we work, and everything in between. Remote work and videoconferencing have combined to cause a notable acceleration in the adoption of cloud computing, a trend that was already well underway.

When the discussion turns to the cloud, Amazon (NASDAQ:AMZN), with its Amazon Web Services (AWS), invariably dominates the conversation as the pioneer and still leader in the space. There's little doubt it remains a great place for investors to cut their teeth on the cloud computing revolution, as revenue from AWS grew more than 36% in 2019.

Yet the opportunities don't stop there, as cloud computing refers to a whole range of software and services that can be provided remotely. And this massive multiyear digital transformation is just getting started.

Let's look at three areas of the cloud, and identify one no-brainer stock opportunity from each.

Image source: Getty Images.

In its simplest terms, a platform-as-a-service company provides a cloud-based framework for developers, giving them all the resources they need to build applications. This includes servers, storage, and networking that can be managed remotely.

As stay-at-home and remote work became the order of the day, it also became more important than ever for companies to be able to communicate with their customers, particularly those using apps -- from food delivery to ride-hailing, from password resets to customer service, and everything in between.

That's where Twilio (NYSE:TWLO) comes in. The company provides the building blocks that allow developers to include the company's communication technology in their apps, allowing them to seamlessly embed messaging systems -- all of which can be accomplished in a matter of hours, where it previously took weeks.

The company has a network of 29 cloud data centers in nine geographic regions that serve developers in 180 countries. Twilio's growing list of customers, which numbered more than 190,000 at last count, grew by 23% in the first quarter and continued to expand beyond our borders. And 28% of its revenue now comes from international markets, increasing from 24% in 2018.

The proof is in the pudding. Twilio's revenue grew by 57% year over year in the first quarter, while its dollar-based net expansion rate of 143% (its highest level since late 2018) shows that once customers are on board, they not only stick around, but tend to expand their spending over time.

As the need for in-app communication continues to grow, this will no doubt continue to expand the demand for Twilio's services.

Image source: Getty Images.

Infrastructure as a service is the industry Amazon pioneered, making data-center services (like storage, networking, computing, and security) available on an as-needed basis.

Microsoft (NASDAQ:MSFT) has long trailed AWS in the space, but its Azure cloud computing offering has been closing the gap by growing at a must faster rate. As an example, in the first calendar quarter of 2020, revenue from AWS grew 33%, while Azure grew 59%.

But that's not the only tool in Microsoft's bag of tricks. The company also provides a host of other services via the cloud, like Microsoft 365, Teams videoconferencing software, Windows Virtual Desktop, and Dynamics accounting software, to name a few.

The diversity of Microsoft's business also makes it attractive. It has exposure to consumer markets and enterprise products (like Xbox gaming and its LinkedIn professional network) in addition to its business and personal software and fast-growing cloud segments.

That strength was on full display in Microsoft's fiscal fourth quarter, ended June 30. Even in the face of the pandemic, revenue grew 13% year over year, with each of its business segments contributing to the better-than-expected performance. Azure grew 47% while Xbox jumped 65%, both boosted by the remote-work and stay-at-home economy.

This wide assortment of businesses and its high-growth cloud segment make Microsoft an attractive addition to any portfolio.

Image source: Getty Images.

As the name implies, software as a service allows businesses and consumers to rent software rather than buy it, and access it via the cloud. While the concept is commonplace today, that wasn't so in 2012 when Adobe (NASDAQ:ADBE) made the then-radical decision to switch from shrink-wrapped physical software discs to making its suite of creative software tools available via a cloud-based subscription model.

The rest, as they say, is history. No longer content to offer just its creative software, Adobe has a wide range of products including marketing services, customer relationship management, and analytics tools. Over the past couple of years, the company has made several major acquisitions, pushing it further into marketing and even e-commerce.

Adobe has produced record revenue that has grown in each of the past 21 consecutive quarters. In the second quarter, revenue grew 14% year over year, a deceleration from its recent growth, but impressive nonetheless considering the economic environment wrought by the pandemic. The bottom line grew at an even faster pace, with operating income increasing by 35%.

The rapid transition to remote work put several of Adobe's businesses front and center. The demand for digital documents surged, with the use of Adobe PDF services climbing 40% sequentially, while the number of documents shares in Acrobat jumped 50% year over year. The company also experienced accelerating adoption for Adobe Sign, its e-signature solution, which has soared 175% so far this year. Installations of Adobe Reader increased 43%, while those of Adobe Scan climbed 66%.

This illustrates the broad reach of Adobe's cloud-based offerings, and strong demand should continue as the need for remote work remains.

Data by YCharts.

The global cloud computing market is expected to grow at a compound annual rate of nearly 19% over the next several years, reaching $761 billion by 2027, according to a report by Fortune Business Insights. Each of these companies is a leader in its respective category, giving investors an outstanding opportunity to profit from the accelerating shift to the cloud.

If you're looking for evidence of the market-beating potential of these cloud innovators, look no further than the results so far this year. Each company has beaten both the S&P 500and the NASDAQ Composite and beaten them by a wide margin.

Excerpt from:
Got $3,000 to Invest? Here Are 3 No-Brainer Stocks to Buy in Cloud Computing - Motley Fool

Read More..

Illinois regulators reject proposal to allow utilities cost recovery for cloud-based computing – Utility Dive

Dive Brief:

Coming changes in the electric power grid, as renewable energy production continues to increase, will require increasingly sophisticated controls.

Utilities will need the ability to predict and react to power supply as well as to new sources of demand. They will need the ability to instantly analyze enormous amounts of data, and that will require ever-increasing data processing power.

Paying for cloud computing services is an issue regulators and utilities have been considering in many states.

Renewable energy advocates more than a year ago recognized that Illinois regulators were in a cutting-edge debate and urged the ICC and Illinois lawmakers to make the change in the accounting treatment of cloud computing expenses.

As representatives of the renewable energy industry, we are strong proponents of policies that allow utilities to modernize computing systems and employ technologies that allow them to manage large amounts of data quickly and cost effectively to foster more rapid deployment of renewables on the electrical grid, wrote the chief executives of The American Council on Renewable Energy, the American Wind Energy Association and the Solar Industries Association in an April 2019 letter to a state legislative committee reviewing an early version of the commissions rule changes.

The grid is now accommodating the two-way flow of electrons and increased interconnections across the distribution system. All of these new technologies are creating large amounts of data and require advanced data management, their letter argued.

As negotiated since then by the ICC staff, utility representatives, and renewable and clean energy advocates, the proposed accounting rule changes would have allowed utilities to treat 80%of cloud service expenses as if they were a capital investment while absorbing the remaining 20%as a normal operating expense.

In a split 3-to-2 vote reflecting at least in part the concerns of lawmakers about the impact on customer bills if utilities were able to more easily pass on the cost of cloud-based data services, the ICC decided to end further discussion and leave current accounting rules in place.

The decision was a shock to some of the parties that had participated in nearly three years of discussions about the change.

At least one utility, Commonwealth Edison, took the setback in stride.

"We appreciate the initiative taken by the Illinois Commerce Commission throughout the cloud-based computer services docket. While we are continuing to review options with the other stakeholders, we know the Commission saw this as a potential opportunity to drive innovation and value for customers, and that remains our priority. We will continue to evaluate both on-premise and cloud-based solutions to ensure that the path chosen is best suited for the needs of the utility and its customers," the company said in a statement

We were surprised and disappointed that the ICC voted down the cloud-computing accounting rule proposal, given the unanimous vote to move forward with the proceeding last fall, the consensus reached by utilities, ICC staff and advanced energy companies on a simplified rule, and the absence of any objection in the recent record," said Danny Waggoner, director at Advanced Energy Economy, in a prepared statement.

Dissenting commission members were bitter about the development and said so before three members of the five-member commission voted to end the proposed rule changes.

Commissioner Maria Bocanegra, who voted against the majority, said the decision to walk away from making any changes meant the effort to modernize the regulations has dwindled down to nothing more than a circular and futile exercise in failed logic.

The decision, she said, means that we are essentially denying the technological process that our systems and our society will need and will depend on more and more in the future.

Illinois has consistently worked to develop processes that allow utilities to experiment with new technologies and not delay innovation necessary for grid modernization. That is, until now, Commissioner SadziOlivasaid before voting against the majority. The majority decision today sets Illinois back as a state progressive in its approach to innovation.

Commission Chair Carrie Zalewski said the decision to leave current regulations as they are does not prevent a utility from asking the agency to consider the cost of cloud data processing services in its next rate case.

She argued that the proposed rule changes as negotiated by the commissions staff and utilities did not really level the playing field for cloud-based services compared to in-house computing. She said the changes, if adopted, could hurt consumers because the proposed language lacks a necessary consumer protection mechanism" previously suggested by lawmakers.

I think cloud-based solutions are the future for this industry and will transform the utilities day to day operations and cost structure in a way that we cannot even predict today, Zalewski said. We need to let technology and markets thrive by getting out of the way.

That is not how Mishal Thadani, director of market development and policy at the artificial intelligence company Urbint, sees the commissions decision.

"Utility investment in cloud-based software is critical to addressing the safety and reliability challenges of today and tomorrow," Thadani said."Adopting specific rules that clarify the capitalization of these investments would close a gap in the regulatory framework, creating cost-efficiencies and making it easier for utilities to unlock the value of emerging technologies for their consumers."

If not for the order closing the proceeding, this would have been a groundbreaking measure for utility innovation, he said.

More here:
Illinois regulators reject proposal to allow utilities cost recovery for cloud-based computing - Utility Dive

Read More..

Future Trends in the World of Cloud Computing – Analytics Insight

Advancements in the cloud computing industry move at a rapid pace and sometimes are difficult to anticipate. Cloud computing is changing organizations in various means. Regardless of whether it is how they store their information or how they safeguard their data, cloud computing is assisting all organizations in each division.

Sharp and clever organizations are continually searching for the most inventive approaches to improve and achieve their business motives. About cloud technology, an increasing number of organizations comprehend the advantages this technology can give them and are starting to look for more cloud computing alternatives to direct their business actions.

Today, the cloud has emerged substantially and has been universally recognized by researchers and organizations resembling as a significant power in fundamentally changing the whole IT scene, from how data servers are built, how programming is deployed, to how upgrades are dealt with, and much more.

Given the crucial job that IT plays in the present business scenario, cloud computing is additionally changing the way that organizations work. A huge number of organizations of all sizes in a wide scope of businesses are using cloud-based programming, platforms, and even infrastructure to modernize procedures, lower IT sophistication, get better clarity, and lessen costs.

On the promising fate of cloud computing, all IT experts concur that cloud computing will be at the bleeding edge of all innovations to understand significant business challenges. According to IDC, at least half of the IT spend is on cloud-based advancements. It is anticipated to reach60% of all IT infrastructure and 60-70% of all software, services and technology spend by 2020.

According to Forbes, an approximate of83% of enterprise workloadswill be in the cloud by 2020. This shows us that the future of cloud computing looks very promising. Here are some big-picture trends that will exemplify the cloud computing market for the future.

Information theft, breach, and omission of data are a major threat even for conventional IT infrastructures. But, with more organizations moving to cloud platforms, its important to guarantee that cloud service providers can make a secure framework to ensure the wellbeing of their customers information.

Cloud security isnt only a trend in cloud computing, its a need that is emphasized by each organization. Consequently, there is an enormous demand for cloud security suppliers that guarantee data practices completely abide by GDPR and other compliance norms.

The prevalence of cell phones, advanced mobile phones, and tablets is additionally majorly affecting the business world. Rather than being tied to work areas and desks in an office, workers today can utilize their cell phones to carry out their jobs whenever from pretty much anyplace.

The anytime, anyplace access that these cloud-based applications provide is perfect for people who are consistently in a hurry. As opposed to stopping by the workplace to utilize their personal computers, employees can essentially sign into an application with a web-enabled device like a cell phone or tablet and carry out their tasks in the cloud.

By encouraging access to precise data and making communication simpler, the cloud is perfect for breaking down barriers, both internally, between divisions or individual staff members, or externally, among clients and customer service employees.

When these barriers are removed, organizations lose the resistance that used to slow them. Automated supply chains and dashboards that show real-time information are just two instances of cloud-enabled devices that are on the ascent and are assisting to make organizations progressively frictionless.

Since cloud platforms are sophisticated, it is important to guarantee that the platform has a quick and safe communication environment. With a service mesh, clients have a devoted layer for service to service correspondence, making their cloud platform exceptionally powerful and secure. The service mesh is a fundamental part of a cloud platform.

As cloud ecosystems grow and are altered to fit the changing needs of clients, a service mesh can fill the various provisions that come up from service identities to get to different arrangements within the cloud platform. The mesh sets up a system communication framework that permits you to decouple and offload the majority of your application network function from your administration code.

With a cloud computing platform that is open-source, organizations can see various advantages. They can rapidly scale their cloud foundation, including rarities is a lot more direct than with a closed source platform, and there are fewer security concerns.

The tech business is moving to a community-oriented workplace and picking an open-source cloud computing administration is by all means the correct decision for new businesses or ones that are scaling. This is the reason numerous experts guarantee that open source is the eventual fate of cloud computing.

Here is the original post:
Future Trends in the World of Cloud Computing - Analytics Insight

Read More..