Experts discuss industry response to multicloud – IT Brief New Zealand

Article by NetEvents editor Lionel Snell

When everyone first started talking about the cloud, it looked as if the pendulum might be swinging back towards a client/server situation, with the cloud being the server to a worldwide population of relatively thin clients.

Big cloud providers encouraged that model: give us your data and we will sell you access to our services.

But that one cloud has evolved from being a single entity to a broad concept, one that included private as well as public clouds, and then the inevitable hybrid clouds incorporating both public and private clouds.

Now we have multicloud is this just a new name for a hybrid cloud?

As I understood it, the difference should be that a hybrid cloud means an interconnected combination of public and private clouds, so that they become one integrated whole, whereas a multicloud means relying on several cloud services from several vendors for business purposes.

But the two are not distinct: for example, Dell EMC cloud solutions offer to transform IT by leveraging a multicloud approach spanning a variety of public, private and hybrid resources.

And IBMs multicloud solutions page says: Multicloud is a cloud adoption strategy that embraces a mix of cloud models public, dedicated, private, managed to best meet unique business, application and workload requirements.

Wikibon chief research officer and general manager Peter Burris says: The fundamental business objective is to use data as an asset... digital business is about how we are going to put data to work differently.

In particular, data is being used to further the current trend for transforming products into services: and that is just what cloud is already doing in the IT industry an important point because it means that the way the cloud is developing now could be a pattern for the way future businesses will develop.

Instead of repeating the usual clich about data being the new oil, he pointed out what a lousy analogy that was: Data is easily copied. It's easily shared. It's easily corrupted. It does not follow the laws of scarcity and that has enormous implications, certainly for all the vendors on the panel and virtually every enterprise on the planet.

Seeing cloud development as a roadmap for broader, longer-term tech-industry trends does make this a vital topic, and it emphasises the point that the cloud is not about centralising computing on a massive scale, but about creating simpler, more powerful distributed computing.

Rather than pass our data up into some providers cloud, we rather keep the data in place: where it is gathered, where it is most secure, where intellectual property is easiest to protect, and where the actual business takes place.

This is not about moving data into the cloud. This is about moving the cloud and cloud services to the data. Within 10 years the cloud is going to reflect a natural organisation of data, whether it's at the edge, whether it's in the core or whether it's in public cloud attributes.

Cisco cloud platforms and solutions group product management VP Jean-Luc Valente points out that it was one thing to upload a terabyte of data to a cloud, but as the surge in data and applications rises towards exabytes, he says it would cost $30 million to upload just one of those to a public cloud.

This explosion of data at the edge is very serious from a networking and security angle.

Over the decades, networking has evolved from being a means to connect devices, to connecting sites, and connecting pages and individuals on social media so is it moving towards primarily connecting data and services?

According to NetFoundry CEO Galeal Zino, Now that the application is the new edge, and data is everywhere, we actually need to reinvent networking and the ecosystems around networking to match that new reality.

NetScout strategic alliances area VP Michael Segal referenced recent discussions about AI and machine learning using data to train automatic processes.

A lot of this would require analysing data in real-time, so edge computing becomes very important. The data needs to be close to where it's being analysed and where it provides insight in real-time.

Burris emphasises the increasingly critical role of the network: the actual training algorithms they use date way back before 2000, it was just that until recently there wasnt the parallel computing capability to put them to work effectively.

Apstra CEO & founder Mansour Karam is another advocate for this exciting time to be in networking.

He says: Managing networks like before no longer works. You can't manage networks manually by configuring devices by hand. It has to be done through software. It has to be done through powerful automation. You have to have the ability to abstract out all of those network services across all of those domains and you have to have the ability to operate these networks, enforce those policies, set these configurations and verify them remotely in every location where data resides.

So the importance of the multicloud is not where the data lies, but how it is managed in an agile manner by leveraging service mesh technology, applying containers, DevOps, or DevSecOps and once we can manage the edge with that same level of agility and automation, all of a sudden the data and the applications will exist wherever they are best put.

Segal compares this spread to the architecture of the modern data centre, where: A lot of server farms and a lot of east, west traffic and containers and virtualised environments in the data centre itself.

Then you extend it, not necessarily immediately to the public cloud - in some cases to private clouds such as Equinix.

Then you can have several different public cloud providers - Oracle, AWS, Microsoft Azure - and think about the complexities associated with connecting everything, many of them are edge computing environments.

Another point Burris made is that there has been a lot of emphasis on the data explosion, but what about the attendant software explosion as we move into a realm where these distributed services are accessed as both applications and data? Automation and abstraction require software, entities will be defined and policies enforced in software.

There's going to be an enormous explosion in the amount of software that's being generated over the next few years.

But is that the real business issue?

Oracle Vice President Jon Mittelhauser works mostly with Fortune 1000 companies and government departments where a lot of our value add is the fast connection between customer data centres. The data can live in either place but I agree that it's the key asset.

For most companies, their data is their asset, not the software. Here in Silicon Valley, the software is highly valued, but outside of Silicon Valley it's the data, or what they do with the data, which software helps you with.

Mansour Karam sees a transition from the days when one began by partnering with hardware vendors.

Once the hardware was agreed, then one decided what software to use.

But in a software-first world, you would be limited to those software offerings that that particular hardware vendor supports. In this new world, they start by partnering strategically with software vendors to define this layer of software first, this service layer. Once they've done that, they can go on and shop for hardware that specifically meets their needs.

To sum up, Peter Burris emphasises three key points:

Original post:
Experts discuss industry response to multicloud - IT Brief New Zealand

Related Posts

Comments are closed.