Page 3,727«..1020..3,7263,7273,7283,729..3,7403,750..»

Which [r]evolution lies ahead for cloud computing in Southeast Asia? – DatacenterDynamics

Hybrid cloud will continue to appeal to businesses

Business appetite for hybrid cloud has grown significantly in 2019. The challenges faced by businesses in terms of new skills, new application needs, legacy IT management etc. are constantly increasing as businesses realize that cloud computing is no panacea. What is at stake is the significant cost associated with the extensive use of public cloud services and the ever more critical need for data control and security. Against this backdrop, businesses are turning away from exclusive public cloud offerings to move part of their data back to a private cloud. On the other hand, they are abandoning on-premises cloud computing in favor of a hosted private cloud service that combines the best of both worlds - greater cost control and a higher level of security, all with the elasticity and scalability of the cloud.

According to the Nutanix Cloud Enterprise Index, 92 percent of IT decision-makers say this type of infrastructure best meets their needs. The same report also revealed that Singapore is the leading nation in terms of hybrid cloud adoption because of the countrys superior connectivity, in turn providing organizations based here a solid foundation to capitalize on such a technology and remain competitive in our digital economy.

Following in the footsteps of hybrid cloud but going one step further, there comes multi-cloud - a combination of cloud environments ranging from on-premises cloud to hosted private cloud to public cloud, each dedicated to different use cases. Given that no single cloud today can competitively provide for all solutions, the most mature businesses find in multi-cloud the promise of excellence - selecting the best solutions from the entire cloud offering available to build a single application environment, in which all components are interdependent.

A business can choose to host its database with one provider, turn to another provider for its compute needs, store its data in yet another location, and orchestrate everything in a multi-cloud architecture. As applications become less and less monolithic and their components communicate in an increasingly standardized way, it is a safe bet that multi-cloud has a bright future ahead of it.

Previously, data security solutions focused on storage or networking capabilities. For example, if you wanted to store encryption keys securely, you had to rely on an HSM (Hardware Security Module), a monolithic solution that was poorly aligned with the cloud concept. The ability to secure data in use, called Confidential Computing, is a big leap forward. More processors will embed this capability, which will therefore be increasingly available in infrastructures.

Organizations are now able to store and run all or part of software programs that require end-to-end security, thus greatly improving the security of data encryption and, in turn, of entire systems. Data encryption will be more readily available, whether for data in transit or at rest, to enhance data security and give businesses much needed peace of mind at a time cyber breaches are becoming increasingly costly. Last year, it was estimated that the average organizational cost of breach in ASEAN stands at US$2.62 million.

With the introduction of data protection regulations and increased public awareness of this issue, businesses have realized the strategic nature of data sovereignty for themselves. The issue of the legal framework for data goes beyond the scope of cloud providers alone and also affects businesses that use cloud solutions. Local initiatives are multiplying to set the rules for a trusted cloud, which meets everyone's expectations in terms of data sovereignty. Taking as example the recent French-German Gaia-X project, it would not be surprising that in 2020, private as well as public organizations were to favor their regional ecosystem face over the American-Chinese duopoly. We should see the development of new collaborative projects allowing the implementation of more local alternatives, made possible by a collective awareness by European vendors of their ability to provide a relevant cloud offering for the Southeast Asia market.

Many other topics could have been addressed here, such as open source, blockchain, AI and machine learning, but also applications related to smart cities, autonomous cars and connected health. These technologies and fields of application involve the storage, exchange, and processing of a large - sometimes quite large - amount of data, and are still in their infancy. In any case, one thing is for sure; society is evolving and cloud computing will continue to evolve as well, in order to better support it. ASEAN, being the fastest-growing Internet market in the world, offers numerous opportunities for businesses here, however, to capitalize on such, there is a need to not only understand the current state of cloud computing but also pay close attention to its evolution in order to stay ahead of competitors.

More:
Which [r]evolution lies ahead for cloud computing in Southeast Asia? - DatacenterDynamics

Read More..

4 things you need to understand about edge computing – VentureBeat

Edge computing has claimed a spot in the technology zeitgeist as one of the topics that signals novelty and cutting-edge thinking. For a few years now, it has been assumed that this way of doing computing is, one way or another, the future. But until recently the discussion has been mostly hypothetical, because the infrastructure required to support edge computing has not been available.

That is now changing as a variety of edge computing resources, from micro data centerstospecialized processorstonecessary software abstractions, are making their way into the hands of application developers, entrepreneurs, and large enterprises. We can now look beyond the theoretical when answering questions about edge computings usefulness and implications. So, what does the real-world evidence tell us about this trend? In particular, is the hype around edge computing deserved, or is it misplaced?

Below, Ill outline the current state of the edge computing market. Distilled down, the evidence shows that edge computing is a real phenomenon born of a burgeoning need to decentralize applications for cost and performance reasons. Some aspects of edge computing have been over-hyped, while others have gone under the radar. The following four takeaways attempt to give decision makers a pragmatic view of the edges capabilities now and in the future.

Edge computing is a paradigm that brings computation and data storage closer to where it is needed. It stands in contrast to the traditional cloud computing model, in which computation is centralized in a handful of hyperscale data centers. For the purposes of this article, the edge can be anywhere that is closer to the end user or device than a traditional cloud data center. It could be 100 miles away, one mile away, on-premises, or on-device. Whatever the approach, the traditional edge computing narrative has emphasized that the power of the edge is to minimize latency, either to improve user experience or to enable new latency-sensitive applications. This does edge computing a disservice. While latency mitigation is an important use case, it is probably not the most valuable one. Another use case for edge computing is to minimize network traffic going to and from the cloud, or what some are calling cloud offload, and this will probably deliver at least as much economic value as latency mitigation.

The underlying driver of cloud offload is immense growth in the amount of data being generated, be it by users, devices, or sensors. Fundamentally, the edge is a data problem, Chetan Venkatesh, CEO of Macrometa, a startup tackling data challenges in edge computing, told me late last year. Cloud offload has arisen because it costs money to move all this data, and many would rather not move it to if they dont have to. Edge computing provides a way to extract value from data where it is generated, never moving it beyond the edge. If necessary, the data can be pruned down to a subset that is more economical to send to the cloud for storage or further analysis.

A very typical use for cloud offload is to process video or audio data, two of the most bandwidth-hungry data types. A retailer in Asia with 10,000+ locations is processing both, using edge computing for video surveillance and in-store language translation services, according to a contact I spoke to recently who was involved in the deployment. But there are other sources of data that are similarly expensive to transmit to the cloud. According to another contact, a large IT software vendor is analyzing real-time datafrom its customers on-premises IT infrastructure to preempt problems and optimize performance. It uses edge computing to avoid backhauling all this data to AWS. Industrial equipment also generates an immense amount of data and is a prime candidate for cloud offload.

Despite early proclamations that the edge would displace the cloud, it is more accurate to say that the edge expands the reach of the cloud. It will not put a dent in the ongoing trend of workloads migrating to the cloud. But there is a flurry of activity underway to extend the cloud formula of on-demand resource availability and abstraction of physical infrastructure to locations increasingly distant from traditional cloud data centers. These edge locations will be managed using tools and approaches evolved from the cloud, and over time the line between cloud and edge will blur.

The fact that the edge and the cloud are part of the same continuum is evident in the edge computing initiatives of public cloud providers like AWS and Microsoft Azure. If you are an enterprise looking to do on-premises edge computing, Amazon will now send you an AWS Outpost a fully assembled rack of compute and storage that mimics the hardware design of Amazons own data centers. It is installed in a customers own data center and monitored, maintained, and upgraded by Amazon. Importantly, Outposts run many of the same services AWS users have come to rely on, like the EC2 compute service, making the edge operationally similar to the cloud. Microsoft has a similar aim with its Azure Stack Edge product. These offerings send a clear signal that the cloud providers envision cloud and edge infrastructure unified under one umbrella.

While some applications are best run on-premises, in many cases application owners would like to reap the benefits of edge computing without having to support any on-premises footprint. This requires access to a new kind of infrastructure, something that looks a lot like the cloud but is much more geographically distributed than the few dozen hyperscale data centers that comprise the cloud today. This kind of infrastructure is just now becoming available, and its likely to evolve in three phases, with each phase extending the edges reach by means of a wider and wider geographic footprint.

Phase 1: Multi-Region and Multi-Cloud

The first step toward edge computing for a large swath of applications will be something that many might not consider edge computing, but which can be seen as one end of a spectrum that includes all the edge computing approaches. This step is to leverage multiple regions offered by the public cloud providers. For example, AWS has data centers in 22 geographic regions, with four more announced. An AWS customer serving users in both North America and Europe might run its application in both the Northern California region and the Frankfurt region, for instance. Going from one region to multiple regions can drive a big reduction in latency, and for a large set of applications, this will be all thats needed to deliver a good user experience.

At the same time, there is a trend toward multi-cloud approaches, driven by an array of considerations including cost efficiencies, risk mitigation, avoidance of vendor lock-in, and desire to access best-of-breed services offered by different providers. Doing multi-cloud and getting it right is a very important strategy and architecture today, Mark Weiner, CMO at distributed cloud startup Volterra, told me. A multi-cloud approach, like a multi-region approach, marks an initial step toward distributed workloads on a spectrum that progresses toward more and more decentralized edge computing approaches.

Phase 2: The Regional Edge

The second phase in the edges evolution extends the edge a layer deeper, leveraging infrastructure in hundreds or thousands of locations instead of hyperscale data centers in just a few dozen cities. It turns out there is a set of players who already have an infrastructure footprint like this: Content Delivery Networks. CDNs have been engaged in a precursor to edge computing for two decades now, caching static content closer to end users in order to improve performance. While AWS has 22 regions, a typical CDN like Cloudflare has 194.

Whats different now is these CDNs have begun to open up their infrastructure to general-purpose workloads, not just static content caching. CDNs like Cloudflare, Fastly, Limelight, StackPath, and Zenlayer all offer some combination of container-as-a-service,VM-as-a-service,bare-metal-as-a-service, andserverless functions today. In other words, they are starting to look more like cloud providers. Forward-thinking cloud providers like Packet and Ridge are also offering up this kind of infrastructure, and in turn AWS has taken an initial step toward offering more regionalized infrastructure, introducing the first of what it calls Local Zones in Los Angeles, with additional ones promised.

Phase 3: The Access Edge

The third phase of the edges evolution drives the edge even further outward, to the point where it is just one or two network hops away from the end user or device. In traditional telecommunications terminology this is called the Access portion of the network, so this type of architecture has been labeled the Access Edge. The typical form factor for the Access Edge is a micro data center, which could range in size from a single rack to roughly that of a semi trailer, and could be deployed on the side of the road or at the base of a cellular network tower, for example. Behind the scenes, innovations in things like power and cooling are enabling higher and higher densities of infrastructure to be deployed in these small-footprint data centers.

New entrants such as Vapor IO, EdgeMicro, and EdgePresence have begun to build these micro data centers in a handful of US cities. 2019 was the first major buildout year, and 2020 2021 will see continued heavy investment in these buildouts. By 2022, edge data center returns will be in focus for those who made the capital investments in them, and ultimately these returns will reflect the answer to the question: are there enough killer apps for bringing the edge this close to the end user or device?

We are very early in the process of getting an answer to this question. A number of practitioners Ive spoken to recently have been skeptical that the micro data centers in the Access Edge are justified by enough marginal benefit over the regional data centers of the Regional Edge. The Regional Edge is already being leveraged in many ways by early adopters, including for a variety of cloud offload use cases as well as latency mitigation in user-experience-sensitive domains like online gaming, ad serving, and e-commerce. By contrast, the applications that need the super-low latencies and very short network routes of the Access Edge tend to sound further off: autonomous vehicles, drones, AR/VR, smart cities, remote-guided surgery. More crucially, these applications must weigh the benefits of the Access Edge against doing the computation locally with an on-premises or on-device approach. However, a killer application for the Access Edge could certainly emerge perhaps one that is not in the spotlight today. We will know more in a few years.

Ive outlined above how edge computing describes a variety of architectures and that the edge can be located in many places. However, the ultimate direction of the industry is one of unification, toward a world in which the same tools and processes can be used to manage cloud and edge workloads regardless of where the edge resides. This will require the evolution of the software used to deploy, scale, and manage applications in the cloud, which has historically been architected with a single data center in mind.

Startups such as Ori, Rafay Systems, and Volterra, and big company initiatives like Googles Anthos, Microsofts Azure Arc, and VMwares Tanzu are evolving cloud infrastructure software in this way. Virtually all of these products have a common denominator: They are based on Kubernetes, which has emerged as the dominant approach to managing containerized applications. But these products move beyond the initial design of Kubernetes to support a new world of distributed fleets of Kubernetes clusters. These clusters may sit atop heterogeneous pools of infrastructure comprising the edge, on-premises environments, and public clouds, but thanks to these products they can all be managed uniformly.

Initially, the biggest opportunity for these offerings will be in supporting Phase 1 of the edges evolution, i.e. moderately distributed deployments that leverage a handful of regions across one or more clouds. But this puts them in a good position to support the evolution to the more distributed edge computing architectures beginning to appear on the horizon. Solve the multi-cluster management and operations problem today and youre in a good position to address the broader edge computing use cases as they mature, Haseeb Budhani, CEO of Rafay Systems, told me recently.

Now that the resources to support edge computing are emerging, edge-oriented thinking will become more prevalent among those who design and support applications. Following an era in which the defining trend was centralization in a small number of cloud data centers, there is now a countervailing force in favor of increased decentralization. Edge computing is still in the very early stages, but it has moved beyond the theoretical and into the practical. And one thing we know is this industry moves quickly. The cloud as we know it is only 14 years old. In the grand scheme of things, it will not be long before the edge has left a big mark on the computing landscape.

James Falkoff is an investor with Boston-based venture capital firm Converge.

Read the original post:
4 things you need to understand about edge computing - VentureBeat

Read More..

4 Reasons Why Cloud Technology Is Inseparable from Our Daily Lives – IT News Africa

As we head deeper into 2020, speculation, predictions and technology foresight are everywhere. But when it comes to cloud computing, no other technology has seen such rapid adoption across both the business and consumer sphere.

Cloud has become so indiscernible from our everyday existence thats why speaking about its future as a standalone technology doesnt always make sense. Canon SA suggests four reasons why cloud technology is so important for our lives, including:

1.Transforming society

What is the future of our society? What does it look like, and what does the cloud have to do with it? Cloud will be a huge driving force in the futurization of towns, workplaces, institutions, and people because it will provide the digital infrastructure of tomorrows smart cities (where an estimated 68% of the worlds population will live by 2050).

Our cities are becoming smarter, but each new connected device creates data that has to be stored and analyzed. It will only be possible to support connected technology at this scale, with a combination of edge technology and cloud computing. With this digital infrastructure in place, smart elevators and parking lots, driverless cars and drone taxis, trains and subways, farms and power plants will all be part of the functioning fabric of everyday life.

The cloud will also support emerging technologies such as artificial intelligence and help them to adapt to new platforms such as mobile. For example, while AI has already found its way onto mobile phones, these devices contain a lot of unstructured data such as emails, text messages, and photos. Analyzing unstructured data takes time and processing power that most smartphones dont have locally. With cloud powering the computing, we can expect our phones to become even smarter.

2. Transforming business

Cloud has become indispensable for businesses. One of the key reasons for this is how it has impacted innovation. Without the need for physical infrastructure and the operational and labour costs that come with it, cloud technology removes the typical financial barriers to innovation and digital transformation. Smaller businesses who would traditionally struggle to come up with the upfront investment required for on-premise implementations can access new technology through cloud delivery models.

Meanwhile, cloud has also reduced the risks associated with investment. Expensive, rigid contracts can be a barrier for many smaller companies. The scalability of cloud computing means offerings can grow or shrink back depending on the needs of the company, helping to manage costs and financial risk. These flexible, cloud-based models will continue to grow in popularity, with predictions stating that by the end of 2020, all new companies and 80% of historical vendors will offer subscription-based business models.

Cloud computing also encourages innovation because its speed makes it easy to experiment with new ideas, as feedback can be gathered quickly. If a strategy isnt working it can be corrected quickly, rather than waiting until it has failed to take stock and learnings. This allows businesses to innovate more freely. Meanwhile, if businesses spot opportunities within the market, a cloud infrastructure allows them to respond and harness these opportunities more rapidly.

With so much opportunity in the cloud, businesses will continue to transition, with the worldwide public cloud services market projected to grow 17.5% in 2019 alone, with no sign of abating.

3. Impacting information

At the heart of every business is information. How we access it, harness it and share it is closely tied to how successful any organization can be. And this is where cloud has truly made an impact.

Thanks to cloud, business workers are no longer tied to the office and can access information and collaborate on projects from anywhere in the world, in real-time. This has revolutionized business models. For example, European company OpenDesk uploads furniture designs to the cloud and lets customers download the designs and commission a local manufacturer to build it in their region. This lowers shipping and inventory costs while reducing the companys carbon footprint.

Cloud-based platforms have also enabled businesses to be more efficient. As organizations grow they tend to become ever more siloed, with teams evolving idiosyncratic ways of working and ways of sharing information. Cloud helps companies bridge the gap, allowing all workers to access one place for everything they need. This also makes it more straightforward to create cross-company workflows, where before workers might have doubled up or lost sight of documents as they progressed.

4. Data deluge

An irony of all of this is that the more we use cloud solutions, the more we need them. The so-called data deluge can be attributed to how much we access cloud-based services in our daily lives. Using online systems, social networking, sharing videos, capturing traffic flow, collaborating with colleagues these all add up to vast quantities of data each person and each business generates for themselves and it doesnt even touch on the data created by healthcare, education, science, and the military. All this information needs computing power to manage, store and analyze it for which we need cloud.

This is why talking about the future of cloud is so erroneous what we really need to be talking about is the future of society and the future of business. The future of our cities, our workplaces and institutions are heavily dependent on cloud technology and this will only become truer, as more services go serverless and digital infrastructures become more advanced.

It is not enough to say that the future of the cloud is heading in a certain direction. Rather, it would be more correct to make predictions about the future of humanity, society and business and how cloud technology will rise to meet those challenges.

Edited by Luis Monzon

FollowLuis Monzonon Twitter

FollowIT News Africaon Twitter

Go here to see the original:
4 Reasons Why Cloud Technology Is Inseparable from Our Daily Lives - IT News Africa

Read More..

D-Wave gives anyone working on responses to the COVID-19 free cloud access to its quantum computers – TechCrunch

D-Wave, the Canadian quantum computing company, today announced that it is giving anyone who is working on responses to the COVID-19 free access to its Leap 2 quantum computing cloud service. The offer isnt only valid to those focusing on new drugs but open to any research or team working on any aspect of how to solve the current crisis, be that logistics, modeling the spread of the virus or working on novel diagnostics.

One thing that makes the D-Wave program unique is that the company also managed to pull in a number of partners that are already working with it on other projects. These include Volkswagen, DENSO, Jlich Supercomputing Centre, MDR, Menten AI, Sigma-i Tohoku University, Ludwig Maximilian University and OTI Lumionics. These partners will provide engineering expertise to teams that are using Leap 2 for developing solutions to the Covid-19 crisis.

As D-Wave CEO Alan Baratz told me, this project started taking shape about a week and a half ago. In our conversation, he stressed that teams working with Leap 2 will get a commercial license, so there is no need to open source their solutions and wont have a one-minute per month limit, which are typically the standard restrictions for using D-Waves cloud service.

When we launched leap 2 on February 26th with our hybrid solver service, we launched a quantum computing capability that is now able to solve fairly large problems large scale problems problems at the scale of solving real-world production problems, Baratz told me. And so we said: look, if nothing else, this could be another tool that could be useful to those working on trying to come up with solutions to the pandemic. And so we should make it available.

He acknowledged that there is no guarantee that the teams that will get access to its systems will come up with any workable solutions. But what we do know is that we would be remiss if we didnt make this tool available, he said.

Leap is currently available in the U.S., Canada, Japan and 32 countries in Europe. Thats also where D-Waves partners are active and where researchers will be able to make free use of its systems.

More:
D-Wave gives anyone working on responses to the COVID-19 free cloud access to its quantum computers - TechCrunch

Read More..

Microsoft ‘might be the best tech stock in this market,’ Jim Cramer says – CNBC

Demand for cloud computing services has spiked during the coronavirus outbreak and Microsoft has been a "huge beneficiary from the lockdowns," CNBC's Jim Cramer said Monday.

"Microsoft's stock is a buy. Of course, I'd like it to come down after this gigantic rally, but Microsoft it might be the best tech stock in this market," the "Mad Money" host said.

Since reaching a low of $132.52 during Wall Street's slide into a bear market, shares of Microsoft have rallied more than 20% to $160.23 as of Monday's close.

On Saturday, the software behemoth's Azure cloud business revealed that cloud usage spiked triple digits on a weekly basis in communities under stay-at-home and social distancing mandates. Cloud services demand soared 775% in those areas, the company said in a blog post. Elsewhere, Microsoft Teams saw a "very significant spike" in usage. About 44 million users logging more than 900 million meeting and calling minutes in each of those days on the collaboration platform.

Windows Virtual Desktop and Power BI traffic usage were also up. The data was released amid the exodus of employees from offices and students from classrooms to remote work and learning environments in efforts to help slow the spread of COVID-19.

"The coronavirus has created a magnificent bull market in cloud computing," Cramer said. "These [Microsoft] numbers are astounding."

Still, Microsoft is not clear of the impact of the fast-spreading virus on the broader economy. The software company, along with other big technology names, are prone to headwinds in a global economy that's been knocked off base. As nonessential businesses across the country have been forced to close, unemployment claims are up significantly with millions out of work.

As many as 47 million people could lose their jobs, sending the unemployment rate in the United States soaring to 32.1%, according to projections offered Monday by the St. Louis Federal Reserve. About 67 million Americans are believed to be working in jobs that are most at risk of being cut.

"I'm expecting a major worldwide slowdown, so there's a real possibility that Microsoft's booming cloud business could get cut back as millions are laid off," the host said, though he thinks tailwinds remain.

"I think the rapid-fire adoption of Microsoft's cloud platform overrides these macro concerns."

Disclosure: Cramer's charitable trust owns shares of Microsoft.

Disclaimer

Questions for Cramer?Call Cramer: 1-800-743-CNBC

Want to take a deep dive into Cramer's world? Hit him up!Mad Money Twitter - Jim Cramer Twitter - Facebook - Instagram

Questions, comments, suggestions for the "Mad Money" website? madcap@cnbc.com

Go here to read the rest:
Microsoft 'might be the best tech stock in this market,' Jim Cramer says - CNBC

Read More..

Creating the optimal cloud defense strategy – ITP.net

Back in the day, the theft and loss of backup tapes and laptops were a primary cause of data breaches. That all changed when systems were redesigned and data at rest was encrypted on portable devices. Not only did we use technology to mitigate a predictable human problem, we also increased the tolerance of failure. A single lapse, such as leaving a laptop in a car, doesnt have to compromise an organisations data. We need the same level of failure tolerance, with access controls and IT security, in the cloud.

In the cloud, all infrastructure is virtualised and runs as software. Services and servers are not fixed but can shrink, grow, appear, disappear, and transform in the blink of an eye. Cloud services arent the same as those anchored on-premises. For example, AWS S3 buckets have characteristics of both file shares and web servers, but they are something else entirely.

Practices differ too. You dont patch cloud servers they are replaced with the new software versions. There is also a distinction between the credentials used by an operational instance (like a virtual computer), and those that are accessible by that instance (the services it can call).

Cloud computing requires a distinct way of thinking about IT infrastructure.

A recent study by the Cyentia Institute shows that organisations using four different cloud providers have one-quarter the security exposure rate. Organisations with eight clouds have one-eighth the exposure. Both data points could speak to cloud maturity, operational competence, and the ability to manage complexity. Compare this to the lift and shift cloud strategies, which result in over-provisioned deployments and expensive exercises in wastefulness.

So how do you determine your optimal cloud defense strategy?

Before choosing your deployment model, it is important to note that there isnt one definitive type of cloud out there.The National Institute of Standards and Technology's (NIST) definition of cloud computing lists three cloud service models infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), and software-as-a-service (SaaS). It also lists four deployment models: private, community, public, and hybrid.

In the cloud, all infrastructure is virtualised and runs as software. Services and servers are not fixed but can shrink, grow, appear, disappear, and transform in the blink of an eye. Cloud services arent the same as those anchored on-premises.

Heres a quick summary of how it all works through a security lens:

If you have a hybrid cloud deployment, youll have to mix and match these threats and defenses. In that case, an additional challenge is to unify your security strategy without having to monitor and configure different controls, in different models and in different environments.

Any strategy and priority decisions should come before the technological reasons. Dont go to the cloud for the sake of it. A desired goal and robust accompanying strategy will show the way and illuminate where deeper training and tooling are needed.

Follow this link:
Creating the optimal cloud defense strategy - ITP.net

Read More..

InfiniteIO Introduces Native File Support for Cloud Storage to Accelerate Access, Analytics and Collaboration for Cloud-Native Applications – Business…

AUSTIN, Texas--(BUSINESS WIRE)--InfiniteIO, which offers the worlds fastest metadata platform to accelerate applications, today announced new Hybrid Cloud Tiering software that provides native file access for traditional and cloud-native applications. Powered by the InfiniteIO File Metadata Engine, InfiniteIO allows IT teams to access and manage all data migrated to object storage in native file format. By leveraging InfiniteIOs continuous data placement policies with native file format, customers and partners can utilize cloud-native services such as analytics, machine learning and serverless computing, while maintaining internal security and governancewithout changing the existing infrastructure or user experience.

Organizations are facing increased complexity managing and moving data across diverse hybrid cloud infrastructure environments, said Scott Sinclair, senior analyst at Enterprise Systems Group. InfiniteIOs metadata-first approach for enabling cloud-native data services can help enterprises standardize and automate data movement activities, which can accelerate cloud adoption, increase data mobility, and unlock the potential of the vast amounts of data they own.

The public health, critical infrastructure and economic crises that were seeing on a global scale have underscored the strategic value of data, said Mark Cree, CEO of InfiniteIO. InfiniteIOs innovations on hybrid cloud computing will speed an organizations ability to extract insights, share information and collaborate at the lowest cost, which will mean a world of difference for todays fast-evolving scientific, medical and commercial applications.

Extending Cloud-Native Workflows for Post-production Processing

InfiniteIOs platform-agnostic data placement policies supporting native file format will provide new opportunities to extend and scale on-premises file workloads to public cloud providers such as Amazon Web Services, Google Cloud Platform, IBM Cloud, Microsoft Azure and Oracle Cloud Platform. Entire files placed by InfiniteIO from any primary NAS system to any S3-based object storage can be stored in cloud-native format. Users and applications will continue to securely access the data as needed whether the data is on-premises or in the public cloud.

InfiniteIO also today announced extended metadata management support for Hitachi HCP S3 and Pure FlashBlade environments. Customers and partners building out hybrid cloud infrastructure continue to have a broad choice of primary and secondary storage partners for their workflows and archiving requirements. Currently supported S3-based private cloud platforms include: Cloudian, Dell/EMC, HPE, NetApp, Scality and Quantum among others.

The InfiniteIO File Metadata Engine (IFME) architecture processes metadata to reduce application latency from seconds to microseconds and enable high-performance hybrid clouds. Built on the IFME, the InfiniteIO Application Accelerator enables both on-premises and cloud-migrated file workloads to run faster by responding to metadata requests directly from the network. InfiniteIO Hybrid Cloud Tiering enables IT managers to optimize their IT budgets using the economics of cloud storage with high-performance data tiering and seamless file accessibility.

InfiniteIO plans to offer native file format support in Q2 as part of software release 2.5 for InfiniteIO Hybrid Cloud Tiering. Existing Hybrid Cloud Tiering customers with current maintenance agreements will be able to upgrade their software at no additional cost.

Additional Information

About InfiniteIO

InfiniteIO provides the lowest possible latency for file metadata, enabling applications to run faster, reduce development cycles, and increase data productivity. Based in Austin, Texas, InfiniteIO independently processes file metadata to simultaneously accelerate application performance and hybrid-cloud data tiering for global enterprises, research organizations and media companies. Learn more at http://www.infinite.io or follow the company on Twitter @infiniteio and LinkedIn.

Read the rest here:
InfiniteIO Introduces Native File Support for Cloud Storage to Accelerate Access, Analytics and Collaboration for Cloud-Native Applications - Business...

Read More..

Five trends affecting the adoption of cloud today – ITProPortal

Cloud computing is firmly established as the new normal for enterprise IT. Across industries such as manufacturing and automotive, cloud continues to be one of the fastest-growing segments of IT spend. With greater spend, however, comes greater responsibility for CIOs to invest budgets wisely, and a bigger impact if things go wrong.

CIOs looking to prepare their organisation to succeed in the upcoming turns must take a differentiated approach to cloud computing. It will be essential for CIOs to develop a formal strategy that helps to put individual cloud decisions in the context of the enterprises strategic goals.

In the new era of cloud, cost optimisation will be crucial. Multicloud strategies will warrant provider independence and address concentration risk. The presence of in-house cloud skills will be a key indicator of enterprise agility, including the ability to distribute cloud services where customers want to consume them, on-premises and on the edge while staying secure.

These five factors will impact cloud adoption throughout this year and the steps that CIOs can take to thrive in a cloud-first world.

For an organisation using cloud services across multiple geographies, finding just one public cloud infrastructure provider to meet its needs is a struggle. In organisations like this, the decision to use a multicloud strategy is clear.

In a recent Gartner survey of public cloud users, 81 per cent of respondents said they are working with two or more providers and by 2024 multicloud strategies will reduce vendor dependency for two-thirds of organisations. However, this will primarily happen in ways other than application portability.

Application portability is the ability to migrate an application across platforms without change, and it is seen as benefit of a multicloud strategy. The reality of business practices, though, is that few applications ever move once they have been deployed in production and adopted by the business. Multicloud computing decisions usually rest on three considerations:

Through 2022, insufficient cloud infrastructure as a service (IaaS) skills will delay half of enterprise IT organisations migration to the cloud by two years or more. Todays cloud migration strategies tend more toward lift-and-shift than toward modernisation or refactoring. However, lift-and-shift projects do not develop native-cloud skills. This is creating a market where service providers cannot train and certify people quickly enough to satisfy the need for skilled cloud professionals.

As consulting companies struggle to find a bench of talented people with relevant cloud skills, clients are falling short of their cloud adoption objectives. System integrators (SIs) are the fallback, but clients often do not trust them because many SIs are also still learning and struggle to scale their operations to meet demand.

To overcome the challenges of this workforce shortage, enterprises looking to migrate workloads to the cloud should work with managed service providers and SIs that have a proven track record of successful migrations within the target industry. These partners must also be willing to quantify and commit to expected costs and potential savings.

Cloud security breaches consistently make news headlines. Yet, the stories of these breaches are often framed with vague explanations a misconfigured database or mismanagement by an unnamed third party.

The ambiguity that surrounds cloud computing can make securing the enterprise seem daunting. Concerns about security have led some CIOs to limit their organisational use of public cloud services.

However, the challenge exists not in the security of the cloud itself, but in the policies and technologies for security and control of the technology. In nearly all cases, it is the user, not the cloud provider, who fails to manage the controls used to protect an organisations data.

CIOs must change their line of questioning from Is the cloud secure? to Am I using the cloud securely? to then focus on implementing vendor-specific training for their staff and apply risk management practices to support cloud decisions.

Through 2024, nearly all legacy applications migrated to public cloud IaaS will require optimisation to become more cost-effective. Cloud providers will continue to strengthen their native optimisation capabilities to help organisations select the most cost-effective architecture that can deliver the required performance.

The market for third-party cost optimisation tools will also expand, particularly in multicloud environments. Their value will concentrate on higher-quality analytics that can maximise savings without compromising performance, provide independence from cloud providers and offer multicloud management consistency.

Teams need to recognise the need for optimisation as an integral part of cloud migration projects and develop skills and processes early to use tools to analyse operational data and find cost optimisation opportunities. Leveraging what cloud providers offer natively and augmenting it with third-party solutions in key to maximise savings.

By 2023, the leading cloud service providers will have a distributed ATM-like presence to serve a subset of their services for low-latency application requirements. Many cloud service providers are already investing in ways to make their services available closer to the users that need to access them.

This trend will continue as the granularity of the regions covered by these cloud service providers increases. Micro data centres will be located in areas where a high population of users congregates, while pop-up cloud service points will support temporary requirements like sporting events and concerts.

Equipment supporting an appropriate subset of public cloud services will be housed in locations close enough to the point of need to support the low-latency requirements of the applications that use them. This will enable applications with such requirements to run directly from the cloud providers native services without having to build infrastructure. The introduction and spread of ATM-like cloud service points can be thought of as a specific implementation of edge computing, which continues to grow rapidly.

In 2020, CIOs should consider how these trends will influence their cloud adoption and migration plans for years to come, taking steps now to prepare their IT infrastructure for the future of cloud.

Gregor Petri, Vice President and Analyst, Research and Advisory team, Gartner

Visit link:
Five trends affecting the adoption of cloud today - ITProPortal

Read More..

JD to Snatch Share of China’s Cloud Market Dominated by the Big Three – CapitalWatch

Author: Anna Vodopyanova

After e-commerce and delivery, JD.com Inc. (Nasdaq: JD) is now looking to rival Alibaba Group (NYSE: BABA; HKEX: 9988) in cloud services.

China's major online retailer JD, which also operates JD Logistics and JD Digits, has started shifting its focus to technology this month through its subsidiary, JD Cloud & AI. Chairman of the company's tech committee, Bowen Zhou, discussed the move in a CNBC in a recent interview. He said the ultimate goal is to deliver cloud computing services to Fortune 500 companies, as well as U.S. and European tech firms looking to expand in China.

Thus,JD is entering the highly competitive market dominated by conglomerates Alibaba, Tencent Holdings (HKEX: 0700), and Baidu (Nasdaq: BIDU). Alibaba's cloud computing segment has been its fastest-growing. For the year-end trimester, the giant posted 62% year-over-year revenue growth, at $1.5 billion, in cloud, while its core commerce sector, excluding logistics and local commerce, grew 36%, according to its quarterly report posted in mid-February.

Baidu posted 35% y-o-y growth in "other revenues," which included cloud services, at $1.2 billion, for the same quarter. China's top search engine operator and the developer of the self-driving Apollo program, Baidu already has Tesla Inc. (Nasdaq: TSLA) on its side, providing map cloud services for the American automaker in China.

Tencent, operating in gaming, payments, and social networking, reported $2.4 billion in cloud revenue for the fourth quarter and reached 1 million in paying customers. The company said it especially increased market share in the internet services, municipal services, tourism, and industrial sectors.

All three have also been investing in blockchain and AI, as well as in chip development.

China's Growing Cloud

Canalys estimated in a March 2020 report that China's cloud market grew nearly 67% in the fourth quarter and took up 10.8% of the world total, making it the second-largest. Alibaba held 46.4% of the cloud computing spend, Tencent followed with 18%, and Baidu had 8.8% in China, according to the market analytics firm.

The coronavirus outbreak, which led to massive lockdowns of China's businesses nationwide in the first quarter, will undoubtedly accelerate digitization and cloud computing development.

In a corporate blog last week, JD said it is "now one of the highest investors in R&D among Chinese Internet companies." Starting with the story of how an IBM computer defeated Garry Kasparov in a 1997 chess tournament, describing Bowen Zhou's fascination with technology and AI, through his service at IBM Watson and now JD, the article arrived at JD's trusted (a key word for Chinese businesses) technology services with Zhou as its leader.

The blog cited Zhou as saying, "It is quite natural for JD to open its accumulated mature technology services to external partners, from e-commerce to logistics to finance, to enable partners down and up the supply chain to improve their own efficiency."

On Monday in New York, the stock in JD shifted 2% higher early, but slipped later in the day to $39.83 per American depositary share, 32 cents below Friday's close.

Continue reading here:
JD to Snatch Share of China's Cloud Market Dominated by the Big Three - CapitalWatch

Read More..

The European Web Hoster IKOULA Supports All Companies to Continue Operating during the Covid-19 crisis, by Offering Them Its Collaborative Work,…

To help companies facing a temporary turndown in business because of the Covid-19, IKOULA rallies around its teams and provides them several tools, to support them and to help them to pursue their projects.

"The current crisis in Europe is unprecedented, both in terms of health and economics," said Jules-Henri GAVETTI - President and co-founder of IKOULA. "As an SME, we know how it is important to stand together in such circumstances. Our baseline "we host with care " makes sense more than ever. Thats why it was important for us to provide companies several tools, and thus help them to continue their business as smoothly as possible, while anticipating the recovery. "

- Tools to back up your workstationsIn the current context, remote work is shaking up habits and behaviors. This can lead to forgetting to back up its files and may be a real risk for the company. To overcome this, IKOULA provides 5 GB of backup via its BaaS solution (Backup as A Service) IKOULA Cloud Backup by Acronis. Complete, flexible and above all very quick to set up, it allows everyone to backup and restore their data, without constraint of time or place. Offer available via the following link with the code backupme

- Tools to urgently host a website or to create a VPNIn these times of crisis, a company may need to urgently set up a VPN or to host a short-lived website. In this case, the micro-server can be the ideal ally. IKOULA offers for one month a Raspberry Pi 4 IPv4 & IPv6. Offer available via the following link with the code IKRASP

- Cloud computing tools to promote collaborative work despite the distanceWith remote work, daily activities, as well as inter-service & inter-employee communication can be difficult. Using Cloud computing to deploy instances hosting open-source applications such as RocketChat for instant messaging, OpenOffice for office automation or Jitsi and Jami for videoconferencing will restore the links inside the company, and increase efficiency. During the entire confinement period, IKOULA provides its Cloud and One Click applications for free.

- A Synology server to store files & applicationsIn order to share files in the best possible way, to stream video and to use many other applications in complete security, IKOULA is offering this month a Synology DS115j NAS server. With this server in hosted mode, it is possible to add a multitude of applications, such as LogicalDoc for document management, or SynologieDrive, for file sharing. Offer available via the following link with the code IKOULASYNO

About IKOULAPioneer of the French Cloud since 1998, IKOULA owns its own Datacenters in France (Reims and Laon), as well as two subsidiaries in Spain and the Netherlands. Because Human Being is part of its DNA, IKOULA maintains a close relationship with its customers, and puts at their disposal reactive teams of experts, available 24/7, able to advise them and accompany them in their activities. IKOULA's teams are multilingual, in order to meet the internationalization challenges of all its customers, spread over more than 60 countries on 4 continents.

View source version on businesswire.com: https://www.businesswire.com/news/home/20200401005302/en/

Contacts

PRESS: IKOULA :Laurane VASSOR ARCARO +33 01 84 01 02 69lvassorarcaro@ikoula.com

See the original post here:
The European Web Hoster IKOULA Supports All Companies to Continue Operating during the Covid-19 crisis, by Offering Them Its Collaborative Work,...

Read More..