Google Cloud: How to Reduce Data Storage Costs and Maximize Performance – Enterprise Storage Forum

The Google Cloud Platform (GCP) has gained serious momentum in recent months. Data storage companies such as NetApp, Veritas, Cohesity, MapR, Cloudian and Nutanix are partnering with Google in an effort to broaden the appeal of their offerings. But the Google Cloud itself is a vast universe of services, storage tiers, speeds, feeds and price points.

Google Cloud Platform offers a diverse portfolio that ranges from flexible and unified storage classes to scalable and secure databases, said Chris Talbott, head of cloud storage product marketing at Google. Its designed to handle mission-critical workloads and connect data to any application.

As such, it consists of compute, storage, databases, machine learning, analytics, networking, big data, internet of things, developer tools, management tools and security features. With so many facets to understand, how can users use GCP to reduce storage costs, maximize performance and gain competitive advantage?

Some storage and IT managers have a carte blanche from management to implement cloud-first strategies. They are under no strain to get it right the first time. They are blessed with all the time in the world to figure out the best way to learn from their mistakes and eventually arrive at the right cloud architecture for their organizations. But they represent the lucky few.

In most cases, storage managers are under the gun to show some immediate return. Within a few months, they know they will be called to the carpet to show tangible results in terms of lower storage costs and smaller budgets.

The best way to achieve that, suggested Talbott, is to look for the low-hanging fruit. One likely area, for example, is tape backup. Anyone who is going to have to invest yet again in tape hardware or an upgrade to the latest tape platform should consider the cloud. As well as offering the potential for cheaper storage, the cloud opens the door to doing something with the data (such as analytics) rather than gathering dust in a vault.

Think about underutilized data sets and easy wins, said Talbott. Many organizations are currently backing-up and archiving data to tape, requiring costly infrastructure to maintain and providing little value outside of a recovery event. Not only can you reduce the effort and costs of maintaining that on-premises you can expose those datasets to other technologies in GCP like data analytics and machine learning.

Talbott said that most users are smart enough to assess current needs. They take the time to figure out how they can use cloud storage to fulfill ongoing requirements. But not enough companies look ahead to gain some idea of how their needs may evolve in the years to come. Although it is impossible to predict such things accurately, it is wise to attempt some kind of projection of storage needs at least a couple of years into the future. That might save some embarrassment when you discover in a years time that your plan for cloud storage was hopelessly inadequate.

Storage managers are advised not to rush headlong into cloud storage decisions. They should consult other areas of IT as well as line of business heads before making any firm commitments. After all, storage is just one part of a much larger IT ecosystem. It has to be implemented sensibly in full alignment with other components and enterprise objectives.

While the cloud can reduce the cost of storage and increase the accessibility of data, choosing the right combination of systems is essential in realizing the benefits of cloud storage, said Talbott.

In addition, Talbott recommended using the cloud to take advantage of storage tiers to make information lifecycle policies work for you. As users begin to move data to the cloud, they should try to understand what data needs to be accessed and when so they can take advantage of different storage tiers. For example, GCP offers Nearline and Coldline archival storage. Nearline is best for data that is accessed a few times a year, but if you have data you dont need on a yearly basis, you could move it to Coldline to reduce costs.

To optimize the storage of an object throughout its life, it will likely spend time in each tier of storage from multi-regional to cold, said Talbott. Object lifecycle policies can automate that cascade.

Its one thing to dump data into a cloud repository as a means of reducing costs. But the real value comes in how that data is managed strategically.

Various data and storage management solutions are available from vendors such as StorEdge, Red Hat, NetApp and others. While functionality varies markedly from one tool to another, in general, they are designed to help organizations maintain control over their data, where it resides, how quickly they can access it and how they can harness it in their business.

For organizations who are looking to understand how their GCP cloud investments are meeting business expectations, NetApps data management solutions provide them with clear visibility and insight into cost, performance and data placement to better understand the impact of IT decisions, said Michael Elliott, cloud advocate, NetApp. Organizations can also address the challenges associated with regulatory, data security and sovereignty requirements by maintaining control of their data across its entire lifecycle.

Part of the reason so many storage businesses are partnering with Google is that it provides easy access to innovation and expanded markets.

Collaborate closely with Google, as they are very responsive when it comes to integrating with their services, said Patrick Rogers, head of marketing and product at Cohesity. APIs may work differently across different cloud providers, so what is possible with one provider may not exactly be the same with another.

Greed is good, said Gordon Gekko in the movie Wall Street. Google has paraphrased that to Green is good. Accordingly, the company believes those journeying to the cloud should look beyond cost savings, which are regarded almost universally as the typical measure of success for those adopting cloud storage.

In many cases, cost savings is just a start. This year Google will reach 100 percent renewable energy for all its operations, data centers included. This means users can reduce the environmental impact of their data storage and operations by taking advantage of data centers which use 50 percent less energy than a typical data center.

With the exponential increase in data being stored, it becomes increasingly important to consider how green the electrons are that power its storage, said Talbott.

Photo courtesy of Shutterstock.

See the rest here:
Google Cloud: How to Reduce Data Storage Costs and Maximize Performance - Enterprise Storage Forum

Related Posts

Comments are closed.