Category Archives: Cloud Storage

Here’s how iCloud’s free storage and upgrades compare to the competition – 9to5Mac

Apple first introduced iCloud at WWDC 2011, with Steve Jobs touting it as the best way to store documents, mail, backups, and more in the cloud. One thing that has infamously stayed the same since that 2011 launch: Apple gives you just 5GB of iCloud storage for free.

Nearly 13 years later, how does iClouds free storage offer and paid upgrade plans compare to the competition?

My research was inspired by John Gruber at Daring Fireball, who published his call for Apple to offer more with iCloud, cost-to-Apple be damned last week.

There are other companies beyond these four that offer cloud storage. pCloud, for instance, is an increasingly popular choice that offers both subscription and lifetime storage options. For instance, you can get 2TB of storage from pCloud for $99/year or $399 lifetime. Im not sure how sustainable that business model is, but its an interesting proposition.

But while its easy to do a comparison based purely on storage amounts, there are other factors to consider especially for Microsoft and Apple.

Microsofts 100GB plan also includes access to other features and services, including mobile and web versions of Word, Excel, PowerPoint, and more. The 1TB plan includes access to those services on the desktop, web, and mobile.

Microsoft doesnt offer a way to subscribe to just OneDrive storage. They used to, but they removed that option in 2023. You can, however, add up to 1TB of additional storage to a Microsoft 365 plan at $0.01 per GB.

Another thing in Microsofts favor: its free tier includes 5GB of cloud storage and 15GB of mailbox storage. In Apples case, you get 5GB for free, and everything (including iCloud Mail) counts toward that limit.

Apples iCloud+ storage plans, meanwhile, include various other premium features:

Dropbox and Googles plans also include some benefits beyond just storage, but does anyone really care?

I generally found myself in agreement with Grubers conclusion. Sure, Apples pricing and plans are competitive with its competitors. Even the free tier at 5GB isnt significantly out of line compared to the broader market. Still, after 13 years, something needs to change:

So on the one hand, its not like Apples iCloud storage pricing is out of line with its competitors. But on the other hand, the free tier of iCloud has been stuck at 5 GB since the day iCloud was announced, which was so long ago that Steve Jobs announced it at his final WWDC keynote in 2011. iClouds $1/month 50 GB and $3/month 200 GB tiers have been unchanged since 2015. Like the stingy U.S. minimum wagewhich was last increased, to $7.25/hour, in 2009these tiers ought to be adjusted for inflation periodically, but arent.

Another takeaway I had: all of this is confusing. Chances are, if youre an iPhone user looking for more cloud storage, your best bet is iCloud+.

Check out Grubers full write-up for more.

Follow Chance:Threads,Twitter,Instagram, andMastodon.

FTC: We use income earning auto affiliate links. More.

Read the rest here:
Here's how iCloud's free storage and upgrades compare to the competition - 9to5Mac

Datafy raises $6 million Seed round led by Insight Partners to optimize cloud storage – CTech

Datafy, which has developed a cloud storage management platform, announced on Wednesday the completion of a $6 million Seed funding round led by global software investor Insight Partners.

The cloud storage sector is experiencing rapid growth, driven by an exponential increase in data generation due to the growth in AI adoption and the general widespread adoption of cloud technologies. The global cloud storage market is projected to grow from $132 billion in 2024 to $665 billion by 2032. Datafy offers up to 50% savings on storage costs and provides a self-optimizing, developer-independent solution.

Datafy's flagship product, focusing on EBS (Elastic Block Store) on the AWS cloud, simplifies cloud storage management by auto-scaling data storage usage to ensure optimum management at a minimal cost.

Datafy was founded by Zivan Ori (CEO), Yoav Ilovich (CPO), and Ziv Serlin (COO). Ori and Serlin previously founded E8 Storage, which was sold to Amazon in 2019. After the acquisition, the two led a development group in the field of cloud storage at Amazon's R&D center in Israel. Ilovich, a graduate of the IDF's Talpiot unit where he also met Ori, has led product teams for more than 15 years, including VP Product positions at Taboola and at Pagaya.

"Our mission is clear - to give Finops and Devops teams the control they deserve with no effort or big changes to the system, said Ori. With Datafy, we're not just saving money; we're transforming how businesses manage their data in the cloud. Todays funding news is the next step in our journey as we continue to grow."

See the article here:
Datafy raises $6 million Seed round led by Insight Partners to optimize cloud storage - CTech

DigitalGlue to highlight new creative.space //CLOUD and //EDGE-X storage solutions at NAB 2024 – NewscastStudio

DigitalGlue will debut of its latest storage solutions, the creative.space //CLOUD and the //EDGE-X storage server, at the NAB Show 2024. Tailored for the content creation industry, these offerings are designed to provide scalable, secure, and cost-effective data management for businesses and creative professionals. Attendees can experience these innovative solutions firsthand at booth SL9081 and apply for the chance to win 10 TB of //CLOUD storage.

creative.space //CLOUD: Scalable and Affordable Storage for Creative Teams DigitalGlue is introducing a cloud-hosting option to the award-winning creative.space platform as a compelling option for creative teams needing an off-site collaboration solution. By leveraging patented UltraIO technology, creative.spaces //CLOUD storage servers provide unprecedented performance, data protection, and efficiency through the ability to offload CPU tasks to GPUs. //CLOUD customers have access to a dedicated node that provides the same features and experience as DigitalGlues on-premises systems, including desktop mounting, link sharing, HTTPS transfers, and more. While users have the option to stream data over the internet for remote editing, DigitalGlue also provides the option to host Mac Studio workstations for screen sharing access that are networked to cloud storage with a 10 GbE or higher connectivity. This provides a separation between the user and the data for added security, while also leveraging the new high-performance mode option in MacOS Sonoma for remote editing at the highest quality over low bandwidth internet connections. Offered at only $195/month for 10 TB, this solution stands out for its affordability and scalability, making it an ideal choice for creative teams.

//EDGE-X: Compact and Efficient Storage Server The //EDGE-X server is an all-flash SSD-based storage server featuring all of the functionality of the creative.space platform. Its compact form factor and lack of spinning disks make it the ideal solution for on-set storage, including mounted directly to a tripod. Productions can ingest directly from cameras from vendors such as RED and Blackmagic Design over a network connection using the creative.space web app, instead of having to shuttle capture cards. The //EDGE-X is adaptable for many use cases, easily integrating with the creative.space //CLOUD. The //EDGE-X is available for $250/month for 15TB, under a 5-year contract paid annually, offering an efficient solution for creative professionals.

Combined Offering for Comprehensive Data Management DigitalGlue also provides a bundled solution that includes 15TB of creative.space //CLOUD storage and the //EDGE-X server for a total of $445/month, based on a 5-year contract paid annually. This package is crafted to offer creative teams a comprehensive set of tools for efficient digital asset management, enhancing their ability to collaborate and produce content effectively.

No Hidden Fees and a Unified User Experience The creative.space platform delivers a consistent user experience across desktop and web applications, with features such as desktop mounting, media browsing, and file transfers. This uniform approach ensures a transparent pricing model, with fixed monthly or annual rates and no additional user access or task-specific fees.

Launching at NAB 2024: A New Era of Content Creation Collaboration DigitalGlue is proud to introduce the creative.space //CLOUD and //EDGE-X server at NAB 2024 in booth SL9081 and offer attendees the chance to win 10TB of //CLOUD storage. These products aim to transform the way creative teams manage and collaborate on digital assets. By offering a mix of on-premises and //CLOUD storage solutions, these products are set to streamline content creation workflows, addressing the industrys need for secure, accessible, and cost-effective data storage solutions.

View original post here:
DigitalGlue to highlight new creative.space //CLOUD and //EDGE-X storage solutions at NAB 2024 - NewscastStudio

A 30000TB tower powered by a 70-year-old technology Spectra Logic proves that data tape still has a place in an AI … – TechRadar

Spectra Logic has introduced the Spectra Cube tape library, a cloud-optimized system for on-premise, hybrid cloud, and IaaS environments that is designed to be quickly deployed, dynamically scaled, and easily serviced without tools or downtime.

The Spectra Cube library is managed by the company's recently announced LumOS library management software, which provides secure local and remote management and monitoring.

The tower is compatible with LTO-6, LTO-7, LTO-8, and LTO-9 technology generations and will reportedly support LTO-10 when it becomes available. LTO-6 support allows users to read old tapes all the way back to LTO-4 with an LTO-6 tape drive. The solution features high tape cartridge exchange performance, a TeraPack Access Port for easy tape handling, and drive interfaces including Fibre Channel and SAS.

With a capacity-on-demand expansion model, the Spectra Cube allows for additional tape slots and drives to be enabled via software without downtime. The library offers up to 30PB of native capacity and supports up to 16 partitions for shared or multi-tenant environments.

"As cloud data continues to grow rapidly, the escalating costs of public cloud storage have forced a reckoning, leading to significant interest in moving data to more economical locations including on-prem clouds and hybrid clouds, said Matt Ninesling, senior director of tape portfolio management at Spectra Logic.

Compared to typical public cloud options, Spectra Cube solutions can cut the costs of cold storage by half or more, while providing better data control and protection from existential threats like ransomware.

The price of a fully-fledged Spectra Cube library ranges from under $60,000 to over $500,000 depending on configuration, number of tape drives, amount of media, and other additions to the base library.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Go here to read the rest:
A 30000TB tower powered by a 70-year-old technology Spectra Logic proves that data tape still has a place in an AI ... - TechRadar

Google Cloud’s AI Hypercomputer cloud infrastructure gets new GPUs, TPUs, optimized storage and more – SiliconANGLE News

Google Cloud is revamping its AI Hypercomputer architecture with significant enhancements across the board to support rising demand for generative artificial intelligence applications that are becoming increasingly pervasive in enterprise workloads.

At Google Cloud Next 24 today, the company announced updates to almost every layer of the AI Hypercomputer cloud architecture, with new virtual machines powered by Nvidia Corp.s most advanced graphics processing units one of the most significant revelations. In addition, it unveiled enhancements to its storage infrastructure for AI workloads, plus the underlying software for running AI models, and more flexible consumption options with its Dynamic Workload Scheduler service.

The updates were announced by Mark Lohmeyer, vice president and general manager of Compute and ML Infrastructure at Google Cloud. He explained that generative AI has gone from almost nowhere just a couple of years ago to becoming widespread across a wide range of enterprise applications encompassing text, code, videos, images, voice, music and more, placing incredible strains on the underlying compute, networking and storage infrastructure that supports it.

To support the increasingly powerful generative AI models being adopted across the enterprise today, Google Cloud has announced the general availability of what it says is its most powerful and scalable tensor processing unit to date. Its called the TPU v5p, and it has been designed with a single purpose in mind to train and run the most demanding generative AI models.

TPU v5p is built to deliver enormous computing power, with a single pod containing 8,960 chips running in unison, which is more than twice as many as the number in a TPU v4 pod. According to Lohmeyer, the TPU v5p delivers some impressive performance gains, with twice as many floating point operations per second and three-times more high-bandwidth memory on a per-chip basis, resulting in vastly improved overall throughput.

To enable customers to train and serve AI models running on large-scale TPU clusters, Google is adding support for the TPU v5p virtual machines on Google Kubernetes Engine, its cloud-hosted service for running software containers.

As an alternative, customers can also use the latest hardware from Nvidia to train their generative AI models on Google Cloud. Besides its TPU family, its also providing access to Nvidias H100 GPUs through its new A3 family of VMs. The A3 Mega VM will become generally available from next month, and one of its main advantages will be support for confidential computing, which refers to techniques that can protect the most sensitive data from unauthorized access even while its being processed. This is a key development, Lohmeyer said, as it will provide a way for generative AI models to access data that was previously deemed too risky for them to process.

Character.AI is using Google Clouds Tensor Processor Units and A3 VMs running on Nvidias H100 Tensor Core GPUs to train and infer LLMs faster and more efficiently, said Character Technologies Inc. Chief Executive Noam Shazeer. The optionality of GPUs and TPUs running on the powerful AI-first infrastructure makes Google Cloud our obvious choice as we scale to deliver new features and capabilities to millions of users.

More exciting, perhaps, is what Google Cloud has in store for later in the year. Though it hasnt said when, the company confirmed that its planning to bring Nvidias recently announced but not yet released Blackwell GPUs to its AI Hypercomputer architecture. Lohmeyer said the Blackwell GPUs will be made available in two configurations, with VMs powered by both the HGX B200 and GB200 NVL72 GPUs. The former are designed for the most demanding AI workloads, while the latter is expected to support a new era of real-time large language model inference and massive-scale training for trillion-parameter scale models.

More powerful compute is just one part of the infrastructure equation when it comes to supporting advanced generative AI workloads. In addition, enterprises also need access to more capable storage systems that keep their data as close as possible to the compute instances that power them. The idea is that this reduces latency to train models faster, and with todays updates, Google Cloud claims its storage systems are now among the best in the business, with improvements that maximize GPU and TPU utilization, resulting in superior energy efficiency and cost optimization.

Todays updates include the general availability of Cloud Storage FUSE, a new file-based interface for Google Cloud Storage that enables AI and machine learning applications to tap into file-based access to its cloud storage resources. According to Google Cloud, GCS FUSE delivers an increase in training throughput of 2.9 times compared with its existing storage systems, with model serving performance showing a 2.2-times improvement.

Other enhancements include support for caching in preview within Parallelstore, a high-performance parallel file system thats optimized for AI and high-performance computing workloads. With its caching capabilities, Parallelstore enables up to 3.9 times faster training times and 3.7 times superior training throughput, compared to traditional data loaders.

The company also announced AI-focused optimizations to the Filestore service, which is a network file system that enables entire clusters of GPUs and TPUs to simultaneously access the same data.

Lastly, theres the new Hyperdisk ML service, which delivers block storage, available now in preview. With this, Google Cloud claims it can accelerate model load times by up to 12-times compared to alternative services.

A third part of the generative AI equation is the open-source software thats used to support many of these models, and Google Cloud hasnt ignored these either. Its offering a range of updates across its software stack that it says will help simplify developer experiences and improve performance and cost efficiencies.

The software updates include the debut of MaxDiffusion, a new high-performance and scalable reference implementation for diffusion models that generate images. In addition, the company announced a range of new open models available now in MaxText, such as Gemma, GPT3, Llama 2 and Mistral.

The MaxDiffusion and MaxTest models are built on a high performance numerical computing framework called JAX, which is integrated with the OpenXLA compiler to optimize numerical functions and improve model performance. The idea is that these components ensure the most effective implementation of these models, so developers can focus on the math.

In addition, Google announced support for the latest version of the popular PyTorch AI framework, PyTorch/XLA 2.3, which will debut later this month.

Lastly, the company unveiled a new LLM inference engine called Jetstream. Its an open-source offering thats throughput- and memory-optimized for AI accelerators such as Google Clouds TPUs. According to Lohmeyer, it will provide three-times higher performance per dollar on Gemma 7B and other open AI models.

As customers bring their AI workloads to production, theres an increasing demand for a cost-efficient inference stack that delivers high performance, he explained. JetStream helps with this need and offers support for models trained with both JAX and PyTorch/XLA, and includes optimizations for popular open models such as Llama 2 and Gemma.

The final ingredient for running generative AI on Googles cloud stack is the Dynamic Workload Scheduler, which delivers resource management and job scheduling capabilities to developers. The main idea is that it improves access to AI computing capacity while providing tools to optimize spending on these resources.

With todays update, Dynamic Workload Scheduler now provides two starting modes flex start mode for enhanced obtainability with optimized economics, and calendar mode, for more predictable job start times and durations. Both modes are now available in preview.

According to Lohmeyer, flex start jobs will be cued to run as soon as possible, based on resource availability. This will make it easier for developers to access the TPU and GPU resources they need for workloads with more flexible start times. As for calendar mode, this provides short-term reserved access to AI compute resources including TPUs and GPUs. Users will be able to reserve co-located GPUs for a period of up to 14 days, up to eight weeks in advance. Reservations will be confirmed, and the capacity will come available on the requested start date.

Dynamic Workload Scheduler improved on-demand GPU obtainability by 80%, accelerating experiment iteration for our researchers, said Alex Hays, a software engineer at Two Sigma Inc. Leveraging the built-in Kueue and GKE integration, we were able to take advantage of new GPU capacity in Dynamic Workload Scheduler quickly and save months of development work.

THANK YOU

Originally posted here:
Google Cloud's AI Hypercomputer cloud infrastructure gets new GPUs, TPUs, optimized storage and more - SiliconANGLE News

Why won’t Google increase its free 15GB cloud storage? – Pocket-lint

Key Takeaways

It seems like everyone and their dog has a Google account nowadays. Its the most popular email service around, with over a billion daily users, but its usefulness doesnt end there. Its used as a hub for all the Google services, allows easy syncing of Google Chrome between devices, and enables hundreds of other quality-of-life features.

One of the most handy perks that getting a Google account gets you is 15GB of free cloud storage available on Google Drive. Sure, that storage is shared between your Gmail, Google Drive, and Google Photos, but its still useful for keeping your backup, email attachments, and a few documents around and ready to share online.

The 15GB limit between all the Google services was introduced back in 2013, and the bar has not been raised since. On the contrary, over the years, the company opted to remove some of the advantages that its cloud storage offered, such as unlimited photo backup for Google Pixel users, essentially making it a worse deal than it used to be all these years ago.

That begs a question: Why doesnt the free storage tier change? Over the years, prices of storage have gone down significantly, so Google should -- at least theoretically -- be able to offer much more storage to Gmail users. Unfortunately, its not as simple as that, and there are a few quite good reasons that the company is sticking to its 15GB limit.

Lets talk about the expenses first. Its true that storage prices have gone significantly down in the last few years, with both hard drives and SSDs significantly coming down in price per gigabyte. However, this fact doesnt take into account the growth of Google itself, rising prices of electricity and server space, all of which are contributing to significantly increasing costs of maintaining the cloud storage that the company offers.

In the blog post from 2021 when Google announced the end of unlimited photos storage, the company mentioned that more than 4.3 million GB are added to Google servers by users every day. This number increases significantly every year even without making the free storage tier bigger, so the operating costs for Google are tremendous. So, the biggest and most obvious reason that the company doesnt make their free storage tier bigger is the cost.

Plus, 15GB is still one of the bigger allowances around, so Google doesnt see the need to compete in this space anymore, and doing the bare minimum is usually preferable for giant companies to minimize their costs.

Speaking of doing the bare minimum: Most users really do not need more than 15GB of free storage.

For tech enthusiasts, 15GB of storage might feel like a pittance, but for a casual user whos only backing up some photos from their Android phone and getting a few emails a day, 15GB is really much more than enough. That's especially true if you only use your Google account for Gmail. Seeing as the maximum attachment size is 25MB, you could easily store 600 emails with the biggest attachment possible before running out of space.

Thats quite an unrealistic scenario, though, so lets see something more day-to-day.

I got my personal Gmail account around 2010, and ever since I have probably never deleted more than 50 emails. I use this account for almost everything, with tens of emails every day that end up simply rotting in the inbox -- a terrible habit, I know, but who has the time to take care of their inbox? Whats the result? Over these years, with more than 10,000 unread emails and probably more read than that, my Gmail has grown to 1.74GB. I could be as disorganized as I want for the rest of my life, and my Gmail account wouldn't touch the free 15GB limit anyway.

Of course, that's different if you want to use Google Photos as your backup or Google Drive to share and store some files, but for the most basic uses, 15GB of free cloud storage really is enough for most people.

Ultimately, though, the reason why Google doesnt want to give you more free cloud storage is really simple: It wants to make money selling you this service. Especially now that cloud storage is getting even more popular and widespread, its difficult to imagine Google taking a step back and offering more free storage, considering the push toward using Google One.

Of course, its not all bad in the paid cloud storage world. I know because Ive been using Google One for a while now. The cheapest tier is quite affordable at $1.99 per month and gets you not only 100GB of cloud storage across Google services, but some additional goodies as well. Were talking about the ability to share your storage space with up to five people, as well as more editing tools in Google Photos.

However, the real fun starts when you choose the highest-priced Google One plan called AI Premium. Not only does it include 2TB of cloud storage, but more importantly, it also lets you use Google Gemini Advanced. Its an improved Gemini AI model, which works both as a standalone chatbot, but is also available in Google Docs, Gmail, and other Google services if you buy the highest tier of Google One subscription.

So, ultimately, you shouldnt expect Google to offer more free cloud storage any time soon, as it would significantly harm the companys business and discourage users from buying the services that Google wants to push.

You really shouldnt worry that much about the lack of free cloud storage available. Ultimately, using Googles (or anyone elses for that matter) cloud solution is not only not very safe, but its also not the best practice if you value the safety of your data. Instead, if you feel like 15GB is not enough for you, you should look into getting yourself your own Network-Attached Storage, or maybe even setting up your own cloud storage solution. It would not only let you create a cloud storage service thats much more spacious than the ones offered by Google or other companies, but thats also, ultimately, much more affordable in the long run.

Read more from the original source:
Why won't Google increase its free 15GB cloud storage? - Pocket-lint

Google Cloud NEXT 2024: The hottest news, in brief – The Stack

Google Clouds first Arm-based CPU for the data centre, a host of new compute and storage services that dramatically improve generative AI performance, a security-centric Chrome offering, and a flurry of enterprise-focused Workspace updates that take the fight to Microsoft 365.

Also, AI in everything, including Gemini and Vertex AI in data warehouse BigQuery (with fine tuning) in public preview, for "seamless preparation and analysis of multimodal data such as documents, audio and video files." (nb: Vector search came to Big Query in preview in February.)

Those were among the updates set to get serious airtime at Google Cloud NEXT in Las Vegas this week. The Stack will share more considered analysis about some of the news coming through in coming days, along with interviews with executives and customers but heres an early sample from a blockbuster set of press releases, GitHub repositories and blogs...

Unlike traditional email and productivity solutions, Gmail and Workspace were built from the very beginning on a cloud-native architecture, rooted in zero-trust principles, and augmented with AI-powered threat defenses.

So said Google pointedly in the wake of the CSRBs blistering indictment of Microsofts security, which noted pointedly that Redmond had designed its consumer MSA identity infrastructure more than 20 years ago.)

Workspace, Googles suite of collaboration and productivity applications, has approximately 10 million paying users. That makes it a minnow compared to the 300 million+ paid seats Office 365 boasted back in 2022.

It could be more of a threat to Microsoft.

A series of new features unveiled today may make it one. They include a new $10/user AI Security add-on that will let Workspace admins automatically classify and protect sensitive files and data using privacy-preserving AI models and Data Loss Prevention [DLP] controls trained for their organization a Google spokesperson told The Stack that were extending DLP controls and classification labels to Gmail in beta.

Pressed for detail, they told us that these will include:

Also coming soon: Experimental support for post-quantum cryptography (PQC) in client-side encryption [with partners] Thales and Fortanix

A new generative AI service called Google Vids baked into Google Workspace may get more headlines. Thats a video, writing, production, and editing assistant that will work in-browser and sit alongside Docs, Sheets, and Slides from June. Less of a serious competitor for Premier Pro and more a templating assistant that pieces together your first draft with suggested scenes from stock videos, images, and background music.(The Stack has clarified that users can also upload their own video, not just use stock...)

Other Workspace updates today:

Chat: Increased member capacity of up to 500,000 in Spaces for those bigger enterprise customers. Also new: GA messaging interoperability with Slack and Teams through Google-funded Mio, and various AI integrations and enhancements across Docs, Sheets etc.

NVIDIA CEO Jensen Huang anticipates over $1 trillion in data center spending over the next four years as infrastructure is heavily upgraded for more generative AI-centric workloads. This isnt just a case of plumbing in more GPUs Google Cloud is showcasing some real innovations here.

It boasted significant enhancements at every layer of our AI Hypercomputer architecture [including] performance-optimized hardware, open software and frameworks

Top of the list and hot off the press:

Various other promises of faster, cheaper compute also abound. But its storage and caching where GCPs R&D work really shines. (Important for generative AI because it is a HUGE bottleneck for most models.)

A standout is the preview release of Hyperdisk, a block storage service optimised for AI inference/serving workloads that Google Cloud says accelerates model load times up to 12X compared to common alternatives, with read-only, multi-attach, and thin provisioning.

Hyperdisk lets uses spin up 2,500 instances to access the same volume and delivers up to 1.2 TiB/s of aggregate throughput per volume: Over 100X greater performance than Microsoft Azure Ultra SSD and Amazon EBS io2 BlockExpress in short its volumes are heavily optimised and managed network storage devices located independently from VMs, so users can detach or move Hyperdisk volumes to keep data, even after deleting VMs.

Hyperdisk performance is decoupled from size, so you can dynamically update the performance, resize your existing Hyperdisk volumes or add more Hyperdisk volumes to a VM to meet your performance and storage space requirements Google boasts, although there are some limitations

Other storage/caching updates:

Chrome Enterprise Premium is a turbocharged version ofChrome Enterprise with new....

Yes, we agree, this sounds rather good too.

More details and pricing in a standalone piece soon.

Follow this link:
Google Cloud NEXT 2024: The hottest news, in brief - The Stack

Google Photos on Android seems primed to pick up a ‘recover storage’ option – Android Central

A new option hidden within the code for the Google Photos app teases a familiar space-saving function.

According to PunikaWeb, courtesy of AssembleDebug, the latest 6.78 version of Photos contains information regarding a coming "Recover Storage" option. The feature was discovered within the "Account Storage" section, under "Manage Storage." Upon tapping, the Android app showed an addition to the page that would let users "convert photos to Storage saver."

Google's description says the saver will "recover some storage" by reducing the quality of your previously cloud-saved items to save space. This method involves all of a user's photos and videos they've saved via the cloud.

A subsequent page states Photos will not touch the original quality of items stored in Gmail, Drive, or YouTube. Additionally, other items on a user's Pixel device may not be roped into this either.

The publication states Google's continued development of Recover Storage has brought in more information about photo/video compression. The company will seemingly warn users in-app that compressing their older items to a reduced quality "can't be reversed."

Users should also be prepared to wait a while as the app does its thing, which could take a few days.

Image 1 of 2

If this feature sounds familiar, it's because the web-based version of Photos already offers this space-saving option. The good thing is that compressing your older media won't affect your future uploads, as stated on its support page. So, if you're running out of space (again), you can always try to compress your files again.

Get the latest news from Android Central, your trusted companion in the world of Android

There's speculation that Google could roll out its Recover Storage option to Android users soon as its functionality seems nearly done. Moreover, it seems it will arrive for iOS devices in conjunction with Android.

Yesterday (Apr. 10), the company announced that a few powerful AI editing tools will soon arrive in Photos for free. Beginning May 15, all users can utilize Magic Eraser, Photo Unblur, Portrait Light, and a few more without a subscription. Eligible devices include those running Android 8 and above, Chromebook Plus devices, and iOS 15 and above.

King of the Androids

The Google Pixel 8 Pro arrived with a paradigm shift in tow. The device features loads of Google's AI software such as Gemini and other tools for editing up blemishes in our photos. Moreover, the Pixel 8 Pro delivers an immersive display for smooth scrolling, great haptics, and more.

Go here to see the original:
Google Photos on Android seems primed to pick up a 'recover storage' option - Android Central

HYCU Wins Google Cloud Technology Partner of the Year Award for Backup and Disaster Recovery – GlobeNewswire

Boston, Massachusetts, April 09, 2024 (GLOBE NEWSWIRE) -- HYCU, Inc., a leader in data protection as a service and one of the fastest growing companies in the industry, today announced that it has received the 2024 Google Cloud Technology Partner of the Year for Backup and DR. HYCU is being recognized for their achievements in the Google Cloud ecosystem, helping joint customers do more with less by leveraging HYCUs R-Cloud platform that runs natively with Google Cloud to provide core data protection services including enterprise class automated backup and granular recovery across Google Cloud and other IaaS, DBaaS, PaaS, and SaaS services.

Google Clouds Partner Awards celebrate the transformative impact and value that partners have delivered for customers, said Kevin Ichhpurani, Corporate Vice President, Global Ecosystem and Channels at Google Cloud. Were proud to announce HYCU as a 2024 Google Partner Award winner and recognize their achievements enabling customer success from the past year.

HYCU provides backup and recovery for the broadest number of IaaS, DBaaS, PaaS, and SaaS services for Google Cloud currently. This support includes Google Workspace, BigQuery, CloudSQL, AlloyDB, Cloud Functions, Cloud Run, and AppEngine with enhanced capabilities for GKE. This support is in addition to Google Cloud services including Google Compute Engine, Google Cloud Storage, Google Cloud VMware Engine, and SAP on Google. With the HYCU R-Cloud Platform, HYCU can now help customers protect more Google Cloud services than any other provider in the industry. HYCU recently announced it has passed the 70 SaaS integration milestone threshold.

In a year when the threat landscape evolved to put companies at an even higher risk of data loss due to cyber threats, HYCU built an industry leading solution on Google Cloud to help customers extend purpose-built data protection to more of the Google Cloud services and SaaS applications that their businesses rely on, said Simon Taylor, Founder and CEO, HYCU, Inc. HYCUs innovation has also helped drive more growth for Google through double digit Google Marketplace GTV YoY. And, more HYCU customers recognized the value of HYCU R-Cloud to leverage the full power of R-Cloud for data protection across Google Cloud, on-prem, and SaaS, with all data backups stored securely using Google Cloud Storage. All of us at HYCU are both excited and proud to be named a Partner of the Year. It is yet another milestone as we look to solve the worlds modern data protection challenges.

Since the HYCU R-Cloud Platform was released and running on Google Cloud, customers have been able to benefit from R-Graph, the first visualization tool designed to help visualize a companys entire data estate including on-premises, Google Cloud and SaaS data. As the industrys first cloud-native platform for data protection, HYCU R-Cloud enables the build and release of enterprise-grade data protection for new data sources quickly and efficiently. This has enabled HYCU to extend data protection to dozens of new Google Cloud services and SaaS applications in the past twelve months, and leverage Google Cloud Storage to securely store backups.

For more information on HYCU R-Cloud, visit: https://www.hycu.com/r-cloud, follow us on X (formerly Twitter), connect with us on LinkedIn, Facebook, Instagram, and YouTube.

HYCU is showcasing its solution during Google Cloud Next from April 9th through the 11th in Las Vegas at booth #552. Attendees can learn more about HYCU's modern data protection approach firsthand.

# # #

About HYCU HYCU is the fastest-growing leader in the multi-cloud and SaaS data protection as a service industry. By bringing true SaaS-based data backup and recovery to on-premises, cloud-native and SaaS environments, the company provides unparalleled data protection, migration, disaster recovery, and ransomware protection to thousands of companies worldwide. As an award-winning and recognized visionary in the industry, HYCU solutions eliminate complexity, risk, and the high cost of legacy-based solutions, providing data protection simplicity to make the world safer. With an industry leading NPS score of 91, customers experience frictionless, cost-effective data protection, anywhere, everywhere. HYCU has raised $140M in VC funding to date and is based in Boston, Mass. Learn more at http://www.hycu.com.

The rest is here:
HYCU Wins Google Cloud Technology Partner of the Year Award for Backup and Disaster Recovery - GlobeNewswire

Podcast: What is distributed cloud storage and what are its benefits? – ComputerWeekly.com

In this podcast, we look at distributed cloud storage with Enrico Signoretti, vice-president of product and partnerships at Cubbit.

We talk about how storage has shifted to hybrid and multicloud modes and how distributed cloud storage separates the control plane from data to provide data retention in multiple locations, on-site and in multiple clouds.

Signoretti also talks about how organisations that need to retain control over data over costs and location, for example can achieve that with distributed cloud, as well as talking about the workloads to which it is best suited.

Enrico Signoretti: So, I can start with why it is important right now and then delve into what it is and what it does.

It is important because we live in a moment where companies are shifting from traditional models, at the beginning, [to] just cloud, and then we discovered hybrid cloud, so keeping some of your IT stuff on-premise and some in the public cloud.

Then we were talking more and more about multicloud; most large enterprises have multiple clouds and multiple applications running in different environments.

So, from this point of view, a distributed cloud is a model thats totally different to what were used to seeing in the market. So, the big hyperscalers do everything in single datacentres. So yes, you see the cloud, but everything running in one or a set of very closed datacentres.

With the model of distributed cloud you separate the control plane from the data plane; something that happened in the past when we were talking about software-defined.

So, the service provider keeps control of this control plane . . . but resources can be used and deployed everywhere. They could be in the same public cloud environment that I mentioned before, or in your datacentre. So, you are building this distributed cloud.

More so, when it comes to storage, when we talk about geo-distributed cloud, it means these resources are really distributed geographically, meaning that you can have some of your data in France maybe and other segments of the data in Italy or Germany, or even more distributed than that.

This is the main concept, and its really important for everybody because it removes a lot of obstacles when it is time to work with the multicloud.

Signoretti: The main benefit of distributed cloud is control. You can have control at several levels. When you start thinking about distributed cloud there is no lock-in because you have the possibility to choose where you put your data.

There is data sovereignty as well as we can call it data independence. Its not only data sovereignty that you achieve but you achieve control on all the layers and all aspects of data management.

And this is very important because even though most of the hyperscalers are very quick to respond to new regulations here in Europe, and also in the US, that are popping up, its still a complex world and for many organisations in Europe giving your data to this kind of organisation is not feasible.

The idea here is that with distributed cloud you have this level of sovereignty that you need but also control on cost, control on policies that are applied on this data management.

Maybe if we think about a comparison between the three models on-premises, public cloud and distributed cloud you can see that distributed cloud is just in the middle between the others. On the one hand, you keep control of the entire stack, and on the other hand, you have the flexibility of the public cloud.

So, matching these two, you can have a very efficient infrastructure that is deployed and managed by your organisation but still keeping all the advantages of public cloud.

Signoretti: You have to think of distributed cloud still as cloud. So, if you have a low latency, high-performance workload for which you usually need the CPU [central processing unit] very close to the storage, thats not for distributed cloud.

In that case, its way better to choose something that is on-premise or in the same cloud.

From my point of view, all other workloads are fine from backup, disaster recovery, collaboration and even big data lakes to store huge amounts of data for AI [artificial intelligence] and ML [machine learning].

In most cases you can have a good throughput. Its just the latency thats not there but the same goes for the public cloud. This is probably the set of use cases that are more suited for distributed cloud.

Read more from the original source:
Podcast: What is distributed cloud storage and what are its benefits? - ComputerWeekly.com