Category Archives: Cloud Storage
New Report: Child Sexual Abuse Content and Online Risks to … – The Hacker News
Oct 10, 2023The Hacker NewsCybersecurity / Online Security
Certain online risks to children are on the rise, according to a recent report from Thorn, a technology nonprofit whose mission is to build technology to defend children from sexual abuse. Research shared in the Emerging Online Trends in Child Sexual Abuse 2023 report, indicates that minors are increasingly taking and sharing sexual images of themselves. This activity may occur consensually or coercively, as youth also report an increase in risky online interactions with adults.
"In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives," said John Starr, VP of Strategic Impact at Thorn. "Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content."
These trends and others shared in the Emerging Online Trends report align with what other child safety organizations are reporting. The National Center for Missing and Exploited Children (NCMEC) 's CyberTipline has seen a 329% increase in child sexual abuse material (CSAM) files reported in the last five years. In 2022 alone, NCMEC received more than 88.3 million CSAM files.
Several factors may be contributing to the increase in reports:
This content is a potential risk for every platform that hosts user-generated contentwhether a profile picture or expansive cloud storage space.
Hashing and matching is one of the most important pieces of technology that tech companies can utilize to help keep users and platforms protected from the risks of hosting this content while also helping to disrupt the viral spread of CSAM and the cycles of revictimization.
Millions of CSAM files are shared online every year. A large portion of these files are of previously reported and verified CSAM. Because the content is known and has been previously added to an NGO hash list, it can be detected using hashing and matching.
Put simply, hashing and matching is a programmatic way to detect CSAM and disrupt its spread online. Two types of hashing are commonly used: perceptual and cryptographic hashing. Both technologies convert a file into a unique string of numbers called a hash value. It's like a digital fingerprint for each piece of content.
To detect CSAM, content is hashed, and the resulting hash values are compared against hash lists of known CSAM. This methodology allows tech companies to identify, block, or remove this illicit content from their platforms.
Hashing and matching is the foundation of CSAM detection. Because it relies upon matching against hash lists of previously reported and verified content, the number of known CSAM hash values in the database that a company matches against is critical.
Safer, a tool for proactive CSAM detection built by Thorn, offers access to a large database aggregating 29+ million known CSAM hash values. Safer also enables technology companies to share hash lists with each other (either named or anonymously), further expanding the corpus of known CSAM, which helps to disrupt its viral spread.
To eliminate CSAM from the internet, tech companies and NGOs each have a role to play. "Content-hosting platforms are key partners, and Thorn is committed to empowering the tech industry with tools and resources to combat child sexual abuse at scale," Starr added. "This is about safeguarding our children. It's also about helping tech platforms protect their users and themselves from the risks of hosting this content. With the right tools, the internet can be safer."
In 2022, Safer hashed more than 42.1 billion images and videos for their customers. That resulted in 520,000 files of known CSAM being found on their platforms. To date, Safer has helped its customers identify more than two million pieces of CSAM on their platforms.
The more platforms that utilize CSAM detection tools, the better chance there is that the alarming rise of child sexual abuse material online can be reversed.
See the rest here:
New Report: Child Sexual Abuse Content and Online Risks to ... - The Hacker News
Ransomware: All the ways you can protect storage and backup – ComputerWeekly.com
Ransomware is a big threat to organisations of all sizes. According to one piece of research, around two-thirds of disaster recovery incidents are a result of ransomware. Meanwhile, firms take an average of 21 days to recover to normal operations.
The growth of ransomware has put data storage and backup on the frontline of cyber defences, and as firms have bolstered their anti-ransomware measures, attackers have become more sophisticated and dangerous.
Attackers have moved from encrypting production data to targeting backups and backup systems. Their goal is to make it harder for organisations to recover, and so more likely they will pay a ransom. Also, double- and triple-extortion attacks where criminal groups threaten to expose sensitive data, or even use it to target individuals have raised the stakes still further.
In response, chief information security officers (CISOs) and chief information officers (CIOs) have looked to harden systems against ransomware attack, with use of immutable snapshots, air-gapped backups and artificial intelligence (AI)-based threat detection. Suppliers have also bolstered anti-ransomware tools. Some are even offering ransomware recovery guarantees that offer financial compensation if an attack does happen.
Ransomware attacks work by spreading malware that disables access to data. The malware usually enters the organisation through phishing, infected documents, or compromised or malicious websites. It acts to encrypt data, then attackers demand a ransom for the decryption key.
The first line of defence is to detect and block phishing attacks, through antivirus and malware detection on client devices and on the network, and through user awareness and training.
Much of this is standard cyber hygiene. Most methods that work against malware and phishing will work equally against ransomware. Security researchers point out that the malware component of ransomware attacks is often not very sophisticated.
However, although cyber hygiene measures will reduce risks, they are not fool-proof. Therefore, firms also look at deeper levels of data protection against encryption, as well as detecting and blocking suspicious activity on the network.
Good backups remain an important defence against ransomware. If a firm can recover its data from a clean backup, they have a good chance of returning to normal operations without the need to pay a ransom. And, as security advisors such as the UKs NCSC point out, paying the ransom is no guarantee of being able to recover data.
Off-site backup, or data that is air gapped and separated either physically or logically from production systems, provide a good level of protection, but recovery from off-site backups can be slow.
A clean recovery also requires users to spot an attack early enough to prevent backups being infected by malware. Also, attackers now actively target backup systems, with a view to disabling them or corrupting backup files.
This has led storage suppliers to build additional levels of ransomware protection into storage and backup technologies to provide additional layers of defence.
One of the most common measures deployed by suppliers to counter ransomware is immutable backups. Often these are snapshots, which are usually immutable anyway. Snapshots have the added advantage of quick restore times, and they can be stored locally, offsite or in the public cloud. Their disadvantage is that the capacity they occupy can rapidly grow, so often snapshot retention periods are quite low.
A wide range of suppliers now offer immutable data copies, either in backup or directly on production storage.
Examples include Wasabis Object Lock immutability feature, for object storage, and Pures SafeMode snapshots on its FlashBlade and FlashArray systems, as well as object locking in PortWorx.
Vast Data is another supplier that provides immutable backups, using a feature it calls Indestructibility. Firms that use Amazon S3 can also apply Object Lock to buckets. A further approach is to harden the operating system; this is what Scality has done with Linux on its Artesca appliances. By hardening the OS, the supplier restricts admin tools an attacker could use to destroy or encrypt data.
There are, however, different levels of immutability. As James Watts, managing director at Databarracks, points out, the effectiveness of immutability depends on how systems are configured. A tool set for immutability at the backup level will not, for example, prevent an attacker from deleting underlying storage volumes. For ultimate protection, he recommends even backup copies and the storage target should be kept off domain.
The majority of backup suppliers now support air-gapped copies of data, and a growing number will work directly with public cloud storage to make it easier and less capital-intensive to store immutable backups offsite.
Chief information officers and data storage managers should check the capabilities of their backup and recovery tools, such as whether they can upload copies to the cloud or be used to create air-gapped datasets.
Immutable backups are not, however, foolproof. They will not protect an organisation if malware infects the snapshot.
This has prompted suppliers to add anomaly detection at the storage device and network level to help spot ransomware infections before they are triggered. Suppliers have increasingly made use of AI tools to spot anomalies across vast quantities of data, at speeds that are hopefully fast enough to prevent malware from spreading, and from encrypting or deleting data.
Such anomalies might include recognising abnormally large numbers of changes to files in a dataset, or increased levels of randomness in filenames or content, both of which could occur as ransomware begins to encrypt data.
Suppliers that offer this type of detection include Cohesity and NetApp, while Pure has AIOps-based anomaly detection in its Pure1 management platform. Commvault also has early warning features in its technology. Firms have in addition built ransomware detection into production data storage, not just backups, as they try to stay ahead of attacks.
Some suppliers have taken a further step by offering financial guarantees to support their data protection measures.
Veeam and NetApp are among the suppliers that offer ransomware warranties; Pure has a ransomware recovery service-level agreement which includes supplying hardware, and a technician, to recover data.
Firms should take their own steps to ensure any ransomware protection measures are suitable for their operations. Warranties, even those that offer seven- or eight-figure payouts, will only apply in tightly defined circumstances, and cash will only go so far to help an organisation if data has been put beyond reach. Theres no blanket policy or simple answer for every organisation, these decisions all need to balance cost and risk for what works for you, says Databarracks Watts.
Read the original post:
Ransomware: All the ways you can protect storage and backup - ComputerWeekly.com
How To Download From The Cloud – Robots.net
Skip to content
HOW TOhow-to-guide
Welcome to the digital age, where storing and accessing files has become seamless with the help of cloud storage. Gone are the days of carrying around physical hard drives or relying on limited storage space on our devices. With cloud storage, we can now easily upload, store, and share our files from anywhere in the world.
But what exactly is the cloud, and how does it work? In this article, we will explore the concept of cloud storage, the benefits it offers, and how you can download files from the cloud with ease.
The cloud refers to a network of servers that are interconnected and stored in a remote location. These servers are maintained by cloud storage providers, who offer various plans and services for individuals and businesses to store and manage their files securely.
One of the primary benefits of using cloud storage is the flexibility it provides. Instead of being limited by the storage capacity of your physical device, you can store your files in the cloud and access them from any device with an internet connection. Whether youre on your smartphone, tablet, or desktop computer, all your files are just a few clicks away.
Another advantage of cloud storage is the convenience it offers for file sharing and collaboration. You can easily share a file or folder with others, whether its for personal or professional purposes. This eliminates the need to send large attachments via email or physically hand over a USB drive.
Cloud storage also provides a layer of security for your files. Most cloud storage providers encrypt your data, ensuring that only authorized individuals can access it. This is particularly useful in cases where your device gets lost or stolen, as your files remain safe and can be easily recovered.
Now that we have a better understanding of the cloud and its benefits, lets explore the different cloud storage providers available and the options they offer for downloading files.
The cloud is a term used to describe a network of remote servers that are accessed over the internet to store, manage, and process data. Instead of relying on local storage devices, such as hard drives or USB flash drives, cloud storage allows users to upload and access their files from anywhere with an internet connection.
Cloud storage providers offer various plans and services to individuals and businesses, allowing them to store and manage their data securely in the cloud. These providers maintain multiple servers in data centers located in different geographical locations, ensuring redundancy and data availability even in the event of hardware failures or natural disasters.
When you upload a file to the cloud, it is stored on one or more of these remote servers. This eliminates the need for physical storage devices and reduces the risk of data loss due to device failure or damage. Additionally, cloud storage providers offer features like automatic backups and version control, giving users the peace of mind that their data is safe and easily recoverable.
One of the key advantages of cloud storage is its scalability. Traditional storage devices have a finite capacity, but with the cloud, you can easily increase or decrease your storage space based on your needs. Whether youre a small business that needs extra storage for your growing data or an individual who wants to store a large collection of photos and videos, the cloud can accommodate your requirements.
Furthermore, the cloud enables seamless collaboration and file sharing. Instead of sending files as email attachments or copying them to physical media, you can simply share a link with others, granting them access to the files stored in your cloud storage. This promotes efficiency and productivity, particularly in work environments where team members need to collaborate on projects.
Its important to note that while the cloud offers numerous benefits, its essential to choose a reliable and trustworthy cloud storage provider. Factors to consider include security measures, pricing plans, ease of use, and customer support. By making an informed choice, you can ensure that your data is protected and that you have access to the features and benefits that best suit your needs.
Now that you have a clearer understanding of what the cloud is and its advantages, lets explore the different cloud storage providers available and the benefits they offer.
Cloud storage has revolutionized the way we store, access, and share our files. It offers a wide range of benefits that make it an attractive option for individuals and businesses alike. Lets explore some of the key advantages of using cloud storage:
Overall, cloud storage offers convenience, flexibility, and peace of mind. By moving your files to the cloud, you can enjoy easy accessibility, seamless collaboration, and enhanced data security. Its no wonder that cloud storage has become the go-to solution for individuals and businesses alike.
Now that weve explored the benefits of cloud storage, lets delve into the different cloud storage providers available and the features they offer for downloading files.
With the increasing popularity of cloud storage, numerous providers have emerged, offering a wide range of options for individuals and businesses. Lets take a look at some of the most popular cloud storage providers and the unique features they offer:
These are just a few examples of the many cloud storage providers available today. When choosing a provider, its essential to consider factors such as storage capacity, ease of use, collaboration features, security measures, and pricing options. Take your time to evaluate your needs and compare the features offered by different providers to find the one that aligns best with your requirements.
Now that weve explored the different cloud storage providers, lets move on to understanding the download options available when using cloud storage.
When it comes to downloading files from the cloud, various options are available depending on the cloud storage provider and the device youre using. Lets explore some of the common download options and their features:
Its important to note that download options may vary depending on the cloud storage provider and the device youre using. Some providers may have additional features specific to their platform, such as offline access on mobile devices or collaboration options within their applications. Consider your specific needs and preferences when choosing a cloud storage provider to ensure that their download options align with your requirements.
Now that weve understood the different download options available, lets dive into a step-by-step guide on how to download files from the cloud.
Downloading files from the cloud is a straightforward process that can vary slightly depending on the cloud storage provider youre using. However, the general steps outlined below will give you a good understanding of how to download files from the cloud:
Following these steps will allow you to download files from the cloud, whether its a single document, a folder containing multiple files, or a selection of specific files.
Its worth noting that some cloud storage providers offer additional options during the download process, such as choosing the download format for certain file types or setting download preferences. Familiarize yourself with the features and settings of your specific cloud storage provider to make the most out of your download experience.
Now that you know how to download files from the cloud, lets explore some additional tips for efficient cloud downloading.
Downloading files from the cloud can be a seamless and efficient process with the right approach. Here are some additional tips to help you optimize your cloud downloading experience:
By following these tips, you can make your cloud downloading process more efficient and seamless. Take advantage of the features and options provided by your cloud storage provider, and leverage additional tools and software to enhance your experience.
Now that weve explored tips for efficient cloud downloading, lets wrap up this article by summarizing what weve learned and highlighting the benefits of using cloud storage.
In this article, we have delved into the world of cloud storage and explored the various aspects of downloading files from the cloud. We learned that the cloud is a network of remote servers that provides convenient and secure storage for our files.
We discussed the benefits of using cloud storage, including easy accessibility, scalability, cost savings, and enhanced collaboration. We also explored different cloud storage providers such as Google Drive, Dropbox, Microsoft OneDrive, Apple iCloud, Amazon Drive, and Box, each with its own unique features and advantages.
We then examined the different download options available when retrieving files from the cloud, such as direct download, syncing and offline access, selective sync, mobile applications, web browser access, and external applications and integrations.
To help you navigate the process, we provided a step-by-step guide for downloading files from the cloud. We highlighted the importance of organizing your files, utilizing sync and selective sync features, checking available download options, monitoring storage space, using download managers, and optimizing your internet connection.
In conclusion, cloud storage has revolutionized the way we store, access, and share our files. It offers numerous benefits, including easy accessibility, scalability, cost savings, and enhanced collaboration. Downloading files from the cloud is a straightforward process, and by following the tips provided, you can make it more efficient and seamless.
Remember to choose a reliable and trustworthy cloud storage provider that aligns with your needs and preferences. Always prioritize the security and privacy of your files by utilizing encryption features, backup options, and strong passwords.
Embrace the power of the cloud and enjoy the convenience and flexibility it offers for your file storage and management needs. With cloud storage, your files are always within reach, no matter where you are or which device youre using.
Cloud NAS, what is it good for? – ComputerWeekly.com
Network attached storage (NAS) is one of the fastest-growing sectors of the data storage market.For enterprises and other large organisations, NAS provides an efficient way to store ever-larger volumes of unstructured data, from media content and documents to AI training materials.
For smaller outfits, NAS offers a step up from internal storage in servers, providing familiar files and folders arrangement and largely similar management tools. They also run over the same standard Ethernet networks as existing PCs and servers.At the micro-business end of the market, a NAS box might even take the place of a server altogether, due to their simplicity and low cost.
NAS systems, however, have their limitations: access times are slower than direct-attached or server-based storage, and they will often be slower than a storage area network (SAN), not least because they use Ethernet rather than faster transports, such as Fibre Channel.
But the main drawback of traditional NAS is that its harder to scale. Once a NAS box is full, IT departments had to upgrade disks, or buy an entirely new NAS. This leads to high costs, or if they keep the old NAS silos of storage.
This prompted vendors to create scale-out NAS systems, where users can build a mesh of NAS nodes to form a single pool of storage, and it prompted vendors to create NAS functionality in the cloud.
NAS is file access storage that can be made available to any device on a network.At its most basic, a NAS box is a standalone, integrated device with a CPU, networking, memory and a number of bays for hard drives, SSDs or M.2 drives. It has its own operating system, and does not need a connection to a server or storage controller to function. Most admin tasks can be carried out via a web browser.
The heart of NAS is its file system. The file system allows the NAS to appear as a shared volume to any PC user or server on the network. Typically, NAS systems support Network File System (NFS) and Server Message Block (SMB) protocols, for maximum application and OS compatibility.
A NAS is optimised for unstructured data files and documents whereas direct attached storage and SANs are optimised for block access, such as for databases.
Its the integration of the file system that defines NAS, in fact. When using a SAN, the file system resides elsewhere such as on servers or PCs with data held on the SAN in blocks accessed from those externally-held file systems.
NAS devices start small, but can be very large indeed.Small NAS boxes are popular for home offices and micro businesses. At the other end of the scale, enterprise vendors provide NAS units with eye-watering capacities. Chinese vendor Huawei, for example, recently launched its OceanStor Dorado V6, which runs to 32 controllers, 6,400 drives and almost 300PB of capacity.
The first enterprise NAS systems were filers, designed to deal with the growing volume of unstructured data coming into organisations as they moved to digital processes.
Filers from vendors such as NetApp and EMC (now owned by Dell) mopped up vast numbers of files in areas from medical imaging to media and document storage. Without filers, these documents would have overwhelmed server-based storage.
But the current landscape is far more complex. More applications use unstructured data and the performance of NAS hardware has increased with faster versions of Ethernet and the move to flash storage.
Scale-out NAS systems, with multiple nodes operating as a single, very large (virtual) volume, or single namespace, have also overcome some of the limitations of traditional standalone NAS. It is now possible to add storage quickly to a NAS system without applications or users being aware. IT teams can add nodes, and storage, as needed.A global namespace also allows enterprises to distribute storage across locations, including the public cloud.
Two other versions of NAS technology are also worth considering.First, for enterprises, hyper-converged infrastructure (HCI) brings together compute, storage and networking. In some use cases, HCI can replace NAS and SAN systems.At the other end of the spectrum, open-source utilities such as FreeNAS can turn redundant servers, or even desktop PCs, into cost-effective NAS boxes for smaller offices or home use.
The focus of filers and early NAS systems was capacity, not performance. Today, there is still a case for NAS for very large capacities. Few other systems can offer the same low cost per gigabyte of data. But conventional, capacity-driven use cases such as backups, archiving including for compliance reasons and handling relatively low-value documents have been joined by more demanding applications like storage for AI/ML training data and advanced analytics datasets.
This trend has also pushed NAS towards convergence with object storage capability, and HCI.
A cloud NAS uses public cloud storage to provide file access, usually via SMB or NFS. NetApps ONTAP, which includes NFS, is also widely supported.
Connecting a user device, or more likely a server, to a cloud NAS should be as simple as connecting it to local NAS hardware. The differences will largely be around capacity, cost, performance and the level of intelligence offered by cloud systems.
Cloud NAS has no real capacity limits although vendors may set an upper limit for practical administrative purposes. Google Cloud Platform, for example, limits a single namespace to 50PB.
The main performance limit of cloud NAS is the WAN connection to the service provider. If applications need high speeds and low latency, firms will need to invest in high speed fibre links. Alternatively, they could relocate applications to the cloud, if the provider supports compute. But on-premises systems are for now faster and much less costly.
When it comes to cost comparisons, customers need to consider that on-prem NAS hardware requires up-front capital investment. Also, firms often also have to buy more than they need and over-provision to minimise the cost and disruption of physical upgrades.
This is not an issue with the cloud, as it is charged by usage, but CIOs will want to control this. Long-term cloud storage can be expensive if users file and forget documents.
For this reason, the large cloud vendors and vendors such NetApp and IBM sell products aimed specifically at the backup and archive markets. These have more favourable cost structures than regular cloud NAS volumes.
Vendors also offer different tiers of cloud NAS, based on performance. Less important workloads or less frequently used data can move down to cheaper volumes. But, in all cases, buyers need to be aware of bandwidth and egress costs to access or download data.
However, there are use cases where cloud NAS is the best solution. These include where a business needs to scale up storage quickly without rewriting applications or porting them to the cloud; or where cost is less of an issue, such as for a short-term project; or to provide remote storage, collaboration and backup to small offices or remote workers.
Read the original post:
Cloud NAS, what is it good for? - ComputerWeekly.com
CoreWeave GPU-as-a-Service cloud farm using VAST storage … – Blocks and Files
GPU cloud service provider CoreWeave is using VAST Data to for its customers storage needs.
CoreWeave rents out Nvidia A100 and H100 GPUs for use by its customers through its CoreWeave Cloud. This has been built for large-scale workloads needing GPUs, with more than 3,500 H100 Tensor Core GPUs available in one of the largest HGX clusters installed anywhere. Its GPU infrastructure is supported by x86 compute, storage and networking resources, and active in running generative AI workloads.
CEO and co-founder Michael Intrator revealed in a statement that CoreWeave is going to: partner with VAST Data to deliver a multi-tenant and zero-trust environment purpose-built for accelerated compute use cases like machine learning, VFX and rendering, Pixel Streaming and batch processing, thats up to 35 times faster and 80 percent less expensive than legacy cloud providers. This partnership is rooted in a deep technical collaboration that will push the boundaries of data-driven GPU computing to deliver the worlds most optimized AI cloud platform.
Vast co-founder Jeff Denworth said: The deep learning gold rush is on, and CoreWeave is at the epicenter of the action.
Up until now CoreWeave Cloud Storage Volumes have been built on top of Ceph, with triple replication, and distributed across multiple servers and datacenter racks. It features disk drive and all-flash NVMe tiers, and both file and S3 object access protocol support.
This is a significantly large order for VAST, and CoreWeave will deploy the VAST arrays to store, manage and securehundreds of petabytes of data for generative AI, high performance computing (HPC) and visual effects (VFX) workloads.
CoreWeave was funded in 2017, as an Ethereum mining company, and then pivoted in 2019 to AI. It raised $421 million in a B-round last April, another $200 million in May, plus $2.3 billion debt financing in August to expand its infrastructure. This was secured by CoreWeave using its GPUs as collateral. It reckons it will have 14 datacenters in place across the USA by the end of the year, and earn $500 million in revenue, with $2 billion contracted already for 2024.
Nvidia is an investor in CoreWeave and also in VAST, which is Nvidia SuperPOD-certified.
A statement from Renen Hallak, founder and CEO of VAST Data, was complementary to CoreWeave, saying: Since our earliest days, VAST Data has had a single vision of building an architecture that could power the needs of the most demanding cloud-scale AI applications. We could not imagine a better cloud platform to realize this vision than what were creating with CoreWeave. We are humbled and honored to partner with the CoreWeave team to push the boundaries of modern AI computing and to build the infrastructure that will serve as the foundation of tomorrows AI-powered discoveries.
CoreWeave said its supercomputing-class infrastructure trained the new MLPerf GPT-3 175B large language model (LLM) in under 11 minutes more than 29x faster and 4x larger than the next best competitor. A VAST Data blog, soon to be live on its site, provides more background info.
See the article here:
CoreWeave GPU-as-a-Service cloud farm using VAST storage ... - Blocks and Files
Lightbits switches on block storage in Azure Marketplace Blocks … – Blocks and Files
As promised in June, the Lightbits Cloud Data Platform has been made available in the Azure Marketplace as a managed application.
Lightbits software uses public cloud ephemeral storage instances to provide block storage that is faster and more affordable than Azures own block storage instances. These are Azure Disk Storage ones with its Ultra Disk Storage,Premium SSD v2,Premium SSD, Standard SSD, and Standard HDD options. Lightbits creates a fast SAN in the cloud by clustering virtual machines connected by NVMe over TCP. Its capable of delivering up to 1 million IOPS per volume and consistent latency down to 190 microseconds, and performance scales linearly as the cluster size increases.
This means tight SLAs for important apps, that otherwise could not be moved to the Azure cloud, can be met. Kam Eshghi, co-founder and chief strategy officer at Lightbits, said in a statement that customers get the highest performance possible on Azure at a lower cost than native cloud storage, plus hands-off operations as a managed service. Now they can migrate all their workloads to the cloud and be confident that they wont have to repatriate them back on-premises due to skyrocketing costs.
Lightbits Cloud Data Platform software is already available in the AWS Marketplace and is now in preview in the eu-west, us-east, and us-west Azure regions. A Lightbits on Azure launch event is scheduled for September 29, 2023.
There are four suppliers providing block storage in the public cloud based on ephemeral storage instances: Dell (PowerFlex), Lightbits, Silk, and Volumez. They compete against the cloud providers native block storage instances and against block storage array providers, such as Pure Storage with its Cloud Block Store, which have moved their array controller software to the cloud. NeApps Cloud Volumes ONTAP provides block, file, and object storage in AWS, Azure, and the Google Cloud.
The advantage for NetApp and Pure customers is their customers having the same block storage environments on-premises and in the public clouds. The cloud block stores are based on the cloud providers native block storage and dont use ephemeral storage instances. That should give the four suppliers above both performance and cost advantages for their products but they lack the on-premises/public cloud environment consistency that NetApp and Pure provide.
Read the rest here:
Lightbits switches on block storage in Azure Marketplace Blocks ... - Blocks and Files
Cloud Storage Market to Exhibit Impressive Growth of CAGR during … – Digital Journal
Publish Date
February, 2023
Cloud Storage Market Scope:
Global Cloud Storage Market size was valued at USD 70.19 billion in 2021 and is poised to grow from USD 87.36 billion in 2022 to USD 486.5 billion by 2030, at a CAGR of 24.18% during the forecast period (2023-2030).
The study of the global Cloud Storage Marketis presented in the report, which is a thoroughly researched presentation of the data. The analysis delves into some of the key facets of the global Cloud Storage market and shows how drivers like pricing, competition, market dynamics, regional growth, gross margin, and consumption will affect the markets performance. A thorough analysis of the competitive landscape and in-depth company profiles of the top players in the Cloud Storage Market are included in the study. It provides a summary of precise market data, including production, revenue, market value, volume, market share, and growth rate.
Request for Sample Copy of this Global Cloud Storage Market: https://www.skyquestt.com/sample-request/cloud-storage-market
The best investment markers are insights into the most prominent market trends, which help potential participants make decisions even easier. The research aims to discover the numerous growth chances that readers may take into consideration and take advantage of using all the necessary information. The market growth over the coming years can be predicted with greater accuracy by carefully examining the important growth-influencing aspects including pricing, production, profit margins, and value chain analyses.
Major Players Covered in Global Cloud Storage Market Report:
Report Inclusions:
The research study can answer the following Key questions:
(1) What is the estimated size of the global Cloud Storage market at the end of the forecast period?(2) Is the segment-leading the global Cloud Storage market anticipated to retain its leadership?(3) Which regions demonstrate the maximum growth potential?(4) Does any player dominate the global Cloud Storage market?(5) What are the main drivers and restraints in the global Cloud Storage market?
Would you like to ask a question? Ask Our Expert: https://www.skyquestt.com/speak-with-analyst/cloud-storage-market
Table of Contents
Chapter 1Industry Overview
1.1 Definition1.2 Assumptions1.3 Research Scope1.4 Market Analysis by Regions1.5 Market Size Analysis from 2023 to 203011.6 COVID-19 Outbreak: Medical Computer Cart Industry Impact
Chapter 2Competition by Types, Applications, and Top Regions and Countries
2.1 Market (Volume and Value) by Type2.3 Market (Volume and Value) by Regions
Chapter 3Production Market Analysis
3.1 Worldwide Production Market Analysis3.2 Regional Production Market Analysis
Chapter 4Medical Computer Cart Sales, Consumption, Export, Import by Regions (2023-2023)Chapter 5North America Market AnalysisChapter 6East Asia Market AnalysisChapter 7Europe Market AnalysisChapter 8South Asia Market AnalysisChapter 9Southeast Asia Market AnalysisChapter 10Middle East Market AnalysisChapter 11Africa Market AnalysisChapter 12Oceania Market AnalysisChapter 13Latin America Market AnalysisChapter 14Company Profiles and Key Figures in Medical Computer Cart BusinessChapter 15Market Forecast (2023-2030)Chapter 16Conclusions
About Us:
SkyQuest Technologyis leading growth consulting firm providing market intelligence, commercialization and technology services. It has 450+ happy clients globally.
Address:
1 Apache Way, Westford, Massachusetts 01886
Phone:
USA (+1) 617-230-0741
Email:[emailprotected]
See original here:
Cloud Storage Market to Exhibit Impressive Growth of CAGR during ... - Digital Journal
Storage news ticker 29 Sep 2023 Blocks and Files – Blocks and Files
Backblaze, which uses SSDs as boot drives for its disk-based storage servers, has produced its latest SSD annualized failure rate (AFR) report. Restricting its stats to drives with more than 10,000 days of use and to statistics with a confidence interval of 1.0 percent or less gets it a table with three entries:
Its overall HDD AFR is 1.64 percent, measured across 226,309 drives and 20,201,091 drive days considerably more reliable statistics. Its SSDs are more reliable with their overall 0.60 percent AFR. These relatively early SSD AFR numbers are tending to show a so-called bathtub curve, similar to disk drives, with new drives failing more often than young and middle-aged drives and the failure rate rising as drives become older:
Report author Andy Klein writes: While the actual curve (blue line) produced by the SSD failures over each quarter is a bit lumpy, the trend line (second order polynomial) does have a definite bathtub curve look to it. The trend line is about a 70% match to the data, so we cant be too confident of the curve at this point.
Connectivity cloud company Cloudflare, building on its collaboration with Databricks, is bringing MLflow capabilities to developers building on Cloudflares serverless developer platform.Cloudflare is joining the open source MLflow project as an active contributor to bridge the gap between training models and deploying them to Cloudflares global network, where AI models can run close to end-users for a low-latency experience. MLflow is an open source platform for managing the machine learning (ML) lifecycle, created by Databricks. Cloudflares R2 is a zero egress, distributed object storage offering, allowing data teams to share live data sets and AI models with Databricks. The MLFlow deal means developers will be able to train models utilizing Databricks AI platform, then deploy those models, to Cloudflares developer platform and global network, where hyper-local inference is deployed on the edge, to complete the AI lifecycle.
Data security, manager and protector Cohesity says Tata Consultancy Services (TCS) has joined its Data Security Alliance ecosystem. Cohesitys Modern Data Security and Management platform bundled with TCS Cyber Defense Suite portfolio of security services and platforms, will bring customers a unified offering. The pair claim this is designed to improve visibility across the threat landscape, secure cloud activities, and enhance their cyber resilience. They say that joint customers will benefit from TCSs domain knowledge as well as security solutions contextualized for specific industries such as finance, manufacturing, HLS, retail, utility, and more.
TheCohesity Data Security Alliancewas founded in November 2022, and contains 15 members includingBigID,Cisco,CyberArk,Mandiant,Netskope,Okta,Palo Alto Networks,PwC UK,Qualys,Securonix,ServiceNow,Splunk,TCS, andZscaler. It will go down to 14 when Cisco completes buying Splunk.
Marco Fanizzi, Commvaults SVP and GM International, and ex-VP EMEA, is leaving for a destination unknown after joining Commvault 4 years ago.
Microns Crucial consumer products unit has an X9 portable SSD in 1TB, 2TB and 4TB versions using Micron 176-layer QLC NAND, and a read speed up to 1,050MBps across its US-C interface. It has a three-year warranty period and comes in a 65 x 50mm plastic case. It is shock, vibration and drop proofup to 7.5 feet on a carpeted floor. The X9 works with Windows File History, Apple Time Machine and Acronis True Image.
Italy-based web3 decentralized storage startup Cubbit has won Leonardo, one of the worlds largest cybersecurity and defense companies with more than $14 billion in revenues, as a customer. For Leonardo, Cubbit storage means each file is encrypted, fragmented and replicated across multiple geographical locations and, in the event of an attack, it will always be fully reconstructable. The deal means Leonardo will reduce its data traffic, enabling a reduction in production and CO2 emissions. It says the distributed storage enables the construction of more efficient digital twins and its now ready for the storage of more archival data. Thats expected to grow threefold between now and 2026.
Gartner has a new Magic Quadrant; the Distributed Hybrid Infrastructure MQ. It deals with a set if suppliers who provide a standardized infrastructure stack that can run both in the public cloud and on-premises, either in datacenters or at the edge. The suppliers include public clods with versions of their software environment running on-premises, meaning Alibaba, AWS, IBM, Microsoft (Azure), Oracle, and Tencent Cloud, but not Google. A second supplier grouping consists on on-premises suppliers who have migrated their software to run in the public cloud: Huawei, Nutanix and VMware. More info here for Gartner customers. Heres the MQ diagram:
A Beyond Big Data: Hyperscale Takes Flightreport from Ocient says that two-thirds of IT Leaders (67 percent) plan to replace their data warehouse provider this year. Thats a 12% increase from last year (59 percent). 58 percent say that database and data warehouse modernization is the top data and analytics-related IT budget priority over the next 12 to 18 months. 67 percent are actively looking to switch their organizations data warehouse infrastructure. 90 percent are currently, or in the next six to 12 months, planning to remove or replace existing big data and analytics technologies. Ocient says that, for the public sector to adapt to a hyperscale data landscape, it must adopt a strategic and holistic approach with technological upgrades.Translation: convert to Ocient.
Event analytics supplier Mixpanel has released a new native connector for Googles cloud data warehouse BigQuery, making it easier for users to explore and gain insights from data. Event analytics captures every action (or event) that each user performs within a digital product, like an e-Commerce site or a ride hailing app. This granular view helps companies understand how different groups of users behave at various points during their experience. Mixpanel says this approach is faster and easier than traditional Business Intelligence (BI) tools that require data to be prepared and tabulated, with BI queries coded in SQL.
European CSP OVH Cloud has announced the integration of Nvidia A100 and H100 Tensor Core GPUs into its AI offering. That will allow it to complex instances for GPU workloads, large language model ML and HPC workloads.
According to Toms Hardware reporting about a a German Computerbase forum, Samsung has a coming T9 portable SSD offering 2GBps read/write speed through its USB 3.2 gen 22 interface. Samsungs existing T7 portable drive runs at 10GBps. The T9 will come in 1TB, 2TB and 4TB variants with a 5-year warranty. Prices at French retailer PC21 are reportedly $133 for a 1TB T9 and $226 for the 2TB version although we couldnt find the drives on the PC21 website.
Swissbit has announced an N5200 Enterprise SSD in three Enterprise and Data Center Standard Form Factor (EDSFF) E1.S variants (5.9 mm, 9.5 mm, and 15 mm) in addition to U.2. Capacities range from 1.92 to 7.68 TB. It has a 4-lane PCIe and an NVMe 1.4 interface, with sequential data rates of up to 7,000 MB/s read and 4,200 MB/s write. Random reads and writes reach up to 1.35 million IOPS and 450,000 IOPS, respectively. Its endurance us at least 1 drive write per day over 5 years. The drive features TCG OPAL 2.01 and AES-256 encryption, Secure Boot, and Crypto Erase plus error correction and data protection mechanisms. It complies with the OCP Cloud Specification 1.0. Swissbit doesnt identify the NAND type and supplier. For project and sales inquiries, contact the Swissbit Datacenter team.
Cloud-based block storage startup Volumez has hired itself its first CRO, Jason McKinney, who most recently was worldwide VP of public cloud sales at NetApp for the last 3.5 years. He helped launch three 1st Party Services during his tenure:Azure NetApp Files, AWS FSXN, and Google Cloud Volumes Service. He was at Salesforce and VMware before NetApp. Volumez recently completed a $20 million A-round funding and hired John Blumenthal as its Chief Product Officer. Previously he was VP Data Services in HPEs storage organization. Its clear that Volumez has a sellable product, even though it is only at an A-round funding stage, and is setting up an exec team of a kind more usually seen with C-round stage startups.
Destini Nova is joining WANdisco (the soon-to-be Cirata) as the senior director of Alliances and Business Development, and will be focused on building strategic partnerships and driving business growth delivering value to customers. WANdisco says she is thrilled to be working for an organization under the dynamic leadership of Stephen Kelly again, grateful for the opportunity, and looking forward to this new journey. She comes from being director Global Alliances at Sage and is based in Seattle.
View post:
Storage news ticker 29 Sep 2023 Blocks and Files - Blocks and Files
HPE Restructuring: 5 Things You Need To Know – CRN
Cloud News Steven Burke September 28, 2023, 10:00 AM EDT
A Hewlett Packard Enterprise restructuring, which goes into effect for the start of the companys new fiscal year November 1, includes a new Hybrid Cloud Business unit, executive reassignments and the departure of two top executives.
Providing A True Cloud Experience For Customers And Partners
Hewlett Packard Enterprise CEO Antonio Neri said the company is implementing a restructuring that will go into effect for the start of the companys new fiscal year on Nov. 1 in order to accelerate the execution of its HPE GreenLake edge-to-cloud strategy.
Having laid the foundation for success, we are now evolving our operating model to accelerate our execution, deliver a superior customer and partner experience, and drive growth for the company and value for our shareholders, said Neri in a blog post. These changes to HPEs organizational structure and executive leadership, which take effect at the start of our 2024 fiscal year on November 1, 2023, will further unify our portfolio and enhance the delivery of a true cloud experience for our customers and partners. They will enable tighter integrations that benefit our customers and partners and spark innovation. And, they position us to continue to win in the market.The restructuring which aligns business segment financial reporting with the new business structure - moves the HPE Storage business, which accounted for $4.7 billion in sales in the last full fiscal year, into a new Hybrid Cloud Storage business unit.
In the blog post, Neri said the vision he laid out six years ago of a future that would be edge-centric, cloud-enabled and data-driven has turned into strong momentum for customers and partners in every industry.
HPE GreenLake now supports 27,000 unique customer logos and 3.4 million connected devices. HPE GreenLake is at the center of how we implement our edge-to-cloud strategy, he said.
In the most recent quarter, HPE reported a 48 percent increase in the annualized revenue run rate for HPE GreenLake in the most recent quarter to $1.3 billion. HPEs GreenLake total contract value is now nearly $12 billion.
HPEs Intelligent Edge business, meanwhile, was up 50 percent in the most recent quarter to to $1.4 billion. That marks the companys fifth consecutive record quarter for intelligent edge sales.
The Intelligent Edge business with Aruba networking now accounts for the largest segment of HPEs operating profit, accounting for 49 percent of total operating profit.
As Ive said many times, Im confident that we can accomplish anything with the right team guided by the right leaders. I believe we have both, said Neri. I am proud of the progress we have made in our transformation to be the edge-to-cloud company. These new strategic alignments will enhance the integrated customer and partner experience, strengthen our market position, and accelerate our execution. I look forward to continuing to deliver in ways that only HPE can.
Steve Burke has been reporting on the technology industry and sales channel for over 30 years. He is passionate about the role of partners using technology to solve business problems and has spoken at conferences on channel sales issues. He can be reached at sburke@thechannelcompany.com.
Follow this link:
HPE Restructuring: 5 Things You Need To Know - CRN
Ex – WW NetApp Cloud Sales Leader, Salesforce, and VMware … – PR Newswire
SANTA CLARA, Calif., Sept. 27, 2023 /PRNewswire/ --Jason McKinney, worldwide sales executive with extensive experience in data, cloud, virtualization software and infrastructure sales, has joined Volumezas the chief revenue officer.
McKinneymost recently was worldwide vice president of public cloud sales at NetApp driving growth over the last 3.5 years, where he worked with customers to manage global cloud deployments in the cloud marketplaces, and successfully launched (3) three 1st Party Services during his tenure: Azure NetApp Files, AWS FSXN, and Google Cloud Volumes Service along with defining the solutions and GTM for Private Offers for all (3) cloud marketplaces.
His rich experience also includes sales and leadership roles at Salesforce launching and running the Social Travel ISV Experience with "Concurforce" combining travel and expense management for CFO's on the Salesforce force.com platform. AtVMware McKinney was part of the launch and movement of data center server consolidation core to virtualization accelerating the transformation of hardware to enterprise software, leading initiatives in Financial Services, Public Sector, and Healthcare Life Sciences with ISV's such as MEDITECH, SAP, and the application ecosystem. Jason scaled these partnerships with Partners and System Integrators driving large migrations anddatacenter consolidations around VMware enterprise license agreements.
"Jason's deep and varied expertise in cloud services highly complements and adds to our innovative approach to next-generation data platforms and cloud storage connecting compute with data," said Volumez CEO Amir Faintuch. "We are excited to join forces and welcome Jason to our growing executive team of high-profile cloud industry leaders who will accelerate our GTM and Sales with cloud hyperscalers, customers, and ISV's leveraging these marketplaces."
"I am looking forward to helping customers optimize their cloud commitments, leverage the marketplace, and co-author new use cases with the Volumez core value add of running applications with scale, resilience, and predictability in the cloud" said McKinney. "My proven experience in growing significant sales and leading large global diverse teams gives me a unique perspective to build momentum and revenue atVolumez mapping to customer success.
Volumez, which supports AWS and Azurecloud services, recently completed a Series A financinground of $20 million U.S. dollars, led by Koch Disruptive Technologies with previous investors Viola Ventures and Pitango, and announced John Blumenthal(also a VMware alumni) as the chief product and business officer earlier this year.
About Volumez
Volumez innovates next generation cloud-based data platforms and storagethat helps companies realize the true potential of their data. With its innovative controller-less architecture, Volumez tackles latency and scalability challenges by establishing direct Linux data paths, ensuring exceptional performance and resiliency. Through cutting-edge technology and a customer-centric approach, Volumez offers comprehensive solutions that streamline data workflows, enhance data quality, and drive informed decision-making. Discover more at Volumez.com.
Video: We are Volumez!
Photo: https://mma.prnewswire.com/media/2222753/Jason_McKinney_Volumez__CRO.jpgLogo: https://mma.prnewswire.com/media/2105345/4094708/Volumez_Logo.jpg
SOURCE Volumez
Excerpt from:
Ex - WW NetApp Cloud Sales Leader, Salesforce, and VMware ... - PR Newswire