Page 2,712«..1020..2,7112,7122,7132,714..2,7202,730..»

From terabytes to exabytes: Supporting AI and ML with object storage – GCN.com

INDUSTRY INSIGHT

From agriculture to defense, federal agencies are increasingly using artificial intelligence and machine learning to enhance mission-critical capabilities, accelerate research breakthroughs and free up staff resources.

The byproduct of this adoption is a rapidly increasing store of unstructured data in the form of images and video footage. The amount of unstructured data produced globally is growing by up to 60% per year and is projected to comprise 80% of all data on the planet by 2025, according to IDC.

All this data must be processed, analyzed, moved and stored. Currently, many organizations do this work using public cloud services. However, as the federal government continues to implement AI and ML technologies, many IT leaders are looking for a solution that better suits their needs in terms of cost, convenience and security.

Object storage -- which allows organizations to build their own private cloud storage environment on-premise, as well as unlocking edge computing capabilities -- is quickly emerging as a viable alternative.

So how does object storage work? How do different object stores compare to each other and the public cloud? And more importantly, how easy is it to implement and use? Read on to find out.

First things first

Object storage is a completely different approach to storage, where data is managed and manipulated into individual units called objects.

To create an object, data is combined with relevant metadata, and a custom identifier is attached. Since each object has comprehensive metadata, object storage removes the need for a tiered structure like the one used in file storage. Its therefore possible to consolidate vast amounts of unstructured data into a single, flat, easily managed data lake.

Object storage is a common solution for cold storage archiving. However, with recent technological advances, data can now be accessed much faster, making it ideal for applications like AI and ML, which require higher performance storage.

Object storage vs. public cloud

The emergence of edge computing goes hand in hand with the rise of AI and ML. Using public cloud services to analyze and store data captured by internet-of-things devices and sensors works brilliantly in urban centers. However, from agricultural drones to bomb disposal robots, connectivity to a central cloud repository is likely to be significantly slower in areas with less-dense network infrastructure.

Object stores solve this problem with low-cost, remote storage that enables computing to happen at the edge. Processing data at the point of collection is significantly faster than sending everything into the cloud, where it must be processed and returned.

Additionally, much of the data used to train AI algorithms has to be stored long term for auditing purposes, another area in which object storage excels. Capabilities including versioning, end-to-end encryption, object locking, and ongoing monitoring and repair enable data to be preserved for decades at a much lower cost than in the public cloud.

Comparing different object stores

When weighing object storage options, its important to scrutinize the technical features of various products. For instance, some object stores make multiple copies of each object to protect against data loss, which can eat up storage very quickly.

On the other hand, more advanced object stores take advantage of erasure coding, which breaks up a unit of data and stores the fragments across various physical drives. If data is wiped or becomes corrupted -- whether by accident or because of malicious activity -- it can be reconstructed from the fragments stored on the other drives. This lowers storage costs, as it doesnt require organizations to keep multiple copies of each object.

Plus, erasure-coded platforms can achieve incredible data durability, keep disk overheads low and enhance the overall performance of the system. Of course, not all vendors implement erasure coding the same way. Different products will likely have differing scalability, as well as varying rebuild and rebalance times.

Another important feature to examine is the data consistency model used by different object stores. Strong consistency is preferable for AI and ML applications. In short, this means that after a successful write, overwrite or deletion, any subsequent read request immediately receives the latest version of the object. Some object stores still use eventual consistency, where theres a lag until read operations return the updated data. This means that the application will occasionally operate off older versions of the objects.

How easy is it to implement and use?

Ease of use is subjective, of course. However, object storage does have several advantages. For instance, it requires less day-to-day attention than a traditional storage-area network, since the resilience of the system allows multiple disks to fail without incurring data loss. This means over 200 petabytes can be managed by a single administrator.

Theres no doubt that managing data captured by AI and ML applications will continue to challenge government IT teams. Object storage is not a panacea, but does address cost, speed and security issues. Looking forward, agencies that adopt object storage should focus on implementing modular end-to-end data management solutions. These enable elements to be swapped out for more advanced technologies when they become available.

About the Author

Robert Renzoni is director, technical sales Americas, at Quantum.

Original post:
From terabytes to exabytes: Supporting AI and ML with object storage - GCN.com

Read More..

Cloud-native security benefits and use cases – TechTarget

Alongside the growth in cloud services use, the industry has given rise to the term cloud native. Unfortunately, it's often ambiguous as to what cloud native means and how it applies to security controls and best practices.

No current industry standard definition for cloud native exists that encompasses all use cases and cloud services. It's generally meant to indicate software objects, controls and capabilities offered as a service delivered by a cloud provider and not on premises.

There are several security challenges driving organizations to use cloud-native security services and controls. First, some security tools and products haven't been adapted to cloud environments or haven't been ported adequately. This can be a major driver of cloud-native security adoption, especially when companies must meet compliance and regulation requirements.

Another driver to use cloud-native services is depth of integration with a cloud provider's fabric. For services and tools that require a significant effort to set up, enabling a cloud-native security platform that is already integrated is worth considering. Additional reasons to use cloud-native services include enhanced or unique capabilities that are difficult or impossible to come by elsewhere, and reduced costs versus third-party tools from vendors with expensive licensing models.

Cloud-native security tools and services don't make sense in some scenarios, however. First, many types of cloud-native security controls and services offered by cloud providers aren't considered best-in-class offerings. The AWS Inspector vulnerability scanner, for example, offers minimal configuration and far fewer in-depth vulnerability checks compared to leading scanning engines from third-party providers.

Second, cloud-native tools increase vendor lock-in, which can significantly inhibit a centralized and streamlined security operations function for multi-cloud deployments. AWS Security Hub, for example, doesn't apply to Microsoft Azure or Google Cloud Platform, and Azure Security Center doesn't apply to GCP or AWS.

Numerous categories of cloud-native security can enhance or improve security programs and capabilities. For most organizations, using some cloud-native tools will make a great deal of sense. Popular cloud-native security use cases include the following:

Increasingly, organizations also use cloud-native monitoring tools such as AWS CloudWatch, AWS Security Hub, AWS GuardDuty and similar tools in Azure and GCP to act as built-in guardrails for alerts on suspicious behaviors.

Cloud-native controls for completely cloud-centric services, such as serverless functions, also make sense in many cases. These controls are built in, well integrated and often less expensive than third-party tools. Some cloud-native tools and services are also highly advanced in terms of performance, scalability and capabilities for more modern workload deployment methods such as containers and orchestration services such as Kubernetes.

Additionally, an entire new set of cloud-native tools and services are now becoming more commonplace for cloud security posture management, cloud access security brokers, and identity federation and single sign-on. Sometimes called security as a service, these offerings are completely cloud-based, focused toward cloud services and their use. These services will likely continue to add to the cloud-native security space in coming years, as well.

Read the original post:
Cloud-native security benefits and use cases - TechTarget

Read More..

Etoro takes the Silk road to Azure Blocks and Files – Blocks and Files

Silks accelerated storage IO in the public cloud can make lifting and shifting non-cloud-native databases and other workloads to the cloud straightforward, and have them operate much faster and with sub-millisecond latency.

Online trader Etoro, which started up on 2007, has grown to more than 20 million customers hitting its on-premises data centres with trading requests. These data centres couldnt keep up, and Etoro decided to move its databases and applications across to the Azure cloud and take advantage of its ability to scale quickly.

It considered doing so using PaaS (Platform-as-a-Service), but that would have required Etoro to change its database and application code adding middleware, to use the underlying Azure platform facilities. This was unrealistic.

The alternative was IaaS (Infrastructure-as-a-Service), with Azure server, network, operating systems, and storage facilities presented virtually. This was much faster to implement, but the resulting performance particularly of SQL Server wasnt good enough

Israel Kalush, VP of Engineering, eToro, said: While some applications can be migrated easily into the cloud, others especially ones that require high throughput IO with very low latency are more complicated and require significant redesign. Significant redesign is expensive and, therefore, less likely to take place.

Silk provides virtual storage array services in the Azure cloud, developed from its on-premises Kaminario all-flash array code base. The cloud-native software delivers high-speed storage IO by using, and protecting, Azures fast and unprotected ephemeral OS disks, which incur no storage cost. In effect, it is a database acceleration layer of software.

The Silk code provides compression, zero-footprint clones, and inline deduplication. It says its users are then able to reduce the amount of cloud resources they need, and cut cloud costsby around 30 per cent.

Etoro decided to use this Silk storage layer between its software and the Azure facilitie,s andfound its software ran up to ten times faster on Azure than without Silk. Using Silk made its Azure incarnation capable of supporting hundreds of thousands of database transactions a second at low latency.

Kalush said: We have some extremely IO-intensive databases. Silk was the only provider that actually promised and delivered on sub-millisecond latency for those database applications. Etoro has found that this low response time is maintained under heavy loads.

It also says its Azure adoption time was cut in half by using Silk, obviating the need to refactor its code.Check out a video to find out more.

Read this article:
Etoro takes the Silk road to Azure Blocks and Files - Blocks and Files

Read More..

Camera to Cloud everything you need to know | Industry Trends | IBC – IBC365

Its a workflow that enables filmmakers to export original footage into a post environment as soon as it is recorded. Rushes can be reviewed, edited or otherwise manipulated, and fed back to the set in minutes, saving time, and therefore money, and enhancing creative decision making.

C2C completes the move from a physical workflow to one thats completely digital. Its been technologically possible for several years but has remained dependent on the speed of internet connections and hampered by general reluctance to change.

Internet access still constrains adoption but the need to cater for Covid-19 safety protocols has shaken the industrys inertia. In recent months, remote workflows have become a staple of editorial for editing, VFX and colour grading, review and approval, in which craft talent is located out of fixed premises with the freedom to work from anywhere.

Now the industry can go a step further and open up collaborative connected workspaces live from location.

Camera to Cloud breaks down the barriers of time and distance, says Michael Cioni, Global SVP of Innovation at Frame.io. What was once a linear process of shooting and waiting for footage to be processed is now a parallel process. With Camera to Cloud, your creative team can work together collaboratively without waiting to exchange any physical media.

Chuck Parker, CEO, Sohonet, says: The concept of a cloud-based platform which can enable even an iPhone to operate as an editing suite has gained even greater urgency with Covid-19. Camera to Cloud can keep remote teams truly connected and in sync whether theyre shooting on a second unit down the road or sitting in an edit suite thousands of miles away.

Whats Camera to Cloud good for?

There are several benefits being discussed in the Camera to Cloud mantra. Perhaps the most important element is the ability to get shots into the hands of editorial without waiting for the dailies process. This typically implies a DNX 36 and a DNX 115 or similar - something that the editor can slot into their workflow.

Dailies have been the quickest route to on-set creative decision making for decades but it is called dailies for a reason, says Parker. Footage is processed often overnight and returned to set sometimes youre waiting 24 hours. Thats no longer efficient for the pressures of modern production, especially when there is an alternative on tap. The cost of reshoots and additional travel to and from location can easily be 15-20% on the bottom line. Anything that helps reduce the cost of pick-up shots by enabling instant creative decision making is of tremendous value.

During production of 2020 action feature Unhinged starring Russell Crowe, director Derrick Borte and DP Brendan Galvin used a C2C solution to slash the shoot time allocated for a major car chase sequence in half.

Normally, the crew would have to go back to the video village to get notes then reset the scene, notes Hugh Calveley, Co-founder, Moxion. Using Moxion they were able to reference the footage of their run as they were resetting, along with notes, and cut the schedule in half.

Camera to Cloud breaks down the barriers of time and distance. What was once a linear process of shooting and waiting for footage to be processed is now a parallel process, Michael Cioni, Frame.io

He continues: A key part of keeping the art of filmmaking fluid and creative is having the ability to receive immediate feedback, to make a decision, refine judgements and to keep going. The value will show on the screen with better pictures and better stories.

Similar workflows can be applied to any key production head. Art Directors or Executive Producers, for example, unable to get onto set because of limitations in the number of people permitted, can review progress remotely. C2C further enables them to work at a location of their choosing, or on the go, maximising their own time while reducing travel costs.

Off-set creatives can be in direct contact with those on-site, providing live feedback that negates the need to go back and forth later down the line, says Parker. So your teams can feel connected to life on-set with an over-the-shoulder collaboration experience.

What local bandwidth connection speeds are necessary?

According to Cioni, if you can make a phone call from where youre shooting, you can probably shoot C2C. He explains that the way Frame.ios system works is that you can throttle the quality of files up or down based on network bandwidth availability.

At 2Mbps, 1080/24p, one hour of content is about 1GB of total footage. Since crews typically shoot between two and four hours of footage a day (or 2-4 gigabytes of C2C proxies), they can easily upload all the media spread across the shoot day, he says.

Ideally, having anything more than 10Mbps upload will result in offsite collaborators having access to clips within a minute or two of the take. When higher bandwidth is available, takes are available within seconds. Even a one-hour interview with 10-20Mbps of upload bandwidth can be fully transmitted in less than six minutes, so the post-production and transcription processes can begin while the crew is still wrapping.

For higher quality files like H.265 4K, the same principles apply. 10-20Mbps is going to be enough for C2C to work extremely efficiently enough to eliminate any sense of delay, Cioni says.

Sohonet is on the same page, recommending a minimum of 1Gbps both up and down to accommodate small and medium-sized productions, with larger tent pole shows often deploying 10Gbps and beyond for their workflows off set.

While there is a lot of buzz around the camera itself pushing directly to the cloud, its not an efficient model. According to Parker, this is more likely to occur in small budget productions, B units off-lot, or production of commercials where there are only one or two cameras.

Most meaningful productions for episodic and features have multiple cameras and are typically using WiFi spectrum to push the data from the camera to the video village where the footage is cached, he says. The director, producer, DoP, DIT, etc, all want to have a look at the take(s) before agreeing to move forward with the next scene or shot.

Additionally, large shoots employ video assist software to provide for better shot management relative to metadata, multiple camera feeds, scene in/out, etc. Pushing all of the video through a common shot manager is the most common practice for medium and large productions.

RAW workflows - not so fast

This is all fine for the vast majority of workflows which work with proxy video, but anyone wanting to push camera RAW (Original Camera Files/OCF) faces an uphill task today.

OCF requires more like 1000Mbps before its reliable enough to move. OCFs are not only the largest data payload but the least time sensitive. Today, OCFs do not come directly from the cameras, but rather are being pushed to a local staging environment (on-set or near-set storage as a part of video village).

When it comes to transmission, the ability to get OCF directly to the cloud is limited today, but we are starting to see it happen, Cioni says. Companies like Sohonet are working with studios to install ultra-high-speed network connections that enable DITs to upload OCF right from set. Currently, those transmissions cant be done wirelessly because wireless networks still lack the appropriate bandwidth.

But there is one more thing we need to solve before the wireless networks catch up: todays cameras are not yet designed to get OCF up to the cloud without first downloading it. Cioni predicts that cameras will be developed to allow access to OCFs so that it can be uploaded from the camera itself.

The first step is that camera manufacturers will have to create that technology, and as they make headway, the telecom solutions will continue to increase bandwidth (hardline or wireless) to allow for connections that move OCF right to the cloud, he says.

Both Parker and Cioni agree that were 5-7 years away from average bandwidth utilised on set being suitable for RAW transfers to the cloud, with shooting OCF to the cloud becoming the norm by 2031.

The impact of 5G

Camera to Cloud is not predicated on the rollout of 5G. Today, LTE and general WiFi hotspots allow ample bandwidth to move files compressed as H.264 up to the cloud, Cioni says.

5G will, however, help boost adoption of C2C by decreasing bandwidth dead zones and increasing internet access points. Satellite internet will also further widen the network reach to reduce dead zones in more remote locations.

5G will be great for reality TV shows where the camera(s) are following actors in a major city, Parker says. This happens today with 4G/LTE and 5G will make this experience much better for filmmakers on the move in major metropolitan areas.

But 5G only has a range of 1000-2000ft and the millimetre wave technology is disrupted quite easily - even the metal walls of a sound stage create a problem, he says.

For 90% of productions, WiFi-6 is a more likely development, with all of the throughput promises of 5G without a telco in the middle of the business model, Parker thinks.

The increased bandwidth of 5G though increases the data, and therefore the quality, of C2C workflows. With 5G, 4K 10-bit files become possible (and eventually OCF). Using this same network availability with a higher bandwidth signal means that recipients can receive higher quality files.

C2C vendors

Camera to Cloud workflows are not the domain of any one vendor but a suite of interlocking technologies.

In March this year, Frame.io launched Frame.io C2C, which certifies a number of products to work with its central asset management software. It also claims to have invented C2C, which seems a marketing sleight of hand. Certainly, the product Frame.io C2C is its own but other vendors market similar technology and claim to have got to the punch earlier.

One of these is Sohonet, which offers ClearView Flex for streaming live, encrypted video with sub-100ms of latency from camera via HDMI or SDI to up to 30 viewers. It presents this as a C2C tool along with Immediates from Auckland-based developer Moxion. Immediates offers a way to view HDR and Dolby Vision footage off-set as a non-real-time solution.

Cloud storage developer LucidLink is also promoting its ability to transfer OCF not proxies direct to the cloud where the media is available to users with a LucidLink client installed on their machine.

The number of technology partners providing product to plug into these workflows is growing rapidly. It includes wireless camera encoders/transmitters from Teradek, video assist tools like QTake, portable mixer-recorders from Sound Devices and grading/edit software from Blackmagic Design, Adobe and Colorfront.

Whos handling the network on set?

Even when the internet and network are provided by the location whether thats a stage, office, or practical location someone on the crew needs to make sure that the on-set devices stay online.

Equipment like modems, routers, meshpoints and antennas must be set up and maintained. Security needs to be established and monitored. Client devices like computers, streaming boxes and cameras need to be connected and provisioned.

Production needs its own versions of modems and routers that fit the unique and specific needs of the set, Robert Loughlin, Frame.io

Managing that network is a big job. All of these responsibilities would normally fall under the role of an IT department but theres no dedicated IT department on set.

Frame.io technology specialist Robert Loughlin discards the two obvious choices, the DIT in the camera department and VTR in the video department, as being too busy to add on more responsibilities.

Thats why it might be time to think about having a dedicated production network manager on the set, he says. They could be involved with the production from the scouting stage to advise on what the possibilities, limitations and requirements are for a given shooting location and can then put together the right package of tools to ensure the production has what it needs to reliably work on the internet.

Then, once production starts, they can maintain and monitor the network, making sure the right devices get connected and stay connected.

New on-set gear

Regardless of whether that on-set network manager falls into an existing role or becomes a new one, theyll need the right tools to get the job done.

Production needs its own versions of modems and routers that fit the unique and specific needs of the set, says Loughlin.

Heat management, battery power (based on standards like Gold Mount and V-Lock), mountability, portability, durability and reliability are all important factors.

This also opens up new opportunities for production gear manufacturers to grow and develop a new segment of the market and is something rental houses could explore, he says.

Who is using C2C already?

The first notable production to use Frame.io C2C was Catchlight Studios Songbird, the first Union-crewed film to go into production during Covid, in July 2020. Use cases also include red carpet coverage of the 63rd Grammy Awards transmitted from Los Angeles to London; and documentary filmmakers using C2C for quickly deriving transcripts from the set so the editing process can begin immediately.

Sohonet claims up to 20 projects have used ClearView Flex on set with overall adoption of Camera to Cloud workflows slow but gaining speed.

Gaps in the workflow

Nonetheless, there are gaps in the workflow. Perhaps the most tricky is the ability for a production to access media assets regardless of which cloud it is stored on. This would fulfil MovieLabs principle of creative applications moving to the data and not the other way around. Right now, though, different facility and technology vendors have a preferred cloud partner.

It is very likely that productions will utilise multiple cloud service providers, Parker says. The VFX team might use Google, the editorial use Azure and the dailies platform might live in AWS. Each of the major CSPs charges a similar egress fee.

The reality is that until CSPs soften their approach commercially (by not charging major players), this will continue to slow cloud adoption because of both the absolute cost and the unpredictable (budget blowing) nature of the egress fees.

Cioni believes the CSP market will have to shift from exclusivity to malleability. He explains: The current problem is that the user wants to leverage their own storage deals (cloud service provider A) at the same time as leveraging a cloud processing service (cloud storage B).

Getting A storage on B processing means both parties have to integrate, and both parties will want to earn on the exchange. This is similar to how ATMs work: theres no fee when you withdraw money from your banks ATM, but if you go to a competitors ATM and withdraw money from your bank theres a fee on the exchange.

The hope is that as more enterprises use cloud that will push cloud companies to produce an experience that is essentially storage agnostic. This means processing can happen with the service and the media can be stored wherever its most convenient for the customer. But what remains to be seen is what the costs will be to leverage that kind of flexibility.

Cloud costs and data movement

Sohonet, Frame.io and others have an all you can use business model with cloud egress fees included that may ease the cost problem. For example, if you use ClearView to stream from set, you pay no per minute or per GB fee, whereas other providers on set charge $0.10 per GB streamed, according to Sohonet. The Frame.io model is based on a flat rate in which theres a monthly set cost per terabyte with no added ingress/egress fees, regardless of use.

An additional hazard is potential loss of data, delays and even security when moving content between cloud providers. Its an issue that MovieLabs, which represents the big Hollywood studios, is keen to see solved.

We expect workflow to continue to span multiple cloud infrastructure and we encourage that choice but the crucial point is that sharing work across various clouds should be seamless so it acts like one big cloud, says Mark Turner Program Director, Production Technology, MovieLabs.

As per MovieLabs 2030 vision, this means building a Camera to Cloud strategy that moves the video modification process (editing, VFX, colour, sound) to the cloud where the video data sits. Then the only egress happens when it is streamed out for a live review and approval session which is 1/100th to 1/1000th of what is being stored in the cloud at that point.

This is a big opportunity for cloud companies and I expect there will be a major breakthrough over the next four years given that cloud storage processes have been relatively flat for the last three years, Cioni says. Its likely that this will change by 2025, when costs per terabyte will begin to decrease.

Read more:
Camera to Cloud everything you need to know | Industry Trends | IBC - IBC365

Read More..

IDrive vs OneDrive – ITProPortal

There are a plethora of options when it comes to storing your files and data in the cloud. That's a good thing, as having various choices allows users to find the best cloud storage that caters to their needs. It can, however, make the process of selection a little dauntingwhich is where we come in.

Microsoft, a household name in tech hardware and software, joined the cloud-wagon in 2007, offering OneDrive to the world. Since then, it has come a long way, and what was once a good solution for personal users is now also an attractive option for businesses. Our comprehensive OneDrive review gives further insights.

For its part, IDrive came to life way back in 1995. Originally, it was designed for the tech-savvy business user who needed a robust backup solution for their system. Today, however, iDrive offers features such as file sharing and file synchronization across multiple devices, making it more appealing to the commercial market. Learn more about the platform in detail in our full IDrive review.

So, with both OneDrive and IDrive commonly appearing amongst the leading options available, which one is right for you? Looking at everything from features to affordability, we'll help you decide in this IDrive vs OneDrive head-to-head.

Here, we break down the key features offered by both services, analyzing which one does what better, and suggesting how important this should be in your decision-making. IDrive and OneDrive both come with an integrated desktop app that makes backing up your files to the cloud effortless. There is, however, a distinct difference between the two in this area.

IDrive can perform a sector-level backup, which allows you to back up your files, system settings, application data, and operating system. You can do this across multiple computers, so if one of your machines malfunctions, you can restore your full system on the replacement computer.

OneDrive, however, only allows users to create a file-level backup, meaning you cant back up your full system. Should one of your computers break down, youll need to reinstall data such as applications and system settings to get up and running on your new device.

Through its Files On-Demand feature, OneDrive also enables you to free up space on your computer by only storing your files in the cloud. You can access and work on these files pretty much as normal, but youll need to be connected to the internet to do so. Unfortunately, this isnt a feature that IDrive offers.

To finish, for personal use, OneDrive ticks many boxes, but for businesses looking for an in-depth backup solution, IDrive has to be the go-to choice.

File versioning allows users to access previous versions of edited files. OneDrive and IDrive both offer this feature, but IDrive offers a more detailed service. The company allows access to up to 30 older versions of files for an unlimited time, while OneDrive gives access to 25 older versions for only 30 days.

Creating and sharing files inside a cloud storage provider allows for better productivity and more seamless collaboration between users. When it comes to integrated applications such as a word processor and spreadsheet creator, OneDrive begins to push ahead of IDrivealthough it's hardly a fair race at this point.

Because Microsoft already has its own productivity tools, it's able to easily integrate them into its OneDrive software. That means multiple users can collaborate simultaneously on files created in Word, Excel, or PowerPoint, for example, leading to a more efficient workflow. There are also integrations with other Microsoft apps and tools, depending on the selected plan.

In contrast, IDrive doesn't offer any native productivity tools, meaning users have to create documents outside of the IDrive platform. This makes it harder for multiple users to collaborate on and edit documents together.

Its essential however that your files and data are protected by strict privacy and unbreakable security. Heres where IDrive starts to make some real ground on OneDrive. The two are pretty much even on the security front. They offer industry-standard AES 256-bit encryption for your files, both in transit and at rest.

Other security measures such as two-factor authentication and the option to create your own encryption key are available on both services. But this is where IDrive begins to move ahead.

Today's best IDrive deals

IDrive is a zero-knowledge provider, meaning it doesnt store your encryption key on its servers. Only the person who created the key will have access to it, as well as the data it's protecting.

On the other hand, OneDrive stores encryption keys on its servers, opening up the opportunity for potential access by third parties. This means that if there were ever a data breach, a hacker might be able to read your files. This would be impossible with IDrive, as even if a hacker did gain access to its servers, they wouldnt be able to decrypt and thus read your files.

Both services allow you to add specific files and folders to the cloud through their synchronization feature. By syncing files, users can access their data across multiple devices. If youre using a different computer, for example, you can log into each application through your browser to access your files.

OneDrive and IDrive have mobile applications, allowing you to upload, download, and access files from your smartphone or tablet.Both services also let you create a link for specific files and share it with others, a common feature in cloud storage providers.

Like OneDrive, IDrive claims it offers collaborative access to files. However, during IDrive testing we were only able to control the amount of time the link remained active, and how many times the shared file could be downloaded. There was no option to set editing permissions for other users, a feature that is available with OneDrive.

Although IDrive has made strides with its cloud service, it still has some catching up to do. Another limiting feature is that it only backs up files 4GB or smaller (if you need to back up large files, IDrive sends out a physical hard drive). OneDrive, however, can back up files of up to 250GB in size.

Both IDrive and OneDrive have impressive performance in terms of usability and speed, featuring a simple and intuitive user interface.

OneDrive is exceptionally stripped down, existing simply as a folder on your desktop rather than a standalone application. IDrive acts very much in the same way, but also offers a separate desktop application (here you can program automatic daily and weekly backups). Uploading files to the cloud from our desktop was extremely easy. We simply had to drag and drop the files we wanted to back up into each cloud services designated folder.

We also tested both services' web-browser applications. OneDrives application was far more responsive, and navigating through it was straightforward. At times, we found IDrives web-browser app unresponsive, often experiencing crashes or longer loading times when navigating each tab.

In terms of speed, we put both programs to the test with an internet connection offering 100Mbps download speeds and 15Mbps upload speeds. We backed up 7.2GB of data (including photos, documents, music, and video) to each service. IDrive did okay, taking one hour and 43 minutes to complete the upload, while OneDrive took quite a bit longer at two hours and 29 minutes.

IDrive won the race again when downloading the same files, taking only 16 minutes to complete, while OneDrive's download took 24 minutes. This isnt the worst performance, but remember to factor in differing times dependent on the speed of your internet connection.

Today's best Microsoft Office 365 Business deals

With a company as large as Microsoft, we'd expect it to offer the best-in-class support for OneDrive. Sadly, that's not the case. Users do have access to the community forums, which cover many of the more commonly asked questions. If you want to speak to a member of the specialized support team, however, your options become rather limited.

There's no telephone support, which is frustrating, nor is there an option for live chat. Users instead have to make do with sending an email. When testing the response times, we received support within nine hours of our original query.

On the flipside, IDrive does all of the above, and does it exceptionally well. You can call its technical support team and its billing department from inside and outside the United States (offices are open Monday to Friday, 6:00AM to 11:30PM PST). We received immediate assistance when using the live chat, and waited for only three hours for a response to our email query (wait times will differ depending on your time zone).

Both services offer 5GB of free storage, as well as a good range of plans for both personal and business use. For personal use, IDrive offers two plans: Personal 5TB ($79.50 a year) and Personal 10TB ($99.50 a year). That's extremely affordable, and should appeal to anyone working with large files.

Business users have more options when it comes to plans, six in total. They start from 250GB of storage space ($99.50 a year), and go up to 12.5TB of storage ($2,999.50 a year). You can also add multiple users to your selected plan, making it much easier to manage backups and share files.

Microsoft uses its range of productivity tools as leverage when it comes to its plans. Its basic personal plan is $1.99 a month, offering 100GB of cloud storage. Move up to the next option, at $69.99 a year for 1TB of cloud storage, and you'll also be able to use office apps such as Word and Excel.

For business use, if you want a fully inclusive Microsoft experience, you'll need to pay $12.50 a month per user (paid annually). With that, you get 1TB of storage, access to Microsoft 365, and the use of apps such as Sharepoint, Teams, and Yammer. On the budget end, you can pay $5 a month per user for 1TB of OneDrive storage, but wont have access to any of the other features.

It's a shame neither provider offers an unlimited storage option. Still, both services remain affordable and competitive in price. If you don't need a fully integrated experience with a range of productivity tools, then we feel IDrive edges it here in terms of finding the overall balance between cost and features.

In many ways, this head-to-head is like comparing apples and oranges. Both OneDrive and IDrive exist in the same family (that is, cloud storage), but they're also extremely different. IDrive offers a complete backup solution, while OneDrive is instead focused on storage, productivity, and collaboration.

For the personal user, we'd suggest going with OneDrive, especially if you use a Windows laptop or PC, where it comes preinstalled. It's easy to use, and many of your contacts will use it too. That's not to say IDrive is a bad option: it's just not quite on the same level as its rival when it comes to collaboration, syncing, and sharing.

But for the business user, including those who are self-employed, it becomes a little tricker to suggest the best option. Without a doubt, IDrive takes the crown for better privacy and security, as well as more backup options for your system. If, however, you want to use integrated productivity tools, collaborate on a project, and back up your most important files, OneDrive is the winner.

Make sure to read some of our other comparison features, including IDrive vs Backblaze and Google Drive vs OneDrive, to get an insight into how these two platforms shape up against other competitors in cloud storage.

Read the rest here:
IDrive vs OneDrive - ITProPortal

Read More..

Walgreens finishes its trip to the cloud and its retail journey is just beginning RetailWire – RetailWire

Aug 13, 2021

Many retailers made plans to move their information technology systems to the cloud prior to the start of the novel coronavirus pandemic last year. But, as with many other elements of their business, this too was accelerated when they found themselves having to deal with changing consumer behavior and other developments that quickly arose as a direct result of COVID-19. Walgreens Boots Alliance (WBA) is a case in point. The drugstore giant announced that it completed moving 122 enterprise resource planning (ERP) apps to the cloud in May, according to a Wall Street Journal article.

Francesco Tinto, global chief information officer at WBA, who joined the company in 2019 from Kraft Heinz, said that the move to the cloud will speed processes across Walgreenss business, including inventory management, point-of-sale transactions, accounting and more. The companys migration to the cloud concluded a five-year effort to overhaul the drugstores tech systems, specifically related to its retail operations.

The need for speed became particularly apparent in the early months of the pandemic and Walgreens, according to the Journals reporting, sees migration to the cloud as a key element in its move. The article provided an example of how setting up a server to run on the cloud now takes a few hours compared to up to a week before.

The new system will integrate functions across the Walgreens organization, giving associates the ability to check store inventory with handheld devices. Employees can now act more quickly than ever before on pandemic-specific challenges, such as access to real-time sales and inventory data on such items as masks and sanitizers.

A forecast from International Data Corp., sourced in the Journal piece, estimates public cloud spending will reach $385.3 billion this year, up from $312.4 billion in 2020, the previous record year for cloud migration.

A KPMG report published last year identified the speed with which many companies were moving their systems to the cloud. Fifty-six percent of technology executives surveyed said that full migration had become an absolute necessity for their businesses and that previous piecemeal migrations were being abandoned in direct response to facts on the ground resulting from the pandemic.

DISCUSSION QUESTIONS: Has moving enterprise resource planning to the cloud become a business necessity for retailers and consumer-direct brands? What do you see as the associated upside and downside for companies as they begin moving systems to the cloud and once they are established on it?

"With the pandemic and with impending weather crises, the cloud becomes even more important as employees work from home or from other remote locations. "

"Now perhaps they should focus on the front of house and how their stores look and website works. Both need optimizing!"

"Theres no question that the future of retail lies in the cloud whether were talking about digital, online sales, or physical stores."

See original here:
Walgreens finishes its trip to the cloud and its retail journey is just beginning RetailWire - RetailWire

Read More..

MinIO: the smartphone camera of object storage Blocks and Files – Blocks and Files

MinIO wants to be the smartphone camera of object storage in that millions of smartphone cameras destroyed the traditional camera industry.

Consumers have largely stopped buying cameras because smartphones can take pictures more easily, cheaply and quickly. Likewise, when virtually every organisation on the planet has access to open-source MinIO object storage software, MinIO hopes to become the default choice for object storage projects.

How widespread is MinIOs software? It has had 622 million Docker Pulls up 90 per cent year-on-year, with nearly a million occurring each day. A Docker Pull is a MinIO software download from a registry. Thirty per cent of the Pulls are in North America, 30 per cent in Europe, The Middle East and Africa, 20 per cent in the Asia Pacific region, and ten per cent in Latin America.

The true number of downloads will be even higher. Co-founder and CEO Anand Periasamy told us in a briefing: That only represents public adoption. Private ones are hidden.

We think MinIO sees itself penetrating the object storage market from the bottom up. Its software is simple to obtain, clean to use, capable, cloud-native, S3-compliant, reliable, scalable and very fast. Many storage vendors use it to provide an object storage add-on or gateway to their products. For example, VMware, Seagate with Lyve Cloud, and Pavilion Data.

In effect VMware, Seagate and Pavilion have validated MinIOs software.

Object storage is assuming a more important role in IT, and that rising tide lifts the MinIO boat as well as those of Cloudian, DataCore, Dell EMC, IBM, NetApp (StorageGRID), Scality and other suppliers.

Periasamy told us Object storage is now competitive for database work. Kubernetes primarily looks for object storage. The AWS S3 giant $45 billion of it is squeezing out the object players, with only a handful of pure-play object vendors left. Object is changing everything and were a major player in this space going forward.

You have to be available every where, on-premises and in the public cloud. Its what differentiates us. We have more than 500,000 hosts across AWS, Azure and GCP. Cloud customers adopt us because we have multi-cloud portability.

Were the only object storage company present everywhere. Were available with Kubernetes, SUSE, HPEs Ezmeral, VMware Tanzu, OpenShift and in the Alibaba cloud.

The idea would seem to be that Cloudian, Scality and the others have sold comparatively large orders to enterprises while MinIO has made its SW freely available to enterprises and everybody else. Its penetrated into the pores of the whole global IT ecosystem and is flooding around the large enterprise customers using the other object storage suppliers products.

And they are reacting. Scality is rewiring its object storage to be cloud-native in the form of Artesca following MinIOs lead, as Periasamy might see it.

Periasamy said: Private cloud is our engine as we grow. MinIOs binary is small, less than 100MB. It fits on a Raspberry Pi. Its great for the edge and powerful enough for the middle and the data centre.

Enterprises have a problem, as they need, Periasamy says: a consistent object storage system to run everywhere; edge, data centre, public cloud, and private cloud. Thats whats driving our growth today and I dont see that changing. Weve been enterprise ready for years and were already running storage at pretty major companies We are well-positioned to win.

MinIO has 140 paying customers for Subnet, its subscription network commercial license, which was launched in August last year. On the one hand, this is an absurdly low number for a supplier with well over 600,000 Docker Pulls. On the other hand it will be enterprises that pay for MinIO support think Red Hat and Linux and this is a reasonable total after only twelve months.

Our view is that MinIO has comparatively few enterprise-capable channel partners, compared to an outfit like Red Hat with more than 2000. This is changing.

Periasamy says MinIO has a strong OEM business. Lyve Cloud for Seagate is powered fully by MiniO Theres been an acceleration of paying customers since we changed to the GNU Affero General Public Licence (AGPL) v3 in May. If you ship under AGPL v3 then use MiniO freely. If not, buy a commercial licence.

Our software is simple to support and customers get 247 direct access to level 4 engineering support. Thats unique to MinIO.

He then outlined a strength which is possibly also becoming a weakness: We actually dont have sales people. All our business is in-bound. There are no sales engineers. We have marketing, engineering and documentation. That is a very different model a digital one. It played well in the pandemic.

In Periasamys view: NAS shifting to object storage is a clear trend. Modern SW stacks are designed to use S3-only object storage. They cant run on file and SAN is shifting to NVMeoF.

GCP and Azure Blob is okay for mono-cloud but you need S3 for AWS mono-cloud, for multi-cloud and for hybrid on-premises-cloud.

We think the progress of S3 can be seen as a bottom-up capture of the object storage market. Periasamy draws a parallel with Microsoft Windows: See how Microsofts desktop GUI destroyed the Unix desktop. Without Windows Microsoft wouldnt have won the server market.

People cant simply download other object storage, but they can pull MinIO straight from Docker. Land-grabbing is important to us. We are building a long-term sustainable brand. MinIO is Hotel California without strings.

MinIO was founded in 2014 and has 42 employees. It is still using cash from its 2017 $20 million Series A funding. Periasamy said: Well raise a Series B in the not too distant future. We have a lot of flexibility [and] were not talking about profitability. There is a lot of preliminary investor interest and he also said: The next round will be a big one.

Our view is that a slice of that cash will be pointed towards the partner network, and MinIOs partner infrastructure and be used to grow MinIOs overall enterprise selling capability. The company has a massive groundswell of support and use but comparatively we stress comparatively little enterprise usage. Not compared to its Docker Pull numbers. Cloudian, Scality, NetApp and IBM surely each have more enterprise customers than MinIO has paying customers.

Thats MinIOs challenge: enterprise catchup with its competitors. It may be possible, using its ability to provide a consistent object storage experience across all the computing platforms an enterprise uses and its extraordinary presence in the wider object storage community. That, and its softwares performance.

Did anyone say MinIO equals MaxIO? Its abut time they did.

Read the rest here:
MinIO: the smartphone camera of object storage Blocks and Files - Blocks and Files

Read More..

5 Companies That Came To Win This Week – CRN

The Week Ending Aug. 13

Topping this weeks Came to Win list are NortonLifeLock and Avast for a planned merger that will create a consumer cybersecurity superpower.

Also making the list this week are machine learning company Dataiku for an impressive funding round, chipmaker Marvell for an acquisition that continues its push into data center technologies for hyperscalers, Alkira and Telarus for a master agent alliance around Networking-as-a-Service, and Arcserve for launching a new data protection appliance that incorporates technology from allies Nutanix and Sophos.

NortonLifeLock To Acquire Avast In Cybersecurity Deal Potentially Worth $8 Billion

Cybersecurity vendors NortonLifeLock and Avast are merging in an acquisition deal that will create a global powerhouse in consumer security.

The deal, disclosed this week, brings together NortonLifeLock, the spin-off from security giant Symantec, and Prague, Czech Republic-based Avast, which has its U.S. headquarters in Redwood City, Calif.

Together NortonLifeLock and Avast will have greater scale in threat visibility, a geographically distributed cloud data platform, and advanced AI-based automation and classification capabilities, the companies said. There are more than 500 million users of the two companies complementary product portfolios.

The deal is valued around $8 billion the final price depends on which mix of cash and NortonLifeLock stock Avast shareholders choose to accept. That would make the deal the cybersecurity industrys third-largest acquisition after Thoma Bravos $12.3 billion proposed buy of Proofpoint and Broadcoms $10.7 billion acquisition of Symantecs enterprise business.

AI And Machine Learning Company Dataiku Raises $400 Million

Dataiku, a rising star in the data science, AI and machine learning space, raised an impressive $400 million in a Series E funding round this week that puts the companys valuation at $4.6 billion.

The company plans to apply the funding to grow its employee roster, both to accelerate product development and expand its go-to-market activities.

The Dataiku platform is used by data scientists and data engineers to design, deploy and manage data-intensive AI and analytical applications.

Marvells $11-Billion Data Center Bet Continues With Innovium Buy

Fast-growing chipmaker Marvell Technology struck a deal this week to buy data center networking chip startup Innovium for $1.1 billion in a move to boost its hyperscale cloud data center market presence.

The Innovium acquisition comes just a few months after Marvells $10 billion acquisition of semiconductor maker Inphi, whose chips connect data center switches to fiber optic cables.

With the two acquisitions, Marvell is making a big bet on winning a share of the capex spending by hyperscale operators like Amazon Web Services, Microsoft and Google. Data center capex spending by hyperscale operators reached $38 billion in just the first quarter of 2021.

Alkira, Telarus Team To Bring Cloud Networking To Agent Channel

Multi-cloud networking startup Alkira inked its first master agent deal this week, paving the way for the company to tap into the vibrant agent partner community.

Under the deal master agent Telarus now counts Alkira as one of its suppliers, providing its agent partners with access to a cloud Networking-as-a-Service specialist. Thats a boost for Telarus partners as the networking needs of their customers change and demand increases for hybrid solutions.

Telarus launched its cloud practice last year and has continued to build out its portfolio of cloud telecom and network services. That provides agent partners with access to the cloud vendors and technologies they need to meet customers changing requirements.

Arcserve N-Series Integrates Data Protection, Nutanix HCI And Sophos Security

Data protection software developer Arcserve wins kudos for taking a best-of-breed approach to developing its latest product. The company debuted a new line of integrated data backup, recovery and ransomware protection appliances this week that incorporate hyper-converged infrastructure technology from Nutanix and endpoint protection technology from Sophos.

The new N-series line of appliances, to be sold by Arcserve through its channels, combines Arcserves UDP data protection technology with the Nutanix Mine platform for private cloud storage and Sophos Intercept X Advanced for Server malware protection software.

The goal of the N-series appliances is to provide secure data protection, backup and recovery to enterprise customers with hyper-converged infrastructure environments.

Link:
5 Companies That Came To Win This Week - CRN

Read More..

Global Multi Cloud Storage Market Analysis Trends, Growth Opportunities, Size, Type, Dynamic Demand and Drives with Forecast to 2027 The Manomet…

Global Multi Cloud Storage Market Report Exhibits Key Drivers, Industrial Analysis, And Competitive Study

The report on Global Multi Cloud Storage Market sketches out all the essential market data for illuminating the readers and investors with the market facts that can help expand their own businesses. The report covers information including market drivers, validated figures, market growth rate, and other changing market dynamics.

It also embeds latest market opportunities & challenges, threats, sales analysis, and changing consumer behavior. The Global Multi Cloud Storage Market report also covers market growth driving factors, various market segmentations, regional analysis, and competitive study. A construal of theMulti Cloud Storageindustry is obtained from this report through the market analysis conducted using various research methodologies and primary or secondary resources.

Get Exclusive Sample Report + To Know the Impact of COVID-19 on this Industry:https://www.marketdataanalytics.biz/global-multi-cloud-storage-market-2021-by-product-by-96025.html#request-sample

TheMarket Data Analyticsreport on Global Multi Cloud Storage Market also clarifies the regional market attractiveness, sales & demand study, financial growth, and socio-physiological status using regional market analysis. The regional study shows the regions such as North America (United States, Canada, and Mexico), Europe (Germany, UK, France, Italy, Russia, Spain, and Benelux), Asia Pacific (China, Japan, India, Southeast Asia, and Australia), Latin America (Brazil, Argentina, and Colombia), The Middle East and Africa to hold a dominating position for the Multi Cloud Storage Market. Similarly, the competitive landscape also helps enlightens the investors and readers with information such as sales analysis, changing consumer behavior, new product launches, and manufacturing changes.

Global Multi Cloud Storage Market Size & Share, by Product Types:: Public, Private, Hybrid

Global Multi Cloud Storage Market Size & Share, Applications:: BFSI, Retail, Energy and Utility, Health Care and Life science, Government, Other

Moreover, the Multi Cloud Storage market research report also mentions the historical and future development trends and figures to better understand the market position on both the global and regional platform. Some of the key players dominating the Global Multi Cloud Storage Market include AWS, HPE, SAP SE, Red Hat, Gosun Technology, Oracle, Nasuni, EMC, Google, Rubrik, Rackspace, Zadara Storage, Microsoft, IBM, Qumulo, VMware.

Request Pre and Post COVID-19 Impact Analysis on Multi Cloud Storage:https://www.marketdataanalytics.biz/global-multi-cloud-storage-market-2021-by-product-by-96025.html#request-sample

What Are The Important Points Covered In The Report?

Geographical distribution to understand regional market attractiveness, consumer demand, market revenue, and growth Changing competitive dynamics to better explain the market analytics and market valuation Key business strategies adopted by the key players to help investors in their decision-making Market analysis also helps study the market drivers, opportunities & challenges, and threats The market share, revenue, and size/volume provide more knowledge on future scope and expected market growth rate during the forecast period.

Queries Answered In The Report:

What is the expected market growth rate over the forecast period? Which is the primary reason expected to propel the Multi Cloud Storage Market growth rate? Which are the dominating key market players in the Global Multi Cloud Storage Market? Which region exhibits rapid revenue generation during the forecast period? What are the latest trends expected to fuel the market growth? What the business strategies adopted by the key players to survive the competitive dynamic changes?

Inquire To Know Additional List of Market Players Included, Request Here::https://www.marketdataanalytics.biz/global-multi-cloud-storage-market-2021-by-product-by-96025.html#inquiry-for-buying

About Us

Market Data Analytics is a leading global market research and consulting firm. We focus on business consulting, industrial chain research, and consumer research to help customers provide non-linear revenue models. We believe that quality is the soul of the business and that is why we always strive for high quality products. Over the years, with our efforts and support from customers, we have collected inventive design methods in various high-quality market research and research teams with extensive experience.

Read the original post:
Global Multi Cloud Storage Market Analysis Trends, Growth Opportunities, Size, Type, Dynamic Demand and Drives with Forecast to 2027 The Manomet...

Read More..

Gartner’s Top 13 Backup And Recovery Software Leaders – CRN

The 13 World-Leading Enterprise Backup And Recovery Software Companies

With the move towards public cloud and ransomware attacks grabbing global headlines in 2021, concerns with backup and data management complexities are leading many businesses to rearchitect their backup infrastructure inside data centers and public clouds.

There are 13 vendors leading the backup and recovery software market on a global basis that are helping enterprises transform and improve their IT environment in data centers, according to Gartners new 2021 Magic Quadrant for Enterprise Backup and Recovery Software Solutions. The IT research firm says the enterprise backup and recovery software market underwent significant transformation over the past two years as ransomware solutions, support for public cloud backup and SaaS-based applications recovery take center stage in the new hybrid cloud world.

The worlds leading vendors are providing the backup and recovery software to capture a point-in-time copy of an enterprise workload and write the data out to a secondary storage device for the purpose of recovering the data in case of loss.

CRN breaks down the 13 vendors that made Gartners 2021 Magic Quadrant for Enterprise Backup and Recovery Software Solutions leading the market on a worldwide basis.

Read the original:
Gartner's Top 13 Backup And Recovery Software Leaders - CRN

Read More..