Page 3,245«..1020..3,2443,2453,2463,247..3,2503,260..»

Explained: How the changes in Google free cloud storage policy will impact you – The Indian Express

Written by Shruti Dhapola, Edited by Explained Desk | New Delhi | Updated: December 9, 2020 9:49:31 amStarting June 1, 2021, Google Photos will no longer be free as earlier.

Googles online cloud storage policy will undergo a major change from June 1, 2021. The reason: the search giant will no longer offer unlimited free storage for uploading photos and videos onto its Google Photos service. But thats not at all: Google will also start deleting content from inactive accounts. We explain everything you need to know about Googles new online cloud storage policy and how it impacts you, the user.

What is the existing policy from Google?

Users on a regular Google Account get 15GB of storage space free. This is considerably more than Microsoft, which offers 5GB free space on OneDrive and Apples iCloud, which offers 5GB as well.

This 15GB space counts towards the users Gmail, Drive and Photos. Drive includes all files, spreadsheets, etc which are created on Googles suite of apps such as Google Docs, Google Sheets, etc. However, photos which were uploaded to the Google Photos app were not counted to this free space. This applied to all high-resolution photos and videos and express resolution photos and videos.

According to Googles definition, when you are uploading photos at high resolution, they are compressed to save space. Photos larger than 16MP get resized to 16MP. All videos at more than 1080p resolution are also resized to high-definition 1080p. In express resolution, the photos are at a maximum of 3MP, and videos are at 480p resolution.

Those who were uploading photos and videos at original resolution, meaning there was no compression of the photos or videos, will not be affected by the change at all. Thats because if you uploaded photos or videos at original resolution, Google did count these against the online storage available in your account.

READ | Best Google Photos alternatives: Free storage, paid cloud storage plans and more

What is the major change to the policy?

Starting June 1, 2021, Google Photos will no longer be free as earlier. All photos and videos uploaded post June 1, 2021 will be counted towards the account storage. Under the earlier policy it was technically possible for you to keep uploading photos and videos without running out of space on the free account, because it did not count. But all that will change next year.

So why is Google making this change?

It is not surprising to see Google make the change considering it has more than a 1 billion users for each of its products and providing so much free cloud storage does not seem like a feasible option.

Google itself explained in a blogpost that people are uploading more content than ever beforein fact, more than 4.3 million GB are added across Gmail, Drive, and Photos every day. The company says it needs to make these changes to keep pace with the growing demand.

Further, Googles move will likely push users who were on the fence to make payments for using its cloud service, especially those who have now become dependent on Google Photos for uploading and storing their phones photo gallery.

The company offers paid options for extra storage under the Google One program. In India, it starts at 200GB for Rs 210 per month, 2TB for Rs 650 per month or Rs 6500 per year, 10TB at Rs 3,250 per month and 20TB at Rs 6,500 per month.

I dont want to pay and I have a lot of photos on Google. Does it mean I will have to delete all my earlier photos?

Google is giving some concessions. All photos and videos uploaded before June 1, 2021 will continue to remain free and not count against the storage. So if you have somehow managed to upload more than 15GB of photos and videos to your Google Account till date, dont worry. You dont have to delete them.

But all photos and videos uploaded post June 1, 2021 will be counted against the free space that Google gives you. And if you plan to keep uploading more photos and videos, it might make sense to pay for the service, given the 15GB is divided across Gmail, Photos and Drive.

For users who have a Google One account which is a paid account, theres nothing to worry about and nothing really changes. Express Explained is now on Telegram

READ | Google Photos: Heres how to export or download all pictures, videos offline

Why is Google deleting content from inactive accounts?

As part of the new policy, Google will delete content from inactive accounts. Any account which has been inactive for more than two years or 24 months, might find content deleted from products where the user is inactive. So if you have not used Google Photos for your account for two years, then the company will delete content from that particular product.

But Google One members who are within their storage quota and in good-standing will not be impacted by this new inactive policy, says the company. Thankfully, Google will warn you plenty of times before deleting the data and you will have the option of downloading the content as well.

You can keep the account active by periodically visiting Gmail, Google Photos and Google Drive on the web or through a Google app, according to the company. If you have exceeded the storage limit for 2 years, you might also find your content being deleted. Again, Google will warn before it decides to delete all your data from accounts if you cross the storage limit.

Dont miss from Explained | How a telescope in Australia is creating a Google map of the Universe

The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Explained News, download Indian Express App.

IE Online Media Services Pvt Ltd

Read the original:
Explained: How the changes in Google free cloud storage policy will impact you - The Indian Express

Read More..

Here are the Top 5 Encrypted Cloud Storage Services – The Mac Observer

To accompany my roundup of encrypted DNS services to use, I wanted to write a roundup of encrypted cloud storage services. These can be used in place of- or in addition to services like iCloud, Google Drive, Dropbox, and OneDrive.

NextCloud is among the top private services to use because you can host it on your own server, putting control of your data in your hands instead of a company. But you can still choose to have a company host your data for you. NextCloud is also open source.

NextCloud

Crypt.ee can store your files, photos, and notes on servers based in Estonia. Its open source and you only need a username to get started; no personal information like a phone number or even email address is needed to create an account. Crypt.ee is unique in that it doesnt have apps. Instead, its a web service that you access via a browser. The company says this helps give users deniability.

Crypt.ee

Tresorit is a zero-knowledge, end-to-end encrypted service that uses Microsoft Azure data centers in Ireland and the Netherlands. Depending on the selected plan, each user gets from 200 GB up to 1 TB. One notable feature is version history. Tresorit tracks all file changes and it can restore previous versions to correct mistakes or revert changes.

Tresorit

Syncs notable feature is the ability to send files of any size to any person, even if they dont use Sync. The service also backs up your files in realtime and its easy to recover deleted files and previous versions of files. This company has impressive, affordable storage plans. Its basic plan gives you 2TB for US$8/month (billed annually).

Sync

Im adding Proton Drive for the sake of future-proofing. Its a service from ProtonMail that is currently in closed beta for paid users. Like Sync, the company is developing end-to-end encrypted link sharing so you can send your files to non-Proton users. I dont know when it will exit the beta but its likely it will become public in 2021. Ill be keeping an eye out.

Proton Drive

Follow this link:
Here are the Top 5 Encrypted Cloud Storage Services - The Mac Observer

Read More..

Object Matrix Joins the Active Archive Alliance – insideHPC

Boulder, Colo. December 9, 2020 TheActive Archive Alliancetoday announced that Object Matrixhas joined the organization, which promotes modern strategies to solve data growth challenges. Object Matrix joins a growing number of industry-leading storage and IT vendors that support the use of active archive solutions for data lifecycle management.

Were pleased to welcome Object Matrix to the Active Archive Alliance,said Peter Faulhaber, Chairman of the Active Archive Alliance and President and CEO of FUJIFILM Recording Media U.S.A., Inc. The Alliance is dedicated to bringing the best of breed technologies together to provide users with active archive solutions for intelligent data management. Companies across the globe are embracingactive archive technologiesto manage the complexities of vast data volumes and realize the value of their data.

Rising data volumes are intensifying the need for efficient, cost-effective active archives, which manage data for rapid search, retrieval, and analytics.Active archiving enables organizations to gain value from retained data quickly.By leveragingan intelligent data management layer,active archives allowfast access to archival data across a virtual file system and manage data between storage systems and media types.

Object Matrixis the award-winning software company that pioneered object storage and the modernization of media archives.It enables global collaboration, increases operational efficiencies, and empowers creativity through the deployment of MatrixStore, an on-prem, hybrid and cloud storage platform. Object Matrixs unified deployment approach ensures content spans on-prem and cloud storage, while its focus on the media industry gives it a deep understanding of the challenges organizations face when protecting, processing and sharing video content.

Now more than ever, media organizations require solutions that enable creative and production teams to self-serve access to content from work or remotely from anywhere, said Jonathan Morgan, CEO, Object Matrix. Our MatrixStore solution delivers the inherent scalability and interoperability of object-based storage, as well as instant access to data and metadata, and our customers include some of the worlds largest media organizations. With MatrixStore, organizations have the protection and governance for the lifetime of any content. We are excited to join the Active Archive Alliance and help promote active archive strategies to solve todays data growth challenges.

Link:
Object Matrix Joins the Active Archive Alliance - insideHPC

Read More..

What Is Public Cloud and Is It Right for Your Business? – G2

Cloud platforms are a necessity for any business trying to make a mark in the digital world.

The ever-increasing demand for cloud platforms stems from the need of organizations to be updated with the latest data, insights and software at all times to stay relevant in the competition.

While cloud platforms offer innumerable advantages and make our lives easier, they are also increasingly complex in the adoption process of a company trying to figure out its cloud strategy. The sheer number of cloud service types can be quite a fix.

In this article, well focus on public cloud, the simplest and most convenient cloud platform available, and learn more about the advantages it has to offer and the challenges that come with it.

Public cloud computing is the cloud migration and storage services provided by third-party cloud vendors over the internet. It gives organizations the ability to choose their resources on-demand and provides them with a scalable platform and flexible payment plans.

A private cloud platform is a cloud service provided and managed internally by an organization. Private cloud solutions are highly secure and offer optimized performance benefits since they are hosted on dedicated servers and have robust firewall access protocols.

A public cloud platform, on the other hand, is a cloud service provided by external vendors where common computing and storage resources are shared among the customers with appropriate security protocols, and the resources are managed by the vendor for a fee.

On an enterprise level, both private and public cloud platforms are viable for use in various applications but sticking to either one of them may cause a hindrance to an optimal cloud experience. Organizations are adopting hybrid cloud solutions and multicloud solutions, which is a merger between private and public platforms, depending on the security and infrastructure requirements.

Hybrid cloud strategy uses public cloud infrastructure as a service platform and private cloud data centers for a more secure cloud experience. Multicloud strategy, on the other hand, employs a variety of cloud platforms for different applications in a single network infrastructure.

Public cloud vendors provide shared computing resources to multiple users and give them the liberty to decide the resources as they deem fit for their business. It is called a multi-tenant architecture since multiple tenants (organizations) use the same resources provided by the vendor. Each user is given secure storage for their data and the same servers will be used to host multiple applications belonging to different users.

Public cloud architecture can be categorized according to the three most common service models, explained as follows:

Public cloud vendors provide infrastructures such as servers, storage and networking hardware to users as per their demands. IaaS customers also have access to various other services and functionalities from the vendors such as maintenance, load balancing and process monitoring.

Best known examples of IaaS service providers include Amazon Web Services(AWS), Google Cloud Platform(GCP), Oracle Cloud, IBM, Digital Ocean and Microsoft Azure.

PaaS is a cloud service where the third-party vendor provides the hardware and software components to users to build application supporting platforms that they need to run on the cloud. PaaS users dont need to replace their entire IT infrastructure, rather just use the vendors hosted infrastructure services over a web browser.

The best known examples of vendors providing PaaS services include AWS Elastic Beanstalk and Google App Engine.

SaaS is a software on-demand cloud model, where the cloud service providers give the users access to a fully developed application created specifically for distribution. The software updates are rolled out for all users uniformly and organizations can use their own tools with the vendor provided application programming interfaces (APIs).

Best known examples of SaaS providers include Netsuite, Salesforce, and Concur.

Public cloud platforms have been adopted by a huge number of organizations, owing to the lesser hassle and optimization benefits they come with once the strategy has been put in place. Lets take a detailed look at the multiple advantages public cloud offers and gain a better understanding of its popularity in the market.

Public cloud model uses a shared resources model which allows cloud vendors to provide their services at a lesser price. Lack of on-premises IT infrastructure and virtualization of existing operating systems also enables users to reduce their hardware and networking assets pricing, allowing them to save better in the long run. Cloud services have become affordable for businesses small and big since they eliminate the need for operational and maintenance costs.

Public cloud services enable users to monitor their network usage, performance capabilities and computing power in realtime and plan the further cloud strategy accordingly. They need not depend on the service provider for these insights on a regular basis.

Public cloud platforms offer greater scalability to their users and give them the liberty to use more or fewer resources for every application, as needed. Both infrastructure and traffic scalability the ability to devote adequate compute power for unexpected increase in business traffic is achieved quickly owing to the huge number of computing services and resources they have available at hand.

Cloud data analytics are a huge advantage that public cloud computing provides. Organizations can gather usage metrics of the cloud resources and present business insights for a better future strategy.

In the public cloud computing model, physical resources like networking hardware and storage units and virtual resources like virtual machines and disk snapshots are available for use by all users in a pay per use system. Hence, if the situation demands, all resources could be pooled to serve a single user with high performance, lesser latency and higher storage needs.

Organizations employing public cloud services are not responsible for the maintenance of the resources provided by the vendors. Both operational and maintenance costs and trouble are taken care of by the vendor as part of their agreement, leading to a stress free cloud experience for the users. Since public cloud deployments are done through a web facility, disaster recovery incase of an unforeseen calamity is also taken care of by the cloud management provider.

Since public cloud services are externally sourced, they come with their own share of security implications and control issues. Let us now dive into the cons of public cloud and what companies need to take into consideration while employing this strategy.

Availing a public cloud service entails that the organization will have lesser control on the physical hardware standards, automation, access management, IT management, and technical support. Since all of these services are outsourced to the external vendor, you might have to adjust quite a few business processes in order to ensure a smoother transition to the cloud.

Since multiple cloud users are hosted on the same platform by the cloud service provider, data vulnerability remains a raging issue in case of public cloud providers.

While vendors are getting better with their cloud security practices and promises, it does not hide the fact that organizations are trusting an external vendor with their data and processes, and are still susceptible to a cyber attack more than they will be if they are on a private network.

Since public cloud platforms are hosted over the web, with no dependency whatsoever on the on-premise infrastructure, a connectivity issue might spell trouble with the entire cloud platform.

One way to stay out of this would be ensuring that you have strong network connectivity at all times, irrespective of the physical conditions.

For organizations having a complex network dependency between applications or an intertwined business model, public clouds may pose a headache because of the rigidity of the multi-tenant architecture.

Businesses cannot customize and optimize the use of resources on demand, they can only make the best of what the service provider has to offer. This one-size-fits-all methodology might be an issue for companies prone to rapid changes in their business and cloud strategy.

When GDPR updates rolled in, every organization who was in business with the EU pushed harder than ever to get the compliance factors checked. In case an organization is using public cloud services and needs to overcome a compliance hurdle for a particular client or partner, vendors might not be fast enough to provide the service and make all their resources compliant with every incoming change.

Cloud uptime is a critical area of consideration when it comes to adopting a cloud strategy. In case of any cloud outage at the vendor side, organizations will have to endure an unplanned downtime affecting their business hours.

While public cloud may not be the only ideal solution for your business migrating to the cloud, it is certainly a strong contender to be chosen for a number of applications having simpler cloud storage and security needs.

Migrating to and running your business on the cloud can be made easier by employing multiple cloud environments for different applications as per their individual requirements.

Read more:
What Is Public Cloud and Is It Right for Your Business? - G2

Read More..

Looking into the crystal ball: Tech predictions for 2021 – ITProPortal

2020 was a year of evolution for many sectors of the tech industry due to the reality of the Covid-19 pandemic. Many tech leaders had to pivot their strategies by shifting to a largely remote-work model and assessing new markets that were impacted by unforeseen economic impacts.

The tech industry has had major changes this year and because of that many experts are predicting new or evolved trends in 2021. Below multiple tech experts highlight their top predictions within the technology industry for the new year.

Krishna Subramanian, president and COO at Komprise:

In 2021, cloud storage costs begin to overtake compute costs. For the past three years, cloud cost optimization has been a key priority for businesses. In fact, Gartner predicted that 80 percent of businesses will outspend their cloud budgets in 2020. A bulk of these costs so far has been in the compute, since cloud object storage is relatively cost effective. But this is changing, since cloud file storage is typically ten times more expensive than S3, and file data is way more voluminous than block data all of which underscores the importance of using cloud file storage just when you need it. In 2021, enterprise IT organizations will begin adopting cloud data management solutions to understand how cloud data is growing and manage its lifecycle efficiently across the various cloud file and object storage options.

Anshu Sharma, CEO and co-founder, Skyflow:

Every large company is on a long-term transition to digital and cloud - so they can effectively compete against the likes of Amazon, and the Silicon Valley startups who are aiming for them. Companies like Nike, JP Morgan Chase, and Walgreens have been trying to transform and in 2020 they all got an unintended boost in their push to the cloud - because they had to. The Fortune 500, having seen relative success with cloud and digital are not going back. They are all doubling down. 2021 will be the year of the digital double down.

Patrick Harr, CEO, SlashNext:

Over the last 30 days, 10 percent of company users were phished, according to live data we compiled across more than 100 large and mid-sized enterprises. Every day, SlashNext Threat Labs detect 21,000 new phishing attacks, almost double the number of threats from a year ago, and SlashNext Threat Labs are seeing an alarming 50-75 percent attacks getting past conventional phishing defenses to compromise enterprise networks. So, if you think your current defenses will keep you safe, think again. And in 2021, we anticipate this problem will get much, much worse.

Stowe Boyd, analyst, Gigaom:

The pandemic has accelerated the adoption of technologies that were popular before, but which are now essential. One example has been the combination of work chat tools and video conferring, as typified by Microsoft Teams and Slack. Microsoft has seen a dramatic uptick in usage, and the release of Google's new take on the former GSuite, now known as Google Workspace, which also integrates work chat and video conferencing represents another challenge for Slack. As the two leaders in what we might think of as 'business operating systems,' Google and Microsoft present a difficult challenge for Slack, since companies will not want to pay extra for functionality, they already have access to in their communications and file storage platforms.

Saad Siddiqui, principal, Telstra Ventures:

The unrelenting pace of open source innovation will continue in 2021, particularly in the areas of data analytics and data infrastructure, leaving many Fortune 1000 businesses and other SMBs struggling to keep pace with, and integrate with, open source innovations. Unlike large tech players, who can afford to hire an army of engineers to constantly change things and add their own herbs and spices to improve their operations, most businesses and newer vendors dont have as many talented engineers or cant hire more engineers to keep pace with change. To address this talent wall, were seeing the rise of a hybrid open source business model, whereby open source data analytics companies like Incorta and infrastructure companies like Rancher Labs monetize closed source, out-of-the-box capabilities that deliver open source innovations while requiring less time and resources for enterprises to derive value.

Cornelia Davis, CTO, Weaveworks:

2021 will see the emergence of common distributed operational patterns widely implemented across all industries, and with the coming of 5G and the edge, this could not be timelier. There are several signals that foreshadow this, such as the adoption of GitOps as a defacto standard and best practice for operating Kubernetes and its workloads. IT systems will not only enjoy greater resilience, and security, but having GitOps in place also lays the foundation for massive scalability and growth.

Yiannis Antoniou, analyst, Gigaom:

Responsible AI / ML will become the hottest topic in the cloud ML industry. Given societys increased emphasis on combatting unfairness and bias and the overall interest in better interpretability and explainability of machine learning models, cloud providers will invest and enhance their ML offerings to offer a full suite of responsible ML / AI capabilities that will aim to satisfy and reassure regulators, modelers, management and the market on the fair use of ML. Meanwhile, AI / ML will continue to see explosive growth and usage across the whole industry, with significant enhancements in ease-of-use and UX combining within a responsible AI / ML framework to drive the next growth spurt of this sector.

IT Experts

See the original post here:
Looking into the crystal ball: Tech predictions for 2021 - ITProPortal

Read More..

How can the cloud industry adapt to a post-COVID world? – IT PRO

One of the unexpected silver linings to the global coronavirus crisis has been the rapid growth the cloud industry has enjoyed. The shift to remote working during the various lockdowns that have taken place over the course of 2020, was largely, if not entirely, facilitated by cloud services. This has meant that while other sectors have struggled and there has been an overall economic downturn, cloud companies have performed relatively well financially.

Although they wouldnt want to characterise the past few months as profiting from the pandemic, the likes of Zoom and Microsoft Teams have surged in usage and revenue, with the latter surpassing 44 million users as early as March. This period has also accelerated many digital transformation projects, with engineers more than capable of carrying out projects at pace and scale, including the traditionally lethargic public sector. This success, however, has been driven entirely by the effects of the pandemic, forcing the industry to question whether, and how, it can adapt once their services are no longer as highly sought after.

While we all rejoiced at the news that a potential COVID-19 vaccine may be available for distribution before the end of the year, shares in a handful of companies dropped sharply in response, including at least 15% reduction in the valuation of Zoom.

Whether things go back to the way they were, or cloud companies continue to play a more pivotal role than ever, is yet to be determined. For independent cloud consultant Danielle Royston, the goal of going back to normality in 2021 is misplaced. Theres no point wasting time and energy trying to return to the halcyon days of pre-COVID, she says. Lets focus instead on some of the positive disruptions weve seen this year. In all the companies Ive been at, Ive promoted and in some cases fully converted to remote working. I saw this as the inevitable direction that work and society was going, as the cloud computing tools were already there. And it makes sense: A better quality of life for employees, ease of collaboration, cutting the costs of business travel.

This is a trend that Tom Wrenn, cloud investment expert and partner at private equity firm ECI Partners, predicts will continue well into next year, telling ITPro that COVID-19 forced many companies into rapidly adopting cloud-based operations. These, driven by government-enforced lockdowns, allowed them to continue operating remotely. Now, having done a basic shift to cloud-based systems, he adds, 2021 will be the year of full cloud adoption, with businesses starting to optimise all its benefits; for example, data analytics and AI. If rapid investment was needed in 2020, next year businesses will want to see a return on that investment and will expect to see more from their cloud computing providers.

Although the recent transition to remote working is a trend sparked by COVID-19, the consensus is that its the beginning of a wider cultural shift. Former IBM boss Ginni Rometty is among the latest to suggest as much, claiming mass remote working will continue in some form as part of a broader hybrid model in future. This may involve companies keeping some physical presence while establishing the infrastructure and equipment to allow workers to work remotely as and when desired.

Cisco CTO for UK and Ireland, Chintan Patel, agrees, telling IT Pro that remote working gained widespread acceptance during COVID-19, even in organisations where it was unthinkable before. This means cloud and software as a service (SaaS) tools will continue to remain a crucial part of many setups, even though businesses will mostly return to a form of hybrid model. For remote working, cloud plays a central role; think secure cloud-based collaboration, accessing cloud-based business applications, and extending the security perimeter to thousands of devices, he explains. Its important to note, though, that cloud-based consumption models are not limited to remote working only. As to those returning to the offices, we see technology can help make the workplace more secure and efficient. As and when companies prepare for a return to office, they also need to optimise their space, address worker concerns about sanitation and social distancing and plan how to communicate policies and information clearly.

Accelerate digital transformation with enterprise apps on the cloud

The risks and rewards of moving enterprise applications to a multi-cloud environment

Technology will play a major part in instigating the changes needed in future, with a key role to play for many of the firms that have enjoyed success during the pandemic. While demand for software such as video conferencing platforms may not be as sky-high as it was at the beginning of the pandemic, Wrenn argues the next big step is how cloud companies can eat further into the market share enjoyed by the traditional telephone industry. More and more businesses are using Microsoft Teams or Zoom to interact, he explains, when previously they would have used conference lines or even called a person directly due to it being more convenient. Cloud providers need to think about how they can make the most of this opportunity as the way in which people interact changes.

To some extent, we should all consider ourselves lucky the global pandemic happened when it did, given that cloud computing has only in recent recently become as advanced as it is now. Thus, rather than profiting from the pandemic, this period has been the making of the industry. After all, cloud storage, processing, and compute facilities are already set up, and ready to expand easily and automatically, as and when enterprises need, according to Royston, who claims this wouldnt have been the case ten to 15 years go. It wouldve been an epic failure and caused even more disruption and long-term damage to global economies. This year, white-collar workers being able to quickly adapt to working from home in their millions is part of whats helped many sectors stay afloat. And its because of the investment and ongoing work of hyperscalers over the past few years thats meant businesses can support workers in doing this.

The IT Pro Podcast: A post-COVID cloud future

COVID has rewritten the rulebook for businesses - but will it last?

Connectivity, too, will continue to grow as organisations reliance SaaS tools increases too, Patel adds, with firms expecting more from these companies beyond provision. With cloud infrastructures becoming increasingly diverse, especially with applications adding more layers of complexity, businesses will be looking to strengthen their infrastructure. This will be achieved by gaining deeper visibility across their IT estates, ensuring workloads have continuous access to required resources and running systems that connect and protect at scale - from on-prem to hybrid cloud configurations. This is in addition to using technologies such as machine learning to give customers tools to manage their ever-growing data lakes. This is where providers can step in to guide customers on their migration journeys.

As such, the greatest challenge facing cloud providers, in light of the above, will largely be customer retention, according to Tom Wrenn. If we take online meeting services as an example, historically businesses would have had to invest in a service, such as [Cisco] WebEx, which is often costly and comes with a lot of equipment, he says. Today, however, businesses are using Zoom and Teams for this and can just turn services on and off with little upfront investment. This means that customers arent locked into providers in a way they once were. As a result, cloud computing providers will need to over-deliver for their clients, retaining a high level of customer service as well as ensuring that service levels dont decline as they undergo a huge period of growth.

Three key steps to delivering data-driven marketing

Go further with data management in your marketing efforts

How to take infrastructure monitoring to the next level

The four imperatives for building true observability

Go further with mobile marketing

Easy steps to get your mobile strategy up-to-speed

MLOps 101: The foundation for your AI strategy

What is MLOps and why do you need an MLOps infrastructure?

See more here:
How can the cloud industry adapt to a post-COVID world? - IT PRO

Read More..

The National Institute for Health Research on connecting through cloud to fight Covid-19 – ComputerWeekly.com

Working in partnership with the National Health Service, and funded by the Department of Health and Social Care (DHSC), the National Institute for Health Research (NIHR) collaborates with universities, local governments, research teams and the general public to carry out life-changing medical research projects.

The NIHR is one of those hidden gems inside the NHS, Justin Riordan-Jones, head of systems and information at the DHSC, tells Computer Weekly. And our job usually means peoples lives get better in about 10 years time, as a result of the work we do.

This year, though, the organisation has been actively working to improve the lives of the nation in a much shorter timeframe, as the NIHR and its stakeholders have worked tirelessly to help bring the Covid-19 coronavirus pandemic under control and help save lives.

The past 10 months have really highlighted just how important and vital research is to the health and wealth of the nation, and we were delighted to be able to be part of rolling out the vaccines, rolling out research and all that good stuff, he says.

The onset of the pandemic in early 2020 prompted a rapid shift in priorities in the NIHR, as it set about coordinating 50 urgent public health studies including two focusing on possible vaccines into the effects of Covid-19.

The overarching aim of this work was to gather as much clinical and epidemiological evidence as possible to inform the UK governments response to the pandemic, while supporting efforts to create new diagnostic systems, treatments and vaccines to curb the spread of the novel coronavirus.

Research initiatives on this kind of scale typically take months, even years, to get off the ground, but the NIHR was able to do so this time around in a matter of weeks, with the help of its Google Cloud-based Digital Hub.

The NIHR is one of those hidden gems inside the NHS, and our job usually means peoples lives get better in about 10 years time, as a result of the work we do. Justin Riordan-Jones, DHSC

Described by Riordan-Jones as the fundamental backbone to the NIHRs operations, the Digital Hub provides the organisations 8,000 employees with access to Googles portfolio of cloud-based communication, collaboration and productivity tools, formerly known as G Suite.

The setup was originally devised in 2014 to bring a little more order to the way the NIHR worked, both in-house and with its external research partners and stakeholders, by replacing the patchwork of data repositories and collaboration tools they relied on to work together.

We wanted a solution that would empower us to operate as a single corporate entity over multiple locations, multiple platforms and multiple scenarios, says Riordan-Jones.Our [previous] system worked in the beginning, but the lack of consistency was starting to slow down our progress.

Working with technology consultancy PA Consulting, the NIHR embarked on finding a suitable replacement for this patchwork of productivity tools, before deciding to press ahead with the deployment of Google G-Suite, which has since been rebranded as Google Workspace.

Fundamentally, choosing Google came down to its development pathway, he says, following a market evaluation of four similar products, which resulted in Google Workspace emerging as a front runner along with one other.

At that stage, there was very little functionality difference between them. What was hugely different was peoples acceptance of how well [the two systems] worked, and the ease of operation so they could migrate swiftly [to Google], he adds.

But it was the fact we could already see a development pathway for two to three years at that point, which we knew would be beneficial to us. So it was the vision and clarity of where Google was going with the product that made us choose it.

The Google Workspace deployment provided the organisations employees with access to their own corporate email addresses for the first time via Gmail, as well as video-conferencing tools in the form of Google Meet, and cloud-based document storage and collaboration through Google Drive.

Rather than having to host multiple systems in different places and locations, and a [heterogenous] technology stack that were forever running around trying to keep up to date, we have something in place now that that is much more compartmentalised and much easier to maintain, says Riordan-Jones.

The setup has also served to give its employees a greater sense of corporate identity, as well as make the productivity portion of its IT estate easier to manage and control.

We have seen a much greater corporate approach than we ever had before, so people now act as if they are the NIHR and not body x inside the NIHR, and we have reassurance and know that the hub is behaving the way we want it to, he continues.

We also know we are benefiting from the levels of security that Google wraps around it, and we have greater control and visibility over how its working [compared with the previous system].

Relying on a single system to fulfil its collaboration and productivity requirements has unlocked sizeable cost savings for the NIHR as well.

When we rolled this out in 2014-15, we were the biggest public sector implementation of G Suite technology at that stage, although weve since been overtaken because weve proved how well it works, says Riordan-Jones.

Over the years, [that has unlocked] 10-15m pounds worth of savings, which has gone into research rather than technology endeavours.

Even so, six years is a long time in tech, and the Digital Hub has been subject to numerous tweaks to its functionality in that time, based on user feedback and the NIHRs wider organisational goals, says Riordan-Jones.

We dont sit there and say this is the definitive version, he says. We are constantly looking at how it needs to be improved, based on the feedback we get from the users and [our] interpretation of where we need to be going for our digital strategy.

The latest iteration of the Digital Hub has been in place since March 2020, with its original functionality bolstered by the inclusion of Google Cloud Search, which integrates with Google Workspace to make it faster and easier for users to locate data stored within its entire infrastructure.

We are constantly looking at how [the Digital Hub] needs to be improved, based on the feedback we get from the users and [our] interpretation of where we need to be going for our digital strategy Justin Riordan-Jones, DHSC

The NIHRs workforce is distributed across various offices, universities and hospitals in the UK, which had led to some data being unintentionally siloed and isolated within these locations, making it inaccessible to some employees who need access to it.

Over the course of 16 weeks, this data sharing pain point was addressed through a Google Cloud Search-focused redesign of the Digital Hub, which has been instrumental in enabling the NIHR to rapidly refocus and coordinate its Covid-19 research efforts.

The long-standing productivity and collaboration functionality of the Digital Hub has also come into its own during the pandemic, as employees have grappled with working remotely.

In fact, its employees have taken to this new way of working like ducks to water, with the NIHR reporting a 379% uptick in the use of Googles cloud-based video-conferencing service Meet during the first two months of the first UK lockdown. Furthermore, use of Google Drive, the search giants online storage and document collaboration offering, was up 198%.

With the reworked Digital Hub now firmly embedded in the NIHR, Riordan-Jones says the scene is set for the organisation to push its digital ambitions further than ever before, following the appointment of John Nother as its chief digital officer in March 2020.

One of Nothers top priorities since joining the organisation has been to set out a five-year digital strategy for the NIHR, which is geared towards streamlining data sharing processes within the organisation through the adoption of what Riordan-Jones describes as a do once and share approach.

As an example, Riordan-Jones cites the process research organisations have to go through to secure ethical approval from the Health Research Authority before they can start work, which requires them to submit details about the nature of the project they want to embark on.

So you tell this organisation all that information and then you come to the NIHR which helps you run that research inside the NHS, and we ask for all that same information again, he says.

It is very simple to say we should be sharing that data, but unfortunately, due to legal considerations and other factors, it is not easy to do so at the moment. But were working our way through it, so that somebody who arrives at the beginning of the research journey only gets asked the absolute questions that need to be asked, rather than having to repeat themselves over and over, he adds.

All this repetition slows down the pace at which the NIHR and its stakeholders are able to work, but as the organisations rapid response to the pandemic has shown it is possible to modify such processes to get where it needs to be faster.

Because of the way the response to the pandemic was organised, we were able to modify processes in some respects to do that. And we have learned a lot as a result about what we can do, how we can do it, and where we can do it, says Riordan-Jones.

Hopefully, were going to take some of the lessons learned from this exercise now and roll those forward because we have proof, even in unfortunate circumstances, that it works and we can roll that forward.

View original post here:
The National Institute for Health Research on connecting through cloud to fight Covid-19 - ComputerWeekly.com

Read More..

Zoom recordings will be deleted after six months – The Daily Evergreen

Users can turn on a setting to notify them seven days before a recording is permanently deleted

LAUREN PETTIT

Before this change, the recordings were stored for nine months. Now, they are saved for six months to decrease the number of Zoom recordings stored in the WSU cloud.

WSU Zoom recordings stored in the cloud will be deletedif they are more than six months old.

Prior to this new policy, Zoom recordings were deleted after nine months, said Corey Oglesby, communications specialist, trainer for WSU Information Technology Services.The new policy went into effect Dec. 8.

Oglesby said the six-month timeline was necessary to help decrease the number of Zoom recordings stored in the WSU cloud storage space. The university was approaching the maximum number of recordings that can be stored in the cloud.

Zoom recordings dramatically increased in response to COVID-19, when we all went online in March, he said.

In March, the number of newly registered WSU Zoom users increased from 2,500 to 31,000, Oglesby said.

Thats a crazy overnight increase, he said. Suddenly there were 30-40,000 meetings taking place every week, and a lot of times there were more than 600 meetings happening at the same time.

Oglesby said after recordings are deleted, they are moved to the users trash bin in their Zoom account. Users then have 30 days to retrieve and download those recordings to their computers before they are permanently deleted.

Users can turn on a setting in Zoom that will notify them seven days before their cloud recordings are deleted from the trash bin, he said.

Zoom users can choose to be notified in the Email Notifications section in the settings for meetings, according to the WSU Zoom support teamsself-help article.

Continued here:
Zoom recordings will be deleted after six months - The Daily Evergreen

Read More..

Global Trend Expected to Guide Private Cloud Storage Market from 2020-2026: Growth Analysis by Manufacturers, Regions, Type and Application – Murphy’s…

The Private Cloud Storage Market report comprises a competitive analysis with a focus on key players and participants ofPrivate Cloud Storage Industrycovering in-depth data related to the competitive landscape, positioning, company profiles, key strategies adopted, and product-profiling with a focus on market growth and potential.

In the report, a concise presentation has been included concerning the product or service. Moreover, the various trends and affecting factors of the Private Cloud Storage market. These variables has helped decide the behavior of the market during the forecast period and empowered our specialists to make effective and precise predictions about the market future.

The report covers a forecast and an analysis of thePrivate Cloud Storage Market on a global and regional level. The historic data is given from 2013-2019 and the estimated period is from 2020-2026 based on revenue.

ThePrivate Cloud Storage market was valued at US$ 6603.3 Mn in 2018 and is expected to reach US$ XX Mn by 2026, at a CAGR of 16% throughout 2020-2026.

Request for a Sample Copy of Private Cloud Storage Market Report @ https://www.alltheresearch.com/sample-request/412

Competitive Landscape Covered inPrivate Cloud Storage MarketReport:

This report includes a study of the marketing and development strategies, along with the product portfolios of the leading companies. The Private Cloud Storage market report elaborates insights on the Market Diversification (Exhaustive information about new products, untapped regions, and recent developments), Competitive Assessment (In-depth assessment of market shares, strategies, products, and manufacturing capabilities of leading players in the Private Cloud Storage market).

Top playersCovered in Private Cloud Storage Market Study are:

Private Cloud Storage Market Segmentation

Private Cloud Storage market is split by Type and by Application. For the period 2020-2026, the growth among segments provides accurate calculations and forecasts for sales by Type and by Application in terms of volume and value. This analysis can help you expand your business by targeting qualified niche markets.

Market Segmentation by Type:

Market Segmentation by Applications:

For more Customization in Private Cloud Storage Market Report:https://www.alltheresearch.com/customization/412

Global Private Cloud Storage Market: Regional Segmentation

Impact of COVID-19 on Private Cloud Storage Market:

The report also contains the effect of the ongoing worldwide pandemic, i.e., COVID-19, on the Private Cloud Storage Market and what the future holds for it. It offers an analysis of the impacts of the epidemic on the international market. The epidemic has immediately interrupted the requirement and supply series. The Private Cloud Storage Market report also assesses the economic effect on firms and monetary markets. Futuristic Reports has accumulated advice from several delegates of this business and has engaged from the secondary and primary research to extend the customers with strategies and data to combat industry struggles throughout and after the COVID-19 pandemic.

For More Details on Impact of COVID-19 on Private Cloud Storage Market:https://www.alltheresearch.com/impactC19-request/412

Research ObjectivePrivate Cloud Storage Market Research:

The report is useful in providing answers to several critical questions that are important for the industry stakeholders such as manufacturers and partners, end-users, etc., besides allowing them in strategizing investments and capitalizing on market opportunities.

Key target audience:

Buy Full Report on Private Cloud Storage Market@ https://www.alltheresearch.com/buy-now/412

About AllTheResearch:

AllTheResearch was formed with the aim of making market research a significant tool for managing breakthroughs in the industry. As a leading market research provider, the firm empowers its global clients with business-critical research solutions. The outcome of our study of numerous companies that rely on market research and consulting data for their decision-making made us realise, that its not just sheer data-points, but the right analysis that creates a difference. While some clients were unhappy with the inconsistencies and inaccuracies of data, others expressed concerns over the experience in dealing with the research-firm. Also, same-data-for-all-business roles was making research redundant. We identified these gaps and built AllTheResearch to raise the standards of research support.

For All Your Research Needs, Reach Out to Us:

Contact Name: Rohan S.

Email: [emailprotected]

Phone: +1 (407) 768-2028

Go here to read the rest:
Global Trend Expected to Guide Private Cloud Storage Market from 2020-2026: Growth Analysis by Manufacturers, Regions, Type and Application - Murphy's...

Read More..

What is Artificial Intelligence (AI)? | IBM

Artificial intelligence enables computers and machines to mimic the perception, learning, problem-solving, and decision-making capabilities of the human mind.

In computer science, the term artificial intelligence (AI) refers to any human-like intelligence exhibited by a computer, robot, or other machine. In popular usage, artificial intelligence refers to the ability of a computer or machine to mimic the capabilities of the human mindlearning from examples and experience, recognizing objects, understanding and responding to language, making decisions, solving problemsand combining these and other capabilities to perform functions a human might perform, such as greeting a hotel guest or driving a car.

After decades of being relegated to science fiction, today, AI is part of our everyday lives. The surge in AI development is made possible by the sudden availability of large amounts of data and the corresponding development and wide availability of computer systems that can process all that data faster and more accurately than humans can. AI is completing our words as we type them, providing driving directions when we ask, vacuuming our floors, and recommending what we should buy or binge-watch next. And its driving applicationssuch as medical image analysisthat help skilled professionals do important work faster and with greater success.

As common as artificial intelligence is today, understanding AI and AI terminology can be difficult because many of the terms are used interchangeably; and while they are actually interchangeable in some cases, they arent in other cases. Whats the difference between artificial intelligence and machine learning? Between machine learning and deep learning? Between speech recognition and natural language processing? Between weak AI and strong AI? This article will try to help you sort through these and other terms and understand the basics of how AI works.

The easiest way to understand the relationship between artificial intelligence (AI), machine learning, and deep learning is as follows:

Let's take a closer look at machine learning and deep learning, and how they differ.

Machine learning applications (also called machine learning models) are based on a neural network,which is a network of algorithmic calculations that attempts to mimic the perception and thought process of the human brain. At its most basic, a neural network consists of the following:

Machine learning models that arent deep learning models are based on artificial neural networks with just one hidden layer. These models are fed labeled datadata enhanced with tags that identify its features in a way that helps the model identify and understand the data. They are capable of supervised learning (i.e., learning that requires human supervision), such as periodic adjustment of the algorithms in the model.

Deep learning models are based on deep neural networksneural networks with multiple hidden layers, each of which further refines the conclusions of the previous layer. This movement of calculations through the hidden layers to the output layer is called forward propagation. Another process, called backpropagation, identifies errors in calculations, assigns them weights, and pushes them back to previous layers to refine or train the model.

While some deep learning models work with labeled data, many can work with unlabeled dataand lots of it. Deep learning models are also capable of unsupervised learningdetecting features and patterns in data with the barest minimum of human supervision.

A simple illustration of the difference between deep learning and other machine learning is the difference between Apples Siri or Amazons Alexa (which recognize your voice commands without training) and the voice-to-type applications of a decade ago, which required users to train the program (and label the data) by speaking scores of words to the system before use. But deep learning models power far more sophisticated applications, including image recognition systems that can identify everyday objects more quickly and accurately than humans.

For a deeper dive into the nuanced differences between thesetechnologies, read AI vs. Machine Learning vs. Deep Learning vs. Neural Networks: Whats the Difference?

Weak AIalso called Narrow AI or Artificial Narrow Intelligence (ANI)is AI trained and focused to perform specific tasks. Weak AI drives most of the AI that surrounds us today. Narrow is a more accurate descriptor for this AI, because it is anything but weak; it enables some very impressive applications, including Apple's Siri and Amazon's Alexa, the IBM Watson computer that vanquished human competitors on Jeopardy, and self-driving cars.

Strong AI, also called Artificial General Intelligence (AGI), is AI that more fully replicates the autonomy of the human brainAI that can solve many types or classes of problems and even choose the problems it wants to solve without human intervention. Strong AI is still entirely theoretical, with no practical examples in use today. But that doesn't mean AI researchers aren't also exploring (warily) artificial super intelligence (ASI), which is artificial intelligence superior to human intelligence or ability. An example of ASI might be HAL, the superhuman (and eventually rogue) computer assistant in 2001: A Space Odyssey.

As noted earlier, artificial intelligence is everywhere today, but some of it has been around for longer than you think. Here are just a few of the most common examples:

The idea of 'a machine that thinks' dates back to ancient Greece. But since the advent of electronic computing (and relative to some of the topics discussed in this article) important events and milestones in the evolution of artificial intelligence include the following:

IBM has been a leader in advancing AI-driven technologies for enterprises and has pioneered the future of machine learning systems for multiple industries. Based on decades of AI research, years of experience working with organizations of all sizes, and on learnings from over 30,000 IBM Watson engagements, IBM has developed the AI Ladder for successful artificial intelligence deployments:

IBM Watson products and solutions give enterprises the AI tools they need to transform their business systems and workflows, while significantly improving automation and efficiency. For more information on how IBM can help you complete your AI journey, explore IBM's portfolio of managed services and solutions.

Sign up for an IBMid and create your IBM Cloud account.

View original post here:
What is Artificial Intelligence (AI)? | IBM

Read More..