Page 3,977«..1020..3,9763,9773,9783,979..3,9904,000..»

Amazon, Google, Microsoft: Who had the best year in cloud in 2019? – The Register

Cloud computing in 2019 continued to dominate IT

Analysis Three technology giants continued to dominate cloud computing in 2019, with each bringing in some interesting tools to play with as they sprawled over smaller players.

The big picture did not change radically in 2019. All the big three cloud providers (or big four if you consider Alibabas growth in East Asia) continued to grow at a dramatic pace. In the quarter ending September 30 2019, AWS reported 34 per cent revenue growth and Microsoft Azure 59 per cent growth compared to the same quarter a year ago. Google did not offer an exact figure for the quarter but said that (PDF): Other revenues for Google were $6.4bn, up 39 per cent year-over-year, once again fuelled by cloud.

AWS remains the largest IaaS (Infrastructure as a Service) provider by some distance. How much though? Gartner said in July that AWS had a 47.8 per cent share, ahead of Azure at 15.5 per cent and Google at 4.0 per cent. Canalys in October put AWS at 32.6 per cent, Azure at 16.9 per cent and Google at 6.9 per cent. Most seem to agree that while AWS is still growing fast, its market share overall is slipping just a little.

Analyst Canalys estimates cloud market share, October 2019

The difficulty in measurement is for several reasons. One is that IaaS is a vague term. A cloud-hosted virtual machine (VM) is undoubtedly IaaS, but the cloud vendors are keen to sell their premium services which stray towards PaaS (Platform as a Service) or even SaaS (Software as a Service). Microsofts share leaps ahead if you chuck in everything Office 365, since AWS has little in the way of SaaS, and this is important since it is synergy between Office 365, Azure and on-premises that drives its growth overall. Another issue is that neither Microsoft nor Google breaks out IaaS revenue, preferring to state percentage growth (when it suits them).

What happened in 2019? There is a constant outpouring of product announcements from all the cloud vendors, most of them not very interesting, but still a regular flow of significant news. If you could pick just one thing though, what would it be?

For AWS, that one thing is probably Outposts, even though it was first previewed in late 2018. Outposts is the AWS experience but physically on-premises, solving both latency and compliance issues, while also being an expensive way to run your servers. Canalys estimates that AWS will be a top four on-premises server vendor in the US and Western Europe within three years. In the past it has been Microsoft which led the way in hybrid cloud, with technology like Azure Stack and Azure AD Connect. Outposts is not quite the same thing, but it is hybrid cloud, and looks like a tidy solution. See here for our chat with an AWS architect about its significance.

Over at Google Cloud, the biggest thing must be Anthos, announced in April, or perhaps the technology behind it yes, Kubernetes (K8s), the container orchestration platform which is open source but invented by Google. Anthos wraps K8s into a hybrid cloud solution which lets you run applications on premises, on GCP, or even on other public clouds. K8s has huge momentum and represents both GCPs biggest technical advantage and an awkward technology for AWS to navigate even though AWS claims to be the biggest K8s cloud provider. No surprise that AWS CEO Andy Jassy told us, regarding K8s, that I dont believe in one tool to rule the world.

What about Microsoft? It was in late 2018 that the company completed its acquisition of GitHub, but this was perhaps the key point of interest in 2019. In particular, Microsoft has done interesting work with devops, and the general availability of both GitHub Actions and GitHub Actions for Azure which lets developers automate the process of checking in code all the way to deploying containerised applications, web applications or serverless applications is potentially a strong driver for Azure cloud adoption.

We visited cloud events from all the big three this year. At GCP Cloud Next in London, the company impressed with its engineering excellence, though the platform feels small in scope compared to its rivals. Microsoft Ignite showed the companys skill in packaging its clouds into services that are easy to adopt, though there are always awkward corners, like what happened when we asked about Azure and Intel speculative execution bugs. And AWS re:Invent was monstrously large, with the impressive Outposts announcement, but also showing signs of agitation, not only with respect to K8s, but also indignation at Microsoft winning the US Department of Defence Jedi contract.

Best 2019 in cloud? It is hard to pick a winner; maybe Microsoft for 2019, although AWS seems more promising for 2020 and Google is likely to continue nibbling at the market share of both.

More of the same in 2020? Probably, but some are sounding a note of scepticism. Blogger Cal Peterson remarked on the Amazon premium, noting that all the big three are much more expensive than smaller hosting providers, and referencing what happened when PHP inventor Rasmus Lerdorf compared VMs from numerous providers. What the big three public clouds offer is amazing in its scope, but for sheer value, look elsewhere.

View original post here:
Amazon, Google, Microsoft: Who had the best year in cloud in 2019? - The Register

Read More..

Recovery In The Cloud: An Online Alternative To Face-To-Face AA Meetings – Forbes

Id like to start the year and the decade on a positive note. My beat at Forbes is the Cloud, social media, and where the two intersect. What Ive seen over the last ten years, before and during my tenure here: the story of people-driven online communication has transformed from one of hope to despair. With todays wild-west atmosphere in the social media sphere, its easy to miss whats working, particularly in well-moderated forums designed to leverage some of the original promises and benefits of social media.

Today, on the first of January when many people are waking late with hangovers and preparing to commit new year resolutions I am thinking of one type of forum: online alternatives to face-to-face Alcoholics Anonymous (AA) meetings.

While there are other online recovery programs AA itself has spawned and inspired many the program I want to focus on today is the SMART Recovery online community. Two reasons why: I have attended close to 90 SMART online meetings in the last few months (the subject of another article I plan to write), and I can see how this young organization (SMART was founded in 1994, the year the Netscape Web browser was born; AA was founded in 1935) came of age in the digital era. There are many other differences between the two. SMART does not ask its members to commune with a higher power, and it purports to base its curriculum on evidence not faith (SMART advises members to use tools modeled after the learnings of cognitive behavioral therapy (CBT). But while I have also attended face-to-face AA meetings, I will not go into the merits of either approach. I will simply point out the merits of online recovery, based on my experience. (Note: SMART also has face-to-face meetings Ive attended one but they are not nearly as plentiful as AAs).

Anonymity and the size of the group

In recent times, weve come to regard disinhibition the tendency for many people online to communicate more aggressively, if not just more assertively as one of the causes for the hostile environments that preside in social media. But not long ago, social media watchers looked at how the rules of some forums might inhibit some people from participating. One rule today: revealing your identity on Facebook (assuming you are not a political operative or a bot). SMART took a cue from AA by encouraging anonymity. While everyone has the option to reveal more, most people go by handles, often humorous, sometimes telling (fictional example: TheAncientMariner). Of course, anonymity doesnt guarantee participation. But here is where the numbers in private, moderated online meetings like SMART might benefit. Some SMART online meetings attract up to 200 people. At the beginning of my first meeting (attendance: about 200), my first thought was that this was going to be a trainwreck. Instead what I got was a well-structured conversation where at least thirty people weighed in meaningfully using video, voice, or chat (the meetings are conducted on the ZOOM cloud conferencing platform).This was better than most of the face-to-face meetings Ive attended. Perhaps anonymity plus scale can facilitate better, livelier meetings.

Access

Another benefit is access to people who cannot attend face-to-face meetings, either because they are ill or too far away. Many SMART online attendees are required to attend meetings by the courts and medical facilities, so access can be a great challenge. The judicial and medical communities are beginning to wake up though not uniformly allowing for recovery regimes that include a mix of face-to-face and online meetings. The greater reach to people otherwise excluded might benefit all attendees. Typically, SMART meetings have a very large geographical blueprint, within and outside the U.S., providing attendees with a wide range of perspectives on both personal issues and local policies (SMART also welcomes anyone with a dependence to other drugs of choice or other maladaptive behaviors). This level of depth can provide people with a better grasp on the evolving world of recovery, which can aid in their growth.

The future of Cloud Computing

And the depth plays well to the potential tech future of online recovery advances in Cloud Computing. In the past decade, weve witnessed the evolution of the Cloud (a) as a system that serves the hosting and storage needs of many organizations to (b) a vast complex of platforms (with Amazon, Microsoft, Google, and others, competing) that provides advanced computational power. In other parts of the medical world, that power is being harnessed to predict patient outcomes and to personalize care securely. This is not science fiction, but science fact, and the providers of recovery services along with the legal and medical bodies that feed those services might sit up and pay attention. Face-to-face is here to stay in fact there are a number of innovations colliding with that world (my next article), but for recovery in the Cloud, its only the beginning.

Read the rest here:
Recovery In The Cloud: An Online Alternative To Face-To-Face AA Meetings - Forbes

Read More..

Managed Hybrid Cloud Hosting Market Future Investment Initiatives, Growing with Technology Development, New Innovations, Competitive Analysis and…

The main aim of the Global Managed Hybrid Cloud Hosting Market 2020 report is to provide an up-to-date information on the market and also pinpoint all the opportunities for Managed Hybrid Cloud Hosting market growth. The report begins with a market outlook and offers market basic introduction and definition of the worldwide Managed Hybrid Cloud Hosting industry. The overview part of the report contains Managed Hybrid Cloud Hosting market dynamics which includes market growth drivers, restraining factors, opportunities and Managed Hybrid Cloud Hosting current trends along with the value chain analysis and pricing structure study.

The global research report on Managed Hybrid Cloud Hosting Market offers an extensive analysis on market size, shares, supply-demand analysis, sales value and volume study of various companies together with Managed Hybrid Cloud Hosting segmentation study, with respect to important topographical regions. The global Managed Hybrid Cloud Hosting industry report contains the recent advancement in the worldwide industry and major factors that influence the overall growth of the Managed Hybrid Cloud Hosting market.

Request a sample report here: https://www.orbisreports.com/global-managed-hybrid-cloud-hosting-market/?tab=reqform

The Managed Hybrid Cloud Hosting market has also been classified on the basis of various segments. The important segments are also divided into Managed Hybrid Cloud Hosting sub-segments which gives the better understanding of the complete growth of Market and helps to take a decisive judgment on Managed Hybrid Cloud Hosting business.

Amazon Web Services (AWS)MicrosoftTata CommunicationsRackspaceDatapipeSifyNTT CommunicationsNxtGenBTCtrlS DatacentersCenturyLinkDimension Data (NTT Communications)FujitsuSingtelTelstra

The report analyzes Managed Hybrid Cloud Hosting market size in terms of value and volume over the forecast period 2020-2024. The research report on Managed Hybrid Cloud Hosting Market study the growth driven factors, latest trends and Managed Hybrid Cloud Hosting industry opportunities in the market over the globe through past study and witnessed future projections based on a thorough research study. The report broadly offers the Managed Hybrid Cloud Hosting Market size, share, growth, trends, and forecasts for the period 2020-2024.

Managed Hybrid Cloud Hosting Industry Type Segmentation

Cloud-basedOn-premises

The Managed Hybrid Cloud Hosting Industry Application Segmentation

ManufacturingRetailFinancialGovernmentOthers

Enquire here before buying: https://www.orbisreports.com/global-managed-hybrid-cloud-hosting-market/?tab=discount

This report also analyses the global Managed Hybrid Cloud Hosting market competition landscape, market driving elements and trending factors. Moreover, highlights the Managed Hybrid Cloud Hosting opportunities and risk/challenges, threats and entry obstacles. Sales channels, Managed Hybrid Cloud Hosting suppliers/distributors,SWOT AND PESTEL analysis also incorporated in the report.

Synopsis of the Managed Hybrid Cloud Hosting Market key players having a large count in terms of end-user demands, constraining elements, size, share, and sales.

Worldwide peculiarities of Managed Hybrid Cloud Hosting Market consisting industry growth and constraining factors, the technological development, Managed Hybrid Cloud Hosting foreseen growth opportunities, and rising segments.

Other factors like Managed Hybrid Cloud Hosting Market cost, supply/demand, profit/loss, and the growth elements are extensively described in Managed Hybrid Cloud Hosting market report.

Managed Hybrid Cloud Hosting Market size, share growth factors study with respect to region-wise and country-wise segments are also included.

Global Managed Hybrid Cloud Hosting Market Trends, operators, restraints, Managed Hybrid Cloud Hosting development opportunities, threats, risks, challenges, and recommendations.

Managed Hybrid Cloud Hosting Market Study Objectives:

1) This report offers precise study for changing Managed Hybrid Cloud Hosting competitive dynamics.

2) It serves future looking prospects on various factors driving or constraining Managed Hybrid Cloud Hosting market growth.

3) It provides a Managed Hybrid Cloud Hosting forecast from 2020-2024 evaluates on the basis of how the market is estimated to grow.

4) It gives the better understanding of the major Managed Hybrid Cloud Hosting product segments and their future.

5) Managed Hybrid Cloud Hosting study offers the correct study of changing competitive dynamics and keeps you forward of Managed Hybrid Cloud Hosting competitors.

6) It guides you in making decisive Managed Hybrid Cloud Hosting business decisions by having overall and updated information on industry and by providing an in-depth study of Managed Hybrid Cloud Hosting market segments.

Click here to see full TOC https://www.orbisreports.com/global-managed-hybrid-cloud-hosting-market/?tab=toc

The knowledge gain from the Managed Hybrid Cloud Hosting study not only helps research analysts to gather an overall Managed Hybrid Cloud Hosting market report but also assist them to comprehend the ongoing competitive landscape of the Managed Hybrid Cloud Hosting market. Finally, it serves the information about the discoveries of the Managed Hybrid Cloud Hosting market research, appendix, information source, and conclusion.

Continue reading here:
Managed Hybrid Cloud Hosting Market Future Investment Initiatives, Growing with Technology Development, New Innovations, Competitive Analysis and...

Read More..

Five Things To Do in St. Cloud, Jan. 6-10 – SC Times

From Staff Reports Published 5:00 p.m. CT Dec. 31, 2019

Monday, Jan. 6 isNational Take Down the Christmas Tree Day. So whether you've been trying to hold onto the holiday spirit, or you've just been procrastinating, now's the time to get rid of your tree and officially enter 2020.

Stearns History Museum(Photo: Stearns History Museum)

Retired St. Cloud special education teacherRick Rentz will be talking about the history of the Duncan yo-yo, which just turned 90 years old, as part of the Stearns History Museum's Breakfast Club series.

The event is $7 and free for members. Itruns from 9-10 a.m. at Stearns History Museum, 235 33rd Ave. S.

Adam Hammer(Photo: Youth Chorale of Central Minnesota)

Pantown Brewing Company will be highlighting local musicians at their Ales and Artists showcase. Musicians performing will includeTJ Larum, Aksel Krafnickand Adam Hammer.

The showcase runs from 5-8:30 p.m. at Pantown Brewing Company, 408 37th Ave. N.

United Way of Central Minnesota will be demonstrating the plight of Central Minnesotans living in poverty in the interactive immersion experience "This is Life: The Cost of Community Crisis."

The event runs from 8:30-11 a.m. at Capital One,30 Seventh Ave. S. For more information and to register for the event, visithttps://unitedwayhelps.org/thisislife/.

The Chamber Music Society of St. Cloud will be hosting classical music ensemble Consortium Carissimi at 7:30 p.m. at St. Mary's Cathedral, 25 Eighth Ave. S.

Tickets are $25 for adults, $20 for seniors and $5 for students and will beavailable at the door and online athttps://www.brownpapertickets.com/.

Read or Share this story: https://www.sctimes.com/story/entertainment/2019/12/31/five-things-do-st-cloud-jan-6-10/2759605001/

Read the rest here:
Five Things To Do in St. Cloud, Jan. 6-10 - SC Times

Read More..

Wyze data leak: Key takeaways from server mistake that exposed information from 2.4M customers – GeekWire

Seattle-area startup Wyze offers low-cost video security cameras and other IoT devices. (Wyze Photo)

Post updated at 6 p.m. on Dec. 29.

Seattle-area startup Wyze, a provider of home video cameras and other Internet of Things (IoT) devices, announced on Dec. 26 that it had been informed of a data leak that reportedly exposed the personal information of 2.4 million of its customers.

The problem arose from a new internal project to find better ways to measure basic business metrics like device activations, failed connection rates, etc., writes Dongsheng Song, Wyze co-founder and chief product officer, in the companys post.

We copied some data from our main production servers and put it into a more flexible database that is easier to query, he explains. This new data table was protected when it was originally created. However, a mistake was made by a Wyze employee on December 4th when they were using this database and the previous security protocols for this data were removed.

Founded in 2017 by a group of Amazon veterans, Wyze offers a series of low-priced cameras, plugs, bulbs and other smart-home devices. The company, based in Kirkland, Wash., has raised $20 million in venture capital. GeekWire has contacted Wyze for additional comment.

To Wyzes credit, it has been very detailed in describing what happened, when, why, how, and what the company is doing about it.

A post by Twelve Security claimed that the leaked data included the following:

Wyze quoted that list in its original post but added, We dont collect information about bone density and daily protein intake even from the products that are currently in beta testing.

In looking over this event, there are ten key security and privacy takeaways.

Wyze has been upfront about the manner in which it was informed of the leak, with little or no time to mitigate the problem before it was made public. ZDNets Catalin Cimpanu summed up the feelings of many (likely including Wyze) about whether this disclosure was responsible or not.

These are valid and reasonable concerns. As is often the case regarding the disclosure wars, there likely wont be any resolution, but instead a renewed airing of both sides of the argument. Those supporting the disclosure can and will say the information was public for a number of days and holding that information back prolongs the risk. Those against it will say this just wasnt enough time for the vendor to take action. Either way, this situation shows that the disclosure wars will continue so long as theres no collective agreement on how to handle these situations.

One thing to Wyzes credit: they clearly jumped on this fast once it broke. The companys post states: Immediately upon hearing about a potential breach, Wyze mobilized the appropriate developers and executives (CEO and CPO) to address the allegations.

It adds later, This means that all Wyze user accounts were logged out and forced to log in again (as a precaution in case user tokens were compromised as alleged in the blog post). Users will also need to relink integrations with The Google Assistant, Alexa, and IFTTT.

This level of response and these steps are reasonable to address the risks around potentially lost authentication tokens. These are also actions that will impose a burden on users.

Going back to our first point, people can and will argue how much of this response is due to the nature of the disclosure. But these are good, concrete steps, which put security ahead of ease-of-use: Wyze is risking user frustration for better security.

One thing that Wyze isnt doing, however, is forcing password resets on users. While Wyze has said that passwords werent stolen, its often hard to be certain. And if the current situation involving Amazons Ring has taught us anything, its that people are regularly reusing passwords, especially where IoT devices are concerned. Not forcing a password reset is missing an opportunity to be thorough in the response to improve overall customer security.

Ring has been in the news a lot lately for being hacked. As Ive noted, the nature of those hacks boil down to the inherent weakness of relying on passwords. This situation is different because its a leak of data held by Wyze. In fact, it even appears that password information wasnt involved.

In this case, even if youve used two-factor authentication (2FA), you still are at risk from this data breach.

If the Ring situation has reminded us of the risks of password reuse and the overall weakness of passwords as a security measure for IoT, this breach helps show us the risks inherent to losing the kind of data used byIoT and health-related devices in the home.

By their very nature, IoT devices are integrated into our most intimate spaces. Cameras in particular represent a major window into our most protected personal spaces, as weve seen in the reactions to the Ring situation.

Looking at the information thats potentially lost in this breach, we get a more concrete sense of IoT data breaches can mean in real terms.

In particular, Wyze notes that the data loss includes: List of all cameras in the home, nicknames for each camera, device model and firmware. WiFi SSID, internal subnet layout, last on time for cameras, last login time from app, last logout time from app.

This data is troubling because it can give very specific information that can be useful for real-world crime. People regularly name devices in ways that are descriptive for themselves, not expecting them to be publicly known. For example, people might name a camera in a childs room Bettys Room. Information like this can give an attacker information about who is in the house, where they might be and where cameras are going to be placed. All of this can be useful information for people who want to enter the home for malicious purposes.

One thing that Wyze has not recommended, which I would recommend, is that users rename their internal WiFi SSIDs, rename their cameras and potentially reposition those cameras. All these steps can mitigate the risks of that information now being publicly accessible.

Another piece of the exposed data is this: Height, Weight, Gender, Bone Density, Bone Mass, Daily Protein Intake, and other health information for a subset of users.

Wyze goes to some length to point out that this information lost only affects a very small subset of their users, specifically 140 external beta testers. Yes, that is a very small number of people. But the information thats was exposed is very sensitive and very personal health information. Its a reminder of the nature of the data thats being handled by IoT and health devices.

The similarities to the Capital One data breach are striking. In this case, as Wyze says: a mistake was made by a Wyze employee on December 4th when they were using this database and the previous security protocols for this data were removed.

While this isnt exactly the same thing that happened with Capital One, in both cases you have data that was accessible in the cloud without appropriate security protections due to human error. Its also notable that in both cases, auditing and monitoring failed to catch the misconfiguration.

Both of these cases are a reminder that, unfortunately, when things are deployed to the cloud, the risks of exposure and breach are frequently greater. And in terms of IT operations and practice, the controls and countermeasures often arent as robust and mature for cloud deployments as they are for traditional on premises deployments.

For startups, there are two lessons, as well. One is cautionary and the other potentially positive.

First the cautionary tale: speed kills.

Once again, to its credit, Wyze is open about what happened, and theres a very clear message for startups. From the companys posting: To help manage the extremely fast growth of Wyze, we recently initiated a new internal project to find better ways to measure basic business metrics like device activations, failed connection rates, etc. We copied some data from our main production servers and put it into a more flexible database that is easier to query.

Two things happened here that are common for startups. First, the company experienced sudden, fast growth. Second, it moved quickly to address the implications of the growth.

As noted above, it was during this fast move that, at some point, the security that had protected the data was removed by an employee.

Its great that Wyze was able to move fast to address issues related to their fast growth. But this is also a reminder that speed can kill. Mistakes happen when things move fast and theres little checking. This is a risk that all startups face and should be conscious of.

Of course, the speed that can kill you as a startup can also save you. The fast response that we see from Wyze is an example of the speed startups can achieve. Another positive aspect of this speed is shown in the statement that is going to bump up priority for user-requested security features beyond 2-factor authentication.

If we compare and contrast this with Rings response to its current situation, the difference is stark. Ring has made no announcements of any major plans to improve security capabilities in the wake of stories of Ring devices being hacked. By contrast Wyze has committed early and openly to reworking their prioritization of new user-requested security features.

Here too is another lesson for startups: use the speed and agility that being a startup gives you to move quickly to turn disadvantage into advantage.

In its post, Wyze very clearly refuted the claim that it is sending data to Alibabas cloud in China. A question and answer in the post speaks directly to this:

Is there validity to the claim that Wyze is sending user data to China?

Wyze does not use Alibaba Cloud. The claim made in the article that we do is false.

It goes on to note that the company has employees and manufacturers in China, but Wyze does not share user data with any government agencies in China or any other country.

The fact that this claim was made and Wyze feels a need to refute it points to another takeaway: there is an emerging, almost McCarthyite trend lately to imply or allege that tech companies with ties to China are storing data in China and/or sharing data with the Chinese government. Weve seen similar insinuations in regards to TikTok as well.

Partly, this represents the sort of speculation that can fill a vacuum when companies dont provide clear information themselves about where they store their data. A few years ago, people, especially in Europe, were concerned about data being stored in the United States and its possibly being subject to seizure under the Patriot Act. Now, people are concerned about data being stored in China and accessible by the government there.

One thing companies can do to mitigate this concern is to be open about where they store data.

Beyond that, though, there is clearly heightened concern now about data being stored and shared with China, and that concern is manifesting in claims and insinuations about data being stored or shipped there.

The Wyze breach is a serious one. And Wyze deserves credit for doing a lot of things right, quickly, in response. But as we dig into it more, we can see that this situation raises a number of issues around IoT devices, data storage, security and incident response.

We can all learn from this, which is one reason why its so good that the Wyze team has been open and up front about the situation: it helps the industry learn and grow collectively. And because Wyze is a startup, its experience and response has particular lessons for other up-and-coming companies in the IoT space.

Update: Wyze disclosed an additional issue in a Dec. 29 update to its post.

We have been auditing all of our servers and databases since then and have discovered an additional database that was left unprotected. This was not a production database and we can confirm that passwords and personal financial data were not included in this database. We are still working through what additional information was leaked as well as the circumstances that caused that leak.

Weve also clarified our post above to note that Wyze says it doesnt collect information about protein intake or bone density, contrary to a report that said such data was included in the leak.

Read more:
Wyze data leak: Key takeaways from server mistake that exposed information from 2.4M customers - GeekWire

Read More..

Conquering the Cyber Security Challenges of the Cloud – CPO Magazine

Cloud computing has become a prevalent force, bringing economies of scale and breakthrough technological advances to modern organizations, but it is more than just a trend. Cloud computing has evolved at an incredible speed and, in many organizations, is now entwined with the complex technological landscape that supports critical daily operations.

This ever-expanding cloud environment gives rise to new types of risk. Business and security leaders already face many challenges in protecting their existing IT environment. They must now also find ways to securely use multiple cloud services, supported applications and underlying technical infrastructure.

The surge in business processes supported by cloud services has been well evidenced by organizations using cloud services store confidential data in the cloud environment. But when using cloud services, organizations are still unsure whether to entrust cloud service providers (CSPs) with their data. CSPs generally provide a certain level of security as substantiated by multiple surveys, but cloud-related security incidents do occur.

CSPs cannot be solely responsible for the security of their customers critical information assets. Cloud security relies equally on the customers ability to implement the right level of information security controls. Nevertheless, the cloud environment is complex and diverse, which hinders a consistent approach to deploying and maintaining core security controls. It is vital that organizations are aware of and fulfill their share of the responsibility for securing cloud services to successfully address the cyber threats that increasingly target the cloud environment.

As organizations acquire new cloud services, they typically choose these from a selection of multiple CSPs and therefore need to deal with a multi-cloud environment, which is characterized using two or more CSPs.

Before you continue reading, how about a follow on LinkedIn?

Organizations favor a multi-cloud environment because it allows them to pick and choose their preferred cloud services across different CSPs (e.g. AWS, Microsoft Azure, Google Cloud, Salesforce). However, each individual CSP adopts its own jargon, its own specific technologies and approaches to security management. The cloud customer therefore needs to acquire a wide range of skills and knowledge to use different cloud services from multiple CSPs securely.

Organizations require a range of different users to securely access cloud services from within the organizations network perimeter through secure network connections (e.g. via a gateway). However, organizations also need their cloud services to be accessed from outside the internal perimeter by business partners and users travelling off-site or working remotely, all connecting through a selection of secure network connections as dictated by the organization.

While CSPs provide a certain level of security for their cloud services, organizations need to be aware of their security obligations and deploy the necessary security controls. This requires organizations to understand and address the many security challenges presented by the complex and heterogeneous aspects of the cloud environment.

Our ISF members have identified several obstacles to operating securely in the cloud environment. The main challenges include:

The rapid explosion of cloud usage has accentuated these challenges and, in some instances, left organizations insufficiently prepared to tackle the security concerns associated with using cloud services.

Securing the use of cloud services is a shared responsibility between the CSP and the cloud customer. The security obligations incumbent on the CSP are to protect the multi-tenant cloud environment, including the backend services and physical infrastructure, as well as to prevent the commingling of data between different customers.

While the CSP maintains much of the underlying cloud infrastructure, the cloud customer is responsible for securing its data and user management. Whether the customers responsibility extends to performing security configurations for applications, operating systems and networking will depend on the cloud service model selected.

This shared responsibility for security can create confusion and lead to over-reliance on the CSP to mitigate threats and prevent security incidents. It is essential that the cloud customer does not depend wholly on the CSP to deploy the appropriate security measures, but clearly understands how responsibility for security is shared with each CSP in order to identify and deploy the requisite security controls to protect the cloud environment.

An organization using an on-premises IT data center will know exactly where its critical and sensitive data resides and can exert full control over the movement of its data. This helps considerably when implementing security controls, whereas in the cloud environment, data moves in and out of an organizations perimeter more freely. This can obscure where critical and sensitive data is located, and how it can be protected, which can hinder an organizations ability to effectively enforce the requisite security controls across all of its cloud services in line with compliance requirements.

While it is the cloud customers responsibility to ensure the security of its data in the cloud environment, the customers control over its data is intrinsically limited since the data is stored by an external party the CSP in an off-site location, often in a different country. Moreover, the CSPs will often leverage several data centers in geographically distinct locations to ensure the organizations data is stored on more than one server for reasons of resilience. This creates additional complexity in terms of managing data across borders, understanding where it is located at a given moment in time, determining the applicable legal jurisdiction and ensuring compliance with relevant laws and regulations an obligation that rests fully with the cloud customer, not the CSP.

Modern organizations must operate at a fast pace, delivering new products and services to stay ahead of the competition. Many are therefore choosing to move ever further towards cloud computing, as the elasticity and scalability offered by cloud services provide the desired flexibility needed to compete. For an organization to have confidence that it can move to the cloud whilst ensuring that vital technological infrastructure is secure, a robust strategy is required.

The cloud environment has become an attractive target for cyber attackers, highlighting the pressing need for organizations to enhance their existing security practices. Yet consistently implementing the fundamentals of cloud security can be a complicated task due to the diverse and expanding nature of the cloud environment.

This is but one of many challenges that organizations need to overcome to use cloud services securely. Organizations cannot rely purely on CSPs to secure their critical information assets but must accept their own share of responsibility. This responsibility calls for a combination of good governance, deployment of core controls and adoption of effective security products and services. Controls that cover network security, access management, data protection, secure configuration and security monitoring are not new to information security practitioners, but they are critical to using cloud services securely.

Moving forward, organizations can select from a variety of trends and technologies that will enable them to use cloud services securely from the adoption of new products to the embedding of improved processes, such as a focus on secure containers, where security is given greater emphasis during development.

Assuring that services are used securely will provide business leaders with the confidence they need to fully embrace the cloud, maximizing its potential and driving the organization forward into the future.

The rest is here:
Conquering the Cyber Security Challenges of the Cloud - CPO Magazine

Read More..

City of Cottonwood now a certified sustainable business | Sedona.Biz – The Internet Voice of Sedona and The Verde Valley – Sedona.biz

City of Cottonwood is now a Certified Sustainable Business at the Conservationist/Bronze level.

Sedona AZ (January 1, 2020) The City of Cottonwood has worked diligently on behalf of its citizens and the Verde Valley to reduce water consumption. In 2005, they became the largest water provider in the Verde Valley after purchasing the six private water companies that previously had served the Cottonwood area. The city immediately began making repairs and upgrades to the water production and distribution system to improve its reliability and reduce the lost and unaccounted for water that was previously occurring.

They also installed 20 arsenic treatment systems to comply with new water quality standards for arsenic that went into effect in 2006. As a result of their efforts, the city is pumping about 30 percent less water today than the six private water companies were pumping in 2000 and their total gallons per capita per daily (GPCD) use has decreased from around 171 to 87; making it one of the lowest total GPCDs for a municipality in the State of Arizona.

The city is also diligently working to expand its use of reclaimed water by strategically installing purple pipe to accommodate the use of reclaimed water for irrigation at the large turf areas throughout the city. The city currently requires the use of reclaimed water for all construction use and delivers reclaimed water for irrigation to Cottonwood Ranch, plus the community garden, airport, and viticulture garden at Yavapai College. In the Spring of 2020, the city will begin delivering reclaimed water for irrigation to Riverfront Park, with plans to expand its use to the cemetery.

Cottonwood is also addressing energy use throughout the city. They installed solar systems for the Riverfront Reclamation Facility and Recreation Center pool, and now have LED lights in Old Town lamp posts, city hall, tennis courts, and the recreation center. The city is moving data storage from conventional servers to cloud servers and they are looking to replace conventional pool pumps with variable speed pumps. These combined efforts reduce costs and the consumption of energy from non-renewable resources.

The city has also taken steps to reduce the potential discharge and disposal of chemicals and pharmaceuticals into the environment by hosting household hazardous waste and pharmaceutical collection events. These events remove toxic chemicals and pharmaceuticals that may otherwise be flushed into the water system or end up in the wrong hands.

Cottonwood also has good employment practices. Employees have access to exercise facilities for physical health and to an Employee Assistance Program for mental health. They also do a lot to support the community including sponsoring Food for Finds, Neighborhood Officer Program, National Night Out, City Selfie Day, Toys for Tots, Thanksgiving Turkey Drive, and Steps to Recovery.

The City is currently working with the Sustainability Alliance to complete a 5-year sustainability plan. See who else is certified.

Go here to see the original:
City of Cottonwood now a certified sustainable business | Sedona.Biz - The Internet Voice of Sedona and The Verde Valley - Sedona.biz

Read More..

A restaurant server got a massive tip and it is helping to change her life – KFOR Oklahoma City

(CNN) Its never a bad idea to start the New Year on a generous note.

Danielle Franzoni, a server in Alpena, Michigan, started hers on the receiving end of that generosity. She waited on a couple at the restaurant where she works during the final days of 2019.

Their bill was $23. They tipped a festive $2,020.

Happy New Year, the anonymous couple wrote on the bill. 2020 Tip Challenge.

Franzoni couldnt believe it. She asked her boss whether it was too good to be true, but the tip was legit and seasonally appropriate.

Things like this dont happen to people like me, she told the Alpena Times.

It had been a difficult year for Franzoni. She moved to Alpena to start over, she said, as a recovering addict whod lived in a homeless shelter.

But with her customers generosity, she could see the clouds starting to clear. She even moved into her own home the same week.

Im gonna build a future because of this, she told CNN affiliate WXYZ. My kids have a future, and I have a home. Its a really big deal.

Tipping servers for the New YearThe kind act that landed Franzoni $2,020 is similar to another tipping challenge Tip the Bill which took off in 2018. Customers were encouraged to tip 100% and surprise their servers.

It seems the only stipulation of the 2020 tipping challenge is to keep the year in the total.

If you partake in New Year tipping, you dont have to go big Franzoni told the Alpena Times she later tipped a server $20.20 on her dinner bill.

That was my pay-it-forward,' she said. I couldnt do the other one.

Read more from the original source:
A restaurant server got a massive tip and it is helping to change her life - KFOR Oklahoma City

Read More..

Cloud Hopper Attacks Far More Extensive than First Thought – CloudWedge

Home > News > Cloud Hopper Attacks Far More Extensive than First Thought

Chinese hacker group APT10 has been plundering the cloud installations for dozens of businesses for over three years, but a news report by Reuters made their actions public knowledge. Now, further digging into the scandal has revealed that the groups impact was far more extensive than initially suspected. Several major cloud providers have fallen prey to the group. However, many companies have failed to inform their clients that they may be the victims of this particular hack. Providers, hoping to protect their reputation, had simply told their clientele that the issue was dealt with when it wasnt.

A report issued by the Wall Street Journalon the 30th of December, 2019 notes that at least a dozen cloud providers werecaught in the breach, including massive brands like IBM and Canadas CGI Group.Managed service providers are the ideal target for these hackers since oncethey breach the initial security, they have access to any of the data that thecompanies which use the service have stored on the server.

The WSJ report comes on the heels of aReuters scoop last year, which initially broke the news about APT10 and CloudHopper. The newest findings mention that over 10,000 records of US NavyPersonnel were taken. The impact on company reputations has made it difficultfor service providers to disclose details about the attack. However, the lackof knowledge about the events makes it even more difficult for cybersecurityfirms and departments to work out what happened. The UKs National CyberSecurity Centre issued warnings to companies that they should be extremely waryof cloud providers that are unwilling to share information about securitybreaches.

Over the last year since the story broke,APT10 has gone mostly silent. The US Justice Department has arrested twoindividuals it thinks took an active part in the campaign. However, certainsecurity companies still report software within the cloud pinging known APT10IPs, making it likely that the group is still operational in some way.

See the rest here:
Cloud Hopper Attacks Far More Extensive than First Thought - CloudWedge

Read More..

2019: The Year in Hyperscale – Data Center Frontier

Data is the foundation of the new economy, and many of the most popular stories at Data Center Frontier in 2019 focused on the role of the hyperscale data center a super-sized version of the mission-critical facilities that house the servers powering the Internet.

Heres a look back at Data Center Frontiers coverage of the hyperscale sector in 2019, including our special reports and, features on the latest noteworthy trends and projects.

Data Center Frontiers special report series works to highlight leading trends and markets in the data center and colocation space. Hyperscale insight and news on the future of the market dotted reportsthroughout 2019, including the below top posts:

The Hyperscale Data Center Drives the Global Cloud RevolutionThe hyperscale data center is reshaping the global IT landscape, shifting data from on-premises computer rooms and IT closets to massive centralized data center hubs. Explore further how cloud campuses will continue to enable operators to rapidly add server capacity and electric power.

The Future of Hyperscale ComputingAt Data Center Frontier,we believe the next 10 to 15 years will be an eraof continuous advancement, as next-generationtechnologies accelerate the digitization of theglobal economy. Heres our take on the road ahead for hyperscale.

Reshaping the Global IT Landscape: The Impact of Hyperscale Data CentersA new report explores howhyperscale computing has changedthe supply chain for digital infrastructure, and how this computing shift will evolve.This excerpt dives into the impact of hyperscale computing.

Hyperscale Computing Poised to Make Dramatic Impact on Service ProvidersThe growth of hyperscale computing has major implication for data center service providers, drivingsegmentation within the service provider universe.This special report excerpt highlights the implications of major growth of hyperscale computing for data serviceproviders, as well as the future of hyperscale.

How Hyperscale Customers & Data Centers Are UniqueThis special report excerpt from a Data Center Frontier report series explores the influence of hyperscale customers, the geography of hyperscale data centers and what sets them apart.

Who Are the Data Center Industrys Hyperscale Players?Hyperscale computing has changedthe supply chain for digital infrastructure, and how this computing shift will evolve.This special report excerpt dives into the impact of hyperscale computing.

Find out where data center development is ebbing and flowing, the state of data center supply on a large scale, new epicenters of the cloud market and more:

Northern Virginia: Less Hyper, Still Plenty of ScaleAfter years of rocket-ship growth, the Northern Virginia data center market is still flying high, but in a slightly lower orbit than in 2018. Industry CEOs, executives and analysts discuss trends in Data Center Alley.

HyperSpeed: How the Data Center Supply Chain Keeps Pace (Roundtable)Hyperscale deployments are moving faster than ever. Our DCF Executive Roundtable outlines ways the data center supply chain keep pace with the rapid growth of digital infrastructure.

Amazon Web Services data centers in Arcola in Loudoun County, Virginia. (Photo: Rich Miller)

A New Cloud Corridor Emerges South of AshburnA new cloud corridor is emerging in Northern Virginia, as data center developers lock down development parcels just west of Dulles Airport. Recent deals value data center sites at more than $1 million an acre.

The State of the Hyperscale Data Center (Roundtable)Our Data Center Frontier Executive Roundtable looks at the hyperscale data center market with insights from Randy Rowland of Cyxtera, James Leach of RagingWire, Rick Crutchley of Iron Mountain, David Richards of Chatsworth, Phillip Marangella from EdgeConneX and Intel Softwares Jeff Klaus.

Hyperscale Leasing is Booming: Can The Supply Chain Keep Pace?Wholesale data center leasing doubled during 2018, spurred by a proliferation of huge hyperscale deals. That rapid pace of growth is testing the industrys supply chain and construction staffing.

Facebook, Amazon, and Microsoft all made splashes in the hyperscale and large data center market in 2019. Find out where these tech giants and more are scaling up growth as the data boom continues:

Google Building More Data Centers for Massive Future Clouds:Google is building more data centers in more places than ever before, and appears likely to make good on its pledge to invest $13 billion in new data center campuses in 2019.

Amazon Continues Its Cloud Expansion in Northern VirginiaAmazon Web Services continues to lease large amounts of new data center space in Northern Virginia, and may be boosting the size of its data centers.

Compass Scales Up for Growth in Top Hyperscale MarketsWith a new round of funding, Compass Datacenters is building big in the three hottest hyperscale data center markets. CEO Chris Crosby provides an update on the companys hyperscale ambitions.

The Equinix LD10 Data Center in Slough, England. (Photo: Equinix)

Going xScale: Equinix, GIC Partner on Hyperscale Data Centers in EuropeEquinix will form a joint venture with Singapores sovereign wealth fund GIC to build six new hyperscale data centers in Europe. The JV will target the FLAP markets of Frankfurt, London, Amsterdam and Paris.

Microsoft Plans Zero-Carbon Data Center in SwedenMicrosoft says its new data centers in Sweden will be its most advanced and sustainable to date by integrating renewable energy sources, and a new data center design.

Facebook Will Begin Selling Wholesale Fiber CapacityAs Facebook builds more fiber routes to move data traffic between its data centers, it will begin selling unused capacity to other companies, effectively entering the wholesale fiber business.

Where Fields Become Clouds: The CyrusOne Approach to Speed and ScaleThe growth of the CyrusOne Phoenix campus demonstrates why the company has been a pioneer in the rapid growth of the data center industry, advancing new ways of building at speed and scale. Its next phase of growth will target European markets.

QTS Partners to Fund Hyperscale Data Centers, in Manassas and BeyondQTS Data Centers has formed a joint venture with global infrastructure investor Alinda Capital Partners that could provide up to $1 billion in funding for future data center construction, starting with a campus in Manassas, Virginia.

Keep pace with the fact-moving world of data centers by following us onTwitterandFacebook, connecting with Data Center Frontier editor Rich Miller onLinkedIn, and signing up for our weekly newsletter using the form below.

Read more here:
2019: The Year in Hyperscale - Data Center Frontier

Read More..