Page 4,202«..1020..4,2014,2024,2034,204..4,2104,220..»

Getting serious about research ethics: Security and Internet Measurement – Freedom to Tinker

[This blog post is a continuation of our series about research ethics in computer science that we started last week]

Research projects in the information security and Internet measurement sub-disciplines typically interact with third-party systems or devices to collect a large amounts of data. Scholars engaging in these fields are interested to collect data about technical phenomenon. As a result of the widespread use of the Internet, their experiments can interfere with human use of devices and reveal all sorts of private information, such as their browsing behaviour. As awareness of the unintended impact on Internet users grew, these communities have spent considerable time debating their ethical standards at conferences, dedicated workshops, and in journal publications. Their efforts have culminated in guidelines for topics such as vulnerability disclosure or privacy, whereby the aim is to protect unsuspecting Internet users and human implicated in technical research.

Prof. Nick Feamster, Prof. Prateek Mittal, moderator Prof. Elana Zeide, and I discussed some important considerations for research ethics in a panel dedicated to these sub-disciplines at the recent CITP conference on research ethics in computer science communities. We started by explaining that gathering empirical data is crucial to infer the state of values such as privacy and trust in communication systems. However, as methodological choices in computer science will often have ethical impacts, researchers need to be empowered to reflect on their experimental setup meaningfully.

Prof. Feamster discussed several cases where he had sought advice from ethical oversight bodies, but was left with unsatisfying guidance. For example, when his team conducted Internet censorship measurements (pdf), they were aware that they were initiating requests and creating data flows from devices owned by unsuspecting Internet users. These new information flows were created in realms where adversaries were also operating, for example in the form of a government censors. This may pose a risk to the owners of devices that were implicated in the experimentation and data collection. The ethics board, however, concluded that such measurements did not meet the strict definition of human subjects research, which thereby excluded the need for formal review. Prof. Feamster suggests computer scientists reassess how they think about their technologies or newly initiated data flows that can be misused by adversaries, and take that into account in ethical review procedures.

Ethical tensions and dilemmas in technical Internet research could be seen as interesting research problems for scholars, argued Prof. Mittal. For example, to reason about privacy and trust in the anonymous Tor network, researchers need to understand to what extent adversaries can exploit vulnerabilities and thus observe Internet traffic of individual users. The obvious, relatively easy, and ethically dubious measurement would be to attack existing Tor nodes and attempt to collect real-time traffic of identifiable users. However, Prof. Mittal gave an insight into his own critical engagement with alternative design choices, which led his team to create a new node within Princetons university network that they subsequently attacked. This more lab-based approach eliminates risks for unsuspecting Internet users, but allowed for the same inferences to be done.

I concluded the panel, suggesting that ethics review boards at universities, academic conferences, and scholarly journals engage actively with computer scientists to collect valuable data whilst respecting human values. Currently, a panel on non-experts in either computer science or research ethics are given a single moment to judge the full methodology of a research proposal or the resulting paper. When a thumbs-down is issued, researchers have no or limited opportunity to remedy their ethical shortcomings. I argued that a better approach would be an iterative process with in-person meetings and more in-depth consideration of design alternatives, as demonstrated in a recent paper about Advertising as a Platform for Internet measurements (pdf). This is the approach advocates in the Networked Systems Ethics Guidelines. Cross-disciplinary conversation, rather than one-time decisions, allow for a mutual understanding between the gatekeepers of ethical standards and designers of useful computer science research.

See the video of the panel here.

View original post here:
Getting serious about research ethics: Security and Internet Measurement - Freedom to Tinker

Read More..

Enterprises Encounter Cloud Storage Cost and Management Challenges – Enterprise Storage Forum

Although it seems that enterprises are flocking to the cloud for their IT needs, data storage in particular, a new survey from DataCore Software suggests that a good number of organizations are running into trouble during the transition.

"Challenges and false starts with technologies have introduced reluctance in the industry to fully commit to software-defined, hyperconverged or a hybrid data storage infrastructure," wrote Paul Nashawaty, product evangelist and director of Technical Marketing at DataCore Software, in a blog post. "Until recently, the promise of cloud, ease of use, and faster application performance have fallen short of expectations."

Some of those expectations include storage services that don't break the budget.

Despite claims by vendors that cloud storage is cheaper than on-premises solutions, the opposite is often true. Nearly a third (31 percent) of the 426 IT professionals quizzed for the company's State of Software-Defined Storage, Hyperconverged and Cloud Storage survey (registration required) said that instead of slashing costs, their move to cloud storage increased costs instead.

Storage management was a key factor in why many organizations are having a tough time containing cloud costs. Twenty-nine percent of respondents said management proved to be more difficult on the cloud.

Continuing the theme of data storage disappointments, DataCore discovered that speedy flash storage also fell short for some organizations. Sixteen percent said the technology did little to accelerate their applications.

In terms of hyperconverged infrastructure (HCI), systems that typically integrate storage, compute, networking and virtualization, more than a third (34 percent) are strongly considering the technology. Forty-one percent define HCI as a hardware-agnostic solution that is tightly integrated with a hypervisor and 27 percent expected an integrated appliance in which hardware and software are tightly interlocked.

Generally, enterprises are considering HCI for their database workloads (34 percent) and their data center consolidation projects (28 percent). Another 28 percent of respondents are eyeing HCI to power their enterprise applications.

Meanwhile, IT executives remain wary of hybrid-cloud deployments.

More than half of the survey takers cited sensitive data and security as main reasons to avoid the cloud. Forty-seven percent said that they had no plans to move any type of application to the cloud, of either the public or hybrid variety.

A third (33 percent) said they expected to move some enterprise applications and data analytics workloads (22 percent). Cloud databases and virtual desktop infrastructure (VDI) services were appealing solutions for 21 percent and 16 percent of respondents, respectively.

Pedro Hernandez is a contributing editor at Enterprise Storage Forum. Follow him on Twitter @ecoINSITE.

Excerpt from:
Enterprises Encounter Cloud Storage Cost and Management Challenges - Enterprise Storage Forum

Read More..

Google Cloud: How to Reduce Data Storage Costs and Maximize Performance – Enterprise Storage Forum

The Google Cloud Platform (GCP) has gained serious momentum in recent months. Data storage companies such as NetApp, Veritas, Cohesity, MapR, Cloudian and Nutanix are partnering with Google in an effort to broaden the appeal of their offerings. But the Google Cloud itself is a vast universe of services, storage tiers, speeds, feeds and price points.

Google Cloud Platform offers a diverse portfolio that ranges from flexible and unified storage classes to scalable and secure databases, said Chris Talbott, head of cloud storage product marketing at Google. Its designed to handle mission-critical workloads and connect data to any application.

As such, it consists of compute, storage, databases, machine learning, analytics, networking, big data, internet of things, developer tools, management tools and security features. With so many facets to understand, how can users use GCP to reduce storage costs, maximize performance and gain competitive advantage?

Some storage and IT managers have a carte blanche from management to implement cloud-first strategies. They are under no strain to get it right the first time. They are blessed with all the time in the world to figure out the best way to learn from their mistakes and eventually arrive at the right cloud architecture for their organizations. But they represent the lucky few.

In most cases, storage managers are under the gun to show some immediate return. Within a few months, they know they will be called to the carpet to show tangible results in terms of lower storage costs and smaller budgets.

The best way to achieve that, suggested Talbott, is to look for the low-hanging fruit. One likely area, for example, is tape backup. Anyone who is going to have to invest yet again in tape hardware or an upgrade to the latest tape platform should consider the cloud. As well as offering the potential for cheaper storage, the cloud opens the door to doing something with the data (such as analytics) rather than gathering dust in a vault.

Think about underutilized data sets and easy wins, said Talbott. Many organizations are currently backing-up and archiving data to tape, requiring costly infrastructure to maintain and providing little value outside of a recovery event. Not only can you reduce the effort and costs of maintaining that on-premises you can expose those datasets to other technologies in GCP like data analytics and machine learning.

Talbott said that most users are smart enough to assess current needs. They take the time to figure out how they can use cloud storage to fulfill ongoing requirements. But not enough companies look ahead to gain some idea of how their needs may evolve in the years to come. Although it is impossible to predict such things accurately, it is wise to attempt some kind of projection of storage needs at least a couple of years into the future. That might save some embarrassment when you discover in a years time that your plan for cloud storage was hopelessly inadequate.

Storage managers are advised not to rush headlong into cloud storage decisions. They should consult other areas of IT as well as line of business heads before making any firm commitments. After all, storage is just one part of a much larger IT ecosystem. It has to be implemented sensibly in full alignment with other components and enterprise objectives.

While the cloud can reduce the cost of storage and increase the accessibility of data, choosing the right combination of systems is essential in realizing the benefits of cloud storage, said Talbott.

In addition, Talbott recommended using the cloud to take advantage of storage tiers to make information lifecycle policies work for you. As users begin to move data to the cloud, they should try to understand what data needs to be accessed and when so they can take advantage of different storage tiers. For example, GCP offers Nearline and Coldline archival storage. Nearline is best for data that is accessed a few times a year, but if you have data you dont need on a yearly basis, you could move it to Coldline to reduce costs.

To optimize the storage of an object throughout its life, it will likely spend time in each tier of storage from multi-regional to cold, said Talbott. Object lifecycle policies can automate that cascade.

Its one thing to dump data into a cloud repository as a means of reducing costs. But the real value comes in how that data is managed strategically.

Various data and storage management solutions are available from vendors such as StorEdge, Red Hat, NetApp and others. While functionality varies markedly from one tool to another, in general, they are designed to help organizations maintain control over their data, where it resides, how quickly they can access it and how they can harness it in their business.

For organizations who are looking to understand how their GCP cloud investments are meeting business expectations, NetApps data management solutions provide them with clear visibility and insight into cost, performance and data placement to better understand the impact of IT decisions, said Michael Elliott, cloud advocate, NetApp. Organizations can also address the challenges associated with regulatory, data security and sovereignty requirements by maintaining control of their data across its entire lifecycle.

Part of the reason so many storage businesses are partnering with Google is that it provides easy access to innovation and expanded markets.

Collaborate closely with Google, as they are very responsive when it comes to integrating with their services, said Patrick Rogers, head of marketing and product at Cohesity. APIs may work differently across different cloud providers, so what is possible with one provider may not exactly be the same with another.

Greed is good, said Gordon Gekko in the movie Wall Street. Google has paraphrased that to Green is good. Accordingly, the company believes those journeying to the cloud should look beyond cost savings, which are regarded almost universally as the typical measure of success for those adopting cloud storage.

In many cases, cost savings is just a start. This year Google will reach 100 percent renewable energy for all its operations, data centers included. This means users can reduce the environmental impact of their data storage and operations by taking advantage of data centers which use 50 percent less energy than a typical data center.

With the exponential increase in data being stored, it becomes increasingly important to consider how green the electrons are that power its storage, said Talbott.

Photo courtesy of Shutterstock.

See the rest here:
Google Cloud: How to Reduce Data Storage Costs and Maximize Performance - Enterprise Storage Forum

Read More..

Cisco Validates SwiftStack’s hybrid cloud storage software – Read IT Quik

SwiftStack, a leading provider of hybrid cloud storage solutions for enterprises, has announced that Cisco has validated its hybrid cloud storage software for the Cisco UCS S3260 storage server and integrated UCS Manager. It is the only object storage solution with solution bundle part numbers on Ciscos global price list, and is the only such solution with the option of Cisco Solution Support. The combination of ordering efficiency, end-to-end solution support, and validated designs provides for the ideal consumption experience for enterprises.

This Cisco Validated Design (CVD) status is the result of successful completion of comprehensive testing and documentation of SwiftStack by Cisco engineers to facilitate faster, more reliable and more predictable deployments for customers. CVDs provide the foundation for systems design based on common use cases or current engineering system priorities. They incorporate a broad set of technologies, features and applications to address customer needs. With SwiftStack software running on Cisco UCS Storage Servers, organizations can leverage the benefits of the public cloud while retaining the control and level of protection typically associated with private data centers running behind enterprise firewalls and security.

SwiftStack innovations power hybrid cloud storage for enterprises. Enabling freedom to move workloads between private data centers and public clouds like Amazon or Google, SwiftStack software delivers universal access to petabytes of unstructured data in a single namespace. Corporate data remains under the management control of internal IT, served by infrastructure that starts small and seamlessly scales huge within and across geographic regions. The result, pay-as-you-grow consumption of IT-managed resources on-premises and in the cloud.

Enterprises are looking for the benefits of public cloud storage as they modernize their business workflows and leverage hybrid cloud solutions, said Don Jaworski, CEO of SwiftStack. As a Cisco Preferred Solution Partner, we have collaborated to establish a joint solution that leverages the high density and industry-leading networking capabilities of Cisco UCS Storage Servers in a solution that Enterprises can easily and confidently consume.

SwiftStack brings the fundamental attributes of public cloud resources into the enterprise infrastructure: scalability, agility, and pricing based on consumption. Legacy applications can access and consume storage via file services, and cloud-native applications via object. Users gain the freedom to move workloads between private data centers and public clouds like Amazon or Google, and from cloud to cloud, according to administrative policies. Whether on-premise or in public cloud, data remains under the management control of internal IT, residing wherever it is needed by users and applications.

See the original post here:
Cisco Validates SwiftStack's hybrid cloud storage software - Read IT Quik

Read More..

To Protect Genetic Privacy, Encrypt Your DNA – WIRED

In 2007, DNA pioneer James Watson became the first person to have his entire genome sequencedmaking all of his 6 billion base pairs publicly available for research. Well, almost all of them. He left one spot blank, on the long arm of chromosome 19, where a gene called APOE lives. Certain variations in APOE increase your chances of developing Alzheimers, and Watson wanted to keep that information private.

Except it wasnt. Researchers quickly pointed out you could predict Watsons APOE variant based on signatures in the surrounding DNA. They didnt actually do it, but database managers wasted no time in redacting another two million base pairs surrounding the APOE gene.

This is the dilemma at the heart of precision medicine: It requires people to give up some of their privacy in service of the greater scientific good. To completely eliminate the risk of outing an individual based on their DNA records, youd have to strip it of the same identifying details that make it scientifically useful. But now, computer scientists and mathematicians are working toward an alternative solution. Instead of stripping genomic data, theyre encrypting it.

Gill Bejerano leads a developmental biology lab at Stanford that investigates the genetic roots of human disease. In 2013, when he realized he needed more genomic data, his lab joined Stanford Hospitals Pediatrics Departmentan arduous process that required extensive vetting and training of all his staff and equipment. This is how most institutions solve the privacy perils of data sharing. They limit who can access all the genomes in their possession to a trusted few, and only share obfuscated summary statistics more widely.

So when Bejerano found himself sitting in on a faculty talk given by Dan Boneh, head of the applied cryptography group at Stanford, he was struck with an idea. He scribbled down a mathematical formula for one of the genetic computations he uses often in his work. Afterward, he approached Boneh and showed it to him. Could you compute these outputs without knowing the inputs? he asked. Sure, said Boneh.

Last week, Bejerano and Boneh published a paper in Science that did just that. Using a cryptographic genome cloaking method, the scientists were able to do things like identify responsible mutations in groups of patients with rare diseases and compare groups of patients at two medical centers to find shared mutations associated with shared symptoms, all while keeping 97 percent of each participants unique genetic information completely hidden. They accomplished this by converting variations in each genome into a linear series of values. That allowed them to conduct any analyses they needed while only revealing genes relevant to that particular investigation.

Just like programs have bugs, people have bugs, says Bejerano. Finding disease-causing genetic traits is a lot like spotting flaws in computer code. You have to compare code that works to code that doesnt. But genetic data is much more sensitive, and people (rightly) worry that it might be used against them by insurers, or even stolen by hackers. If a patient held the cryptographic key to their data, they could get a valuable medical diagnosis while not exposing the rest of their genome to outside threats. You can make rules about not discriminating on the basis of genetics, or you can provide technology where you cant discriminate against people even if you wanted to, says Bejerano. Thats a much stronger statement.

The National Institutes of Health have been working toward such a technology since reidentification researchers first began connecting the dots in anonymous genomics data. In 2010, the agency founded a national center for Integrating Data for Analysis, Anonymization and Sharing housed on the campus of UC San Diego. And since 2015, iDash has been funding annual competitions to develop privacy-preserving genomics protocols. Another promising approach iDash has supported is something called fully homomorphic encryption, which allows users to run any computation they want on totally encrypted data without losing years of computing time.

Kristen Lauter, head of cryptography research at Microsoft, focuses on this form of encryption, and her team has taken home the iDash prize two years running. Critically, the method encodes the data in such a way that scientists dont lose the flexibility to perform medically useful genetic tests. Unlike previous encryption schemes, Lauters tool preserves the underlying mathematical structure of the data. That allows computers to do the math that delivers genetic diagnoses, for example, on totally encrypted data. Scientists get a key to decode the final results, but they never see the source.

This is extra important as more and more genetic data moves off local servers and into the cloud. The NIH lets users download human genomic data from its repositories, and in 2014, the agency started letting people store and analyze that data in private or commercial cloud environments. But under NIHs policy, its the scientists using the datanot the cloud service providerresponsible with ensuring its security. Cloud providers can get hacked, or subpoenaed by law enforcement, something researchers have no control over. That is, unless theres a viable encryption for data stored in the cloud.

If we dont think about it now, in five to 10 years a lot peoples genomic information will be used in ways they did not intend, says Lauter. But encryption is a funny technology to work with, she says. One that requires building trust between researchers and consumers. You can propose any crazy encryption you want and say its secure. Why should anyone believe you?

Thats where federal review comes in. In July, Lauters group, along with researchers from IBM and academic institutions around the world launched a process to standardize homomorphic encryption protocols. The National Institute for Standards and Technology will now begin reviewing draft standards and collecting public comments. If all goes well, genomics researchers and privacy advocates might finally have something they can agree on.

Read the original:
To Protect Genetic Privacy, Encrypt Your DNA - WIRED

Read More..

Cloud Encryption Market Worth 2401.9 Million USD by 2022 – Markets Insider

PUNE, India, August 23, 2017 /PRNewswire/ --

According to a new market research report "Cloud Encryption Market by Component (Solution and Service), Service Model (Infrastructure-as-a-Service, Software-as-a-Service, and Platform-as-a-Service), Organization Size, Vertical, and Region - Global Forecast to 2022", published by MarketsandMarkets, the market size is expected to grow from USD 645.4 Million in 2017 to USD 2,401.9 Million by 2022, at a Compound Annual Growth Rate (CAGR) of 30.1%.

(Logo: http://photos.prnewswire.com/prnh/20160303/792302 )

Browse 64 Market Data Tables and 45 Figures spread through 184 Pages and in-depth TOC on "Cloud Encryption Market"

http://www.marketsandmarkets.com/Market-Reports/cloud-encryption-market-158713019.html

Early buyers will receive 10% customization on this report

The demand for cloud encryption is majorly driven by stringent government regulations and the need to protect mission critical data residing on the cloud. With the rising demand for cloud and virtualization across different industry verticals, the adoption rate of cloud encryption among enterprises is expected to gain a major traction during the forecast period.

The Infrastructure-as-a-Service (IaaS) model is expected to hold the largest market share

The IaaS segment includes the offerings such as servers, storages, and networking infrastructure on-premises private cloud. This infrastructure is used to run the applications on the public cloud. It enables the organizations to reduce the total cost of ownership as the infrastructure is being provided by third-party vendors in the form of cloud-based data centers. However, virtualization introduces new security challenges. Thus, enterprises are adopting cloud encryption solution and services to run business-critical functions securely.

Ask for PDF Brochure @http://www.marketsandmarkets.com/pdfdownload.asp?id=158713019

The telecom and IT vertical is expected to grow at the fastest rate

The telecom and IT vertical involves high usage of cloud-based applications for their business operations and is thus frequently attacked by cybercriminals. Companies in this sector are adopting cloud encryption solutions so as to provide their customers risk-free services. The usage of cloud encryption has allowed users to save the important information on their mobile devices and use that information through the cloud without any risk. Therefore, cloud encryption solutions are helping telecom and IT companies in enhancing their services and providing secure information to customers while complying with regulations.

North America is expected to contribute to the largest market share; Asia Pacific to grow the fastest during the forecast period

North America is expected to have the largest market share and dominate the Cloud Encryption Market from 2017 to 2022, owing to the early adoption of new and emerging technologies and the presence of a large number of players in this region. APAC offers extensive growth avenues in the Cloud Encryption Market, owing to a widespread presence of SMEs that are extensively adopting cloud technology.

The major vendors providing cloud encryption solutions and services are Thales e-Security (La Defense, France), Gemalto N.V. (Amsterdam, Netherlands), Sophos Group plc (Abingdon, UK), Symantec Corporation (California, US), Skyhigh Networks (California, US), Netskope Inc. (California, US), CipherCloud (California, US), HyTrust, Inc. (California, US), Trend Micro Incorporated (Tokyo, Japan), Vaultive, Inc. (Massachusetts, US), and TWD Industries AG (Unteriberg, Switzerland).

Enquiry Before Buying @http://www.marketsandmarkets.com/Enquiry_Before_Buying.asp?id=158713019

Browse Related Reports

Cloud Security Market by Service Type (IAM, DLP, IDS/IPS, SIEM, and Encryption), Security Type, Service Model (IaaS, PaaS, and SaaS), Deployment Type (Public, Private, and Hybrid), Organization Size, Vertical, and Region - Global Forecast to 2022http://www.marketsandmarkets.com/Market-Reports/cloud-security-market-100018098.html

Mobile Encryption Market by Component (Solution and Services), Application (Disk Encryption, File/Folder Encryption, Communication Encryption, and Cloud Encryption), End-User Type, Deployment Type, Vertical, and Region - Global Forecast to 2022http://www.marketsandmarkets.com/Market-Reports/mobile-encryption-market-120317676.html

Know More About our Knowledge Store @http://www.marketsandmarkets.com/Knowledgestore.asp

About MarketsandMarkets

MarketsandMarkets provides quantified B2B research on 30,000 high growth niche opportunities/threats which will impact 70% to 80% of worldwide companies' revenues. Currently servicing 5000 customers worldwide including 80% of global Fortune 1000 companies as clients. Almost 75,000 top officers across eight industries worldwide approach MarketsandMarkets for their painpoints around revenues decisions.

Our 850 fulltime analyst and SMEs at MarketsandMarkets are tracking global high growth markets following the "Growth Engagement Model - GEM". The GEM aims at proactive collaboration with the clients to identify new opportunities, identify most important customers, write "Attack, avoid and defend" strategies, identify sources of incremental revenues for both the company and its competitors. MarketsandMarkets now coming up with 1,500 MicroQuadrants (Positioning top players across leaders, emerging companies, innovators, strategic players) annually in high growth emerging segments. MarketsandMarkets is determined to benefit more than 10,000 companies this year for their revenue planning and help them take their innovations/disruptions early to the market by providing them research ahead of the curve.

MarketsandMarkets' flagship competitive intelligence and market research platform, "RT" connects over 200,000 markets and entire value chains for deeper understanding of the unmet insights along with market sizing and forecasts of niche markets.

Contact:Mr. RohanMarketsandMarkets701 Pike StreetSuite 2175, Seattle,WA 98101, United StatesTel: +1-888-600-6441Email: rel="nofollow">sales@marketsandmarkets.com

Visit Our Blog @ http://www.marketsandmarketsblog.com/market-reports/telecom-itConnect with us on LinkedIn @ http://www.linkedin.com/company/marketsandmarkets

Read the original post:
Cloud Encryption Market Worth 2401.9 Million USD by 2022 - Markets Insider

Read More..

Why 2017 Is The Year To Understand Cloud Computing – Nasdaq

The Cloud has become a major buzzword in business for very good reason. Small businesses and large enterprises alike cantake advantage of cloud computingto build and expand the computer based infrastructurebehind the scenes. Follow this guide to better understand what cloud computing is, how it works, and how you can take advantage.

In the old world of web servers and internet infrastructure, websites and other online assets were typically limited to one main server, or a few linked servers using tools called load balancers, to process and send data, whether it be acustomer facing websiteor internal facing application. The advent of content delivery networks (CDNs) powered up those servers to host and serve data from the edge of the network for faster serving and sometimes lower costs.

As computing demand exploded with the rise of the smartphone and high-speed internet, consumer and business needs downstream of those servers continues to creep upward. Cloud computing has emerged as the best option to handle an array of computing needs for startups and small businesses due to the ability to start at a low cost and scale, almost infinitely, as demand grows. Advances in cloud technology at Amazon, Google, Microsoft, IBM, Oracle, and other major cloud providers is making cloud computing more desirable for all businesses.

When cloud computing first emerged, large enterprises were the only businesses able to afford the cost of elastic, flexible computing power. Now, however, those costs are more likely a drop in the bucket for small businesses.

For example, I use the cloud to store and serve videos forDenver Flash Mob, a side hustle business I run with my wife. Our monthly bill is typically around a dollar or two, and heavy months lead to a bill around five bucks. No big deal! Mylending startup Money Molais also cloud based, with costs to run both a development server and public facing server running us around $30 per month.

The first time I logged into Amazon Web Services (AWS) it seemed like I needed a computer science degree to use it! I had a hard time doing even basic tasks outside of uploading and sharing videos. Thankfully Amazon has made using AWS much easier, though it is not without its challenges.

Im a pretty techy guy, so my skillset is a bit more advanced than the average computer user. I have setup AWS to send outgoing transactional emails,automatically backup websites, and more on my own. If you are willing and able to hire a cloud expert, the possibilities of the cloud are endless. Anything from web hosting to artificial intelligence and big data analysis can run in the cloud.

The most basic way to get started with cloud computing is website and computer backups. If you use WordPress for your website, setting up cloud backups is simple with one of a handful of plugins likeUpdraft Plus. If you can use the WordPress dashboard, you can setup cloud backups with Updraft plus. It is quick and easy and includes out of the box support. Easy from companies like AWS, Drobox, Google Drive, Rackspace Cloud, and other services. The paid plugin version adds access to Microsoft OneDrive and Azure, Google Cloud Storage, and other options.

I runseveral backups of both my laptop and my web based assets. If my home were to be burglarized or burned down, the cloud has me covered. If my laptop is stolen, I have a backup at home and in the cloud. Redundant backups are not optional, they are a must in 2017.

In addition to safe, secure backups, the cloud can reach far corners of the planet. Utilizingcloud based CDNs, you know your customers will get every video and web page they want with near instant speeds.

Lets say your business has a popular video you want to share around the world. With acloud CDN, you upload your video once to the web. Then the CDN takes over and creates copies of that video file in data centers around the world. Whenever a customer clicks to view that video, they are served a copy from the closest data center to their location.

Thanks to the power of a CDN, you dont have to send viewers in Australia, London, Bangkok, and Buenos Aires a video from your web server in Texas. Each one gets a local copy so they get their video even faster, offering a better customer experience. App based businesses can even run multiple versions of their app in data centers around the world. This will nsure every user has the same great experience.

It doesnt matter what your business does, there is some way the cloud can help you achieve better results. The cloud is only going to grow and become more prominent in business. Older computer methods will go the way of the fax machine. If you want serious computing success with scalability and flexibility, the cloud is your best option.

This article was originally published on Due.com.

The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.

Read the original post:
Why 2017 Is The Year To Understand Cloud Computing - Nasdaq

Read More..

The Benefits of Multi-Cloud Computing Architectures for MSPs – MSPmentor

Multi-cloud computing architectures are the next step up from cloud computing.

If you're an MSP, it may no longer be enough to have just one cloud.

Here's why a multi-cloud strategy can helped managed services providers.

As the term implies, multi-cloud computing refers to the use of more than one cloud.

A multi-cloud architecture could involve multiple public clouds -- such as AWS, Azure and Google Cloud Platform.

Multi-cloud could also take the form of a mixture of different types of clouds -- a public cloud, a private cloud and a managed cloud, for example.

In the latter sense, there is some overlap between multi-cloud architectures and hybrid architectures, which mix public and private clouds together.

Think of hybrid cloud as one form of multi-cloud computing.

Multi-cloud is a broader category, because it involves mixing clouds of many different types.

What do businesses -- and MSPs in particular -- have to gain from a multi-cloud strategy?

Consider the following advantages of a multi-cloud architecture:

Go here to see the original:
The Benefits of Multi-Cloud Computing Architectures for MSPs - MSPmentor

Read More..

VMware shares to surge more than 20% because the Amazon cloud threat is overblown: Analyst – CNBC

Wall Street rarely talks about its mistakes, but Deutsche Bank admitted it overestimated the Amazon Web Services threat to VMware's business.

The firm raised its rating for VMware shares on Monday to buy from hold, saying the company's server virtualization software can continue to thrive in a cloud-computing world.

"We've spent much of the last two years worried about VMware's on-premise core server business given its maturity and the threat from AWS/Cloud adoption [Amazon Web Services]," analyst Karl Keirstead wrote in a note to clients entitled "Overcoming our AWS fears."

"This upgrade should be seen in the context of growing evidence that large enterprises are embracing a hybrid model, materially lowering the out-year risk profile of VMware shares."

The hybrid model is defined by companies using both local servers on-site and cloud-computing servers off-site. Keirstead said he realized the staying power of VMWare's on-site server market was more "durable" than he originally forecast.

"We believe that large enterprises are migrating IT workloads to the public cloud model at a slower-than-expected pace and are electing to ramp spending to modernize their on-premise IT infrastructures," he wrote. "Our recent checks agree that VMware technology is proving to be more durable than they would have thought 12-18 months ago."

As a result, Keirstead increased his VMware price target to $120, which is 24 percent higher than Monday's close. His previous price target was $110.

VMware shares are outperforming the market this year. Shares have risen 23.2 percent year to date through Monday compared with the S&P 500's 8.5 percent gain.

The analyst said he is also cautiously optimistic about the VMware and Amazon AWS strategic partnership announced in October, which enables access to AWS computing power for the company's customers.

"We are positive on the deal for both parties. It is hard to imagine how this could end up being a net negative for either party," he wrote. "We conclude that the stock can still work even if the initial lift from VMware Cloud on AWS is modest."

VMware will report second-quarter earnings on Thursday after the market close. Its stock traded up 1.8 percent short after Tuesday's market open.

CNBC's Michael Bloom contributed to this story.

Read this article:
VMware shares to surge more than 20% because the Amazon cloud threat is overblown: Analyst - CNBC

Read More..

Microsoft acquires cloud computing firm Cycle Computing to boost … – The News Minute


The News Minute
Microsoft acquires cloud computing firm Cycle Computing to boost ...
The News Minute
To accelerate big computing in the Cloud, Microsoft has acquired Cycle Computing, a leader in cloud computing orchestration, for an undisclosed sum. With this ...
Cycle Computing will make Microsoft Azure more appealing to more enterprisesTechRepublic
Microsoft acquires cycle computing for users to accelerate to cloudETCIO.com
Microsoft acquires Cloud firm Cycle ComputingBusiness Standard
DeathRattleSports.com
all 19 news articles »

Here is the original post:
Microsoft acquires cloud computing firm Cycle Computing to boost ... - The News Minute

Read More..