Category Archives: Cloud Computing
by Chris Woodford. Last updated: March 19, 2017.
History has a funny way of repeating itself, or so they say. But it may come as some surprise to find this old clich applies just as much to the history of computers as to wars, revolutions, and kings and queens. For thelast three decades, one trend in computing has been loud and clear:big, centralized, mainframe systems have been “out”;personalized, power-to-the-people, do-it-yourself PCs have been “in.”Before personal computers took off in the early 1980s, if yourcompany needed sales or payroll figures calculating in a hurry, you’dmost likely have bought in “data-processing” servicesfrom another company, with its own expensive computer systems, thatspecialized in number crunching; these days, you can do the jobjust as easily on your desktop with off-the-shelf software. Or canyou? In a striking throwback to the 1970s, many companies arefinding, once again, that buying in computer services makes morebusiness sense than do-it-yourself. This new trend is called cloudcomputing and, not surprisingly, it’s linked to the Internet’sinexorable rise. What is cloud computing? How does it work? Let’stake a closer look!
Photo: Cloud computing: the hardware, software, and applicationsyou’re using may be anywhere up in the “cloud.” As long as it all does what you want, you don’t needto worry where it is or how it works.
Cloud computing means that instead of all the computer hardware and software you’re using sitting on your desktop, or somewhere inside your company’s network, it’sprovided for you as a service by another company and accessedover the Internet, usually in a completely seamless way. Exactlywhere the hardware and software is located and how it all worksdoesn’t matter to you, the userit’s just somewhere up in thenebulous “cloud” that the Internet represents.
Cloud computing is a buzzword that means different things to different people. For some, it’s justanother way of describing IT (information technology) “outsourcing”;others use it to mean any computing service provided over theInternet or a similar network; and some define it as any bought-in computerservice you use that sits outside your firewall. However wedefine cloud computing, there’s no doubt it makes most sense when westop talking about abstract definitions and look at some simple, realexamplesso let’s do just that.
Screenshot: Soundcloudone of my favorite examples of a website (and mobile app) that uses cloud computing to good effect. Musicians and DJs upload their music, which “followers” can listen to (or preview) for free through real-time streaming. You can build up a personal collection of tracks you like and access them from any device, anytime, anywhere. The music you listen to stays up in the cloud: in theory, there is only ever one copy of every music file that’s uploaded. Where is the music stored? No-one but Soundcloud needs to knowor care.
Most of us use cloud computing all day long without realizing it. When you sit atyour PC and type a query into Google, the computer on your desk isn’tplaying much part in finding the answers you need: it’s no more thana messenger. The words you type are swiftly shuttled over the Net toone of Google’s hundreds of thousands of clustered PCs, which dig out your results and send them promptly back to you. When you do a Googlesearch, the real work in finding your answers might be done by acomputer sitting in California, Dublin, Tokyo, or Beijing; you don’t knowand most likely you don’t care!
The same applies to Web-based email. Once upon a time, email was something you could only send and receive using a program running on your PC (sometimes called a mail client). But thenWeb-based services such as Hotmail came along and carried email offinto the cloud. Now we’re all used to the idea that emails can bestored and processed through a server in some remote part of the world, easilyaccessible from a Web browser, wherever we happen to be. Pushing email off intothe cloud makes it supremely convenient for busy people, constantly on the move.
Preparing documents over the Net is a newer example of cloud computing. Simply log on toa web-based service such as Google Documents and you cancreate a document, spreadsheet, presentation, or whatever you like usingWeb-based software. Instead of typing your words into a program like Microsoft Word orOpenOffice, running on your computer, you’re using similar software running on a PC at one of Google’s world-wide data centers. Like an email drafted on Hotmail,the document you produce is stored remotely, on a Web server, so you can access it from anyInternet-connected computer, anywhere in the world, any time you like. Do you knowwhere it’s stored? No! Do you care where it’s stored? Again, no! Using a Web-basedservice like this means you’re “contracting out” or “outsourcing”some of your computing needs to a company such as Google: they pay the cost ofdeveloping the software and keeping it up-to-date and they earn backthe money to do this through advertising and other paid-for services.
“You don’t generate your own electricity. Why generate your own computing?”
Jeff Bezos, Amazon.
Most importantly, the service you use is provided by someone else andmanaged on your behalf. If you’re using Google Documents, you don’thave to worry about buying umpteen licenses for word-processingsoftware or keeping them up-to-date. Nor do you have to worryabout viruses that might affect your computer or about backing up thefiles you create. Google does all that for you.One basic principle of cloud computing is that you no longer need to worry how the service you’re buying is provided: with Web-based services, you simply concentrate on whateveryour job is and leave the problem of providing dependablecomputing to someone else.
Cloud services are available on-demand and often bought on a “pay-as-you go” orsubscription basis. So you typically buy cloud computing the same wayyou’d buy electricity, telephone services, or Internet access from autility company. Sometimes cloud computing is free or paid-for inother ways (Hotmail is subsidized by advertising, for example). Just like electricity, you can buy as much or as little of a cloud computing service as you need fromone day to the next. That’s great if your needs vary unpredictably: it means you don’t have to buy your own gigantic computersystem and risk have it sitting there doing nothing.
Now we all have PCs on our desks, we’re used to having complete control overour computer systemsand complete responsibility for them as well. Cloud computing changes all that. It comes in two basic flavors, public and private, which are the cloud equivalents of the Internet and Intranets. Web-based email and free services like the ones Google provides are the most familiar examplesof public clouds. The world’s biggest online retailer, Amazon, becamethe world’s largest provider of public cloud computing in early 2006. Whenit found it was using only a fraction of its huge, global, computing power, it started renting out its spare capacity over the Netthrough a new entity called Amazon Web Services (AWS). Private cloud computing works in much the same way but you access the resources you usethrough secure network connections, much like an Intranet. Companies such as Amazon alsolet you use their publicly accessible cloud to make your own secure private cloud,known as a Virtual Private Cloud (VPC), using virtual private network (VPN) connections.
IT people talk about three different kinds of cloud computing, where differentservices are being provided for you. Note that there’s a certain amount of vaguenessabout how these things are defined and some overlap between them.
What’s good and bad about cloud computing?
The pros of cloud computing are obvious and compelling. If your business is sellingbooks or repairing shoes, why get involved in the nitty gritty ofbuying and maintaining a complex computer system? If you run aninsurance office, do you really want your sales agents wasting timerunning anti-virus software, upgrading word-processors, or worryingabout hard-drive crashes? Do you really want them cluttering yourexpensive computers with their personal emails, illegally shared MP3 files,and naughty YouTube videoswhen you could leave that responsibilityto someone else? Cloud computing allows you to buy in only theservices you want, when you want them, cutting the upfront capital costs of computers and peripherals. You avoid equipment going out of date andother familiar IT problems like ensuring system security and reliability.You can add extra services (or take them away) at a moment’s notice as your business needs change.It’s really quick and easy to add new applications or services to yourbusiness without waiting weeks or months for the new computer (andits software) to arrive.
Photos: Cloud computing: forward to the future… or back to the past? In the 1970s, the Apple ][ became the world’s first, bestselling small business computer thanks to a killer-application called VisiCalc, the first widely available computer spreadsheet. It revolutionized business computing, giving middle managers the power to crunch business data on their desktops, all by themselves, without relying on slow, centralized computer departments or bought-in data processing. Critics are concerned that cloud computing could be disempoweringa throwback to the 1970s world of centralized, proprietary computing.
Instant convenience comes at a price. Instead of purchasing computers and software, cloudcomputing means you buy services, so one-off, upfront capitalcosts become ongoing operating costs instead. That might work outmuch more expensive in the long-term.
If you’re using software as a service (for example, writing a report using an online word processor or sending emailsthrough webmail), you need a reliable, high-speed, broadband Internet connectionfunctioning the whole time you’re working. That’s something we take for granted in countries such as the United States, but it’s much more of an issue in developing countries or rural areas where broadband is unavailable.
If you’re buying in services, you can buy only what people are providing, so you may be restrictedto off-the-peg solutions rather than ones that precisely meet your needs.Not only that, but you’re completely at the mercy of your suppliers if they suddenlydecide to stop supporting a product you’ve come to depend on.(Google, for example, upset many users when it announced in September 2012 that its cloud-based Google Docs would drop support for oldbut de facto standard Microsoft Office file formats such as .DOC, .XLS, and .PPT, giving a mere one week’s noticeof the changealthough, after public pressure, it later extended the deadlineby three months.) Critics charge that cloud-computing is a return to the bad-old days of mainframes andproprietary systems, where businesses are locked into unsuitable, long-termarrangements with big, inflexible companies. Instead of using”generative” systems (ones that can be added to and extended in exciting waysthe developers never envisaged), you’re effectively using “dumb terminals” whose uses are severelylimited by the supplier. Good for convenience and security, perhaps, but whatwill you lose in flexibility? And is such a restrained approach good for the futureof the Internet as a whole? (To see why it may not be, take a look at Jonathan Zittrain’s eloquent bookThe Future of the InternetAnd How to Stop It.)
Think of cloud computing as renting a fully serviced flat instead of buying a home of yourown. Clearly there are advantages in terms of convenience, butthere are huge restrictions on how you can live and what you can alter. Willit automatically work out better and cheaper for you in the long term?
We’ve just had a quick and simple sketch of cloud computingand if that’s allyou need, you can stop reading now. This section fills in some of the details, asks some deeper questions,looks at current trends, such as the shift to mobile devices, and explores challenging issues like privacy and security.
The figures speak for themselves: in every ITsurvey, news report, and pundit’s op-ed, cloud computing seems theonly show in town. Back in 2008, almost a decade ago, the PewInternet project reported that 69 percent of all Internet users had”either stored data online or used a web-based softwareapplication” (in other words, by their definition, used some formof cloud computing). In 2009, Gartner priced the value of cloudcomputing at $58.6 billion, in 2010 at $68.3 billion, and in 2012 atover $102 billion. In 2013, management consultants McKinsey andCompany forecast cloud computing (and related trends like big data,growing mobilization, and the Internet of Things) could have a”collective economic impact” of between $1020 trillion by 2025.In 2016, Amazon revealed that its AWS offshoot, the world’s biggest provider of cloud computing, is now a $10 billion-a-year business;the Microsoft Cloud isn’t far behind.
So the numbers keep on creeping up and it’s anexciting trend, to be sure. But there’s one important word ofcaution: how you measure and forecast something as vague as “thecloud” depends on how you define it: if the definition keepsexpanding, perhaps that’s one reason why the market keeps expandingtoo? Way back in the 1990s, no-one described Yahoo! Mail or Hotmailas examples of cloud computing, Geocities was simply a community ofamateur websites, and Amazon and eBay were just new ways of findingand buying old stuff. In 2010, in its breathless eagerness to talk upcloud computing, the Pew Internet project had rounded up everyweb-based service and application it could think of and fired it tothe sky. WordPress and Twitter were examples of cloud blogging,Google Docs and Gmail were cloud-based, and suddenly so too wereYahoo! Mail, buying things from eBay and Amazon, and even (bizarrely)RSS feeds (which date back to the late 1990s). Using “the cloud”as a loose synonym for “the Web,” then expressing astonishmentthat it’s growing so fast seems tautologous at best, since we knowthe Internet and Web have grown simply by virtue of having moreconnected users and more (especially more mobile) devices. Accordingto Pew, what these users prized were things like easy access toservices from absolutely anywhere and simple data storing or sharing.This is a circular argument as well: one reason we like “the cloud”is because we’ve defined it as a bunch of likeable websitesFacebook,Twitter, Gmail, and all the rest.
Businesses have shrewder and more interestingreasons for liking the cloud. Instead of depending on MicrosoftOffice, to give one very concrete example, they can use free,cloud-based open-source alternatives such as Google Docs. So thereare obvious cost and practical advantages: you don’t have to worryabout expensive software licenses or security updates, and your staffcan simply and securely share documents across business locations(and work on them just as easily from home). Using cloud computing torun applications has a similarly compelling business case: you canbuy in as much (or little) computing resource as you need at anygiven moment, so there’s no problem of having to fund expensiveinfrastructure upfront. If you run something like an ecommercewebsite on cloud hosting, you can scale it up or down for the holidayseason or the sales, just as you need to. Best of all, you don’t needa geeky IT department becausebeyond commodity computers runningopen-source web browsersyou don’t need IT.
When we say cloud computing is growing, do we simply meanthat more people (and more businesses) are usingthe Web (and using it to do more) than they used to? Actually we doand that’s why it’simportant not to be too loose with our definitions. Cloud web hostingis much more sophisticated than ordinary web-hosting, for example,even thoughfrom the viewpoint of the webmaster and the personaccessing a websiteboth work in almost exactly the same way. Thisweb page is coming to you courtesy of cloud hosting where, a decadeago, it ran on a simple, standalone server. It’s running on the sameopen-source Apache server software that it used then and you canaccess it in exactly the same way (with http and html). Thedifference is that it can cope with a suddenly spike in traffic inthe way it couldn’t back then: if everyone in the United Statesaccessed this web page at the same time, the grid of servers hostingit would simply scale and manage the demand intelligently. The photosand graphics on the page (and some of the other technical stuff thathappens behind the scenes) are served from a cloud-based ContentDelivery Network (CDN): each file comes from a server in Washington, DC, Singapore,London, or Mumbai, or a bunch of other “edge locations,” depending on where in the world you (the browser) happen to be.
This example illustrates three key points ofdifference between cloud-based services and applications and similarones accessed over the web. One is the concept of elasticity(which is a similar idea to scalability):a cloud service or application isn’t limited to what a particularserver can cope with; it can automatically expand or contract itscapacity as needed. Another is the dynamic nature of cloudservices: they’re not provided from a single, static server. A third,related idea is that cloud services are seamlesswhetheryou’re a developer or an end user, everything looks the same,however, wherever, and with whatever device you use it.
Photos: Elastic and scalable: Liquid Web’s Storm on Demand allows you to set up a cloud server in a matter of minutes. With a couple of mouse clicks, you can resize your server (upgrade or downgrade the memory, for example)to cope with changes in demandfor example, in the run up to a Black Friday sale. Every aspect of the service is pay-as-you-go. It’s easy to use even if you have little or no experience of setting up or managing dedicated servers.
One of the biggest single drivers of cloudcomputing is the huge shift away from desktop computers to mobiledevices, which (historically, at least) had much less processingpower onboard. Web-connected smartphones, tablets, Kindles, and othermobile devices are examples of what used to be called “thinclients” or “network computers” because they rely on theservers they’re connected to, via the network (usually the Internet),to do most of the work. A related trend referred to as bring yourown device (BYOD) reflects the way that many companies now allowtheir employees to logon to corporate networks or websites usingtheir own laptops, tablets, and smartphones.
From the smartphone in your pocket to the mythicalfridge that orders your milk, the number and range of devicesconnected to the Internet is increasing all the time. A new trendcalled the Internet of Thingsanticipates a massive increase inconnected devices as everyday objects and things with built-insensors (home heating controllers, home security webcams, and evenparcels in transit) get their own IP addresses and become capable ofsending and receiving data to anything or anyone else that’s online.That will fuel the demand for cloud computing even more.
Photo: Mobile cloud: The shift to mobile devices and the growth of cloud computing are mutually reinforcing trends. Mobile devices are much more useful thanks to cloud-based apps like these, provided by Google. In other words, one reason for buying a mobile is because of the extra (cloud-based) things you can do with it. But these services are also thriving because they have ever-increasing numbers of users, many of them on smartphones.
How significant is the shift to mobile? By anymeasurement, phenomenal and dramatic. Bearing in mind that there wasonly one functioning mobile phone in 1973 when Martin Cooper made thefirst cellphone call, it’s staggering to find that there are now anestimated 8 billion mobile subscriptions (more than one for everyperson on the planet). By 2012, Goldman Sachs was telling us that 66percent of Net-connected devices were mobiles, compared to just 29percent desktops. Mobile Internet traffic finally overtook desktoptraffic in 2014/15, according to Comscore and, in response, Googlerolled out a “mobile-friendly” algorithm in 2015 to encouragewebmasters to optimize their sites so they worked equally well onsmartphones.
Cloud computing makes it possible for cellphonesto be smartphones and for tablets to do the sorts of things that weused to do on desktops, but it also encourages us to do more thingswith those devicesand so on, in a virtuous circle. For example, ifyou buy a smartphone, you don’t simply do things on your phone thatyou used to do on your PC: you spend more time online overall, usingapps and services that you previously wouldn’t have used at all.Cloud computing made mobile devices feasible, so people bought themin large numbers, driving the development of more mobile apps andbetter mobile devices, and so on.
Stare high to the sky and you can watch cloudsdrift by or, if you’re more scientific and nuanced, start to teaseout the differences between cumulus, cirrus, and stratus. In much thesame way, computing aficionados draw a distinction between differenttypes of cloud. Public clouds are provided by people such asAmazon, Google, and IBM: in theory, all users share space and time onthe same cloud and access it the same way. Many companies, forexample, use Gmail to power their Internet mail and share documentsusing Google Drivein pretty much the same way that you or I mightdo so as individuals. Private clouds work technically the sameway but service a single company and are either managed exclusivelyby that company or by one of the big cloud providers on their behalf.They’re fully integrated with the company’s existing networks,Intranet, databases, and infrastructure, and span countries orcontinents in much the same way. Increasingly, companies find neitherof these bald alternatives quite fits the billthey need elementsof eachso they opt for hybrid clouds that combine the bestof both worlds, hooking up their existing IT infrastructure to apublic cloud system provided by someone like Amazon or Google. Othertrends to watch include the development of personal clouds,where you configure your own home network to work like a mini-cloud(so, for example, all your mobile devices can store and access filesseamlessly), and peer-to-peer cloud computing, in which thedynamic, scalable power of a cloud computing system is provided notby giant data centers but by many individual, geographicallydispersed computers arriving on the network, temporarily contributingto it, and then leaving again (as already happens with collaborativescience projects like SETI@home and ClimatePrediction.net).
Security has always been an obvious concern forpeople who use cloud computing: if your data is remote and travelingback and forth over the Internet, what’s to keep it safe? Perhapssurprisingly, many IT professionals think cloud-based systems areactually more secure than conventional ones. Ifyou’re buying into Google’s, Amazon’s, or Microsoft’s cloud-based services, you’re also buyinginto world-class expertise at keeping data safe; could youor your ITteammanage security as well? Security can therefore be seen as acompelling reason to migrate to cloud-based systems rather than areason to avoid them.
Privacy is a more nuanced and complex issue. Whilewe all understand what we mean by keeping data secure, what do wemean by keeping it private in a world where users of cloud-basedservices like Twitter, Instagram, and Snapchat happily share anythingand everything online? One of the complications is so-called bigdata, the statistical (“analytic”) information that companieslike Google and Facebook gather about the way we use theircloud-based services (and other websites that use those services).Google, for example, collects huge amounts of data through itsadvertising platforms and no-one knows exactly what happens to itafterward. Facebook knows an enormous amount about what people saythey “like,” which means it can compile detailed profiles of allits users. And Twitter knows what you tweet, retweet, and favoritesoit has similar data to Facebook. The quid-pro-quo for “free”web-based services and apps is that you pay for what you use with aloss of privacy, typically to power targeted advertisements.
Another complication is that privacy meansdifferent things in different parts of the world. In Europe, forexample, the European Union has strict restrictions on how data canbe moved in bulk from one country to another or shared by companieslike Google that have multiple subsidiaries operating across countries andcontinents. While Internet-based cloud computing makes nationalboundaries obsolete, real-world laws still operate according toold-fashioned geographyand that could act as a serious brake onthe aspirations of many big cloud providers.
When it comes to the everyday web services we allenjoy, there may be different kinds of clouds on the horizon. Asweb-based advertising dwindles in effectiveness, one future concernmust be how companies like Google, Facebook, and Twitter willcontinue to fund their ever-growing, (essentially) cloud-based,services without using our data in increasingly dubious ways. Part ofthe reason for the huge growth in popularity of services like this issimply that they’re free. Would Facebook be so popular if we had topay for it through a monthly subscription? If Google Docs cost money,would we slink back to our desktop PCs and Microsoft Word? Can advertising continue to sustain an ever-growing field of cloud-basedservices and apps as the number of Internet users and Net-connecteddevices continues to grow? Watch this space!
In theory, cloud computing is environmentally friendly because it uses fewer resources(servers, cooling systems, and all the rest) and less energy if 10 people share anefficiently run, centralized, cloud-based system than if each of them run their owninefficient local system. One hosting provider in the UK told me that his company hasembraced cloud systems because it means they can handle more customers on far fewerphysical servers, with big savings in equipment, maintenance, and energy costs.In theory, cloud computing should be a big win for the environment; in practice,it’s not quite so simple.
Ironically, given the way we’ve defined cloud computing, it matters where your cloud serversare located and how they’re powered. If they’re in data centers powered by coal, instead of cleaner fuels such as natural gas or (better still) renewable energy, the overall environmental impact could be worse than your current setup. There’s been a lot of debate about the energy use of huge data centers, partly thanks to Greenpeace highlighting the issue once a year since 2009. In its 2011 report[PDF] How Dirty is Your Data Center: A Look at the Energy Choices that Power Cloud Computing, it ranked cloud computing providers like Akamai and Amazon on eco-friendliness, alongside companies like Facebook, Google, and Twitter whose services are underpinned by a massive global network of data centers. By 2017, in a report calledClicking Clean, Greenpeace was congratulating around 20 of the biggest data center operators (including Apple, Facebook, and Google) for starting on the path toward 100 percent renewable energy. In the United States in particular, quite a few cloud (and web hosting) providers explicitly state whether their servers are powered by conventional or green energy, and it’s relatively easy to find carbon-neutral service providers if that’s an important factor for your business and its CSR (corporate social responsibility) objectives.
Chart: Growth in energy use in data centers from 2007 to 2013. Drawn by us using data from the 2012 study by DatacenterDynamics (DCD) Intelligence published in Computer Weekly, October 8, 2012. I’ve struggled to find figures for the years from 2014 onward; as soon as I do, I’ll update the chart!
When it comes to overall impact on the planet, there’s another issue to consider. If cloud services simply move things you would do in your own office or home to the cloud, that’s one thing; the environmental impact merely transfers elsewhere. But a lot of cloud- and Internet-based services are encouraging us to use more computers and gadgets like iPads and iPhones for longer, spending more time online, and doing more things that we didn’t previously do at all. In that sense, cloud computing is helping to increase global energy use and greenhouse gas emissions so describing it as environmentally friendly is highly misleading.That was evident from a 2012 study by DatacenterDynamics (DCD) Intelligence, the British Computer Society, and partners(reported in Computer Weekly), which showed that global energy use from data centers grew from 12 gigawatts (GW) in 2007 to 24GW in 2011 and predicted it would reach 43GW some time in 2013. However, a follow-up study revealed a significant slowing down of the rate of growth in cloud power consumption, from 19 percent in 2011/2 to around 7 percent in 2013. Growing concerns about the impact of cloud computing have also prompted imaginative new solutions.Later in 2013, for example, researchers at Trinity College Dublin and IBM announced they’d found a way to reduce cloud emissions by over 20 percent by using smart load balancing algorithms to spread out data processing between different data centers. Even so, with cloud computing predicted to become a $5 trillion business by 2020, power consumptionseems certain to go on increasing. Ultimately, the global environment, the bottomline trendever-increasing energy consumptionis the one that matters. It’s no good congratulating yourself on switching to diet Cola if you’re drinking four times more of it than you used to. In 2016, Peter Judge of DatacenterDynamics summed things up pithily: “No one talks much about total energy used by data centers because the figures you get for that are annoying, depressing and frustrating…. The truth is: data center power is out of control.”
From Google searches to Facebook updates and super-convenient Hotmail, most of us value the benefits of cloud computing very highly, so the energy consumption of data centers is bound to increaseand ensuring those big, power-hungry servers are fueled by green energy will become increasingly important in the years to come.
Today’s corporate officers and IT technologists are literally deluged with marketing hype over cloud, and the economic advantages of migrating business and IT functions and applications to the cloud.
We have seen corporate giants like NetFlix and Sabre (global reservation system used by major airline carriers) experience cloud-based system outages, causing customer delays and significantly impacting business not just in the US, but impacted globally in the case of Sabre.
There are hundreds of cloud providers to choose from in the market today. Not all are enterprise ready, despite their claims. And let’s face it, many have paid or biased positioning in leading independent studies released by highly respected IT research companies.
How does a CIO or CTO, and today’s CEO, tasked to identify the essential drivers of a business, navigate? Leading independent think tanks (Gartner, Aberdeen, IDC, etc) are not fully equipped to provide battle-tested and deep thorough analysis regarding the critical issues of cloud security, network throughput, reliability and ROI when it comes to enterprise grade customers. Researchers can only convey what they read and disseminate their opinions, not facts. Gartner (NYSE:IT) even has in its disclaimer that all analysis is “opinion and not based on facts.”
It’s time that we get real answers and share actual experiences from companies who are using enterprise grade cloud and from true technologists that has the experience and credentials to provide an unbiased roadmap.
GOAL: We create leaner, more efficient companies that is secure, reliable and has the ROI to start training our current employees, and bringing jobs back to the US workforce.
Continue reading here:
Doug H. – Boston Cloud Computing Meetup (Boston, MA) | Meetup
In the second session on cloud computing of the III RNP Forum, representatives from hospitals and Federal Institutes of Higher Education (Institutos Federais de Ensino Superior – IFES) and of Education, Science and Technology (IFs) spoke about what they expect from cloud computing resources.
The Federal University of Rio de Janeiro (UFRJ) Professor and Superintendent of Information Technology and Communication Gabriel Pereira da Silva expressed the universities desire. We want a cloud that will solve all our problems, where we can put all our systems.
One of the cloud applications that UFRJ offers to its community is the OJS, the electronic journal service provider, to meet the search area. For Gabriel, the Capes (Coordination of Improvement of Higher Education Personnel) Journals Portal is an important element for the community, but forgets the national production of increasingly electronic research and journals. Therefore, we offer the OJS, he said.
Carlos Thiago Garantizado, from the Federal Institute of Amazonas (IFAM), showed the IFs perspective in deploying services and cloud applications. We work with infrastructure, platform, and service. The biggest challenge is to provide security. Not only to provide it, but to transmit this security to the user, he affirmed.
When submitting a SWOT matrix of cloud deployment in IFs, he highlighted as a help for the institutes integration, the Federate Academic Community (CAFe) service and the technical expertise of the teams.
Moderated by Adenilson Raniery Pontes, from the Par Museum Emilio Goeldi (MPEG), the panel also included the participation of Marco Antonio Gutierrez, who heads the Computer Service and the Medical Informatics Laboratory of the Heart Institute (Instituto do Corao – Incor).
Gutierrez noted that health information systems must be made available very quickly. According to him, the hospital area, open cloud solutions cannot be used. We need to ensure the information confidentiality and secrecy.
At the end of his speech, the officer explained the economic constraints of healthcare industry regarding cloud computing. The investment in technology within hospitals is still seen as a cost and not as an investment. Therefore, we cannot evolve into private cloud solutions due to financial issues, he stated.
Read the original here:
Cloud computing at Ifes, IFs, and hospitals | RNP
Cisco has also faced stiffening competition from rivals like the software maker VMware, which announced a partnership with Amazon last year.
Cisco and Google executives vowed to offer something different. They said companies have been struggling with the fact that they need separate tools to manage software on their own premises and those running in the cloud, a situation that sometimes causes security problems. By combining Google programming technology and Cisco networking and security software, they said, tech managers can create and manage software that can run securely in or outside their companies data centers.
The idea, said Urs Hlzle, Googles senior vice president for technical infrastructure, is to close those security gaps.
Cloud computing has been roiling the strategies of older tech companies for much of the past decade. The concept, besides letting customers sidestep the costs of buying hardware and software, can let companies deploy computing resources more quickly and flexibly.
Amazon Web Services pioneered the concept. Synergy Research Group, a market research firm, said in July that A.W.S. accounted for 34 percent of the roughly $11 billion spent on such cloud services in the second quarter, compared with 11 percent for Microsoft, 8 percent for IBM and 5 percent for Google. Amazon and Microsoft are expected to highlight progress in their cloud businesses when they report quarterly earnings on Thursday.
Google has moved aggressively to catch up. In late 2015, the company gave the job of running its cloud business to Diane Greene, a widely respected Silicon Valley entrepreneur who helped make VMwares technology a mainstay at many corporations.
She made a series of organizational changes, recruited new talent and introduced new technology features. In one important move, Google in September 2016 bought the start-up Apigee Corporation for $625 million, adding capabilities to help customers connect their operations with online services operated by others.
More mature technology companies have taken different tacks to try to hold on to customers. Some, like IBM and Oracle, offer their own cloud services. Others, like Hewlett Packard Enterprise and Dell Technologies, have shied away from engaging in a spending war in data centers against deep-pocketed internet giants.
So has Cisco. The company, based in San Jose, Calif., promoted a concept called intercloud that amounted to coordinating a federation of cloud services operated by partners.
But Cisco dropped that approach last year, choosing instead to help customers manage hybrid cloud arrangements industry parlance for using a blend of operations in a companys own data centers and those operated by a growing number of cloud services.
We think we are one of the few companies that can navigate this multi-cloud world, said David Goeckeler, executive vice president and general manager of Ciscos networking and security business.
The company has broadly signaled plans to rely more on software and services than on sales of networking hardware, aided frequently by acquisitions. On Monday, for example, Cisco said it would pay $1.9 billion for BroadSoft, which sells online communications services.
Other companies also have embraced the hybrid cloud concept. Microsoft, for example, has longtime ties with corporate software buyers and has come up with ways to run new cloud applications in its data centers or on customers premises, said Al Gillen, an analyst at the research firm IDC.
We see other vendors doing things to compete since what we have is so strong and so unique, said Julia White, a corporate vice president with Microsofts Azure cloud business.
VMware, a subsidiary of Dell, was first known for software technology called virtualization that allows more efficient use of servers but now competes with Cisco with networking software. Russ Currie, a vice president of enterprise strategy at the network monitoring specialist NetScout Systems, said VMware was effectively using its cloud alliance with A.W.S. to court customers. Pat Gelsinger, VMwares chief executive, called the announcement from Google and Cisco a validation of his own companys vision.
Cisco also cooperates in various ways with A.W.S. and Microsoft in cloud computing. But Mr. Goeckeler said that the Google relationship was particularly potent because of the technological specialties of each company.
We are both users of each others products, said Mr. Hlzle of Google. But in this case, this is about working together to give their customers the technology they want, he said.
Read this article:
Cisco and Google Find Mutual Interest in Cloud Computing …
Thereare several angles you can take if you want to add a few cloud computing stocks to your portfolio. But if you want to bet on the leaders in the business, you should start with Amazon.com, Microsoft, and Alphabet.
Research firm Gartner predicts that the public cloud services market will be worth $383.3 billion by 2020, a massive opportunity that makes it crystal clear why these three key players are in the middle of a fierce battle for dominance. There are plenty of smaller players in this segment, all offering varying types of services, but let’s focus on the giants and how they plan to compete in this space.
% of Public Cloud Market
Data source: Yahoo Finance and Business Insider.
Don’t feel bad if you’re not exactly sure what cloud computing is. The term can actually refer to several different types of cloud services, including software as a service (SaaS), platform as a service (PaaS), infrastructure as a service (IaaS).
Microsoft’s Office 365 is one of the best examples of SaaS, while PaaS includes things like Google’s App Engine tools, which allow developers to build their own products and services. Meanwhile, IaaS involves things like networking features, virtual machines, andactual data storage — which Amazon, Microsoft, and Google all offer.
In this article, we’ll use the general “cloud computing” term to refer to a combination of all of these services, because each of the big three players offers SaaS, PaaS, and IaaS.
Amazon Web Services (AWS) is the clear leader right now with 40% of the public cloud computing market, according to Synergy Research Group.So investors accustomed to thinking about it as an e-commerce play should look closer. Sure, sales from the company’s retail platform in North America accounted for nearly 59% of Amazon’s revenue in the second quarter, but AWS generates far more of the profit. Amazon’s North American e-commerce sales brought in $436 million in operating income in the second quarter of this year, while AWS hit $916 million.
AWS is the real breadwinner because of its hefty 25% margins and its stellar growth. Its revenue jumped by 70% in 2015 and 55% in 2016.
Amazon isn’t a pure play in the segment, but it’s already proved that it can do both e-commerce and cloud computing very well — and at the same time. AWS revenuegrew by 58% year over year in Q2, and it’s likely to continue growing as a key contributor to the company’s top and bottom lines.
Image source: Getty Images.
Microsoft may be known for its software prowess, but its Azure Cloud Services unit currently holds the No. 2 spot in public cloud computing with an 11% market share.
The cloud computing segment brought in $18.9 billion in fiscal 2017 — a 56% year-over-year increase — and it’s on track to hit $20 billion annually by the end of fiscal 2018.It’s worth pointing out that Microsoft lumps all of its cloud computing revenue (including Office 365 software) into one segment, which pushes its cloud revenue higher than Amazon’s.
Image source: Getty Images.
Azure is important to Microsoft because it helps lock the company’s enterprise customers into its other services like Windows Server. Without Azure, Microsoft could easily lose Windows Server clients and forfeit significant yearly sales. For example, Microsoft said in its fiscal fourth quarter that server products and cloud services revenue jumped by 15%, which was “driven by Azure revenue growth.”
Microsoft CEO Satya Nadella said back in 2014 that the company would take a “mobile first, cloud first” approach, and so far it has implemented the last part of that strategy quite well. Microsoft may trail Amazon, but it’s still far ahead of its next closest competitor, Alphabet. And the company’s long enterprise experience should help keep it solidly in that No. 2 spot.
It’s a bit unusual to have occasion to describe Alphabet as anything other than a leader, but in cloud computing, it’s still a relatively small player. Google started selling its first cloud services back in 2008, and currently, holds about 6% of the market.
But just because it’s the smallest player on this list doesn’t mean it should be counted out. In typical Google fashion, it’s building out its cloud presence by offering software like its TensorFlow machine learning algorithms for free to developers.
TensorFlow is the magic behind the Google Photos function that automatically categorizes images for you, for example, and its machine learning makes the results of your Google searches more relevant. The company started giving TensorFlow away to developers a couple of years ago as part of its efforts to woo them to the company’s cloud platform.
A recentMIT Technology Review article said that TensorFlow is “becoming the clear leader among programmers building new things with machine learning” and that “the software’s popularity is helping Google fight for a bigger share” of the cloud market because it’s easier to use on Google’s cloud than AWS or Azure.
That strategy may seem like an odd way to build a cloud business, but TensorFlow has made Amazon and Microsoft nervous enough that they’ve teamed up to release a competing machine learning software product.
The head of Google Cloud, Urs Holzle, has said that he wants revenue from Google Cloud services to overtake the company’s advertising sales by 2020. Considering that the tech giant earns nearly 90%of its top line from advertising, that timeline sounds optimistic, to say the least.
Google doesn’t break out its cloud platform revenue (nor any other segment’s sales, for that matter) but that doesn’t mean investors should overlook the potential for Alphabet here. It became a key cloud services player in a short period of time, and now offers important software that developers find very useful. And Microsoft and Amazon appear to be a little worried about Google Cloud’s rapid rise, which by itself should make it clear that it’s shaping up to be a formidable player in the space.
The cloud computing sector is like any other when it comes to investing: You need to have a long-term perspective when buying shares in any of these companies. None of these tech giants are cloud computing pure plays, which means that many other factors — like Amazon’s retail sales, Google’s advertising dollars, and Microsoft’s software sales — will affect their share prices.
But investors should keep in mind that we’re not that far into our cloud computing journey, and the gains these companies could make from these technologies the coming years could be far greater. So be patient, consider the other businesses these companies are in when building your investment thesis, and then sit back and wait for this market to mature.
How to Invest in Cloud Computing — The Motley Fool
The cloud in cloud computing originated from the habit of drawing the internet as a fluffy cloud in network diagrams. No wonder the most popular meaning of cloud computing refers to running workloads over the internet remotely in a commercial providers data centerthe so-called public cloud model. AWS (Amazon Web Services), Salesforces CRM system, and Google Cloud Platform all exemplify this popular notion of cloud computing.
But theres another, more precise meaning of cloud computing: the virtualization and central management of data center resources as software-defined pools. This technical definition of cloud computing describes how public cloud service providers run their operations. The key advantage is agility: the ability to apply abstracted compute, storage, and network resources to workloads as needed and tap into an abundance of pre-built services.
From a customer perspective, the public cloud offers a way to gain new capabilities on demand without investing in new hardware or software. Instead, customers pay their cloud provider a subscription fee or pay for only the resources they use. Simply by filling in web forms, users can set up accounts and spin up virtual machines or provision new applications. More users or computing resources can be added on the flythe latter in real time as workloads demand those resources thanks to a feature known as auto-scaling.
The array of available cloud computing services is vast, but most fall into one of the following categories:
This type of public cloud computing delivers applications over the internet through the browser. The most popular SaaS applications for business can be found in Googles G Suite and Microsofts Office 365; among enterprise applications, Salesforce leads the pack. But virtually all enterprise applications, including ERP suites from Oracle and SAP, have adopted the SaaS model. Typically, SaaS applications offer extensive configuration options as well as development environments that enable customers to code their own modifications and additions.
At a basic level, IaaS public cloud providers offer storage and compute services on a pay-per-use basis. But the full array of services offered by all major public cloud providers is staggering: highly scalable databases, virtual private networks, big data analytics, developer tools, machine learning, application monitoring, and so on. Amazon Web Services was the first IaaS provider and remains the leader, followed by Microsoft Azure, Google Cloud Platform, and IBM Cloud.
PaaS provides sets of services and workflows that specifically target developers, who can use shared tools, processes, and APIs to accelerate the development, test, and deployment of applications. Salesforces Heroku and Force.com are popular public cloud PaaS offerings; Pivotals Cloud Foundry and Red Hats OpenShift can be deployed on premises or accessed through the major public clouds. For enterprises, PaaS can ensure that developers have ready access to resources, follow certain processes, and use only a specific array of services, while operators maintain the underlying infrastructure.
Note that a variety of PaaS tailored for developers of mobile applications generally goes by the name of MBaaS (mobile back end as a service), or sometimes just BaaS (back end as a service).
FaaS, the cloud instantiation of serverless computing, adds another layer of abstraction to PaaS, so that developers are completely insulated from everything in the stack below their code. Instead of futzing with virtual servers, containers, and application runtimes, they upload narrowly functional blocks of code, and set them to be triggered by a certain event (e.g. a form submission or uploaded file). All the major clouds offer FaaS on top of IaaS: AWS Lambda, Azure Functions, Google Cloud Functions, and IBM OpenWhisk. A special benefit of FaaS applications is that they consume no IaaS resources until an event occurs, reducing pay-per-use fees.
The private cloud downsizes the technologies used to run IaaS public clouds into software that can be deployed and operated in a customers data center. As with a public cloud, internal customers can provision their own virtual resources in order to build, test, and run applications, with metering to charge back departments for resource consumption. For administrators, the private cloud amounts to the ultimate in data center automation, minimizing manual provisioning and management. VMwares Software Defined Data Center stack is the most popular commercial private cloud software, while OpenStack is the open source leader.
A hybrid cloud is the integration of a private cloud with a public cloud. At its most developed, the hybrid cloud involves creating parallel environments in which applications can move easily between private and public clouds. In other instances, databases may stay in the customer data center and integrate with public cloud applicationsor virtualized data center workloads may be replicated to the cloud during times of peak demand. The types of integrations between private and public cloud vary widely, but they must be extensive to earn a hybrid clouddesignation.
Just as SaaS delivers applications to users over the internet, public APIs offer developers application functionality that can be accessed programmatically. For example, in building web applications, developers often tap into Google Maps API to provide driving directions; to integrate with social media, developers may call upon APIs maintained by Twitter, Facebook, or LinkedIn. Twilio has built a successful business dedicated to delivering telephony and messaging services via public APIs. Ultimately, any business can provision its own public APIs to enable customers to consume data or access application functionality.
Data integration is a key issue for any sizeable company, but particularly for those that adopt SaaS at scale. iPaaS providers typically offer prebuilt connectors for sharing data among popular SaaS applications and on-premises enterprise applications, though providers may focus more or less on B-to-B and ecommerce integrations, cloud integrations, or traditional SOA-style integrations. iPaaS offerings in the cloud from such providers as Dell Boomi, Informatica, MuleSoft, and SnapLogic also enable users to implement data mapping, transformations, and workflows as part of the integration-building process.
The most difficult security issue related to cloud computing is the management of user identity and its associated rights and permissions across private data centers and pubic cloud sites. IDaaS providers maintain cloud-based user profiles that authenticate users and enable access to resources or applications based on security policies, user groups, and individual privileges. The ability to integrate with various directory services (Active Directory, LDAP, etc.) and provide is essential. Okta is the clear leader in cloud-based IDaaS; CA, Centrify, IBM, Microsoft, Oracle, and Ping provide both on-premises and cloud solutions.
Collaboration solutions such as Slack, Microsoft Teams, and HipChat have become vital messaging platforms that enable groups to communicate and work together effectively. Basically, these solutions are relatively simple SaaS applications that support chat-style messaging along with file sharing and audio or video communication. Most offer APIs to facilitate integrations with other systems and enable third-party developers to create and share add-ins that augment functionality.
Key players in such industries as financial services, healthcare, retail, life sciences, and manufacturing provide PaaS clouds to enable customers to build vertical applications that tap into industry-specific, API-accessible services. Vertical clouds can dramatically reduce the time to market for vertical applications and accelerate domain-specific B-to-B integrations. Most vertical clouds are built with the intent of nurturing partner ecosystems.
The clouds main appeal is to reduce the time to market of applications that need to scale dynamically. Increasingly, however, developers are drawn to the cloud by the abundance of advanced new services that can be incorporated into applications, from machine learning to internet-of-things connectivity.
Although businesses sometimes migrate legacy applications to the cloud to reduce data center resource requirements, the real benefits accrue to new applications that take advantage of cloud services and cloud native attributes. The latter include microservices architecture, Linux containers to enhance application portability, and container management solutions such as Kubernetes that orchestrate container-based services. Cloud-native approaches and solutions can be part of either public or private clouds and help enable highly efficient devops-style workflows.
Objections to the public cloud generally begin with cloud security, although the major public clouds have proven themselves much less susceptible to attack than the average enterprise data center. Of greater concern is the integration of security policy and identity management between customers and public cloud providers. In addition, government regulation may forbid customers from allowing sensitive data off premises. Other concerns include the risk of outages and the long-term operational costs of public cloud services.
Yet cloud computing, public or private, has become the platform of choice for large applications, particularly customer-facing ones that need to change frequently or scale dynamically. More significantly, the major public clouds now lead the way in enterprise technology development, debuting new advances before they appear anywhere else. Workload by workload, enterprises are opting for the cloud, where an endless parade of exciting new technologies invite innovative use.
Read more from the original source:
What is cloud computing? Everything you need to know now …
The Volkswagen Group, the worlds second largest car manufacturer, is planning to use open-source cloud-computing platforms in order to build a private cloud to host websites for its brands Audi, VW, and Porsche. The company is also looking at a comprehensive platform for innovative automotive technology. In fact, VW officials debated for a long time over how to leverage the technology. The Group employs over 600,000 employees globally and of them, there are 11,000 who are internal IT experts.
They first plan to build private cloud to span the thousands of physical nodes across multiple data centers in the USA, Europe, and Asia. The automotive giant then eventually plans to build public cloud in order to create a hybrid cloud. The automobile giant has evolved from a car manufacturer to a global mobility provider and realizes the need to move away from traditional application development processes to those which are agile and can sustain rapid development. For new mobility services, Volkswagen will collect data, analyze it, and store it to make better products for its customers. These are real cases of digital transformation. Volkswagen is in fact one of the many automotive companies that are leveraging transformational technologies for a digital future.
Massive expansion predicted for cloud services globally
Cloud computing is one of the most disruptive forces facing the industry. According to the Bain & Company research report The Changing Faces of the Cloud, globally, the cloud IT market revenue is projected to increase to $390 billion in 2020 from $180 billion, translating into a compound annual growth rate (CAGR) of around 17%. The scale of change is mind-boggling.
The overall global public cloud market will mature, and its growth rate will slightly slow down from 17.2% in 2016 to a 15.2% increase in 2020, says Sid Nag, research director at Gartner. While Brexit and other growth challenges exist, some segments such as financial SaaS applications and the PaaS user markets will still see strong growth through 2020. As buyers intensify and increase IaaS activity, they will be getting more for their investment: ongoing enhancement of performance, more memory, more storage for the same money (which will drive increases in consumptions) and increased automation in traditional IT outsourcing (ITO) delivery, added Nag.
Cloud computing is radically changing the face of the automobile industry
The change is not cosmetic but radical in all aspects and could be truly transformational as it will power and define business processes and supply chains. These are companies genuinely trying to change everything: from the way their structure is managed to the products they sell.
We live in a world where innovation is the only constant. The world is witnessing unprecedented change driven by digital revolution. Everything is changing from how organizations function to how people work. Digital transformation is the buzzword across industries and cloud-based tech is leading that digitalization of processes and supply chains, said Shashank Dixit, CEO, Deskera, a global cloud provider.
Automotive companies are leveraging modern Cloud-computing platforms for creating Cloud native Applications, Operating System, the Internet of Things (IoT), devising a comprehensive software development methodologyall of which have the potential to literally transform it into a global powerhouse. As the company strives to explore new markets, it is overhauling everything that defines the core of its business and moving towards being a software services company, away from its hallmark of being a leading automaker. The bold move will perhaps lay the blueprint of how automobile enterprises of the future will keep reinventing themselves.
Read the rest here:
How The Automotive Industry Is Leveraging Cloud Computing – CXOToday.com
Huawei and Microsoft executives sign an expanded cloud computing partnership. (Huawei Photo)
Huaweis bid to be a player in Chinas cloud computing scene got a little stronger Tuesday with the signing of a deal with Microsoft to host more of the software giants apps on its cloud.
Just five months into the making of Huaweis public cloud strategy, the two companies signed a deal that will see more of Microsofts enterprise technology software become available on Huaweis public cloud. Huawei launched its public cloud service in April with support for Windows Server and RDS (relational database service) for SQL Server, but customers running other Microsoft apps on-premises will now be able to take advantage of a cloud option for those apps through the new partnership.
Cloud computing in China is the domain of home-grown companies, with Alibaba as the countrys leading provider of cloud computing services. Baidu and Tencent are also going after cloud customers, while U.S. companies like Amazon Web Services and Microsoft operate their cloud services through a local subsidiary. The overall market is a little behind where the global cloud computing market is at the moment, but demand for cloud services in China is expected to surge over the next decade.
Microsoft and Huawei will cooperate on bringing new services to Huaweis customers, the two companies said in a statement. The announcement kicks off the Huawei Connect 2017 conference, which is being held in Shanghai.
Read the original here:
Huawei ups its bet on cloud computing with broader support for Microsoft apps – GeekWire
A board sports apparel retailer is taking steps to blend its physical and digital retail channels.
Billabong is leveraging the Aptos Singular Commerce platform to support omnichannel retailing across its global enterprise. The cloud-based solution will merge the retailers physical and digital retail channels, and create a single view of customers, inventory and orders, among other operations.
In addition to managing point-of-sale, the solution also supports customer relationship management (CRM), order management, merchandising and auditing functions. By integrating these functions, Billabong is positioned to deliver truly seamless customer experiences regardless where, when or how its customers shop, the company said.
Transitioning to a cloud-based platform also helps Billabong to consolidate its retail technology stack, and accelerate the implementation of new solutions goals that required a seasoned partner.
Aptos global presence, leading cloud-based technology, and professional services and implementation team were important considerations in our selection process, said Michael Yerkes, senior VP, global operations of Billabong International Limited.
Billabong operates 372 retail stores, as well as operates e-commerce sites for each of its key brands, Billabong, RVCA, Element, Von Zipper, Honolua Surf Company, Kustom, Palmers Surf and Xcel.
Sept. 5 Cloud computing lies behind many of todays most popular technologies, from streaming video and music to e-mail and chat services to storing and sharing family photos. Since 2015, theChameleontestbed has helped researchers push the potential of cloud computing even further, finding novel scientific applications and improving security and privacy.
A new grant from the National Science Foundation will extend Chameleons mission for another three years, allowing the project led by University of Chicago with partners at theTexas Advanced Computing Center(TACC), Renaissance Computing Institute (RENCI), and Northwestern University to enter its next phase of cloud computing innovation. Upgrades to hardware and services as well as new features will help scientists rigorously test new cloud computing platforms and networking protocols.
The $10 million renewal will be officially announced at the inauguralChameleon User Meeting, taking place September 13-14 at Argonne National Laboratory.
In phase one we built a testbed, but in phase two were going to transform this testbed into a scientific instrument, saidKate Keahey, Argonne computer scientist, Computation Institute fellow, and Chameleon project PI. Were going to extend the capabilities that allow users to keep a record of their experiments in Chameleon and provide new services that allow them to build more repeatable experiments.
The new features build upon the projects original philosophies of flexibility and transparency, which provided users with a large-scale, ~600-node cloud infrastructure with bare metal reconfiguration privileges. This unique level of access allows researchers to go beyond limited development on existing commercial or scientific clouds, offering a customizable platform to create and test entirely new cloud computing architectures.
In its first phase, this powerful resource supported advanced work in computer science areas such as cybersecurity, OS design and power management. With Chameleon, scientists could realisticallysimulate cyberattacksupon cloud computing systems to improve their defenses, train students tosearch high-resolution telescope imagesfor undiscovered exoplanets, and develop machine learning algorithms that automatically determinethe most energy-efficient task assignment schemesfor large data centers.
Many of these projects benefited from Chameleon features that allow them to extract detailed, precise data about system performance and status during usage. To further support the conduct of reproducible science, the Chameleon team will make it even easier for scientists to gather and use this information.
Everything in the testbed is a recorded event, but right now the information about those events is in various different places, Keahey said. Were going to make it very easy for users to have a record of everything that was happening on the testbed resources that they used, and well also provide services to replay those experiments.
Additional phase two landmarks include new hardware, including additional racks at UChicago and TACC, infusion of highly-contested resources such as GPUs, and Corsa network switches. The new Corsa switches enable experimentation with software-defined networking (SDN) within a Chameleon site as well as extending individual SDN experiments across the wide-area to include resources from either Chameleon site or even from other compatible testbeds, such as NSF GENI.
New hardware will be complemented by new capabilities allowing users to define entirely new classes of experiments. For the second phase, new team members from RENCI with significant expertise developing such capabilities will join the existing Chameleon team based at UChicago, TACC, and Northwestern University.
On the software side, the Chameleon team will package CHI (CHameleon Infrastructure), the software operating Chameleon, based primarily on the open-sourceOpenStackproject to which the University of Chicago team made substantial contributions. Packaging the Chameleon operational model will allow others to create their own experimental clouds easily.
Whether somebody wants to provide a Chameleon resource or create their own experimental testbed, CHI will make it very easy for them Keahey said. It is based on a widely used open source system that is increasingly popular in scientific data centers and thus easy to adopt. Ultimately, we would like to make testbeds for Computer Science research cost-effective to operate.
The project will also look to expand their community through outreach events, including workshops, online tutorials, and Septembers user meeting. In addition to training scientists in the use of Chameleon and gathering feedback for future improvements, these events will also be an opportunity for defining the future of cloud computing science, Keahey said.
We would like to go beyond simply providing resources, and give the community the opportunity to focus on experimental methodology in computer science: how to improve it, how to control it, and how to make experimental computer science less about logistics and more about the science.
Source: Computation Institute & The University of Chicago
See the original post:
Cloud Computing Testbed Chameleon Renewed for Second Phase – HPCwire