Page 5,406«..1020..5,4055,4065,4075,408..5,4205,430..»

Euro Zone Eyes Cloud Computing to Kick Start Economy

With a four-year debt crisis and recession affecting many of its member countries, the European Union (EU) is turning to cloud computing to create 2.5 million new jobs and boost the regions economy.

Cloud computing is where files are stored in massive data centers rather than on office servers and computer programs and functions run via the Internet. Oracle

Consumers dont have to get bogged down with the complexity of computing and they dont have to make a huge capital investment, they just plug in and run their applications, the founder of the worlds largest database software company told CNBC Tuesday.

It may not be a new concept but it certainly caught the eye of the European Commission which says the industry could increase the regions gross domestic product (GDP) by 583 billion euros ($760 billion) between 2015 and 2020 and create millions of new jobs.

EMC

Anything that governments and the EU can do to clarify and simplify is always a good thing, he said. Any time you have a framework, that can only foster more investment.

The EU wants to focus on four key aims to help cloud computing realize its full potential. They want users to be able to easily move providers, a certification for trustworthy companies, contracts that would simplify regulations, and clear communication between providers and the public sector, so work doesnt drift overseas to the U.S.

Katherine Thompson, analyst at Edison Investment Research, is not entirely convinced, however.

Im not sure I strictly agree that it will give such a boost to the economy, as the move to the cloud is often a shift from one form of expenditure to another, as opposed to incremental spend, and in many cases will be deflationary, she told CNBC. The EUs thinking behind this is that it would help create new types of companies and new business models start I do agree with this myself, but this is already happening without EU involvement.

Job Losses?

More:

Euro Zone Eyes Cloud Computing to Kick Start Economy

Read More..

ZapThink Announces Expansion of Cloud Computing for Architects Course

Includes brand new, original content on Cloud Standards, Orchestration Platforms, and Cloud Assurance.

McLean, VA (PRWEB) October 08, 2012

The two-day Cloud Computing for Architects course is the only architect-focused course on Cloud Computing available on the market today. ZapThink aggressively updates the curriculum to address the rapidly emerging Cloud Computing marketplace.

ZapThink has successfully run earlier versions of this course in McLean VA, London, Singapore, Australia, and India. Cloud Computing for Architects will be offered October 18 19, 2012 in London (in partnership with IRM UK), December 3 - 4, 2012 in McLean, VA, and December 10 - 11, 2012 in San Diego, CA.

The instructor for the course is Jason Bloomberg, President of ZapThink. Mr. Bloomberg is a globally recognized SOA and Cloud thought leader, and creator of the popular Licensed ZapThink Architect SOA course and certification.

The Cloud Computing for Architects course covers virtualization, workloads, Cloud service models (IaaS, PaaS, and SaaS), enterprise architecture and the Cloud, Cloud configuration, RESTful Clouds, Cloud Standards, Cloud security and governance, and big data and the Cloud. The course has no prerequisites. It is designed for architects, but is appropriate for people with different roles and levels of expertise. This course is valuable for anyone who wants in-depth knowledge about how to succeed with Cloud Computing.

Cloud computing is far more than simply outsourcing your data center, said Jason Bloomberg, President of ZapThink. In many organizations, architecture is the missing piece of their Cloud strategy. To take advantage of the promise of the Cloud, getting the architecture right is critically important.

The Cloud Computing for Architects course an intensive, two day fire hose of information that prepares you to leverage the Cloud to achieve real business value. We cut through the hype and separate what really works from the noise. For more information and to register for an upcoming course, visit http://www.zapthink.com/cca.

About ZapThink

As a recognized authority and master of Service-Oriented Architecture (SOA), Enterprise Architecture, and architectural approaches to Cloud Computing, ZapThink (http://www.zapthink.com) provides its global audience of public and private sector enterprises with practical advice, guidance, and education, to assist in creating an architecture that meets business needs. ZapThink offers a clear roadmap for standards-based, loosely coupled distributed computing a vision of IT meeting the needs of the agile business.

Read more:

ZapThink Announces Expansion of Cloud Computing for Architects Course

Read More..

SpiderOak Set to Announce Industry-First for the Cloud: 'Zero-Knowledge' Collaboration

SAN FRANCISCO, CA--(Marketwire - Oct 8, 2012) - An inherent lack of privacy has emerged as a key issue for large enterprises and SMBs that rely on 3rd party cloud providers. While Box CEO Aaron Levie is set to announce a new collaboration platform,SpiderOak asks that companies consider a critical question: is your data really safe when cloud providers have plaintext access to all documents, spreadsheets, and PowerPoint presentations stored on their servers?

While Box announced its solution today, later this year SpiderOak will announce an industry first 'Zero-Knowledge' collaboration tool -- allowing enterprises to use cloud technologies without the risk of exposing data outside of the organization. This is an extension of SpiderOak's 'Zero-Knowledge' Privacy Environment whereby no one -- including SpiderOak employees -- ever has plaintext access to customer data.

Nowhere is the tension between the convenience of the cloud and end-user ignorance more apparent than this year's conference hosted by Box -- BoxWorks -- and its theme, "Business Without Boundaries." Like all other cloud providers, Box advertises security and asks for trust while remaining fundamentally at risk from the more critical issue -- privacy. Box and other cloud providers require access to this data in order to perform their basic functions and service. As such, anyone with the proper access controls (or those who illegally obtain them) can view enterprise data at any time.

"Cloud technology companies, including Box, demand and expect enterprises to inherently trust them with their most valuable possession -- their data. However and as we have seen time and time again, this proposition is fraught with risks in security and data privacy from both internal and external threats. Ultimately business does in fact need boundaries," said Ethan Oberman, CEO of SpiderOak. "SpiderOak is built around 'Zero-Knowledge' privacy. As such, trust no longer sits at the center of our sales pitch because we are never in a position where we can betray the trust of our customers. Why? Simple. We don't have access to the plaintext data."

SpiderOak: Built for Privacy

SpiderOak's centralized and fully protected solution enables enterprises to take full control of the cloud in a managed environment while not relinquishing privacy to a 3rd party provider. In contrast, Box.net and other cloud companies are forced to advertise and push security as they cannot make the same 100% privacy guarantee.

"Today's enterprise environment demands a higher standard of privacy in the cloud," said Richard Stiennon, Chief Research Analyst at IT-Harvest. "Sensitive data is often distributed around the world, with employees collaborating and sharing information on a host of disparate, mobile and intelligent devices. A majority of enterprises in today's business environment don't retain full control of their data in the cloud nor do they have the option of fully controlling and managing it, either on-site or off-site. SpiderOak is designed with complete privacy in mind, representing a much-needed advance in the world of enterprise cloud environments."

The SpiderOak Blue line of enterprise products empowers companies to maintain the industry's most confidential cloud environments. It is impossible for anyone outside of company's IT departments to see data, unless those IT departments have shared it. Available in on-premise and SpiderOak hosted solutions, SpiderOak Blue enables companies to comply with the most stringent privacy, info-security and auditing requirements in a fully confidential cloud technology environment.

For more information on all SpiderOak: Blue products, please visit https://spideroak.com/business_learn_more/.

About SpiderOak SpiderOak provides a cloud backup, sync, and sharing environment that is 100% private. Our 'Zero-Knowledge' Privacy Standard ensures absolute confidentiality between you and your data, everywhere, every time and from every device. With SpiderOak, you maintain full and complete control of your data in a centralized, managed and fully protected environment. SpiderOak: we've got your back(up).

See the article here:
SpiderOak Set to Announce Industry-First for the Cloud: 'Zero-Knowledge' Collaboration

Read More..

Cloud management platforms key for cloud security

You did it: You managed to carefully shepherd your organization through the minefields of cloud computing. You selected a security-friendly provider, carefully planned your architecture and migration, and even implemented a nice set of cloud-specific security controls, with a mix of public and private clouds. Excellent job.

Then you were smashed by the freight train of reality as developers, administrators and even business units shattered your well-laid plans by, you know, actually using the darn cloud. Instances began spinning up left and right, quickly falling out of security and compliance because of old patch levels, improperly configured security groups, and all the little, tiny changes introduced by maintaining state through day-to-day usage.

We struggle to manage these issues with our traditional infrastructure, but at least in those circumstances we have a modicum of physical control. It isn't like business units are sneaking into the data center to add new 1U servers to the racks. But in the cloud? Assuming you set it up properly to actually leverage the advantages of cloud computing you will have new servers and applications

Managing basic operations under these conditions is extremely challenging -- outside of security -- especially when you dig into the technical issues of managing your entire infrastructure through network connections and APIs. For example, one friend once had to launch disaster recovery plans because an administrator accidentally used the wrong command line. Instead of shutting down three key servers on Amazon Web Services, he "terminated" them. If you don't know, terminate on AWS means immediately stop this server and erase all associated storage irrecoverably.

Companies like RightScale Inc. and enStratus Networks Inc. insert a proxy in front of the management plane to provide greater compatibility, control and policy-based management across heterogeneous cloud deployments.

However, a new breed of tools and services is emerging to help with the complexities of managing cloud infrastructures. Companies like RightScale Inc. and enStratus Networks Inc. insert a proxy in front of the management plane to provide greater compatibility, control and policy-based management across heterogeneous cloud deployments.

Although the primary goal of these cloud management platforms is operations, when you get down to it, a large percentage of security is really just operations. Keeping systems patched, positioning instances in the right parts of the network, controlling which administrators can manage which resources are all critical security functions that don't necessarily need to be part of security.

Let's look a little deeper into how these tools work (although keep in mind different vendors have different implementations and this is a broad generality). Normally we manage cloud through a mix of direct API calls, command line tools or Web interfaces. Administrators (and users) have access to all or some of these resources across different cloud platforms, which requires some complex entitlement and user management. Also, even when you can restrict their activities, it's either so granular as to be incredibly complex or so broad that it's worthless.

Plus, there are other, extensive operational functions like patching that must be managed with a patchwork of tools.

Cloud management platforms are usually a proxy between the users and the cloud management plane. The proxy has access to the entire cloud infrastructure, and users run through the proxy instead of making direct API calls. They don't even have access rights to the cloud's management plane.

Read more:
Cloud management platforms key for cloud security

Read More..

How Will Rackspace Hosting OpenStack Cloud Computing Certification Affect DevOps?

Rackspace Hosting has initiated a new effort to help the cloud community grab some attractive certifications, in the form of OpenStack Fundamentals. This offering is based on teaching materials that hosting company developed internally for a course called OpenStack Fundamentals, which has been available for the past year. It is a four-day course with six hours of personal instruction that has been given to hundreds of students in 50 countries already.

The course costs $2,500 per student, if you come to a Rackspace class, and a whopping $45,000 if but you want to get the course privately in your facility. That private class size is limited to 18 students, which again works out to the same $2,500 per person.

Introducing this cloud computing certification is a step toward achieving the mission of making OpenStack the de facto cloud standard. And the good thing is that the company is looking ahead for the certification holders to be absorbed in good positions in the industry.

And some of those positions are expected to focus on DevOps:

We are seeing more cloud-specific jobs. Cloud is such a hot topic, youll see a lot of companies looking for cloud technologists, although cloud may not be in the title. There are a lot of people looking for DevOps engineers, for example. DevOps and cloud walk together a lot. We see an uptake in companies using cloud in general, and theyre going to need IT professionals to power and run those clouds, said Tony Campbell, Director of training and certification at Rackspace.

We plan to offer an application developer certification for software developers who are writing cloud-enabled apps. The cloud brings a new way and new capabilities to software development, and its important that engineers are up to speed on how [to] leverage cloud in their software. Up until now there has been a hard line between software and hardware; with cloud, that line becomes blurry to where the software can interact with the hardware and can expand and contract hardware resources directly from the software. We want to teach people how to do that.

Once you are done with your coursework, youll have to take a test which involves working with a running OpenStack cluster. And passing test makes you an OpenStack Certified Technician. The test will be available in December, while the pricing has not been informed yet. Rackspace has also launched a partner program that will allow third parties that are interested in using its coursework to provide training and therefore help drive up certifications. The same will also start in December.

Read the rest here:
How Will Rackspace Hosting OpenStack Cloud Computing Certification Affect DevOps?

Read More..

44 Percent Of US Execs To Tackle IT Challenges Through Cloud

DETROIT - Forty-four percent of U.S. executives aim to tackle current IT challenges through leveraging cloud solutions, and they are planning to invest more in cloud computing in the future. That is the finding of an IDC survey commissioned by T-Systems.

Corporations expect cloud computing to deliver lower IT costs (26 percent) and to enable them to replace legacy systems (21 percent) and adopt new applications more flexibly (14 percent).

As the U.S. cloud services market continues to mature, enterprises find that overall business impact and productivity gains from the cloud are as significant as achieving cost reductions, said David Tapper, IDC VP Outsourcing and Offshore Services Market Research. Cloud computing is seen as most likely to deliver solutions for Customer Relationship Management (31 percent), productivity tools like email, collaboration or Office packages (28 percent), online stores, and Enterprise Relationship Management (26 percent each).

Corporations continue to have reservations about security, but they are no longer the decisive criterion against cloud. The concept of security now extends to issues such as how cloud computing will impact compliance requirements or data availability. That is prompting corporations to consider the right cloud type and cloud service needed. Enterprises see an opportunity in the private cloud for providers to fulfill their security requirements and agree on service level agreements. 40 percent of U.S. respondents have implemented a private cloud strategy while only 13 percent are relying on public cloud and 16 percent on hybrid cloud solutions.

In the course of adopting cloud computing, enterprises are increasingly considering new service providers, and they are also considering providers whose services they have not previously used. In ERP more than half are considering providers with whom they have had no previous experience. CEOs, Tapper said, are ranked as most significant in the decision-making process on using clouds. The result is that buyers are viewing cloud as strategic in achieving critical business objectives for which CIOs and IT vendors must ensure that their cloud solutions help achieve these objectives and associated business benefits.

The survey results validate that one of the greatest needs in deploying cloud-based solutions is to find the right partner who can assist with the question of cloud readiness and bring forward a clear plan on how to migrate to the cloud, said T-Systems North America Managing Director Heike Auerbach. T-Systems has been migrating and managing complex applications to the cloud for more than seven years longer than any other IT service provider. It was gratifying for us to see that customers profoundly value an experienced partner as they make the journey to the cloud.

For the cloud survey commissioned by T-Systems, IDC asked CIOs and other top IT managers of 104 U.S. corporations in the summer of 2012 how they now rated cloud computing. IDC conducted the same interviews in the Netherlands, Spain, Switzerland, the UK and Brazil.

IDC analysts and T-Systems cloud experts are presenting the survey findings and the latest cloud solutions in free webcasts. The live webcast for the U.S. market will take place at 2 p.m. Eastern Time on October 18.

To register, click on T-Systems.Com

Link:

44 Percent Of US Execs To Tackle IT Challenges Through Cloud

Read More..

Enterprise computing IS the cloud

Summary: Cloud strategy is now indivisible from enterprise computing - can Oracle retain its vast customer base in this new era?

The Oracle occupation of downtown San Francisco buildings is over, but the sea and air invasion continues. After painting the town red with Open World conference events, the city's busiest events weekend (thanks to the fog free Indian summer we San Franciscans enjoy this time of year) includes lots of America's Cup yacht racing action on the bay.

The video clip above of the defending cup holder Oracle team capsizing but subsequently going on to win Saturday's event is a good analogy for how many enterprise software customers see all the big incumbent tech vendors in the race for future relevance. They're heavily invested in past technologies and watching the race for any errors and where the future high ground is.

The 'cloud' air war now effectively IS enterprise computing: any differentiation from the old networked on-premise data center world of vast extranets was rendered moot by Oracle's wholesale adoption of cloud rhetoric and perceived future client needs last week.

Like the yachting America's Cup which Team USA/ Oracle currently hold, the vast installed Oracle client base is Oracle's to lose. SAP may be the largest provider of discrete business applications but Oracle's customers are used to a soup to nuts relationship from bare metal up to business process, and based on my conversations with various random Oracle users last week they appear confident about their future with the Oracle juggernaut.

Where Oracle are vulnerable in a down economy is the relative lack of huge global companies left to supply: small and medium business have headroom to grow, typically at the expense of incumbents. Many of the success stories in the SMB sector have the rapidity of cloud strategy along with SaaS lower costs and agility to change at their heart. Larger companies are fostering small business units and encouraging start up style innovation outside of the mothership infrastructure as they look for growth opportunities.

The cultural identity problems for the large IT players - Microsoft, IBM, SAP and Oracle (MISO) - are massive: their DNA is rooted in the last century and try as they might to reinvent themselves, their rank and file integrators and users are set in their ways.

Fusion is a very sophisticated and solid - although complex- set of offerings and Oracle continue to do a terrific job rolling out their 21st century solutions...the 'but' is that the client user base is small and the vast majority of Oracle customers didn't appear particularly interested. The fact that the Fusion apps were all showcased in the smaller Moscone Conference West venue while the huge hall in the main conference center had two sets of showcase areas for everything else spoke volumes about where the business is today.

For fiscal Q1 ending Aug. 31, Oracle claims $222 million in cloud-based revenue. That's a fraction of$8.2 billiontotal sales but the first time Oracle has disclosed cloud revenue.Company President Mark Hurd stated Oracle's cloud business this quarter is at a $1 billion run rate.

Servicing yesteryears portals and existing plumbing is understandably why most attendees were in San Francisco, and tech conferences are just as much about existing users as showcasing what's next for them and impressing Wall Street analysts. The reality however is that no one knows what the future holds for big tech in a world of month by month Software as a Service subscriptions an business disruption at all levels. The big financial companies are experimenting with Google, Amazon and Facebook style massive datacenter design models - Goldman Sachs are experimenting with Facebook'sOpen Compute project for example, as alternatives to the proprietary world of Oracle.

Here is the original post:

Enterprise computing IS the cloud

Read More..

Dell supercomputer effort spawns new line of servers

Dell has developed a line of servers based on designs the company is using in an upcoming 10-petaflop supercomputer called Stampede, which will be fully deployed at the University of Texas, Austin, starting next year.

The PowerEdge C8000 servers are built with standard Intel x86 CPUs and can be equipped with graphics processors or additional storage to improve performance on database tasks, high-performance computing operations and cloud workloads.

Users will be able to mix and match graphics processors, storage, memory and other elements inside the servers, said Armando Acosta, a product manager at Dell.

For its part, the Stampede supercomputer includes thousands of C8000 servers with a total of 272TB of memory and 14 petabytes of storage. Dell and the Texas Advanced Computing Center at the University of Texas worked together on Stampede. The design for the C8000 servers blossomed as the supercomputer came to fruition, Acosta said.

The supercomputer will use eight-core Intel Xeon E5-2600 processors and co-processors code-named Knights Corner, which Dell said will speed up scientific and math calculations.

As for the new servers, the basic C8220 chassis can have up to eight blade servers; each server can contain two CPUs with up to 16 processing cores, two internal hard drives and additional storage and networking options. For instance, the servers can be hooked up to the new C8000XD storage box for expandable hard drive or SSD options.

The C8220X, a more advanced model in the new lineup, has more RAM and storage and can be equipped with graphics processors. All of the servers are designed for use in highly parallel computing environments, Acosta said.

Pricing starts at $35,000 for the C8220, $42,000 for the C8220X and about $25,000 for the C8000XD storage box.

This version of this story was originally published in Computerworld's print edition. It was adapted from an article that appeared earlier on Computerworld.com.

Read more about hardware in Computerworld's Hardware Topic Center.

Read more:
Dell supercomputer effort spawns new line of servers

Read More..

Data Center Servers Suck — But Nobody Knows How Much

If the computer industrys dirty little secret is that data centers are woefully inefficient, the secret behind the secret is that nobody knows how bad things really are.

On its surface, the issue is simple. Inside the massive data centers that drive todays businesses, technical staffers have a tendency to just throw extra servers at a computing problem. They hope that by piling on the processors, they can keep things from grinding to a halt and not get fired. But they dont think much about how efficient those servers are.

The industry talks a lot about the power efficiency of data centers as a whole i.e. how much of the data centers total power is used for computing but it doesnt look as closely at the efficiency of the servers inside these computing facilities how much of the time theyre actually doing work. And it turns out that getting a fix on this is pretty hard.

The folks who run the most efficient data centers in the world the Amazons and Googles and Microsofts view this information as a competitive secret, so they wont share it. And in the less-efficient enterprise data centers, staffers may not welcome any type of rigorous measurement of their server efficiency. Think about it who would want their boss to know how poorly utilized that incredibly expensive asset was? said David Cappuccio, a Gartner analyst speaking in an email interview.

But that keeps the industry from getting a proper fix on things, says Amy Spellmann, a global practice principal with the Uptime Institute. I think there are good reasons for getting the benchmarks and the analysis out there, she says. We should be tracking these things and how we are doing as an industry.

When The New York Times ran its recent investigative expose on data center waste, they had to peg the story on a 4-year-old data center report by McKinsey & Co. and a whole lot of anecdotal evidence.

That seems to be the current state of research on data center utilization rates: one report based on data from 20,000 servers that was compiled in 2008. Back then, Amazons EC2 cloud service was in beta; nowadays, EC2 and its sister Amazon Web Services run as much as one percent of the internet. The industry has changed, but the research has not.

McKinsey spokesman Charles Barthold says that the only systematic study McKinsey has ever done was this 2008 analysis. Back in 2008, it pegged server utilization at 6 percent meaning servers in the data center only get used 6 percent of the time. The firm guesses that the rate is now between 6 to 12 percent, based on anecdotal information from customers, Barthold says. McKinsey declined to talk in depth about the report.

And thats too bad. Its not ever clear whether this is the best way to measure the efficiency of our data centers.

Over at Mozilla, Datacenter Operations Manager Derek Moore says he probably averages around 6 to 10 percent CPU utilization from his server processors, but he doesnt see that as a problem because he cares about memory and networking. The majority of our applications are RAM or storage constrained, not CPU. It doesnt really bother us if the CPU is idle, as long as the RAM, storage, or network IO [input-output] is being well-utilized, he says. CPU isnt the only resource when it comes to determining the effectiveness of a server.

More:
Data Center Servers Suck — But Nobody Knows How Much

Read More..

How to share documents with iCloud

The new dream in computing is keeping all of your files in the cloud, on remote servers that you can access from anywhere at any time. Apples cloud-based syncing and storage service, iCloud, debuted in June 2011. Still, only since the release of OS X Mountain Lion that enough applications have started to support iCloud document syncing for this feature to be useful. Working with iCloud is fairly simple, but you need to know the ground rules if you plan to start storing your documents in the cloud.

If you dont have a free iCloud account, or if youre just starting out with it, this article will give you an overview of how to set up a new iCloud account. To store documents in the cloudno matter which application puts its files thereyou also need to activate the Documents & Data setting in the iCloud pane in System Preferences, as well as in the Settings of any iOS device you plan to use (to do so, selectSettings > iCloud). Once youve done this, any iCloud-compatible app can store files in iCloud.

For now, only a limited number of applications can store files in iCloud. By files, I mean documents that you create, not data that an application such as Calendar stores in the cloud. On the Mac, many of Apples apps do support iCloud, including Preview, TextEdit, the iWork 09 suite (Pages, Numbers, andKeynote), and GarageBand.

Third-party apps that store documents in the cloud include text editors such as iA Writer, Byword, and Smultron; the PDF editor PDFpen; the graphics editor Pixelmator; and some others. At this point, compatible programs can produce files in Microsoft Office formats, but Microsoft Office itself doesnt support iCloud.

Note that Apple lets onlyapps sold through the Mac App Store use iCloud to store documents. If your favorite productivity app is only sold directly by the developer, youre out of luck.

If youre using an application that can save documents in the cloud, doing so is fairly simple. Say youre using TextEdit. After youve created a new document, press Command-S, and make sure the Where menu shows iCloud. Name the file and click Save, and the document will be sent to the cloud.

Once youve saved a file to the cloud, you can access it from multiple devices. Say you have a desktop Mac and a laptop; you can save any files you need on the road in iCloud and access them from either computer as long as you use the same app.

To open files youve saved to iCloud, press Command-O in an iCloud-savvy application, then click on the iCloud button. Youll see something like this:

Note that in the above screenshot you see a folder. To create a folder, just drag one file on top of another, as you would with icons on an iPhone or iPad. Name the folder, and itll be saved on iCloud.

You may have a number of files on your Mac that youd like to put in the cloud; this is straightforward. Just open a file with an application that can put documents on iCloud, choose File > Move To, then choose iCloud from the Where menu. If you want to move a file from the cloud to your Mac, click on the Where menu and find the folder where you want to place the file. If the folder where you want to move the file isnt in the menu, choose Other from the bottom of the menu, and navigate to the location you want.

Link:
How to share documents with iCloud

Read More..