Page 1,310«..1020..1,3091,3101,3111,312..1,3201,330..»

Google: Use the Cloud, Save the Planet

Organizations generally switch to cloud-based services to save money, but there are environmental benefits as well. Cloud computing reduces energy use and carbon emissions, according to Google, which claims that an average enterprise can lower its energy usage by 65 percent to 85 percent by switching to online productivity tools such as Google Apps.

“Lower energy use results in less carbon pollution and more energy saved for organizations,” writes Google’s Urs Hoelzle, senior vice president for technical infrastructure, in a Monday post on the Google Green Blog.

A typical organization has more servers than it needs for back up, failures, and spikes–an inefficient system that wastes energy and money, Hoelzle writes. Cloud-based services, by comparison, are used far more efficiently by thousands of people, and are engineered to minimize energy output for operating cooling servers.

Energy Impact of the Cloud Source: Google

How much energy and money can organizations save by switching to the cloud? According to Google, the U.S. General Services Administration (GSA) cut its server energy use by nearly 90 percent and its carbon emissions by 85 percent, when it recently switched 17,000 users to Google Apps for Government. As a result, the GSA will slash its annual energy bill by about $285,000.

There are security issues with cloud computing, of course, and organizations must weigh the pros and cons before making the switch from in-house solutions. But the potential energy savings of using the cloud are crystal clear.

Contact Jeff Bertolucci at Today@PCWorld, Twitter (@jbertolucci) or

Continue reading here:
Google: Use the Cloud, Save the Planet

Read More.. and CopperEgg Partner to Deliver Integrated Real-Time Performance Testing & Cloud Monitoring

AUSTIN, TX–(Marketwire -06/18/12)- CopperEgg, Corp., a cloud analytics and monitoring company, today announced a partnership with, a new approach to Web performance testing for apps, websites and cloud services, to deliver integrated real-time Web performance testing and monitoring for cloud infrastructures. The integration delivers real-time insight into cloud capacity and performance to help better test, scale, and optimize cloud application delivery.

“ is committed to working with best-in-class solutions to help our customers meet the new challenges of rolling out applications in a dynamic cloud infrastructure,” said Tamer Abbas,’s head of business development. “Combining’s outside-in metrics such as response times, rates and number of users, with CopperEgg’s inside-out metrics such as CPU, Disk I/O and memory utilization, enables end-to-end visibility into your app or your website performance.”

Connecting CopperEgg’s real-time system performance measurements to the native interface via the CopperEgg API enables users to correlate Application Performance Management (APM) metrics with system capacity and performance statistics on the same graph. This allows application developers and DevOps engineers to immediately see the effect of a code or system change within seconds of that change, creating a much tighter and higher fidelity testing loop.

“Integration between and CopperEgg not only delivers instant system performance and capacity feedback to customers, it also demonstrates the power and ease-of-use of the CopperEgg API,” said Mike Raab, V.P. Business Development at CopperEgg. “Integration with followed the CopperEgg mantra of simple, smart, and fast. We look forward to working with in taking APM for DevOps to the next level.”

For more information about the partnership visit:

About is a simple yet powerful cloud-based service that enables developers creating apps, websites or cloud services to immediately and cost-effectively test the performance of their solutions under real-world conditions. Either on its own or as an integrated part of its large ecosystem of partners, helps application and website developers throughout the DevOps lifecycle with continuous monitoring and performance testing with no scripting required. supports APIs for development languages such as Ruby, Java, Maven, Node.js, Python, Perl, PHP and more. To learn more, visit, or follow us on Twitter @blitz_io

About CopperEggCopperEgg next-generation cloud monitoring provides simple, smart, and fast insight into the performance, quality, and availability of servers, applications and services deployed on cloud, virtual and physical infrastructures. Our SaaS-based, real-time cloud monitoring and cloud analytics deliver immediate intelligence into critical cloud performance problems, correlated visibility into developing trends, and split-second decision support for organizations of all sizes. CopperEgg products are simple to try, install, use, and grow. CopperEgg is backed by Silverton Partners and based in Austin, Texas.

For more information, visit:, as well as on Twitter: @CopperEgg. You can also read their blog:

Read the rest here: and CopperEgg Partner to Deliver Integrated Real-Time Performance Testing & Cloud Monitoring

Read More..

Mellanox Announces Connect-IB, World’s Leading Scalable Server and Storage Interconnect Adapter


ISC12 Mellanox Technologies, Ltd. (MLNX) (MLNX.TA), a leading supplier of high-performance, end-to-end interconnect solutions for data center servers and storage systems, today announced Connect-IB, the worlds leading scalable server and storage adapter solution for High-Performance Computing (HPC), Web 2.0, Cloud, Big Data, financial services, virtualized data centers and storage environments. Connect-IB adapters deliver the highest throughput of 100Gb/s utilizing PCI Express 3.0 x16, unmatched scaling with innovative transport services, sub-microsecond latency and 130 million messages per second 4X higher message rate over competing solutions.

Connect-IB is the new foundation for scalable computing. HPC, Web 2.0, and cloud environments are challenging todays interconnect technologies with their demand for infrastructures, utilizing tens-of-thousands of servers, and hundreds of virtual machines per server. New applications such as Big Data analytics and in-memory computing depend on parallel execution and RDMA (Remote Direct Memory Access). RDMA has also become critical for storage solutions. The new Connect-IB interconnect architecture delivers the performance and capabilities required by compute and storage intensive applications and enables IT managers to build the most efficient, extreme scale data centers.

In the high-performance computing server and storage markets, the explosion of data volumes is significantly increasing the demand for network throughput, said Steve Conway, IDC research vice president for HPC. The introduction of interconnect technology at 100Gb/s is an important step towards meeting these demands. With the rollout of powerful next-generation compute servers including Intel’s Romley, we expect growing demand from a variety of HPC markets for highly scalable, high bandwidth, low latency interconnect solutions such as those being offered by Mellanox.

Mellanox is the first company to deliver 100Gb/s interconnect throughput a significant breakthrough to take our customers to the next level of scalable computing, said Eyal Waldman, chairman, president and CEO of Mellanox Technologies. Connect-IB delivers the industrys highest performing server and storage interconnect with maximum bandwidth, low latency and highest application efficiency.

The Connect-IB product line consists of single and dual port adapters for PCI Express 3.0 with options for x8 and x16 host bus interfaces as well as a single port adapter for PCI Express 2.0 x16. Each port supports FDR 56Gb/s InfiniBand with MPI ping latency less than 1us. All Mellanox HCAs support CPU offload of transport operations and RDMA for efficient computing. New in Connect-IB is Dynamic Transport operation support for unlimited scalability and end-to-end data protection for unmatched data reliability. Adapter cards are sampling today.

Supporting Resources:

About Mellanox

Mellanox Technologies is a leading supplier of end-to-end InfiniBand and Ethernet interconnect solutions and services for servers and storage. Mellanox interconnect solutions increase data center efficiency by providing the highest throughput and lowest latency, delivering data faster to applications and unlocking system performance capability. Mellanox offers a choice of fast interconnect products: adapters, switches, software and silicon that accelerate application runtime and maximize business results for a wide range of markets including high performance computing, enterprise data centers, Web 2.0, cloud, storage and financial services. More information is available at

Mellanox, BridgeX, ConnectX, CORE-Direct, InfiniBridge, InfiniHost, InfiniScale, PhyX, SwitchX, Virtual Protocol Interconnect and Voltaire are registered trademarks of Mellanox Technologies, Ltd. Connect-IB, FabricIT, MLNX-OS, Unbreakable-Link, UFM and Unified Fabric Manager are trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.

Original post:
Mellanox Announces Connect-IB, World’s Leading Scalable Server and Storage Interconnect Adapter

Read More..

Dynamic Data Centers and Efficient Operations in the Cloud. Patent for Auction. ICAP Patent Brokerage Announces for …

SAN FRANCISCO, June 18, 2012 /PRNewswire/ –ICAP Patent Brokerage, a division of ICAP plc and the world’s largest intellectual property brokerage firm and organizer of the ICAP Ocean Tomo Auctions, is offering for auction a patent portfolio of seventeen (17) issued U.S. patents and associated pending applications regarding enterprise “rack servers” and their provision and management. The lot will be included in the 16th ICAP Ocean Tomo IP Auction on July 26, 2012, at the Julia Morgan Ballroom in San Francisco, CA.


“We are excited to be offering this patent technology lot for auction to our global buyer base.” Dean Becker, CEO, ICAP Patent Brokerage, ICAP Ocean Tomo Auctions.


Virtual data centers are comprised of enterprise servers that store data remotely. Thus, the location of these enterprise servers is sometimes referred to as “the cloud” and their remote access and use is sometimes referred to as “cloud computing.” As these data centers (or “clouds”) require to be scaled up or down based on use or demand, significant addition / re-location of servers, hardware reconfiguration, and re-cabling is often required, sometimes taking significant time and being very expensive. A cost-effective and reliable solution was required to address the need to make data centers more dynamic and efficient, all without requiring the physical relocation and rewiring of servers.

Key Characteristics & Benefits

With priority dates from 2004, the patents in this portfolio disclose an architecture for enterprise servers (ES) with varying arrangements of pluggable modules called an enterprise fabric (EF), with the following benefits:

Market Potential

This patented technology will be important to all data center providers and operators as well as manufacturers of enterprise servers and networking equipment.

Companies who have cited this patent portfolio include: Cisco, Oracle, IBM, Hewlett-Packard, Intel, Microsoft and Toshiba.

Read the original:
Dynamic Data Centers and Efficient Operations in the Cloud. Patent for Auction. ICAP Patent Brokerage Announces for …

Read More..

Google promotes energy savings with Apps

Companies that turn off their local servers for e-mail, productivity and collaboration applications and switch to the cloud-hosted Google Apps suite can save significant amounts of money in energy costs, Google said on Monday.

The savings typically range between 65 percent and 85 percent, and involve reductions in consumption of energy for powering and cooling the local servers, Google said, citing results of an internal study.

Google offered as an example the U.S. General Services Administration (GSA), which signed up for Google Apps for Government this year and moved about 17,000 employees to it from on-premise systems. The GSA cut energy consumption by servers related to email and collaboration applications by almost 90 percent, which will allow it to save about $285,000 per year on energy costs in that area, a reduction of 93 percent, according to Google.

When it takes on customers’ workloads, Google spends much less energy than they do because servers in its data center make more efficient use of their computing and storage capacities, and its maintenance and administration costs are less, Google said.

“A typical organization has a lot more servers than it needs–for backup, failures and spikes in demand for computing. Cloud-based service providers like Google aggregate demand across thousands of people, substantially increasing how much servers are utilized. And our data centers use equipment and software specially designed to minimize energy use. The cloud can do the same work much more efficiently than locally hosted servers,” wrote Urs Hoelzle, Google’s Senior Vice President for Technical Infrastructure, in the blog post.

Google has been promoting the benefits of using Google Apps and cloud-hosted software for years, as acceptance of this model has been steadily increasing. Google and other cloud computing proponents constantly highlight that by using cloud-hosted applications, companies can save on hardware sales and software maintenance, while improving and simplifying the way users can collaborate on server-based documents.

However, the benefits of cloud-based software still have to be weighed against potential pitfalls, like application outages and latency, lack of compliance with data protection regulations in certain industries and countries, and software subscription models that may be economically inconvenient for certain companies and in certain scenarios.

Juan Carlos Perez covers enterprise communication/collaboration suites, operating systems, browsers and general technology breaking news for The IDG News Service. Follow Juan on Twitter at @JuanCPerezIDG.

Read the original here:
Google promotes energy savings with Apps

Read More..

Security of personal data in the cloud more important than where it is stored, EU official says

Megan Richards, acting deputy director-general of the Information Society and Media Directorate-General at the European Commission, said that personal data should not have to be located within the EU in order for EU rules governing its processing and storage to apply.

The cloud does not stop at national boundaries,” Richards said at a cloud computing conference in London last week, according to a report by Techworld.

“You shouldnt care where the data is as long as it is secure and meets regulatory requirements, so now the question is how to ensure that how to make sure that when we use cloud resources, personal data does meet those requirements, Richards said.

Cloud computing refers to the use of computers and software on an internet-based network to do information processing rather than the use of local computing resources. It allows internet users to access or store information without owning the software to do it and many online companies, such as Google, operate huge servers that store the data and deliver it to users.

In January the European Commission published draft legislation aimed at reforming the EU data protection framework. Its proposed General Data Protection Regulation would introduce a single data protection law across all 27 EU member states which companies based outside the EU borders would be subject to if they process personal data of EU citizens.

Richards said that the European Parliament is currently assessing the plans and that “it usually takes a year” for legislation to be passed by the Parliament, according to a report by The Register news website.

However, the current data protection framework in Europe is creating problems for researchers who store information in the cloud, an IT expert at the European Organisation for Nuclear Research (CERN) has said.

CERN’s openlab project sees private sector firms invest in research by scientists working on the Large Hadron Collider. The physics experiments produce a mass of data that CERN shares with its private sector investors.

Bob Jones, head of openlab at CERN, said though that capacity issues were a problem. He said that the body is to conduct a pilot scheme that would see cloud computing utilised to move data produced from the experiments between CERN’s own systems and ‘data centres’ in the cloud operated by its commercial partners.

However, current EU data protection laws are a hindrance to CERN’s collaboration plans, Jones said.

Read this article:
Security of personal data in the cloud more important than where it is stored, EU official says

Read More..

Alpha Networks to offer switches matched with data center servers

Alpha Networks to offer switches matched with data center servers

Irene Chen, Taipei; Adam Hwang, DIGITIMES[Monday 18 June 2012]

Alpha Networks, a Taiwan-based maker of networking/communication devices, will cooperate with suppliers of servers used in cloud computing data centers to offer switches for being integrated with such servers and related solutions in the second half of 2012, according to the company.

Alpha Networks currently ships switches used in cloud computing data centers to international vendors of servers on an OEM basis, with such switches accounting for 20-30% of shipments of all switches. Viewing that there have been operators of cloud computing data centers procuring servers and solutions directly from makers instead of via vendors, Alpha Networks has decided to adjust its operational model, the company indicated.

In view of increasing competition for OTT (over-the-top) STBs (set-top boxes) in the retail market, Alpha Networks will offer dual-mode OTT STBs, models integrating OTT with terrestrial TV, and begin shipments in the third quarter of 2012, the company pointed out. The company has finished development of Smart TV Box equipped with an open platform to enable clients to build up application services based on their needs.

In addition, Alpha Networks has been in cooperation with telecom carriers and operators of pay TV service to develop STBs integrating OTT and DVB-S satellite TV and IP STBs, with shipments to begin at the end of the third quarter of 2012, the company indicated.

Excerpt from:
Alpha Networks to offer switches matched with data center servers

Read More..

Can your cloud balance supply and demand?

Private cloud involves a multi-layered approach to architecting IT systems and delivering services to the business, making the most of virtualisation to provide a separation between the two.

There is no clever magic involved, nor does there have to be anything particularly special about the servers, storage and networking equipment used. Rather, it is a question of arranging resources in the right way and using management software to make the most of them.

To understand how things need to look, we can consider private cloud in terms of supply-side capabilities that consider physical assets and demand-side capabilities that focus on how services are delivered.

The trick, if there is one, is to ensure that supply-side capabilities are adequate, and to use software automation to enable these to be provisioned and managed in the most efficient way.

Underlying both supply and demand is the requirement for resilience to be built into the infrastructure.

Private cloud adoption involves consolidating multiple applications onto a reduced hardware footprint, which therefore increases the risk of failure.

If power goes down to a single server running a single application, for example, only one application is lost. But if a server is running multiple virtual machines, the damage could be far greater.

So it is important to ensure adequate provision is made for redundancy and failover across the architecture.

Operational management of supply-side capabilities requires a clear understanding of the server, storage and networking infrastructure and how it is packaged into logical units.

To an extent this equates to good, honest, traditional IT operations configuration management of equipment, components and firmware versions, patching status and so on.

Read this article:
Can your cloud balance supply and demand?

Read More..

Application Hosting

16-06-2012 17:51 – Cloud9 Real Time Small Business and Professional Practice Cloud Solutions offer application hosting to fit your needs! With over 250 applications currently on the systems and growing daily, the possibilities are endless. Outsource your server and get all of your apps in the Cloud in one central location for singular login. http Hosting the full suite of products from popular manufacturers such as Thomson Reuters, Intuit, Microsoft, CCH, and Sage as well as small industry specific or proprietary software. Application hosting is open because Cloud9 Real Time builds customized servers for your business.

Follow this link:
Application Hosting

Read More..

Teradata offers new private cloud service

Published : Monday, June 18, 2012 00:00 Article Views : 119 Written by : TECH TIMES

Singapore: Analytic data solutions provider Teradata recently announced the availability of Teradata Active Data Warehouse (ADW) Private Cloud, which is poised to arm companies with the tools in developing more informed decisions and establishing competitive advantages in their respective markets.

Unveiled during Teradata Universe 2012 in the island city-state, Teradata Active Data Warehouse (ADW) Private Cloud enables a more fully utilized data management proves by consolidating data warehouse servers on it.

Furthermore, aside from reducing the number of servers and storage, this private cloud can also deliver both capital and operating expense savings, including labor, data center space, power and cooling.

Leading companies have consolidated under-utilized servers and storage onto a Teradata ADW Private Cloud, reducing costs while increasing utilization of their IT resources, explained Scott Gnau, president at Teradata Labs.

By eliminating data marts, many with only 10-20 percent utilization, companies can consolidate onto a Teradata ADW Private Cloud running at 90 to 100 percent utilization, Gnau added.

Through this, business users can benefit from this private cloud by having better control of their computing resources with self-service capabilities.

Users can also demand for analytics processing by a completely functioning data lab to provision and manage data needed in less than five minutes.

Read the original here:
Teradata offers new private cloud service

Read More..