Page 4,242«..1020..4,2414,2424,2434,244..4,2504,260..»

Top 5: Risks of encryption backdoors – TechRepublic

There's lots of talk of mandating a backdoor to encrypted services so that law enforcement can use them under warrants. The need is real and there are some reasonable compromises that can keep all our data safe and still help catch bad guys.

But a backdoor for the good guys is potentially a backdoor for the bad guys too. Here are five reasons a backdoor in encryption is a bad idea:

1. Strong encryption protects dissidents and democracy advocates in repressive regimes as well. Putting in backdoors limits their options and weakens their protections.

2. The backdoor goes beyond the phone. IoT devices are becoming more and more frequent, meaning any device with a connection could have a backdoor. If someone gets the keys or figures out how the backdoor works, they could get inside lights, door locks and more.

3. Dual key systems are inherently less secure. Having one key that you the user are the only with access to is the only way to make sure that you are the only weak point. Having dual keys stored in a government agency gives attackers more targets for social engineering and other attacks.

SEE: Ethical Password Hacking and Security (TechRepublic Academy)

4. Criminals can choose not to use the services with backdoors. Open source encryption tools are available that nobody controls, and large enough organizations can create their own. So you're weakening security for law abiding citizens more than criminals.

5. You can't make math illegal. The solution to our last point is to make any encryption without a backdoor against the law. Except that encryption is generally just multiplying two prime numbers. It would be hard to make that against the law.

Now there is more tech companies could do to assist law enforcement. Creative solutions being proposed include pushing updates that do things like say, surreptitiously turn on logging in an app like WhatsApp for a suspect who is the target of a court approved warrant.

That may or may not be the right answer of course but that's where productive discussion can be had. The kind of things that lessen a criminal's security without breaking encryption for everyone.

Also see:

Here is the original post:
Top 5: Risks of encryption backdoors - TechRepublic

Read More..

Indian IT firms value scaling encryption, lag in adoption: Study – Economic Times

NEW DELHI: Indian IT firms value scaling of data encryption but lag in adoption of the technology compared to the global average, says a study commissioned by French security technology firm Thales.

"95 per cent of organisations in India valued scalability for encryption solutions, which was much higher than any other country, global average of 29 per cent," the Global Encryption Trend study said.

However, it found that 82 per cent of organisations in India covered in the study embrace some type of encryption strategy while global average is of 86 per cent

The survey is based on responses from more than 5,000 IT security decision makers across multiple industry sectors in the United States, United Kingdom, Germany, France, Australia, Japan, Brazil, the Russian Federation, Mexico, Saudi Arabia and the United Arab Emirates and includes responses from 548 individuals in India.

"This study is part of a global initiative by Thales to educate leaders from the private and public sectors on the privacy and data protection practices companies can follow today," Thales, Country director India, Emmanuel de Roquefeuil said.

The company operates in strategic electronics and IT space with focus on high-end security. It is setting up manufacturing unit in India in partnership with Reliance Defence for making radar and electronic warfare display system for supply to Rafale Jet.

The study found that Indian firms led globally in adoption of cloud technology with 75 per cent of organisations transferring sensitive or confidential information to the cloud - whether encrypted or not - compared to global average of 53 per cent.

The top drivers for encryption in India are to protect against specific, identified threats and customer information.

"This is in contrast to the global data where compliance is, and historically always has been, the top driver for encryption. In India, compliance ranked third on the list at 55 per cent," the study said.

As per the study, 62 per cent of the respondents in India feel hardware security modules (HSMs) will be important in the next 12 months for encryption or key management strategy which is almost in line with global average of 61 per cent.

"This study is a call to action for organisations in India to strengthen their security position with strong data security and encryption plans in order to secure sensitive data and adhere to risk and compliance best practices and regulations," Thales e-Security, director for sales in South Asia, James Cook said.

Most of the Indian IT firms are of the view that top threat to sensitive data is from mistake of employees followed by hackers and temporary contract workers, the study said.

"Top threat to sensitive data continues to be employee mistakes (55 per cent of respondents), followed by hackers (36 per cent) and temporary or contract workers (31 per cent)," the study said.

Continued here:
Indian IT firms value scaling encryption, lag in adoption: Study - Economic Times

Read More..

IBM deprecates its first cloud storage platform – www.computing.co.uk

IBM launched Bluemix less than two years ago

IBM is killing off its first attempt at cloud object storage - after less than two years.

The cloud is a useful tool, offering high speeds for fast development and storage - but the downside of that speed is evident in IBM's latest decision: turning off a service that it launched in December 2015, because it is obsolete.

Bluemix's Object Store v1 is a SaaS solution, designed to store unstructured data. It didn't last long: IBM refers to it as a 'tech preview connector', and it was made a private service in February 2016.

In an attempt to get users to upgrade to the newest v3 (a simple case of copying data and updating an app), the firm has said, 'We will now be deleting all existing instances after 30 days i.e. on August 24, 2017. We recommend users to unprovision the Object Storage v1 service and switching to v3, before August 24, 2017. Please copy over any data and point Analytics for Apache Spark applications to use the Object Storage v3 service.'

The fact that IBM is shutting down the service after only 20 months may concern some users, especially as they are being forced to manually move their data: platform upgrades for SaaS are usually non-disruptive.

V3 of IBM's Object Store needs to be superior to its predecessors; Gartner's recent Magic Quadrant for Cloud Infrastructure as a Server ranked IBM in the 'Leader' category, marking a firm with a relatively complete vision but low ability to execute on that vision. The research firm said the cloud feature-set 'has not improved significantly since the IBM acquisition [of SoftLayer] in mid-2013; it is SMB-centric, hosting-oriented and missing many cloud IaaS capabilities required by midmarket and enterprise customers.'

Gartner added, 'IBM has, throughout its history in the cloud IaaS business, repeatedly encountered engineering challenges that have negatively impacted its time to market. It has discontinued a previous attempt at a new cloud IaaS offering, an OpenStack-based infrastructure that was offered via the Bluemix portal in a 2016 beta. Customers must thus absorb the risk of an uncertain roadmap. This uncertainty also impacts partners, and therefore the potential ecosystem.'

Go here to read the rest:
IBM deprecates its first cloud storage platform - http://www.computing.co.uk

Read More..

Senate Resurrects Cloud Storage Protections Bill | Broadcasting … – Broadcasting & Cable

A bipartisan bill, the ECPA Modernization Act, has been introduced that would update communications privacy law to protect cloud storage. It is the latest effort by the Senate to address the issue after the House voted overwhelmingly to protect older data.

In the previous Congress, Senate Judiciary Committee chairman Charles Grassley (R-Iowa.) pulled an Electronic Communications Privacy Act update bill from the committee's markup agenda after "poison pill" amendments threatened to expand the bill into areas that neither of its co-sponsors wanted it to go.

That baseline bill, which passed the House 419 to zero, would have updated the Electronic Communications Privacy Act to provide protections for cloud storage by requiring a probable cause warrant for accessing information in the cloud and extending the protections to emails and other content stored over 180 days (currently no warrant is required to access those).

"Americans dont believe the federal government should have warrantless access to their emails just because they are 180 days old, Lee said in a statement on the new bill. They dont believe the government should be able to always know where you are just because you are carrying a cell phone. It is long past time that Congress updated our federal laws to better protect Americans privacy.

Americans expect and deserve strong, meaningful protections for their emails, texts, photos, location information and documents stored in the cloud," said Leahy.

Leahy helped write the original ECPA law and has said no one anticipated the way communications would be transmitted and stored.

In the digital world, Americans deserve the same privacy protections that we have for our papers and personal information in the physical world," said Adam Brandon, president of free market, small government group Freedomworks. "Senator Lees efforts to reform ECPAs outdated standards will restore the protections that our founders enshrined in the Constitution. Im glad to see Sens. Lee and Leahy's continued leadership on this important issue.

Internet-era privacy reforms are long overdue and we commend Senators Lee and Leahy for their bill to clearly extend Fourth Amendment protections to emails and geolocation information stored in the cloud," said Ed Black, president of the Computer & Communcations Industry Association. "As most individuals communications are now stored online, law enforcement should obtain a warrant before demanding access. This principle is equally true for the intimate information contained in users digitally stored location data. The Lee-Leahy bill will ensure that the Constitutions protections for individual privacy are reflected in how information is stored and accessed in the 21st Century.

ECPA was enacted long before many of us knew what email was, let alone used it, and over 30 years later it is woefully out of step with our everyday world of communicating through connected devices and cloud computing, said Andy Halataei, SVP for government affairs for tech trade group ITI. Electronic communications contain the most sensitive details about our lives, but unlike a filing cabinet or desk drawer in our homes, the government can access emails and other online content without a warrant after 180 days. Like ECPA reforms unanimously passed by the House earlier this year, Sens. Lee and Leahys bill reflects how we use cloud services to communicate by granting our electronic communications the same Constitutional protections enjoyed by the papers and effects we keep in our homes.

Follow this link:
Senate Resurrects Cloud Storage Protections Bill | Broadcasting ... - Broadcasting & Cable

Read More..

Geared up to make best use of cloud storage, AI: Naidu – The Hindu

Chief Minister N. Chandrababu Naidu said Andhra Pradesh was far ahead of other States in harnessing the power of Information Technology (IT) and that it was geared up to make the best use of the fourth industrial revolution driven by Artificial Intelligence (AI) and Internet-of-Things (IoT).

Inaugurating Pi Data Centre (PDC), the States first such facility, at the A.P. Industrial Infrastructure Corporations IT Park here on Friday, Mr. Naidu said he had been associated with IT and IT-Enabled Services for over three decades and played an instrumental role in the growth of Hyderabad (Cyberabad) into a global IT hub.

He said that A.P. had been a pioneer in deploying IT in the implementation of a variety of welfare schemes, and rued the fact that he lost the elections in 2004 due to the perception that he over-emphasised on advanced technologies, which are now helping A.P. in growing at a faster clip with a greater degree of transparency and accountability.

He exuded confidence that the establishment of PDC would encourage more companies in the IT space to enter Amaravati, the upcoming capital city of Andhra Pradesh. He claimed that the real-time governance architecture rolled out by the Government of A.P. (GoAP) facilitated speedy decisions, and corruption could be checked to a large extent.

Mr. Naidu said: The GoAP took 100 racks in the PDC and would seek more storage and backup space as smartphones and other latest technologies reach the next levels of growth, where there would be a manifold growth in the generation of data and the need to process and analyse it for varied applications.

PDC Chairman and CEO Kalyan Muppaneni said he was thankful to the GoAP for its support in setting up the data centre with an initial capacity of 500 racks that is scalable to 5,000. He observed that the loss of data would have serious repercussions for companies, which brings data centres like PDC into the picture.

Mr. Kalyan said: A sum of 600 crore is being invested in phases in the PDC at Mangalagiri which is Tier-IV compliant and is the companys global headquarters.

Minister of IT and Panchayat Raj Nara Lokesh, Unique Identification Authority of India Chairman J. Satyanarayana, A.P. Principal Secretary (IT) K. Vijayanand, venture capitalists B.V. Jagadeesh and Sudheer Kuppam and Sridhar Vembu (Chairman, Zoho Corporation) were present.

View original post here:
Geared up to make best use of cloud storage, AI: Naidu - The Hindu

Read More..

Verizon data of 6 million users leaked online – CNNMoney

The security issue, uncovered by research from cybersecurity firm UpGuard, was caused by a misconfigured security setting on a cloud server due to "human error."

The error made customer phone numbers, names, and some PIN codes publicly available online. PIN codes are used to confirm the identity of people who call for customer service.

No loss or theft of customer information occurred, Verizon told CNN Tech.

UpGuard -- the same company that discovered leaked voter data in June -- initially said the error could impact up to 14 million accounts.

Chris Vickery, a researcher at UpGuard, discovered the Verizon data was exposed by NICE Systems, an Israel-based company Verizon was working with to facilitate customer service calls. The data was collected over the last six months.

Vickery alerted Verizon to the leak on June 13. The security hole was closed on June 22.

The incident stemmed from NICE security measures that were not set up properly. The company made a security setting public, instead of private, on an Amazon S3 storage server -- a common technology used by businesses to keep data in the cloud. This means Verizon data stored in the cloud was temporarily visible to anyone who had the public link.

ZDNet first reported the breach.

Related: Data of almost 200 million voters leaked online by GOP analytics firm

The security firm analyzed a sample of the data and found some PIN codes were hidden but others were visible next to phone numbers.

UpGuard declined to disclose how the leaked data was discovered.

Dan O'Sullivan, a Cyber Resilience Analyst with UpGuard, said exposed PIN codes is a concern because it allows scammers to access someone's phone service if they convince a customer service agent they're the account holder.

"A scammer could receive a two-factor authentication message and potentially change it or alter [the authentication] to his liking," O'Sullivan said. "Or they could cut off access to the real account holder."

Verizon customers should update their PIN codes and not use the same one twice, O'Sullivan advises.

The is the latest leak to surface from a misconfigured Amazon S3 storage unit. In June, an analytics firm exposed the data of almost 200 million voters, and earlier this month, an insecure server leaked 3 million WWE fans' data last week.

Why does this keep happening? Amazon secures these servers by default. This means the errors that occur are due to changes someone makes with a security setting -- typically by accident, O'Sullivan said.

O'Sullivan says the Verizon case highlights how many third-parties have access to our personal data.

"Cyber risk is a fact of life for any digital service," O'Sullivan said. "As data becomes more powerful and more accessible, the potential consequences for it to be misused also becomes more dangerous."

CNNMoney (New York) First published July 12, 2017: 4:14 PM ET

Visit link:
Verizon data of 6 million users leaked online - CNNMoney

Read More..

Cisco Launches New UCS Servers, Hybrid Cloud Management … – SDxCentral

Cisco today debuted new servers and software, which includes a hybrid cloud management tool.

The new cloud tool is called the Workload Optimization Manager, and its powered by cloud management software fromTurbonomic. The product uses intent-based analytics to match workload demand to infrastructure supply across on-premise and multi-cloud environments.

It also compares costs of moving workloads from public clouds, older Cisco servers, and non-Cisco machines to the latest Unified Computing System (UCS) M5 servers announced today.

Ciscos own IT department started using Turbonomic for data center management, and they said, you need to check this out, said Joann Starke, senior manager ofCisco Data Center Solutions.

The company used the software to manage 30 million watts of raised data center floor space. Eighteen months after installing the product, the IT department optimized half of their data center environments and downsized the data center footprint by one-quarter. This saved the company $17 million in equipment costs over the same time frame, and $2.8 million every month in rental costs for space.

The hybrid cloud management software also integrates with UCS infrastructure, which enables Cisco customers to identify idle machines and increase workload density.

Cisco also updated its UCS Director the software that manages the UCS hardware.

The UCS Director 6.5 extends automation capabilities beyond infrastructure by automating native PowerShell functions, virtual machine mobility across vCenter data centers, and support for VMware VMRC console.

It also integrates with Workload Optimization Manager, which enables the automatic creation of a new virtual machine or configuration of a physical server by UCS Director. Workload Optimization Manager then reallocates resources. This ensures application performance and cost efficiency, Starke said.

It is our plan and our vision to expand this across Ciscos entire hybrid cloud stack, she added.

The software updates and new workload management tool help IT departments modernize their data centers with automation, Starke said.

Workloads are increasing by 26 percent, year over year, but IT budgets are only increasing by 3 percent, she explained. Clearly we have a gap and you cant hire enough humans to fill it. You need automation. Youre letting software manage software.

In addition to the software, Cisco today launched new UCS M5 servers. They are built on the Intel Xeon processors, also announced today. Of the five new Cisco machines, three are rack servers and two are blade servers.

The servers include up to double the memory capacity of previous systems and deliver up to 86 percent higher performance compared to the previous generation of UCS, Cisco claims.

Our customers are telling us they want faster applications with fewer complications, said Todd Brannon, director of product marketing, unified computing at Cisco. The demand for real-time analytics the trend there is certainly pointing upward.

Customers want servers with more memory and more graphics processing units (GPUs), which accelerate machine learning algorithms, Brannon added. To this end, one of the new blade servers includes a half-width blade form factor, which allows it to support two GPUs. Cisco says this is an industry first.

Additionally, one of the new rack servers tripled its GPU support, so it now can support six.

The hardwares key differentiator is really the software, Brannon said. Where others warp their servers up in sheet metal, were warping them up in software, he said. Its definitely all about the software for us in UCS.

When asked about the new servers and software, analyst PatrickMoorhead, president of Moor Insights & Strategy said, I like what I see, particularly for current UCS customers. Their new hardware and software is focused at solving real problems and the automation is differentiated.

But, he added, hed like to hear more about Ciscos server security. The new attack point is server firmware, less so on the network and client device, he explained.

The rest is here:
Cisco Launches New UCS Servers, Hybrid Cloud Management ... - SDxCentral

Read More..

New Azure servers to pack Intel FPGAs as Microsoft ARM-lessly embraces Xeon – The Register

Microsoft may have said ARM servers provide the most value for its cloud services back in March, but today it's given Intel's new Xeons a big ARM-less hug by revealing the hyperscale servers it uses in Azure are ready to roll with Chipzilla's latest silicon and will all use Chipzilla's field programmable gate arrays.

Those servers are dubbed Project Olympus and Microsoft has released their designs to the OpenCompute Project. In a post doubtless timed to co-incide with the release of the new Xeons, Microsoft reveals worked closely with Intel to engineer Arria-10 FPGAs, which are deployed on every single Project Olympus server, to create a 'Configurable Cloud' that can be flexibly provisioned and optimized to support a diverse set of applications and functions.

Redmond also praises the Xeon Scalable Processors as being jolly powerful and all that, which will help Azure to scale and handle different workloads. But it's the news that Redmond's all-in with Intel Arria FPGAs that must be warming cockles down Chipzilla way, as using Xeons as the main engine and tweaking them for different roles with FPGAs is Intel's strategy brought to life.

IBM's also embraced the new Xeons, gushing that it will be the first to offer them on bare metal cloud servers. But not, in all likelihood, the first to use them at all: Google has claims to have been running them since June 1st, 2017.

The deal that gave Google early access to Skylake Xeons was thought to be one reason Microsoft let its excitement about ARM servers emerge into public view.

But The Register does not believe that ardour and today's kind words for Xeon are mutually exclusive: Redmond is surely contemplating future Azure architectures, so while Wintel looks strong today, there's still plenty of time in which the alliance could splinter.

Read more from the original source:
New Azure servers to pack Intel FPGAs as Microsoft ARM-lessly embraces Xeon - The Register

Read More..

Server vendors board the Xeon SP party bus – The Register

As expected when Intel processors power virtually all x86-class servers, the vendors all hopped on the Skylake Xeon SP party bus.

They hope to ride the server update into the market better than each other, and get every last upgrade penny they can pocket.

Cray's XC50 supercomputers and CS line of cluster supercomputers will be available with Xeon Scalable Processors and should run their jobs faster than before.

Cray XC50 supercomputers with Xeon SPs are available now. The Cray CS500 cluster supercomputers and CS-Storm accelerated cluster supercomputers with Xeon SP will be available in the third quarter.

Fujitsu is announcing its new, refreshed range of dual and quad-socket PRIMERGY servers and octo-socket PRIMEQUEST business critical servers using Xeon SPs.

The line includes the multi-node and modular PRIMERGY CX400 M4, which has blade servers inside a rack chassis.

Technical features include DDR4 memory modules and up to 6TB capacity in quad-socket PRIMERGY servers, flexible configuration options to support mix-and-match of storage drive bays and graphics processing units (GPUs) to accelerate high-performance computing, hyperscale, and enterprise data centre workloads.

Fujitsu says its PRIMEQUEST server pushes the performance envelope of SAP HANA up to 12TB.

These server lines are available worldwide from Fujitsu and its distribution partners. Prices vary by region, model and configuration.

Taipei-based Gigabyte's Xeon SP line will initially offer four new 1U-form factor and four new 2U-form factor systems, as well as two motherboard SKUs that support the Scalable series, and have a range of options for storage and expansion slots.

Gigabyte R281 rack server

Check them out here. No specific Xeon SP processors are detailed.

HPE's social media whizz, Calvin Zito, has two videos available showing HPE product people talking about Xeon SP use in the DL380 and DL560 ProLiant gen 10 server line. Gen 10 signifies Xeon SP use.

Youtube Video

IBM? Launching Xeon SP servers? Didn't it sell off its x86 server line to Lenovo way back when? Yes, it did, but this time around the x86 block it's launching bare metal Xeon SP servers in the IBM Cloud. They'll use Xeon Silver 4110 or Xeon Gold 5120 processors, Big Blue burbles, for faster insights from big data workloads.

These join its POWER servers in the cloud, and will be available in IBM Cloud data centres in the US, UK, Germany and Australia from Q3 2017. We don't know who is building these servers for IBM.

Lenovo blade, rack, tower, dense, mission-critical and hyperscale servers. The SN550 and SN850 blades support Xeon SP Platinum CPUs as do the SR530, 550, 630 and 650 rack servers. So far its website doesn't specify Xeon SP support for the SR570 and SR590 rack servers

The ST550 tower and SD530 dense servers support Xeon SP Platinums but it doesn't say whether the liquid-cooled SD650 does.

Both mission-critical servers, SR850 and SR950, fly the Xeon SP Platinum colours.

Lenovo has announced 42 new world-record benchmarks for its ThinkSystem server portfolio integrated with Xeon SP processors.

The TPC-E benchmark uses a database to model a brokerage firm with customers who generate online transactions related to trades, account inquiries, and market research.

The TPC Benchmark H (TPC-H) is a decision support benchmark for systems looking at large volumes of data, execute complex queries and returning answers.

Lenovo said it is:

The STAC-M3 benchmarks measure workloads in time-series analytics.

Specific Xeon SP CPU models used in these servers are not yet listed.

White box server king Supermicro has a new X11 generation server and storage motherboard and chassis line optimised for the Xeon SPs.

There are X11 dual-processor (DP) and uniprocessor (UP) Serverboards and SuperWorkstation boards, with single, dual and quad-socket motherboards. Xeon SPs up to the Platinum models with 28 cores are supported.

Supermicro claims it offers the most extensive range of computing products with this Xeon SP line for data centre, enterprise, cloud, HPC, Hadoop/Big Data, AI/deep learning, storage, and embedded environments.

Charles Liang, Supermicro president and CEO, said: "Our Server Building Block Solutions are designed to not only take full advantage of Xeon Scalable Processors' new features such as three UPI, faster DIMMs and more core count per socket, but also fully support NVMe through unique non-blocking architectures that achieve the best data bandwidth and IOPS."

Find out more here.

Hyperconverged infrastructure appliance software supplier Maxta says it has immediate support for any server designed for Xeon SP CPUs.

Benchmark testing of a Maxta HCI cluster configured with Xeon Platinum 8168 processors and Intel data centre SSDs with NVMe delivered a storage performance gain of 120 per cent in IOPS, with less than half the storage latency compared to previous Xeon technology.

Using QuickAssist Technology to offload and accelerate real-time data compression operations, the platform offered a further performance gain of 25 per cent and a 13 per cent decrease in latency. Nice.

There's no need, it purrs, to wait months for hardware-based hyperconvergence products to integrate Xeon SP technology.

See more here:
Server vendors board the Xeon SP party bus - The Register

Read More..

Microsoft’s data centre investment to boost SA cloud adoption – IT News Africa

July 12, 2017 General, Opinion, Southern Africa

Robert Marston, Global Head of Product at SEACOM.

Thats the word from Robert Marston, Global Head of Product at SEACOM, who says that Microsofts decision to locally host cloud services such as Azure, Office 365 and Dynamics 365, will not only enable it to offer better performance and lower latencies, thereby providing a better end-user experience, but will critically lower the cost barrier for adoption of these services for enterprise customers.

This represents a significant step forward for South Africas IT industry because it means organisations can access Microsofts rich selection of cloud services from a local data centre, he adds. This will not only give them more reliability, faster speeds and lower latencies than they can get when accessing cloud services from data centres in Europe or the US, but will also cut out international connectivity costs which have typically been a barrier to entry for the move to the cloud.

Up until now, many enterprises have chosen private clouds or to work with local public cloud providers that lack the capabilities of the worlds top cloud computing providers in an effort to ensure reliable, fast access to cloud applications and services, says Marston. When the Microsoft data centres go live in 2018, enterprises will be able to enjoy the full Azure experience with no performance compromises or hefty data charges.

Many CIOs will also be reassured that their data is hosted in a local data centre that complies in full with South African data protection laws, he adds. Knowing that the data centres are in Cape Town and Johannesburg rather than Ireland or Germany will give many organisations the confidence to migrate more aggressively to the cloud, Marston says.

With Microsoft hosting its cloud services locally, it is only a matter of time before the other cloud providers also start to set up data centres in Africa, says Marston. The investment from Microsoft is a signal that the cloud market in South Africa and the rest of Africa has come of age. This is a great time for companies who have not yet made extensive use of cloud services and applications to jump in, he adds.

The cloud business model simply makes sense for Africa, where skills and budgets are often lower than other developed markets. Using cloud computing for solutions such as infrastructure-as-a-service and software-as-a-service promises African organisations massive cost-savings to operate and maintain their own data centres and on-site IT infrastructure. It also enables them to ramp computing capacity up and down in response to changing business needs, and gives them the flexibility they need to innovate and grow with little risk. This is all being made possible by the growing availability and falling cost of high-speed fibre Internet access.

Until the Microsoft African data centres go live, African businesses can fast-track the digital transformation of their organisations by tapping into affordable, dedicated Ethernet links on SEACOMs resilient network between their own IT environments and the worlds leading providers of cloud computing services such as Google, Microsoft, Amazon and IBM. SEACOM offers a high-performance, secure alternative to accessing cloud services across the public Internet.

ByRobert Marston, Global Head of Product at SEACOM

comments

data centreMicrosoftOpinionSouth Africa

Huawei to offer WorldRemits international money transfer service SA must invest in cyber security or risk being crippled by cyber attacks

See the rest here:
Microsoft's data centre investment to boost SA cloud adoption - IT News Africa

Read More..