Page 3,391«..1020..3,3903,3913,3923,393..3,4003,410..»

Number of Bitcoin Addresses Holding at Least 1 BTC Hits New ATH – Ethereum World News

Quick take:

The month of September has been a turbulent one for Bitcoin in the crypto markets. The month started off with BTC breaking the $12k ceiling only to drop below $10k on numerous occasions before leveling off at current levels above $10,200.

However, the constant volatility of Bitcoin has not swayed investors from owning more BTC. According to the team at Glassnode, the number of Bitcoin addresses holding at least 1 BTC has grown continuously over the years. Known as wholecoiners, their numbers recently hit a new all-time high value of 823,000. The team at Glassnode shared their observation via the following tweet.

Connecting the dots, an increment of bitcoin addresses holding at least one Bitcoin is proof that investors are continually confident that BTC is a store of value. The concept of Bitcoin as a store of value, or digital gold, has been explored numerous times by crypto enthusiasts and analysts with the most recent being Tyler Winklevoss who explained why this was so.

Bitcoin is not just a scarce commodity, its the only known commodity in the universe that has a deterministic andfixedsupply. As a result, bitcoin is not subject to any of the potential positive supply shocks that gold (or any commodity for that matter) may face in the future.

Bitcoin is the first commodity in the universe where supplydoes notfollow demand. Demand for bitcoin does not, and cannot, expand its supply.

Beyond superior supply attributes, bitcoin possesses all of the other characteristics that make gold valuable and actually performsbetteron a side-by-side comparison.

The concept of owning at least one Bitcoin was also discussed in a recent episode of the Joe Rogan experience featuring Adam Curry. This episode of the famous podcast has been shared widely in the crypto-verse and can be found on youtube. The particular section of the podcast talking about owning at least one Bitcoin can be found in the following tweet.

Link:
Number of Bitcoin Addresses Holding at Least 1 BTC Hits New ATH - Ethereum World News

Read More..

The adjusted on-chain volume of Bitcoin and Ethereum hit a 30-month high in August – Yahoo Finance

The total adjusted on-chain volume of Bitcoin and Ethereum reached a 30-month high during the month of August.

Source: Coin Metrics, The Block Research

Combined, the total adjusted on-chain volume for the two networks grew 38.3% month-over-month, as noted in a by-the-numbers breakdown for August produced by The Block Research.

Bitcoins total adjusted on-chain volume grew by 22.5%, from $66.1 billion in July to $80.9 billion in August, while Ethereum saw an increase of 81.7%, increasing from $24 billion in July to $43.5 billion in August.

Bitcoins on-chain volume was 1.85 times more than Ethereums on-chain volume last month, according to the report.

2020The Block Crypto, Inc. All Rights Reserved. This article is provided for informational purposes only. It is not offered or intended to be used as legal, tax, investment, financial, or other advice.

Continue reading here:
The adjusted on-chain volume of Bitcoin and Ethereum hit a 30-month high in August - Yahoo Finance

Read More..

‘Massive moment’ as first ever DLC smart contract deployed on the Bitcoin mainnet – Cointelegraph

Two crypto enthusiasts just made a $10,000 Bitcoin bet on the next US president using a smart contract.

According to a Sept. 8 post from podcaster Marty Bent, the Bitcoin (BTC) mainnet has seen its first ever discreet log contract (DLC), which he calls "a very important moment in Bitcoin history".

Fitting in with the general theme of 2020, the smart contract is highly political. Nicolas Dorier, a man who claimed to have made BitPay obsolete as a developer of BTCPay, made a bet using a DLC about the next winner of the 2020 Presidential Election in the United States: Donald Trump or Joe Biden.

Doriers DLC offer, which can be viewed on Github, is between himself and Suredbits founder Chris Stewart, with Outcome Observer acting as a third-party oracle.

When the election is finalized, the Outcome Observer will broadcast a signature that either I or Nicolas can use to settle the bet, said Stewart. If Trump wins, Nicolas will receive 1 BTC. If Biden wins, I will receive 1 BTC.

The Bitcoin engineer seemed to have all possibilities covered, including full refunds for both himself and Dorier if a third-party candidate were to win. Even if the oracle was to somehow disappear:

DLCs allow two or more parties to enter into a smart contract agreement regarding a future event. Until now such smart contract technology has been more at home on platforms like Ethereum and EOS as Bitcoin has eschewed complexity in favor of security. But work on adding smart contract technology to Bitcoin has been ongoing thanks to Suredbits.

I don't think [Bitcoiners] need to concede that territory to Ethereum, said Stewart. There are plenty of powerful primitives in Bitcoin that allow you to do advanced applications with Bitcoin.

Bent was similarly enthusiastic about the news, calling the smart contract bet a "massive" moment in the history of Bitcoin.

DLCs may be the least appreciated killer application of Bitcoin out there right now, said the podcaster. It will be very interesting to see the types of applications that begin to proliferate as more people come around to this type of smart contract.

In an apparent rebuke to all those who criticize Bitcoin's slow pace of development, Bent said it had clearly been paying dividends:

Read the rest here:
'Massive moment' as first ever DLC smart contract deployed on the Bitcoin mainnet - Cointelegraph

Read More..

Crypto Borrowing: Here Are Seven of the Best Interest Rates on the Market | Finance – Bitcoin News

Cryptocurrency is sizing up traditional finance on its legacy turf of lending and borrowing with competitive interest rates (currently as low as 0.44% for ethereum and 4.50% per year for bitcoin) as well as less cumbersome verification procedures. Crypto holders present their virtual assets as collateral to get loans paid out in fiat or stablecoin. The option allows one to keep an immediate financial need separate from long-term crypto investment as well as evade a taxable sale of their crypto funds.

Investors are also able to lend their digital assets and pick up significantly higher passive income of as much as 12% on their deposits than generally offered by conventional institutions. Whereas bank customers may currently be recording negative interest for their money due to the Covid-19-induced global economic recession, crypto lenders put their money to work for them.

Risks in the growing market include the theoretic vulnerability of smart contracts to hackers and a lower level of regulation for the exchanges, including decentralized ones, and wallets offering the service.

News.Bitcoin.com briefly profiled platforms that offer the best virtual assets borrowing rates. Services are ranked for BTC and ETH, according to data provided by Coinmarketcap. The ethereum space is dominated by decentralized finance (Defi) protocols while bitcoin borrowing is dominated by centralized wallets and exchanges. All featured services also allow the lending function.

Dydx offers the best borrowing rate for ether at 0.44% per annum. The decentralized exchanges interest rates fluctuate based on the supply and demand of loans and deposits of the particular crypto-asset. Dydx allows users to leverage positions up to 4x. Users can borrow directly to a wallet. The minimum starting account collateralization is 125% and must be maintained above 115% to avoid liquidation of the account.

Nuo offers a rate of 2.33%. Like Dydx, the decentralized platform allows users to margin trade cryptocurrency in addition to lending and borrowing. Rates similarly fluctuate depending on supply and demand. Users can leverage trade up to 3x and borrow up to 0.7x of the collateral amount.

Compound Finance is also a decentralized exchange. It currently offers a borrowing rate of 3.06%. Users can also deposit one crypto-asset and request for a loan of other digital tokens. Rates fluctuate based on supply and demand. The collateral factor for ETH is 75. For example, a user with assets worth $100 can borrow up to $75.

Sitting atop the BTC list for best borrowing rates with 4.50%, Celsius is a wallet that allows customers to deposit and loan virtual currencies. The centralized service fixes all interest rates for its users. Celsius incentivizes use of its CEL token with better rates for deposits. Celsius started in 2018 with a minimum loan of $10,000 which has gone down a few times to the current minimum of $1,000.

Coinloan is tied with Celsius on the top spot with a 4.50% loan. Depositors can monitor interest for their crypto, stablecoin, or fiat investments in real time and get back funds any time on demand. To get 100,000 euros ($118,000) with a loan-to-value ratio of 60, a user needs to deposit 26 BTC.

Bitrue offers an interest rate of 5.85%. The centralized exchange sets the asset-type, capacity, and yield for each deposit product. It also offers loans of crypto-assets, backed by the users deposits.

Nexo has a remarkable minimum loan of $10, at interest rates of 5.9% per year. Like most wallets and exchanges in the business, no credit checks are involved. The credit line limit is calculated according to the value of assets. Nexo fixes interest rates for its users and offers a variety of currencies including stablecoins, U.S. dollar, British pound, and euro.

What do you think of the prevailing interest rates for borrowing cryptocurrency? Let us know in the comments section below.

Image Credits: Shutterstock, Pixabay, Wiki Commons

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

See the original post:
Crypto Borrowing: Here Are Seven of the Best Interest Rates on the Market | Finance - Bitcoin News

Read More..

Argentina Government Refuses to Pay $4M Bitcoin Ransom to Hackers Who Paralyzed Its Borders | News – Bitcoin News

The government of Argentina has reportedly refused to pay a $4 million bitcoin ransom demanded by hackers who hijacked the countrys immigration systems, temporarily crippling cross-border movements.

On Aug. 27, the cybercriminals now identified as a group calling itself Netwalker hacked Argentinas immigration agency, Direccin Nacional de Migraciones, in an attack that halted border crossing in and out of the Latin American country for up to four hours.

The thieves allegedly stole sensitive information and are demanding millions of dollars in bitcoin (BTC) before they can decrypt the files, according to a Sept. 6 report by Bleeping Computers. Initially, Netwalker wanted $2 million worth of bitcoin but later doubled the ransom to about 356 BTC (or $4 million at the time).

However, the Argentinian government is refusing to negotiate with the hackers and will not pay the demanded ransom. As reported by Infobae, a local publication, officials say that the cybercriminals did not attack the critical infrastructure of the immigration agency and did not steal anything sensitive, whether personal or corporate.

Authorities are adamant that they will not negotiate with hackers and neither are they too concerned with getting that data back, Infobae reported, quoting Mara Eugenia Lachalde, a lawyer who represents the agency. Lachalde detailed that the attack affected the normal operation that attends to the public, both in administrative offices and in immigration control posts.

In response, the government shut down the entire computer system of the immigration department to prevent the malware from spreading to other networks. The action resultantly stopped all border crossing throughout Argentina for four hours. When immigration officials first noticed the attack on Aug. 27, they made an SOS call to higher offices:

(The team) realized that it was not an ordinary situation, and evaluated the Central Data and Distributed Servers infrastructure, noting the activity of a virus that had affected the systems MS Windows-based files (mainly Adad Sysvol and System Center DPM) and Microsoft Office files (Word, Excel, etc.) in users jobs and shared folders.

What do you think about the Argentina ransomware attack? Let us know in the comments section below.

Image Credits: Shutterstock, Pixabay, Wiki Commons

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

See original here:
Argentina Government Refuses to Pay $4M Bitcoin Ransom to Hackers Who Paralyzed Its Borders | News - Bitcoin News

Read More..

Square Aims to Stop Patent Trolls From Killing Bitcoin Adoption – Decrypt

In brief

American financial services firm Square has been a major proponent of cryptocurrency, from its Cash App being a popular place to buy Bitcoin to its Square Crypto division awarding several grants to date to developers of free, open-source projects.

Today, Square Crypto unveiled the next step in its push for the continued growth and wellbeing of the industry, launching the Cryptocurrency Open Patent Alliance, or COPA.

COPA continues the focus on open-source technologies that Square Crypto started with its grants, but broadens its horizons by inviting other crypto industry companies to join. To do so, companies must pledge to never assert patents on what Square calls foundational cryptocurrency technology, unless it's to defend those technologies.

Furthermore, members will contribute to a shared patent library that allows fellow members the ability to use those patents defensively against outside patent trolls, which Square says will [give] even small companies a shield with which to protect themselves against patent aggressors.

As you know, from Square Crypto to Cash App, Square is in the fight to keep Bitcoin and crypto free and open. The way to do this is to make sure that the tech driving both is available to everyone, Square Crypto tweeted today.

The success of cryptocurrencies, as with any new technology, depends on people being able to build what they want, which is not possible when every new idea gets tied up by patent litigation.

There is growing concern that patent lockup could stifle innovation and adoption from Bitcoin to the most obscure cryptocurrency, it added. In order to tackle this problem, the crypto community will need to once again do what it is so famous for and come together for the greater good.

Square Crypto is the only announced company thus far, and it has made a good faith gesture in opening up all of its crypto patents for future COPA members to use as needed. Additionally, the company will fund all membership dues for the first year, after which fees will be determined to cover operational expenses.

According to the COPA website, the organization will be run separately from Square by a board of nine people: three from the crypto and open-source community, three from the founding companies, and three more chosen from additional firms that join.

Companies that wish to join must agree to stay in the alliance for a minimum of three years, and any patents that they pledged not to assert (except defensively) during their time in COPA will still be subject to the Patent Pledge after they leave.

Read this article:
Square Aims to Stop Patent Trolls From Killing Bitcoin Adoption - Decrypt

Read More..

Qovery lets you deploy your application without managing your cloud infrastructure – TechCrunch

Remember how Heroku was a big breakthrough when it was first released? Qovery wants to do it again by building an abstraction layer between your code and your cloud infrastructure. You push code in a git repository and Qovery manages your services for you.

Its a container-as-a-service platform for developers. Like on Heroku, you just have to put a .qovery.yml file to describe the dependencies you need, co-founder and CEO Romaric Philogne told me.

Essentially, Qovery sits in-between your git repository and your cloud infrastructure account the company doesnt take care of cloud hosting itself. You can connect your Qovery account with your GitHub, GitLab or Bitbucket account so that it automatically gets triggered when you push new code.

After that, Qovery automatically spins up new servers, managed databases and brokers (Kafka, RabbitMQ) for you. There are some ways to automate your deployment already with Terraform and continuous integration/continuous delivery software. But Qovery makes it easy to get started.

More importantly, Qovery is building integrations with multiple cloud providers. It already works with Amazon Web Services and the team is currently working on DigitalOcean and Scaleway support. Next up, Google Cloud and Microsoft Azure are on the road map.

Interestingly, you can design your own infrastructure for each branch. For instance, if you have a development branch to try out new features or a staging branch, you can spin up new servers for this branch without having to recreate your production environment from the start.

And thats arguably Qoverys most important feature. According to the startup, cloud hosting will become commoditized. Each provider will provide managed databases, message brokers, etc. It comes down to reliability, pricing and support level. You can imagine having a production application on AWS and a development branch running on another cloud provider.

Behind the scene, Qovery relies heavily on Terraform and Kubernetes, with an additional layer on top of them. When you compare it with Herokus monolithic philosophy, it scales more efficiently, as it has been designed around micro-services from the ground up.

Qovery costs $15 per application per month. Many companies have dozens of applications running at the same time to handle different parts of a service. So if you switch everything over to Qovery, youll pay $15 for each application.

If you already have a CI tool that works with your development team, you can use it instead of Qoverys built-in CI service. And theres no lock-in effect you can stop using it if you now have your own DevOps team.

The company has raised $1 million from Techstars and a long list of business angels.

Image Credits: Qovery

The rest is here:
Qovery lets you deploy your application without managing your cloud infrastructure - TechCrunch

Read More..

Actifio pushes cloud DR that accelerates slow object storage to near SSD speed – Blocks and Files

ESG has validated Actifios claim it can provide public cloud disaster recovery at near SSD speed using slower, low-cost object storage in the Google Cloud Platform (GCP).

A yet-to-be-published Microsoft SQL Server Recovery and Performance with Actifio on Google Cloud ESG technical review paper, sponsored by Actifio, examined Actifios claims.

ESG used a test bed with a 967GB SQL Server production database instance running in GCP and backing it up with Actifio to SSD-class storage, which GCP calls Persistent Disk. The backup was then replicated to a different GCP region and stored there on Google Cloud Nearline (object-class) storage.

It took Actifio 90 minutes to fully back up the database and replicate it to the second DR regions Nearline storage. A policy was set to have Actifio then back up and replicate in an incremental forever manner with application consistency.

Then ESG initiated an on-demand Actifio Sky appliance in the DR region and used Actifio Global Manager to spin up an on-demand SQL Server host there. This host mounted the backup image in the Nearllne storage through the Sky appliance as a virtual block device.

This makes the Nearline objects look like block storage to the SQL Server host, which can only operate with block storage. Without this spoofing capability the Nearline objects would have to be rehydrated to GCPs Persistent Disk storage class.

The Actifio appliance then brought the DR SQL Server instance online, 5.5 minutes after starting the DR process. Its performance was then compared to that of the production database.

With a 50:50 read/write mix, the Actifio DR instance running off Nearline storage, and with a Persistent Disk cache, provided 95 per cent of the performance of the production instance running on Persistent Disk. It provided 97.6 per cent of the production instances performance with an 80/20 read/write workload mix.

The ESG analysts then looked at the costs of this kind of DR and modelled the expected costs for a company that needed to support a 1TB SQL Server production environment over a three-year period without taking into consideration capacity and performance growth requirements or soft costs like administration. They compared the Actifio DR costs to a more traditional model of running backups and replicating full copies of the database for each DR copy.

They stated in the paper: The cost of storage for Actifio over three years is $76,080 compared to $648,000 in a traditional model. This represents a cost savings of $571,920, an 88 per cent cost reduction.

Actifios SVP for product marketing, Chandra Reddy, said that Actifio is doing for DR what Snowflake does for data warehousing: One of the basic characteristics of data warehousing is to grow 10s of TB to 100s of TBs or even PB very quickly and shrink back. This requires the use of a scalable storage layer. Snowflake chose to use cloud-based object storage such as AWS S3 or Google Cloud Storage or Azure Blob Storage.

It gets faster speed than raw cloud object storage can deliver by using a scale-out design in which each scaleout compute delivers parallel execution of parts of queries. Also: Each compute instance, known as a virtual warehouse (VW), caches data upon reading from object storage in its flash storage and memory. This ensures fast local data access with high IOPs for query execution.

The on-demand Actifio Sky appliances can be scaled out too, meaning Actifio cloudDR is essentially using the same cloud model as Snowflake.

Read more from the original source:
Actifio pushes cloud DR that accelerates slow object storage to near SSD speed - Blocks and Files

Read More..

Not Just in the Cloud: Serverless in Your Own Data Center – Data Center Knowledge

If you follow conversations about trendy DevOps technologies, you have probably heard of serverless functions. But you may not realize that serverless functions arent just something available from public cloud providers. They can run out of on-prem or colocation data centers, using hybrid or private cloud architectures.

If you've wanted to explore serverless functions without having to depend on a public cloud provider, keep reading for an overview of how and why to deploy serverless functions in your own data center or colocation facility.

Related: Explaining Knative, the Project to Liberate Serverless from Cloud Giants

A serverless function is an application or part of an application that runs as part of serverless architecture. Developers can simply load serverless functions into a serverless hosting environment, then configure the conditions that should trigger the functions to execute.

There is no need to configure entire operating system environments or install software in the traditional sense -- hence the "serverless" label, which is somewhat of a misnomer, because the functions are still hosted on servers, even though the server environment is abstracted from end users.

Related: Cloudflare Wants to Eat AWSs Serverless Lunch

The serverless platforms that get the most attention, like Azure Functions and AWS Lambda, are public cloud services. The solutions are sometimes referred to as Functions-as-a-Service, or FaaS, because they enable users to deploy and execute serverless code using a cloud-based architecture that is similar to SaaS.

Although public cloud vendors have dominated the serverless market, there is nothing inherent in the serverless model that requires functions to be hosted in a public cloud. You can just as easily set up an environment within your own data center that allows your developers to deploy functions in a pain-free serverless way and execute them using an event-driven framework.

There are a number of reasons you may want to run serverless functions in your own data center. One is cost. Public cloud vendors charge you each time a serverless function executes, so you have a continuous ongoing expense when you use their services. If you run functions on your own hardware, most of your investment occurs upfront, when you set up the serverless environment. There is no direct cost for each function execution. Your total cost of ownership over the long term may end up being lower than it would be for an equivalent service in a public cloud.

Security is another consideration. By keeping serverless functions in your data center, you can keep all of your data and application code out of the cloud, which could help avoid certain security and compliance challenges.

Performance, too, may be better in certain situations for serverless functions that run in your own data center. For example, if the functions need to access data that is stored in your data center, running the functions in the same data center would eliminate the network bottlenecks you may face if your functions ran in the cloud but had to send or receive data from a private facility.

A final key reason to consider serverless solutions other than those available in the public cloud is that the latter services offer native support only for functions written in certain languages. Functions developed with other languages can typically be executed, but only by using wrappers, which create a performance hit. When you deploy your own serverless solution, you have a greater ability to configure how it operates and which languages it will support.

That said, the various serverless frameworks that are available for data centers have their own limitations in this respect, so you should evaluate which languages and packaging formats they support before selecting an option.

Deploying serverless functions in your own data center (or a colocation data center) is not much more complicated than running them in the public cloud. There are two main approaches to setting up a serverless architecture outside the public cloud.

The first is to run a private cloud within the data center, then deploy a serverless framework on top of it. In an OpenStack cloud, you can do this using Qinling. Kubernetes (which is not exactly a private cloud framework but is similar in that it lets you consolidate a pool of servers into a single software environment) supports Knative, Kubeless, and OpenWhisk, among other serverless frameworks.

The second approach is to use a hybrid cloud framework that allows you to run a public cloud vendor's serverless framework in your own data center. Azure Stack, Microsoft's hybrid cloud solution, supports the Azure serverless platform, and Google Anthos has a serverless integration via Cloud Run. (As for Amazons cloud, AWS Outposts, its hybrid cloud framework, does not currently offer a serverless option.)

The first approach will require more effort to set up, but it yields greater control over which serverless framework you use and how its configured. It may also better position you to achieve lower costs, because many of the serverless solutions for private clouds are open source and free to use.

On the other hand, the second approach, using a hybrid cloud solution from a public cloud vendor, will be simpler to deploy for most teams, because it does not require setting up a private cloud. It also offers the advantage of being able to deploy the same serverless functions in your data center or directly in the public cloud. A serverless function deployed via Azure Stack can be lifted and shifted with minimal effort to run on Azure Functions.

Serverless functions in the public cloud are very easy to deploy, but they do not offer the best cost, performance, or security for all types of workloads. For situations where the public cloud vendors' serverless solutions come up short, consider deploying serverless functions in your own data center or colocation facility.

Read more:
Not Just in the Cloud: Serverless in Your Own Data Center - Data Center Knowledge

Read More..

Why Public Cloud Data Center Spending Is At An All-Time High – CRN: Technology news for channel partners and solution providers

As data center spending hit a near all-time high of $41.4 billion in the second quarter of 2020, the industry during the COVID-19 pandemic is a tale of two stories. One is the rapidly growing spending pace of public cloud titans like AWS, Google and Microsoft on data center hardware and software, while enterprise spending on data centers is being hit by COVID-19 and related issues.

While cloud service providers continue to go from strength to strength, elements of the enterprise market are being dogged by COVID-19 and related issues, said John Dinsdale, a chief analyst at Synergy Research Group, in an email to CRN.

Worldwide spending on data center hardware and software hit $41.4 billion in the second quarter of 2020, up 7 percent year over year, according to new data by IT market research firm Synergy Research Group.

Spending on public cloud data center infrastructure jumped 25 percent year over year to nearly $17 billion in the second quarter, hitting an all-time high. However, enterprise spending dropped 3 percent year over year to roughly $24.4 billion.

[Related: Nutanix On Microsoft Azure: 5 Big Things You Should Know]

Although enterprises are slowing down their investment in data centers amidst the coronavirus pandemic, cloud providers are shrugging off the impact of the pandemic and are continuing to invest heavily in their data centers.

In the middle of a global pandemic, spending on data center infrastructure was almost at an all-time high second only to the fourth quarter of 2019. That speaks volumes about the continued robust growth in both enterprise and consumer cloud services, said Dinsdale.

The coronavirus pandemic isnt slowing down cloud providers like AWS, Google, Oracle and Microsoft spending billions on building and equipping new hyperscale data centers. The total number of large data centers operated by hyperscale providers jumped to 541 at the end of the second quarter of 2020, according to Synergy Research Group. AWS and Google opened the most new data centers in the last 12 months, accounting for over half of the total, followed by Microsoft and Oracle.

Dinsdale said there is also a geographic story behind the second quarter public cloud spending growth. The US market grew at a good pace in the quarter, but among the larger markets it was China that was the standout performer, jumping almost 35 percent from the second quarter of last year, he said.

China-based Inspur, which is now the fastest-growing server company in the world, was the market-leading vendor in terms of public cloud spending on data center hardware and software, according to Synergy. Dell Technologies was the second market leader for public cloud spending on data center infrastructure, followed by Microsoft and Huawei.

On the enterprise side, Microsoft led data center spending in the second quarter, followed by Dell Technologies, Hewlett Packard Enterprise (HPE), Cisco and VMware. Microsoft features heavily in the rankings due to its position in server operating system (OS) and virtualization applications.

The main hardware-oriented segments of servers, storage and networking in aggregate accounted for 75 percent of the data center infrastructure market. Virtualization software, cloud management, OS and network security account for the balance.

Dell was the leader in server and storage revenues, while Cisco is dominant in the networking segment, according to Synergy.

Here is the original post:
Why Public Cloud Data Center Spending Is At An All-Time High - CRN: Technology news for channel partners and solution providers

Read More..