Page 40«..1020..39404142..5060..»

The future of serverless cloud looks a lot like physical servers – TechRepublic

In the cloud, hardware no longer matters, especially as the world goes gaga for serverless. At least, that would be the case but for one inconvenient truth: Serverless is powered by… servers.

Even if one accepts that the cloud increasingly allows developers to focus on writing code and not bothering with how it’s run, the hardware that powers the cloud looks set to matter for a long, long timesomething Google’s Kelsey Hightower humorously points out. In fact, in areas like edge computing, hardware has never mattered more, as former Goldman Sachs’ top technologist Don Duet told me in an interview. By Duet’s reckoning, “The ‘land grab’ for the next generation of computing literally involves physical assets like land and fiber.”

Whaaaaaat?

Duet believes this so strongly that he dumped his impressive Wall Street position to join an edge computing startup, Vapor IO, in Texas. Despite being able to architect the technology strategy for the world’s preeminent investment banking firm, Duet needed to break away to solve a serious problem, that of the speed of light:

In such a world, the cloud becomes an n-tier fabric that stretches from the centralized data center to the edge of the wireless network. The most interesting aspects of edge computing will emerge as the full capabilities of cloud computing get pushed to the edge.

Getting to the edge, however, means getting into the muck of physical infrastructure. To wit, Vapor IO signed a partnership with shared wireless infrastructure player Crown Castle to get access to over 40,000 cell towers and more than 60,000 miles of metro fiber, not to mention a growing small cell footprint. “A great deal of [edge computing success] depends on real estate and urban infrastructure,” Duet says, and in this case a network of fully managed, programmable data centers across a nationwide footprint of edge locations.

Of course, Vapor IO is hardly unique in trying to deliver on edge computing. AWS announced Greengrass, and Microsoft has its Azure IoT. In Duet’s mind, however, these don’t go far enough, because they’re still too centralized, primarily focused on edge gateways and devices, and bringing only a small subset of cloud functionality to the edge.

For a true edge cloud, he argues, [W]orkloads must run on cloud servers at the physical edge, adjacent to the devices, and directly cross-connected to the wireless network. IP addresses are presented at the edge nodes and handed off seamlessly at the edge, not resolved back in a central office location.”

A true edge cloud, in other words, provides all the key attributes of a centralized cloud, only in the edge location, including elastic scalability on automatically provisioned equipment. It also delivers a direct connection to regional and centralized data centers, as well as the internet at large, providing fast and seamless tiers of service that are more important for mission-critical computing. The key component is infrastructure and a lot of attention is on the use cases, whether it is autonomous cars, virtual reality, and so on, it is essential to be reliable, secure, and highly distributed.

Not surprisingly, this “full cloud at the edge” requires a reconceptualization of data centers. Instead of billion-dollar behemoths stacked with servers, Vapor IO builds so-called “Vapor Chambers” that are nine feet in diameter, house 130 to 160Kw of compute power, and are completely self-contained and remotely operable. They’re micro data centers, if you will, and serve as a tangible reminder that as much as we may want to eliminate our concern for hardware in the cloud, it is the nuts-and-bolts of physical servers that ultimately deliver this “serverless” reality, particularly at the edge.

Has your organization made progress toward implementing serverless computing? Share your experiences and advice with fellow TechRepublic members.

Image: Wavebreak Media LTD

Read more:
The future of serverless cloud looks a lot like physical servers – TechRepublic

Read More..

Tachyum bets on flash storage to re-architect the cloud data center – ZDNet

special feature

The Cloud v. Data Center Decision

While companies used to have to justify everything they wanted to migrate to the cloud, that scenario has flipped in recent years. Here’s how to make the best decisions about cloud computing.

Read More

Cloud datacenters rely on acres of disk drives to store data, and startup Tachyum aims to change that with an all-flash cloud. The secret sauce is a combination of transistor physics and advanced data encoding. How will it work?

Tachyum’s founder and CEO, Dr. Radoslav Danilak, is an experienced chip designer, architect, and entrepreneur. His earlier startups, SandForce and Skyera, focused on flash storage.

Tachyum includes flash storage in its value proposition, but doesn’t stop there. Tachyum is developing a “Cloud Chip” that is optimized for low-power performance, combined with a software layer that enables current applications to run on their new architecture.

You’ve likely noticed that while transistors continue to get smaller, chip speeds have not improved. Why is that?

Smaller chip feature sizes are great for building fast transistors, but the resistance of the on-chip interconnecting wires increases as they shrink. That makes data harder and slower to move, limiting performance.

Tachyum’s solution: dramatically decrease data movement by performing operations in storage, not CPU registers. Tachyum’s software layer enables compatibility for hyperscale data apps.

Because data movement is reduced, so are power and heat. Tachyum expects to put 100 servers in a 1u rackmount box, using a fraction of the power that x86 servers need.

Another major part of Tachyum’s savings comes from using advanced erasure coding to eliminate the standard 3x data copies that hyperscale storage systems typically requires. These erasure codes are widely used today in large scale active archives, but their computational and network requirements make them uneconomic in cloud datacenters.

Tachyum’s cloud chip overcomes these problems by including many 100Gb Ethernet links and hardware that accelerates the erasure coding process. Instead of 3 copies of each file, they claim a 1 percent increase in file size with better than RAID 6 data resilience, cutting storage capacity by two thirds – and making all-flash affordable.

With massive reductions in power consumption, storage footprint, and server hardware cost, Tachyum expects its cloud chip-based systems to come in at 1/4 the cost of current cloud systems. At the scale the cloud giants are operating, giving Tachyum a fraction of their hardware spend would save them billions annually.

Bravo to Tachyum for architecting a clean sheet design for hyperscale computing. They say thay have an FPGA prototype of their cloud chip today, and they plan to ship their ASIC version next year.

In the meantime they’re showing the cloud vendors what they have. Given the economics, I don’t doubt that they are getting serious attention.

What I find most interesting though, is their in-storage processing. Scale changes everything, and it may be that our standard von Neumann CPU architectures need overhauling for the age of Big Data.

It may never come to your laptop, but as more and more computing resides in data centers, an approach like Tachyum’s is needed to keep scaling the cloud.

Courteous comments welcome, of course.

See more here:
Tachyum bets on flash storage to re-architect the cloud data center – ZDNet

Read More..

Juniper adding microsegmentation to Contrail cloud – TechTarget

Juniper Networks Inc. has added tools for network microsegmentation in Contrail — an important feature for users of the software-defined networking controller, but a capability that’s unlikely to reverse Juniper’s decline in security revenues.

Juniper introduced the capability this week, along with other security features the company labeled as Juniper Contrail Security. In general, Juniper is focusing its latest stab at strengthening its security portfolio on companies with multiple data center environments in a Contrail cloud.

Microsegmentation tools, which have become a popular way to contain malware in the data center, allow corporate IT staff to build a zero-trust security zone around a set of resources, such as network segments and workloads. In network virtualization within SDN, microsegmentation adds firewall capabilities to east-west traffic.

VMware and Cisco have had microsegmentation capabilities in their SDN products, NSX and Application Centric Infrastructure (ACI), respectively, for several years. NSX has outpaced ACI deployments in the data center, primarily because microsegmentation has become its leading use case for protecting applications that run on top of VMware’s ubiquitous server virtualization products.

Companies use Juniper Contrail and vRouter — the vendor’s virtualized router software — to create a network overlay that extends across cloud-based environments in multiple data centers. The core users of Contrail and Juniper switches include cloud companies that provide infrastructure, platform or software as a service. Others include large financial institutions.

With the latest release, companies can use the Contrail cloud console to carve up their data center LAN and intradata-center WAN, and then create and distribute policies that establish restrictions on communications between network microsegments. Also, Juniper is providing tools that give companies the option of using third-party firewalls for policy enforcement.

The capability is available for cloud environments using bare-metal servers, Linux containers built and managed through the Kubernetes system, and OpenStack — the modular architecture for creating and managing large groups of virtual private servers. Kubernetes and OpenStack are open source technologies.

Juniper has contributed Contrail’s source code to the open source community through an initiative called OpenContrail. Contrail is a Juniper-supported binary version of OpenContrail, which is available under the Apache 2.0 license.

Juniper has contributed the source code of its latest security features to the OpenContrail community, said Pratik Roychowdhury, the product manager for Contrail. The site GitHub is the online repository for OpenContrail.

“Everything that I’m talking about in Contrail Security is out there [on GitHub],” Roychowdhury said. “Anyone can essentially go and take a look at the source code.”

Besides microsegmentation, Juniper has added other features to the Contrail console. They include a visual depiction of interactions between applications in hybrid cloud environments and analytics that detect anomalies and suggest corrective actions.

The latest features are useful to companies using Juniper switches or its SRX firewalls running alongside other vendors’ switches, said Lee Doyle, an analyst at Doyle Research and a TechTarget contributor. Either scenario would be helpful to Contrail adoption.

“Contrail is one of many SDN controllers that has struggled to break through [a competitive market],” Doyle said. “It’s not contributing a huge amount of revenue.”

What is contributing a growing share of Juniper’s revenue is switching. In the quarter ended June 30, revenue grew nearly 32% year over year to $276 million. However, the company’s overall market share is small at 3.4%, according to stock research firm Trefis.

Security, on the other hand, remains a weak spot in Juniper’s portfolio. Revenue has fallen from $670 million in 2012 to $318 million last year, according to Trefis. In the June quarter, revenue fell 12% to $68.7 million.

“Quite frankly, the focus right now on security has been on achieving stability and returning to growth,” Juniper CEO Rami Rahim said in an online transcript of the July earnings call with financial analysts. The transcript is available on the financial site Seeking Alpha.

More:
Juniper adding microsegmentation to Contrail cloud – TechTarget

Read More..

Demand for server specialists increases, but talent pool is small – Network World

Almost two-thirds of organizations surveyed say recruiting for jobs in data center and server management is becoming increasingly difficult because of the skills needed, both in traditional servers and converged infrastructure.

The findings come from a worldwide survey by 451 Research for its Voice of the Enterprise: Servers and Converged Infrastructure, Organizational Dynamics study (registration required). It found that IT shops have concerns about the long-term costs of using public cloud, and that is causing many IT shops to pull back on their cloud movement and even expand on their on-premises infrastructure.

Because of that, many organizations are looking to hire more server-based IT staff rather than reduce it, as would be expected in the move to the public cloud. But the fact remains that despite moving many workloads to the cloud, most organizations still need a data center and still have on-premises requirements

Two-thirds of companies, 67.7%, said the key driver for increasing server-related employees in the next 12 months is overall business growth, a good problem to have, followed by IT organizational changes at 42.4%.

Most IT managers are closely scrutinizing their deployment options instead of blindly following the pack to IaaS and other off-premises cloud services, said Christian Perry, research manager and lead analyst of the survey, in a statement.

When determining the optimal mix of on- and off-premises compute resources, there is no doubt this is hampered by the availability of specialist skills and regional availability. Whether organizations will realize their expected server staff expansion remains to be seen due to hiring difficulties, he added.

451 Research expects the worldwide pool of available full-time employees dedicated to server administration will decline due to difficulties in finding the right candidates. Almost 70% of respondents said current candidates lack skills and experience. A lack of candidates by region and high salaries are also cited as causes.

The makeup of IT teams is also evolving and having an impact on available personnel. The survey found a nearly even split between the need for generalists and specialists: 40.4% of managers choose specialists, and 39.4% choose IT generalists. Over the past two years, 451 Research has noted the trend veering toward generalists, particularly as automation, orchestration and software-defined technologies take hold.

The time and resource savings from these new technologies results in a slightly reduced need for server specialists, Perry said. The good news is that there remains a need for specialists across both standalone servers and converged and hyper-converged infrastructures. This is especially true within LOBs or remote divisions or departments.

However, there is also a need for specialists as converged and hyper-converged infrastructure (HCI) takes hold. As adoption of software-defined infrastructure technologies increases, for example using HCI, organizations can gain new staffing efficiencies that fall outside that traditional staffing policy and practice.

This is where vendors such as Dell EMC and HP Enterprise have to take a lead in educating customers on the benefits of proper staffing levels through a deeper understanding of optimal infrastructure use and resource distribution. Customers need to not only know what boxes they are getting but the skills best suited to manage them.

Original post:
Demand for server specialists increases, but talent pool is small – Network World

Read More..

Why 2017 is the Year to Understand Cloud Computing – Business 2 Community

The Cloud has become a major buzzword in business for very good reason. Small businesses and large enterprises alike can take advantage of cloud computing to build and expand the computer based infrastructure behind the scenes. Follow this guide to better understand what cloud computing is, how it works, and how you can take advantage.

In the old world of web servers and internet infrastructure, websites and other online assets were typically limited to one main server, or a few linked servers using tools called load balancers, to process and send data, whether it be a customer facing website or internal facing application. The advent of content delivery networks (CDNs) powered up those servers to host and serve data from the edge of the network for faster serving and sometimes lower costs.

As computing demand exploded with the rise of the smartphone and high-speed internet, consumer and business needs downstream of those servers continues to creep upward. Cloud computing has emerged as the best option to handle an array of computing needs for startups and small businesses due to the ability to start at a low cost and scale, almost infinitely, as demand grows. Advances in cloud technology at Amazon, Google, Microsoft, IBM, Oracle, and other major cloud providers is making cloud computing more desirable for all businesses.

When cloud computing first emerged, large enterprises were the only businesses able to afford the cost of elastic, flexible computing power. Now, however, those costs are more likely a drop in the bucket for small businesses.

For example, I use the cloud to store and serve videos for Denver Flash Mob, a side hustle business I run with my wife. Our monthly bill is typically around a dollar or two, and heavy months lead to a bill around five bucks. No big deal! My lending startup Money Mola is also cloud based, with costs to run both a development server and public facing server running us around $30 per month.

The first time I logged into Amazon Web Services (AWS) it seemed like I needed a computer science degree to use it! I had a hard time doing even basic tasks outside of uploading and sharing videos. Thankfully Amazon has made using AWS much easier, though it is not without its challenges.

Im a pretty techy guy, so my skillset is a bit more advanced than the average computer user. I have setup AWS to send outgoing transactional emails, automatically backup websites, and more on my own. If you are willing and able to hire a cloud expert, the possibilities of the cloud are endless. Anything from web hosting to artificial intelligence and big data analysis can run in the cloud.

Webcast, August 29th: How to 8x Your SEO Traffic With These 3 Power Hacks

The most basic way to get started with cloud computing is website and computer backups. If you use WordPress for your website, setting up cloud backups is simple with one of a handful of plugins like Updraft Plus. If you can use the WordPress dashboard, you can setup cloud backups with Updraft plus. It is quick and easy and includes out of the box support. Easy from companies like AWS, Drobox, Google Drive, Rackspace Cloud, and other services. The paid plugin version adds access to Microsoft OneDrive and Azure, Google Cloud Storage, and other options.

I run several backups of both my laptop and my web based assets. If my home were to be burglarized or burned down, the cloud has me covered. If my laptop is stolen, I have a backup at home and in the cloud. Redundant backups are not optional, they are a must in 2017.

In addition to safe, secure backups, the cloud can reach far corners of the planet. Utilizing cloud based CDNs, you know your customers will get every video and web page they want with near instant speeds.

Lets say your business has a popular video you want to share around the world. With a cloud CDN, you upload your video once to the web. Then the CDN takes over and creates copies of that video file in data centers around the world. Whenever a customer clicks to view that video, they are served a copy from the closest data center to their location.

Thanks to the power of a CDN, you dont have to send viewers in Australia, London, Bangkok, and Buenos Aires a video from your web server in Texas. Each one gets a local copy so they get their video even faster, offering a better customer experience. App based businesses can even run multiple versions of their app in data centers around the world. This will nsure every user has the same great experience.

It doesnt matter what your business does, there is some way the cloud can help you achieve better results. The cloud is only going to grow and become more prominent in business. Older computer methods will go the way of the fax machine. If you want serious computing success with scalability and flexibility, the cloud is your best option.

Read the original post:
Why 2017 is the Year to Understand Cloud Computing – Business 2 Community

Read More..

President Trump Could Cost US Cloud Computing Providers More Than $10 billion by 2020 – The Data Center Journal

The U.S. cloud computing industry stands to lose more than $10 billion by 2020 as a result of President Trumps increasingly shaky reputation on data privacy, according to the latest research from secure data center experts Artmotion.

Growth for U.S. cloud computing providers is already thought to be slowing. Although IDCs latest Worldwide Public Cloud Services Spending Guide suggests that the US will generate more than 60% of total worldwide cloud revenues to 2020, the country is expected to experience the slowest growth rate of the eight regions in the analysis.

However, this forecast slowdown does not factor in the effect that President Trumps controversial record on data privacy has had on business confidence in the U.S. as a data hosting location. This coincides with a rapid increase in people expressing unfavorable opinions about the U.S. more generally. In fact, the latest study from the Pew Research Center highlights that just 22% of people have confidence in President Trump to do the right thing when it comes to international affairs.

As a result of this growing uncertainty, Artmotions new analysis suggests that U.S. cloud providers will experience further slowing of growth in the next three yearscreating estimated losses of $10.1 billion for the industry between 2017-2020.

Mateo Meier, CEO of Artmotion, commented: In a market that is still expected to grow significantly in the next few years, it is vital that U.S. service providers continue to attract new customers in order to retain market share. Despite the U.S.s current dominance of the global cloud computing market, there is no certainty that the status quo will be maintained. Perhaps the key reason for US cloud providers to be fearful is that this isnt the first time weve been here.

Edward Snowdens revelations about PRISM and the NSAs mass surveillance techniques were hugely damaging to U.S. cloud companies. It also encouraged many businesses to completely rethink their data strategies, rather than continuing to trust that U.S. cloud providers would guarantee the levels of data security and privacy they need. The impact that President Trump could have needs to be understood in that context.

Artmotions full analysis is available as a free download here.

President Trump Could Cost US Cloud Computing Providers More Than $10 billion by 2020 was last modified: August 25th, 2017 by Press Release

Link:
President Trump Could Cost US Cloud Computing Providers More Than $10 billion by 2020 – The Data Center Journal

Read More..

Is Bitcoin the New Gold? – TheStreet.com

Gold has always been considered a safe haven asset.

Now Bitcoin is appearing to exhibit gold-like properties. Will Bitcoin replace gold as the new safe haven asset?

Anti Danilevski, CEO of KICKICO, a Russian blockchain platform for initial coin offerings, explains that during financialdifficulties, Bitcoin hasalready been performing in analogous ways to gold — representing a calmharbor for investors.

“During the last year S&P 500 index was decreased, gold (GLD) increased by 14.4%, whereas Bitcoin increased by 74.9%,” Danilevski said. “During the last five years, S&P 500 increased by 68.8%, gold decreased by 26.5%, whilst bitcoin grew by an impressive 24.9%.”

Danilevski believes cryptocurrency is not only a good backup plan, but comparedtogold, it grows considerably more in its price with a growing market.

Taking into consideration the recent events such as the clash between North Korea and President Trump, we saw gold rise on the back of genuine fear of conflict.

“Despite this latest episode of hair pulling which was regarded as the closest threat of war between North Korea and the U.S. since 1994, this spike in gold, although adequate, wasn’t mouthwatering,” said James Trescothick, senior global strategist at easyMarkets. “Instead it was another asset that appealed to the masses looking for safety.”

Trescothick saidin the past there has been the belief that investors should “put 10% of your wealth into gold and hope it goes down.” This logicunderlines the well-known fact that when gold rises in value it indicates that other assets are performing terribly and fear is gripping the market.

Now Trescothick says it is time to put 10% of your wealth into Bitcoin and hope it doesn’t crash. “In the middle of July Bitcoin was trading around $1985 per coin before continuing its rise higher and around the beginning of August it traded at $2789.58 per coin,” he said. “And then as hostilities increased between North Korea and the U.S., it skyrocketed. First it broke the $3,000 mark before slicing through the $4,000 level with ease.”

We saw gold react in a similar way moving from $1,258.80 an ounce at the beginning of August before moving $31 up as threats from both North Korea and Trump gripped the media. Both reacted in safe haven style, but it was the size of the move and the speed of rise in Bitcoin value that was impressive.

Are you investing in cryptocurrency? Don’t miss TheStreet’s coverage:

Visit link:
Is Bitcoin the New Gold? – TheStreet.com

Read More..

This Swedish guy bet all his life savings on bitcoin and it made him 100 times richer – Business Insider Nordic

Four years ago, Swedish expert programmer Alexander Bottema went from deep skepticism to considering bitcoin as an innovation with similar implications as the internet itself. In an exclusive interview, he reveals what happened after he went all in on bitcoin, and shares how other investors can walk in his footsteps.

So did it make him a billionaire?

No, Im not a billionaire yet [in terms of Swedish crowns], says Alexander Bottema, laughing on the phone.

But my capital has grown by a factor of more than one hundred since I sold all my stocks and liquidated my savings in order to buy bitcoins in 2013.

By that time, one bitcoin was worth 30 dollars. Today, almost four years later, the virtual currency is valued at 4000 dollars.

How much savings he had at the time Bottema doesnt want to reveal, but its clear hes content with the investment: his holdings have remained untouched since that day in early 2013.

I consider it a retirement insurance. Im not thinking about buying any more, since I can never get the same return on investment again, he quips.

Alexander Bottema grew up in a small community near Stockholm. He started programming at nine years of age, using the familys Apple 2-computer.

The booming personal computer market made it an exciting time to grow up.

But programming wasnt enough for Alexander Bottema. He wanted to learn more about the theories behind computing. In 1991, he started studying computer science at the renown Uppsala University, where he would later continue as a PhD faculty.

Uppsala turned into Stockholm when he started working on data security and encryption for consultancy Upec Industriteknik. When the company was acquired, Alexander Bottema and his two colleagues started their own company.

Frontec wasnt interested in product development, which we were into at the time, so we decided to start Polytrust. We got financial support from two venture capital firms: Telia Business Innovation and IT Provider.

After Stockholm and Polytrust, Alexander Bottema moved to the U.S. to work for Mathworks, where he is still employed. The Massachussetts-based company provides data analysis and simulation for industrial purposes.

I rejected it as something uninteresting. Seeing that I had a long track record in data security, I was certain that it wouldnt be possible to build safe servers that are open, and envisioned a crash. The following year, I was sitting on the subway and read in the Metro newspaper how bitcoin had recovered after a crash. I couldnt understand how a currency that is built on trust could recover. That piqued my interest.

He downloaded the technical description and the program code for the currency, and used all his knowledge and experience in studying the material.

Alexander Bottemas jaw dropped. His deep skepticism was gone with the wind.

Once he was convinced of bitcoins excellence, he started calculating how much the currency could one day be worth. Like many others, Bottema used the gold market as comparison.

The allocated value of the gold market, where rich people put their money to avoid devaluations, is roughly 8,000 billion dollars. If you divide it by 21 million, which is the number of bitcoins that will be available from the year 2140 onwards, you get 380000 dollars per bitcoin. I ended up on values ranging between 50,000 and 100,000 dollars per bitcoin. I panicked, and bet all of my savings.

What is driving the value of bitcoin, according to Bottema, is the combination of a growing number of use cases, like micropayments, and a limited supply of the currency.

At the outset, he worried most about a ban against the currency but he doesnt consider it a threat any longer.

Isnt there a threat from competing cryptocurrencies?

The first one is the biggest. This is quite similar to the war between VHS and Betamax. Technically speaking, Betamax was better, but VHS was bigger. So even though I theoretically could invent an entirely new internet, it would be very hard to make it grow.

Do you not get worried whenever the currency’s value plummets?

No, I know how the system works and I know what Ive invested in. Sure, the currency dips, but it has always recovered after some time. Bitcoins future looks bright.

Heres how you invest in bitcoin:

Create an account on an exchange, not an intermediary. This will give you the best price. The most accessible one for Swedes is Bitstamp.net.

All bitcoin exchanges are strictly supervised, which means you need to send copies of your ID so as to prevent money laundering.

Send over money to your account using a (European) Sepa-transfer.

Now you are eligible to buy bitcoins.

If youve bought the currency, and your exchange goes bankrupt, its important that your withdraw them.

Transfer them your own bitcoin wallet, for instance Ledger Waller or Blockchain.info.

Its important that you save the code of your wallet on a printed piece of paper or a USB-drive in case your computer would crash or disappear.

When you want to sell, just do the opposite.

Read the original article in Swedish on VA Finans.

Originally posted here:
This Swedish guy bet all his life savings on bitcoin and it made him 100 times richer – Business Insider Nordic

Read More..

Why Motherboard Is Capitalizing ‘Bitcoin’ Again – Motherboard

This is a post about capitalization standards, but it’s also about how we decide to normalize technology and when.

If you’ve been following Motherboard’s cryptocurrency coverage over the years, you may have noticed some changes. Back in 2011 and 2012, we would alternately capitalize Bitcoin or write it all in lowercase, depending on the whim of whoever was doing the story. In 2015, we elected to capitalize Bitcoin when talking about the system or protocol, and lowercase denominations of the currency: So, “I love Bitcoin, and I own many bitcoins.” In 2016, we decided to write about cryptocurrencies all in lowercase.

As of today, we’re going back to our previous rule. From here on out, we will capitalize cryptocurrencies when referring to the protocol or systemBitcoin, Ethereum, Monero, etc.but lowercase the denominations: “I love Ethereum and own a lot of ether.”

Our decision here is partly practical and partly philosophical. The main reason we decided to lowercase all cryptocurrencies was to replicate how the normalization of other technologies is reflected in writing. For a long time after the internet became popular, for example, publications would capitalize it as “Internet,” and many still do. Now that the internet is an integral part of most of our personal and work lives, it just seems a bit silly to see it capitalized, doesn’t it? It’s as if the writer is talking about something alien and unfamiliar instead of a system we’re all deeply embedded within. So, in a move that now seems wildly optimistic, we decided to get ahead of the curve with cryptocurrencies and de-capitalize them.

Recent events have played out in ways we couldn’t have predicted. Bitcoin, for example, recently split off into two separate versions with nearly identical code and, most importantly, an identical transaction history up until the time of the split. With this newly introduced confusion, our previous capitalization policy thrust us into the realm of value judgements: Should the new version of Bitcoin, called Bitcoin Cash, be capitalized when Bitcoin proper is not? What kind of message does that send to our readers? Does one version “deserve” to be capitalized while the other does not? On the other hand, which implementation “deserves” to be treated with the kind of familiarity that the internet does, and why?

Leaving these questions aside, on a purely practical level, it just makes more sense to readers, in our estimation, to parse a sentence like, “Bitcoin Cash is an offshoot of Bitcoin,” instead of, “bitcoin cash is an offshoot of bitcoin.” So, that’s one point on the side of capitalization.

The previously raised philosophical concerns also put us on the side of capitalization. With yet another Bitcoin split on the way in November, it’s clear that the Bitcoin protocol (and community) is not as monolithic as we had presumed, and at the moment it’s not as resilient as other technologies we de-capitalize, like the internet. The battle between Bitcoin and Bitcoin Cash is largely a battle between brands, and another entrant with a new name (no word on whether it will also try to claim the ‘Bitcoin’ moniker) is about to enter the arena.

We need to keep our articles readable. We also need to avoid the illusion of preference. Above all, we need to be careful about which technologies we normalize. For all these reasons, we’re going to capitalize all cryptocurrencies-as-systems and de-capitalize their denominations. Litecoin, Dogecoin, Bitcoin, Bitcoin Cash, Ethereum, etc.

In the bonkers world of cryptocurrencies, there have to be some rules.

See the rest here:
Why Motherboard Is Capitalizing ‘Bitcoin’ Again – Motherboard

Read More..

Value of all digital currencies hits record around $160 billion as … – MarketWatch

Bitcoin and Ether cryptocurrencies rose Monday, helping digital currencies broadly to an all-time valuation record.

The total value of the digital-currency universe tracked by data research site Coinmarketcap.com, including those linked to the Bitcoin BTCUSD, +0.50% Ethereum and Ripple blockchain networks, reached around $160 billion on the session, surpassing a previous record at $156.4 billion on Aug. 25.

Most recently, one bitcoin token was buying $4,397.32 off about 2.7% from its 52-week high at $4,522.13, according to MarketWatch data and research site Coindesk.com.

A single Ether token bought $342.71, about $40 off its mid-June all-time high, while Bitcoin Cash, a spinoff from bitcoin, was down 3.6% at $599.16

So-called cryptocurrencies tend to see both big intraday and interday price swings.

Growing valuations for cybercurrencies come as the decentralized platforms garner increased attention from average folks and businesses. The Chicago Board Options Exchange, which operates the largest options exchange, has teamed up with Tyler and Cameron Winklevoss twins to create bitcoin derivatives.

Read: Opinion: Stay away from bitcoin and ethereumthey are complete garbage

Moreover, beyond the concept of a digitally based currency, the blockchain has drawn attention as a utility from Wall Street investors. The blockchain is the digital ledger that tracks each bitcoin transaction and underpins the currency, and is considered by companies as a promising way to quickly document things like trading and other transactions.

Chris Burniske, a blockchain analyst at ARK Invest, in a tweet on Monday, said increased activity on blockchain networks has been part of the reason for recent gains in Ether and bitcoin and other digital units.

Original post:
Value of all digital currencies hits record around $160 billion as … – MarketWatch

Read More..