Page 4,182«..1020..4,1814,1824,1834,184..4,1904,200..»

Bitcoin Cash Statistics Confirm BCH is Just Another Altcoin – Live Bitcoin News

With almost 6,000 blocks mined on the Bitcoin Cash network, now is a good time to check some statistics. It is still less profitable to mine than Bitcoin, which is not surprising. Bitcoin Cash has around 8% of the value of Bitcoin itself right now. The blockchain operates at 13% of the original chains difficulty as well. Moreover, the BCH is no longer the longest blockchain either. All of these developments are pretty interesting and somewhat surprising.

One could argue Bitcoin Cash has not achieved all that much. That would be a rather shortsighted statement, though. After all, no one expected this project to stick around for as long as it has. It still has a fair bit of support from mining pools and miners alike, which is good to see. Moreover, the BCH value has somewhat stabilized around $550 as well. Unfortunately, it is inferior to Bitcoin in all other aspects one can think of right now.

First of all, the Bitcoin Cash mining difficulty is still extremely low. The mining difficulty adjustment has always been subject to some wild speculation. Some people feel miners can effectively trick the EDA into allowing them to mine more coins accordingly. Whether or not this is effectively the case, will always remain a bit of mystery. It is certainly true the mining difficulty adjustment algorithm shakes things up a bit.

Despite this lower difficulty, Bitcoin Cash is still pretty unprofitable to mine. It has been more profitable than BTC mining on two occasions so far. For the majority of the time, however, it wont net you any major income. Unless you believe BCH will double or triple in value, that is. Should that be the case, the people mining BCH right now will have a big payday to look forward to. It is unclear what the future holds for this altcoin in this regard.

Additionally, it appears the BCH blockchain is no longer the longest chain. Although it is only a minor title to obtain, it would give Bitcoin Cash some advantage over Bitcoin. With this factor out of reach as well, there is no reason to consider BCH is superior in any regard. It has all of the traits of an altcoin and not much is changing in this regard. That doesnt mean its not worth paying attention to. However, it will not rival Bitcoin in any significant manner any time soon.

Header image courtesy of Shutterstock

About JP Buntinx

View all posts by JP Buntinx

See the article here:
Bitcoin Cash Statistics Confirm BCH is Just Another Altcoin - Live Bitcoin News

Read More..

We’re About to Cross The ‘Quantum Supremacy’ Limit in Computing – ScienceAlert

The 4th International Conference on Quantum Technologies held in Moscow last month was supposed to put the spotlight on Google, who were preparing to give a lecture on a 49-qubit quantum computer they have in the works.

A morning talk presented by Harvard University's Mikhail Lukin, however, upstaged that evening's event with a small announcement of his own his team of American and Russian researchers had successfully tested a 51-qubit device, setting a landmark in the race for quantum supremacy.

Quantum computers are considered to be part of the next generation in revolutionary technology; devices that make use of the odd 'in-between' states of quantum particles to accelerate the processing power of digital machines.

The truth is both fascinating and disappointing. It's unlikely we'll be playing Grand Theft Auto VR8K-3000 on a quantum-souped Playstation 7 any time soon. Sorry, folks.

Quantum computing isn't all about swapping one kind of chip for a faster one.

What it does do is give us a third kind of bit where typical computers have only two. In quantum computing, we apply quantum superposition that odd cloud of 'maybes' that a particle occupies before we observe its existence cemented as one of two different states to solving highly complex computational problems.

While those kinds of problems are a long, tedious process that tax even our best supercomputers, a quantum computer's "qubit" mix of 1s, 0s, and that extra space in between can make exercises such as simulating quantum systems in molecules or factorising prime numbers vastly easier to crunch.

That's not to say quantum computing could never be a useful addition for your home desktop. But to even begin dreaming of the possibilities, there are a whole number of problems to solve first.

One of them is to ramp up a measly handful of qubits from less than 20 to something that can begin to rival our best classical supercomputers on those trickier tasks.

That number? About 50-odd, a figure that's often referred to in rather rapturous terms as quantum supremacy.

The Harvard device was based on an array of super-cooled atoms of rubidium held in a trap of magnets and laser 'tweezers' that were then excited in a fashion that allowed their quantum states to be used as a single system.

The researchers were able to control 51 of these trapped atoms in such a way that they could model some pretty complex quantum mechanics, something well out of reach of your everyday desktop computer.

While the modelling was mostly used to test the limits of this kind of set-up, the researchers gained useful insights into the quantum dynamics associated with what's called many-body phenomena.

Fortunately they were still able to test their relatively simpler discoveries using classical computers, finding their technique was right on the money.

The research is currently on the pre-publish website arXiv.com, awaiting peer review. But the announcement certainly has the quantum computing community talking about the possibilities and consequences of achieving such limits.

The magical number of 50 qubits is more like a relative horizon than a true landmark. Not much has changed in the world of quantum computing with the Harvard announcement, and we still have a long way to go before this kind of technology will be useful in making any significant discoveries.

Google's own plan for a 49-qubit device uses a completely different process to Lukin's, relying on multiple-qubit quantum chips that employ a solid-state superconducting structure called a Josephson junction.

They've proven their technology with a simpler 9-qubit version, and plan to gradually step up to their goal.

Without going into detail, each of the technologies has its pros and cons when it comes to scaling and reliability.

A significant problem with quantum computing will be how to make the system as reliable and error-free as possible. While classical computing can duplicate processes to reduce the risk of mistakes, the probabilistic nature of qubits makes this impossible for quantum calculations.

There's also the question on how to connect a number of units together to form ever larger processors.

Which methods will address these concerns best in the long run is anybody's guess.

"There are several platforms that are very promising, and they are all entering the regime where it is getting interesting, you know, system sizes you cannot simulate with classical computers," Lukin said to Himanshu Goenka from International Business Times.

"But I think it is way premature to pick a winner among them. Moreover, if we are thinking about truly large scales, hundreds of thousands of qubits, systems which will be needed for some algorithms, to be honest, I don't think anyone knows how to go there."

It's a small step on the road to a hundred thousand qubits, but it doesn't make passing this milestone any less significant.

Happy 51, Harvard!

See the article here:
We're About to Cross The 'Quantum Supremacy' Limit in Computing - ScienceAlert

Read More..

Explaining the Most Recent Record for Quantum Computing: A 51-Qubit Quantum Computer Array – All About Circuits

Last month, a team of Russian and American scientists unveiled a quantum computer array with 51 qubits at the International Conference on Quantum Technologies in Moscow. Here's a look at how they accomplished this new milestone with the use of cold atoms and lasers.

If you're already familiar with quantum computing, I recommend skipping to the next section. If you're not familiar with quantum computing, it is aptly named for its quantum properties. In quantum physics, particles do not have a defined location until they are observed. In classical computing, digital data is read in bits, which are 1s and 0s, or ON and OFF states, which we know as binary, which can be manipulated into different arrangements using various logic gates.

Quantum computing combines concepts from classical computing and quantum mechanics to make qubits (a shortened nickname for "quantum bits"). Unlike classical bits, qubits can be a 1 or 0 at the same time, much like Schrodinger's cat, which is in a state of flux until observed. So,four bits have 16 possible combinations (24), whereas four qubits can be in every possible combination at the same time until they are observed. This allows a quantum computer to perform every possible calculation at the same time. A quantum algorithm reduces the time required for large calculations by the square root of the number of entries being searched.

Quantum computers are not practical for most tasks handled by personal computers, but they excel at large-scale calculations such as searching databases, running simulations, and even breaking encryptions. The video below is the simplest explanation of quantum computing I have seen so far.

It seems like every few months, quantum computing reaches a new milestone. Last month, at the International Conference on Quantum Technologies in Moscow, attendees and reporters gathered in mass for Professor John Martinis' presentation of a chip embedded with 49 qubits. Instead, in a fashion that reminds me of Steve Harvey announcing the Miss Universe pageant, Mikhail Lukin, a Harvard professor and co-founder of the Russian Quantum Centermade his own announcement and stole the show.

Lukin's team had successfully created the world's most powerful, functional quantum computer to date, which runs on 51 qubits. The device was tested successfully at Harvard, where it solved physics problems that silicon chip-based supercomputers were struggling with.

Most quantum computers have been designed using superconductors and even semiconductors. Martinis' 49-qubit chip was constructed in this fashion. Since traditional semiconductor materials are reaching their limits, Lukin's team took a different approach.

The 51-qubit machine uses "cold atoms" in place of qubits that are locked onto laser cells. Cold atom physics is the discipline of studying atoms at incredibly low temperatures (.0000001 degrees Kelvin) in order to recreate quantum conditions. Cooling atoms to temperatures near absolute zero slows their movement down and makes them easier to observe. The video below gives an introduction to cold atom physics (starting at 1:35). After that, we'll get into the biggest question I had about all of this:

How the heck do super-cooled atoms with lasers shining through them make a computer?

Lukin's team wrote a research paper (PDF) explaining the experiment they set up. After sifting through the equations, I arrived at the data-reading mechanism. The setup consists of a linear array101 evenly spaced "optical tweezers", which are generated by feeding a multi-tone RF signal into an into an acousto-optic deflector.

In simpler terms, they shine a laser beam through a vacuum tube and take fluorescence images(a type of laser scanning microscopy) of the atoms as they change betweenpositions. The "traps" that control the position of the atoms are programmable, which allows this super cooledvacuum tube with a laser shooting through it to function like a quantum computer.

As computing devices become ever smaller, engineers have been teaming up with scientists from other disciplines like physics and biology to make some outside-the box computing devices. Although it's unlikely that any of these will end up in personal devices anytime soon (or ever), it always reminds me that a computer is just a device that calculates problems, and what our concept of a "computer" will look like in 100 years might just be beyond our current levels of comprehension.

If you'd like to learn more about quantum computing, I've compiled some resources below along with some of my favorite outlandish non-silicon computers!

Featured image used courtesy of Kurzgesagt

More:
Explaining the Most Recent Record for Quantum Computing: A 51-Qubit Quantum Computer Array - All About Circuits

Read More..

Searching for the future of currency, major companies try Bitcoin technology – PBS NewsHour

HARI SREENIVASAN: Im walking on Wall Street with author Don Tapscott. Hes written a dozen books on technology and sees one that could change everything around us. Hes not the only believer. While the Dow Jones Industrial Average is up about 20 percent in the past year, Bitcoin, a digital currency, is up more than 700 percent, with a total value of near $80 billion. Thats more than American Express. The surge has people wondering whether Bitcoin is in a bubble.

For Tapscott, that question is missing the real story.

DON TAPSCOTT, AUTHOR BLOCKCHAIN REVOLUTION: The real pony here is the underlying technology called the blockchain.

HARI SREENIVASAN: Tapscott and his son co-wrote a book called Blockchain Revolution, named after the technology that supports bitcoin and other so-called cryptocurrencies. Theyre called that because of the cryptography, or computer code, that makes them secure.

Tapscott says the technology is the key to creating trust in peer-to-peer transactions, like sending or receiving money without a bank or a credit card company in between.

DON TAPSCOTT: Trust is achieved not by a big intermediary; its achieved by cryptography, by collaboration and by some clever code.

HARI SREENIVASAN: Heres how the blockchain works: when you send or receive an asset, the transaction is recorded in a global, public ledger. A network of millions of computers store copies of that ledger and work to validate new transactions in blocks. When each block is verified, its sealed and connected to the preceding block, which in turn is connected to every block that has ever been validated, creating a secure blockchain.

DON TAPSCOTT: There is now an immutable record of that transaction. And if I wanted to go and hack that transaction, say to use that money to pay somebody else, Id have to hack that block, plus the previous block, in the entire history of commerce on that block chain, not just on one computer, but across millions of computers simultaneously all using the highest level of cryptography while the most powerful computing resource in the world is watching me. The way I like to think of it is that is a blockchain is a highly processed thing sort of like a chicken nugget, and if you wanted to hack it, itd be like turning a chicken nugget back into a chicken. Now someday someone will be able to do that. But for now, its going to be tough.

HARI SREENIVASAN: Tapscott predicts these global ledgers, or blockchains, could affect several parts of the economy during the next decade, in particular, the financial industry.

HARI SREENIVASAN: In a blockchain future, what happens to the New York Stock Exchange?

DON TAPSCOTT: Well, a likely scenario is it becomes a fabulous museum, and it is a beautiful building when you think about it. But buying and selling a stock can be done peer-to-peer now using new blockchain platforms.

HARI SREENIVASAN: He says routine transactions, like using a credit card or making online payments with PayPal or Venmo, could be replaced with instant, peer-to-peer blockchain transactions, speeding up how long it takes and shrinking the costs.

DON TAPSCOTT: Think about something like you tap your card in a Starbucks and a bunch of messages go through different companies. Some of them using, you know, 30-year-old technology, and three days later, a settlement occurs. Well, if all of that were on a blockchain there would be no three-day delay. The payment and the settlement is the same activity. So it would happen instantly and in a secure way. So thats either going to disintermediate those players, or if those players are smart, theyll embrace this technology to speed up the whole metabolism of the financial industry.

HARI SREENIVASAN: Beyond upending financial transactions, Tapscott imagines a future where a blockchain could be used to transfer any kind of asset, from a users personal data to intellectual property.

Some of that has already begun. This is Consensys, a technology start-up in Brooklyn, New York. Joseph Lubin founded Consensys and helped develop the Ethereum blockchain, the second biggest blockchain in the world after Bitcoin. Ethereum launched in 2015.

JOSEPH LUBIN, CONSENSYS: Ethereum is by far the most powerful blockchain platform out there. It has the most expressive programming language.

HARI SREENIVASAN: Meaning Ethereum can do something pretty radical: it allows for what are known as smart contracts to be built into the code. So it can also transfer a set of instructions or conditions.

DON TAPSCOTT: Its kind of like what it sounds like its a contract that self-executes, and it has a payment system built into it. Sort of like a contract that has built in lawyers and governments and a bank account.

HARI SREENIVASAN: At Consensys, one project applies this idea to music.

JESSE GRUSHAK: Click buy album

HARI SREENIVASAN: Jesse Grushack is the founder of ujo, a music platform for artists to distribute their music through the blockchain. Artists decide what price to sell their music and pocket more from their intellectual property.

JESSE GRUSHAK, UJO MUSIC: Were looking at how to make the music industry more efficient, but at the end of the day, our top level goal is getting artists paid more for their work and all their creative content.

HARI SREENIVASAN: But ujo is not yet easy to use. There is only one album on the platform, and it requires users to buy music with ether, the cryptocurrency used on the Ethereum blockchain.

JESSE GRUSHAK: The blockchain is still in its infancy right now. Its still kind of in the Netscape phase, really, of the internet. You dont have that AOL, you dont have that landing page that opens the world up to you. Its still a little nerdy, its still a little technical but were working really hard to kind of make it usable, make the user experience seamless because really this technology we want to be in the hands of everyone.

HARI SREENIVASAN: When he said, a little nerdy, he wasnt kidding. In order to get an idea, I went out and bought some crypto-currencies online and the process was not easy. Certainly not as easy as going to the bank to get cash or calling a stockbroker to buy a stock. But then, using my first email account in the early 90s, that wasnt easy either.

DON TAPSCOTT: I think were in 1994. And in 94, we had the internet and most people were using it for a single application, email. And thats kind of like Bitcoin is today. The application is called a currency, but were starting to see the rise of the web as we did in 94. A general purpose platform for building applications that changed many, many industries.

HARI SREENIVASAN: Youve literally written the book on the blockchain. How do you know that this is actually working, that people are believing in this, investing in this, understanding the potential in this?

DON TAPSCOTT: In every single industry now, companies are starting to implement pilots to explore how this technology can change their operations.

HARI SREENIVASAN: Tapscott points to retailer Walmart, which has done a pilot using a blockchain to track food safety, and manufacturer Foxconn, which is experimenting with using a blockchain to track its supply chain.

Still, this blockchain believer acknowledges it has a lot left to prove.

HARI SREENIVASAN: Theres several critics out there that kind of look at this and say, This is like tulip mania. This cryptocurrency stuff, this is a bubble, bigger than Ive ever seen before. Theres a bunch of people that dont know a thing about whats going on that just want to see something go up.

DON TAPSCOTT: Well, for sure theres a hype cycle that were into now. But the biggest impact will be that blockchain itself is going to change the fundamental operations of banks, of retail companies of supply chains, of manufacturing companies, of governments, and of every institution in society.

View post:
Searching for the future of currency, major companies try Bitcoin technology - PBS NewsHour

Read More..

Bitcoin $5,000: Currency Hits New Record High | Fortune.com – Fortune

The price of the world's best known digital currency briefly crossed the $5,000 mark on a major index for the first time on Friday evening ET, before retreating about 5% in subsequent hours.

The idea of Bitcoin breaking the symbolic milestone of $5,000 would have been unthinkable to most people at the start 2017, when the price topped $1,000 for the first time. If you're keeping track, the digital currency is up 500% this year, and nearly 2200% since mid-2015, when it was in the doldrums at around $220.

There appears to be no single reason for the recent run-up. Instead, it can likely be explained by the same factors driving this year's cryptocurrency bull run: Publicity-driven speculation; New financial products creating unprecedented liquidity; Trading surges in Asian markets; Institutional investors treating digital currency as a permanent new asset class.

Meanwhile, the $5,000 milestone is likely to trigger a new round of chatter that Bitcoin and other cryptocurrencies are in a bubble and vulnerable to a major price collapse. Bitcoin has experienced a series of spectacular crashes in the past (most recently in 2014 when it dropped around 75%) but has always recovered.

Get Data Sheet, Fortune's technology newsletter.

Finally, it should be noted that Bitcoin crossed $5,000 on an index used by the trade publication Coindesk, but not by other major indexes. This is significant because Coindesk's BPI index includes prices from several Asian exchanges, where prices are typically higher than US or European one. Here's a screenshot showing the milestone (the time shown is GMT):

A more conservative price estimate can be found on an index created by the Winkelvoss twins, are among the world's biggest holders of Bitcoin. Known as the Winkdex, the index only draws data from U.S.-dollar denominated exchanges. As you can see, the index's calculation shows how Bitcoin prices approached the $5,000 mark, but did not break it:

Bitcoin is not the only cryptocurrency to achieve new highs this week. Ethereum nearly crossed the $400 mark for the first time while another smaller rival, Litecoin, briefly broke through the $90 mark.

As of Saturday late morning ET, Bitcoin was trading between $4,500 and $4,600.

This is part of Fortunes new initiative, The Ledger, a trusted news source at the intersection of tech and finance. For more on The Ledger, click here.

More:
Bitcoin $5,000: Currency Hits New Record High | Fortune.com - Fortune

Read More..

Bitcoin Drops Below $5000 as Crypto Markets See $13 Billion Sell-Off – CoinDesk

It's the biggest sell-off since mid-July.

At press time, the total value of all publicly traded cryptocurrencies was $166 billion, a figure that was down more than 7 percent from a high of nearly $180 billion last night.

That's when bitcoin, surging on technical improvements and growing investor optimism, topped $5,000 on the CoinDesk Bitcoin Price Index for the first time.

The decline was similar to what was observed on bitcoin, with average global prices declining from a high of $5,013.91 to a low of $4,619.97, a more than $250 decline.

Overall, it was the largest sell-off in the cryptocurrency markets since July 15, when the total value of the asset class plunged roughly 12 percent from $72 billion to $63 billion. However, that decline was part of a multi-day sell-off that saw prices drop more than 25 percent on what was then concern over bitcoin's technical roadmap.

At press time, market observers seemed split on how to read the market movement.

In remarksto CoinDesk, some stated it might be too early to say the market has peaked given the recent upswell in institutional interest and the finite nature of new cryptocurrency creation.

On the latter point, some went so far to speculate the decline could be a "bear trap," one that quickly opens the door for larger gains.

"Since bitcoin is getting a lot of media attention lately a lot of people are looking for a moment to enter the market," Bram Ceelen, founder of cryptocurrency brokerage AnyCoin, told CoinDesk.

Others pointed to declines in July and May as evidence the market has still retracted, even during its 2017 rally, and that further declines were possible.

Money in mousetrap image via Shutterstock

The leader in blockchain news, CoinDesk is an independent media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. Have breaking news or a story tip to send to our journalists? Contact us at [emailprotected].

Go here to read the rest:
Bitcoin Drops Below $5000 as Crypto Markets See $13 Billion Sell-Off - CoinDesk

Read More..

Bitcoin Mempool is Empty and High Transaction Fees are no Longer Required – newsBTC

It appears as if the Bitcoin mempool is virtually empty once again. This is a positive development for anyone looking to move BTC on the network. Up until this point, there have been several issues with transaction delays. However, it appears those problems are finally coming to an end as we speak. Users should adjust their transaction fees accordingly to avoid overpaying. Theres no reason to pay too much, after all.

It is good to see the Bitcoin mempool clear itself out. More specifically, now is the best time to move Bitcoin on the network. An empty mempool means fees can be kept to an absolute minimum. That is good news for the network as a whole. Bitcoin miners may not like this development too much, though. Their earnings will decrease slightly when network fees drop off. Then again, we have seen multiple incidents involving an overly full mempool these past few months.

The positive side to all of this is how theres no need for high transaction fees. Using the default setting in ones wallet should be just fine. However, it appears some wallets are still charging too much as we speak. It is worth the effort to manually adjust fees whenever possible. Wallet estimation tools are still pretty awful to this very day. It is unclear if this situation will change anytime soon, though. Using the default Bitcoin Core client may be ones best option in this regard for the time being.

This also shows how there is no spam attack against the Bitcoin network right now. Many people had feared that would happen eventually, but it turns out things are just fine for now. This situation may change at any given moment, though. Spam transactions can be launched in mere minutes and flood the network pretty quickly as a result. For now, the mempool is still empty and continues to be for some time to come, hopefully. It has been around 30 days since we last had an empty Bitcoin transaction backlog.

With Bitcoin Cash effectively splitting off, it seems those high fees are a thing of the past now. Everyone who wants bigger block sizes can effectively switch over to BCH. Everyone else hopes for either SegWit or SegWit2x to make a big difference in the coming months. It will be interesting to see how this situation will play out in the long run. It is now up to wallet developers to successfully lower their network fees properly. Whether or not what will happen, remains a big mystery for the time being.

Header image courtesy of Shutterstock

Go here to see the original:
Bitcoin Mempool is Empty and High Transaction Fees are no Longer Required - newsBTC

Read More..

VMware-on-AWS is live, and Virtzilla is now a proper SaaS player – The Register

VMworld 2017 VMware CEO Pat Gelsinger last week introduced the company's second quarter results by saying the company has embarked on a multi-year journey from a compute virtualization company to offer a broad portfolio of products driving efficiency and digital transformation.

And today at VMworld the company began to explain what that mouthful of jargon meant: a strategy to put the company at the center of multi-cloud management.

The clearest expression of Gelsinger's words is its half-dozen new software-as-a service offerings, namely:

All six are subscription services, accessible through existing VMware accounts. And all six are new stuff for your VMware account manager, or channel partner, to suggest. If you're one of the few who resisted the company's No Naked vSphere push, VMware's going to come at you again, this time as a software-as-a-service vendor.

The Register expects the company will come hardest with AppDefense, because it's created a new business unit to back a product it feels is genuinely new to offer. Most security is about finding bad, we are about ensuring good, says Tom Corn, senior veep of the Security Product group at VMware.

The Register revealed the basics of AppDefense well before its announcement. We had to wait for today to learn that it can build its whitelist of acceptable VM behaviour by interacting with either vCenter or automated provisioning tools like Jenkins or MAVEN. Linking with those tools is an effort to make AppDefense offer something to DevOps practitioners. It's also trying to impress line-of-business types by offering them a mobile app that alerts them when applications misbehave, so that all stakeholders can participate in decisions about how to respond.

AppDefense will be sold as SaaS or on-premises software. Either way, it should do well: security types The Register's virtualization desk have spoken to feel Virtzilla is onto something here!

VMware's favourite news from this year's event is that the company's deal with Amazon Web Services has come to fruition. AWS now hosts servers running Cloud Foundation, the bundle of vSphere, VSAN, NSX and vCenter that is intended to mirror on-premises implementations.

It's all available as of today, to run in AWS alone or in a hybrid cloud spanning an on-premises implementation.

For how it's only in one AWS Region, US West, and you can only buy by the hour. One-and-three-year subscriptions are due soon, as is a global rollout that will start soon and continue deep into 2018. There's just one server type, too, and while vSphere lets you slice and dice that as it would any other server, there's no hint of the varied instance types AWS and other clouds offer.

At least the server is pleasingly grunty. Each host has a pair of CPUs, 36 cores, 72 hyper-threads, 512GB of RAM, local flash storage (3.6TB cache, 10.7TB raw capacity tier). But you'll need four hosts to build a cluster!

There is integration between VMware-on-AWS and some AWS services.

VMware will run and support the service, in contrast to the arrangement it has with IBM and the other ~4,300 vCloud Air Network partners that run vSphere-based clouds. Those partners get a new version of vCloud Director, plus more hardware partners ready to sell them servers ready to roll with Cloud Foundation. And perhaps some worry beads, for stress relief and/or prayer as VMware challenges them like never before, because the new service integrates with some AWS services. We're told that the VMware service lives in the same data centres as services like Lambda, so piping them into apps will be low-latency.

In the past VMware partners have told El Reg they feel VMware's cloud partnerships aren't bad for business, because they get users talking about vSphere-powered clouds. Now we hear some are re-thinking that position, but the pricing for VMware on Amazon may well crimp their concerns, because it isn't super-cheap.

Here's the pricing scheme.

Remember: you'll probably need at least four hosts, so actual costs will be rather more than the single-host cost.

VMware justifies these prices by saying they stack up well when compared to the total cost of ownership compared to either on-prem or public clouds.

Here's the company's math.

That calculation excludes bandwidth and IP address charges, and assumes VMs have a pair of vCPUs, 8GB RAM and 150GB of storage.

VMware's attempt to build a public cloud failed, as did its early SaaS forays.

The company's now turned that around, because the AWS deal gives it unrivalled scale, with perhaps-unsettling price.

The new SaaS offerings do two things:

VMware has done astoundingly well to keep Hyper-V's market share small. But anyone who needs new servers or storage now has to consider either hyperconverged infrastructure or Azure Stack because both offer strong alternatives to traditional infrastructure. Azure Stack also makes hypervisors irrelevant and therefore also makes the idea of Windows-on-VMware look a bit archaic.

Starting with last week's earnings call and already in pre-VMworld briefings, VMware's counter argument is that it's happy for you to use Azure in any form. So long as you don't needlessly rip and replace perfectly good vSphere in order to buy in to Microsoft's hybrid vision.

The new SaaS tools give you reasons not to ditch vSphere, by making multi-cloud wrangling easier and making vCenter the place you'll do it. AppDefense helps, too, because it looks a useful tool that won't hurt even if only deployed as one layer of a defense-in-depth strategy. It needs vCenter, too. And if vCenter is the place to do some security, and do multi-cloud management, it's a lot harder to contemplate ejecting it. That the VMware/AWS tie-up has quickly gone beyond IaaS and into AWS' services also suggests Virtzilla has found its way into a position of cloudy strength.

For now, anyway. Clouds move fast, and so do strategies to catch them.

Sponsored: The Joy and Pain of Buying IT - Have Your Say

Here is the original post:
VMware-on-AWS is live, and Virtzilla is now a proper SaaS player - The Register

Read More..

Socionext Partners with Advantech to Offer High-Density, Low-Cost … – Design and Reuse (press release)

Scalable, Robust, Low-power, and Easily Deployable Solutions for service providers and other video intensive applications

SUNNYVALE, Calif. and MILPITAS, Calif., Aug. 31, 2017 -- Socionext Inc., a world leader in hardware HEVC encoding, and Advantech, creator of innovative video acceleration solutions, today announced a strategic partnership to provide live hardware transcoding solutions for the data center supporting MPEG2, AVC (H.264), and HEVC (H.265).

Socionext real time dense transcode solution, also known as the "Media Cloud", enables advanced HEVC compression technology and real-time transcoding capabilities for OTT applications. Socionext's extended partnership with Advantech includes the integration of Socionext's Media Cloud technology into Advantech's VEGA 7000 Family of High Density Video Servers to enable agile and cost-effective live UHD cloud services for the new video-centric era.

"We are seeing an increasing need to lower the cost of ownership by media, telecom and internet companies that are seeking to address the ever-increasing mass consumption of streaming high-quality video," said David Lin, VP of Video Solutions at Advantech. "Socionext, as our valued partner, is able to solve the power, density, and performance technical design requirements we are looking for in order for us to develop a cost-competitive, highly-efficient transcoding solution with adaptive bitrate (ABR) streaming capabilities for live cloud media service providers."

The Advantech VEGA 7000 is a family of accelerated video processing servers which combine best video and IT practices within an off-the-shelf platform that has been optimized to efficiently scale throughput of high-density transcoding applications in live OTT and cloud workflows. Up to four VEGA-3318 accelerators can be integrated into a 1U server to deliver up to 32 x 4Kp60 live HEVC profiles per rack unit the highest density available in the market. This allows for large scale, energy and cost-efficient data center deployments that benefit from a 20X rack space and power reduction when compared to non-accelerated solutions. Advantech VEGA solutions for the data center minimize development efforts by providing a comprehensive software package that features Linux and Windows SDKs, an FFmpeg plug-in and virtualization-friendly drivers supporting OpenStack. Advantech also offers hardware and software design and customization services for maximum deployment flexibility.

"Advantech offers decades of expertise in complex hardware and software system integration and design services," said Yasuhiro Wakimoto, VP of the Enterprise Solution Business Unit at Socionext. "Socionext and Advantech have a long history together providing solutions for "Live" transcode broadcasting and processing large volume of media data for video systems. This partnership further extends the close relationship."

Advantech will demonstrate their VEGA 7000 Series of High Density Video Server for the Media Cloud at IBC 2017 in Hall 11, Booth C32, taking place at RAI, Amsterdam, from September 15-19, 2017. For more information, visit http://www.advantech.com/nc/spotlight/IBC2017 or email video.solutions@advantech.com.

About Advantech

Founded in 1983, Advantech is a leader in providing trusted, innovative products, services, and solutions. Advantech VEGA Video Platforms and PCIe Adapters are designed to boost video infrastructure performance from acquisition to distribution at the lowest power budget while fully complying with the media industry needs. By providing access to the latest 4K/8K UHD video processing and IP media technologies on commercial-off-the-shelf IT platforms we accelerate the deployment of next-generation, open and more efficient video solutions across a wide range of applications from broadcast encoding and high-density OTT transcoding to cloud, mobile and 360-degree video. Advantech's standard portfolio can be tailored to meet a range of system requirements, significantly reducing time-to-market effort for our customers. For more information, visit http://www.video-acceleration.com.

About Socionext Inc.

Socionext is a new, innovative enterprise that designs, develops and delivers System-on-Chip products to customers worldwide. The company is focused on imaging, networking, computing and other dynamic technologies that drive today's leading-edge applications. Socionext combines world-class expertise, experience, and an extensive IP portfolio to provide exceptional solutions and ensure a better quality of experience for customers. Founded in 2015, Socionext Inc. is headquartered in Yokohama, and has offices in Japan, Asia, United States and Europe to lead its product development and sales activities.

Read the original:
Socionext Partners with Advantech to Offer High-Density, Low-Cost ... - Design and Reuse (press release)

Read More..

High-Dimensional Quantum Encryption Performed in Real-World … – Futurism

Quantum Encryption

For the first time, researchers have sent a quantum-secured message containing more than one bit of information per photon through the air above a city. The demonstration showed that it could one day be practical to use high-capacity, free-space quantum communication to create a highly secure link between ground-based networks and satellites, a requirement for creating a global quantum encryption network.

Quantum encryption uses photons to encode information in the form of quantum bits. In its simplest form, known as 2D encryption, each photon encodes one bit: either a one or a zero. Scientists have shown that a single photon can encode even more informationa concept known as high-dimensional quantum encryptionbut until now this has never been demonstrated with free-space optical communication in real-world conditions. With eight bits necessary to encode just one letter, for example, packing more information into each photon would significantly speed up data transmission.

Our work is the first to send messages in a secure manner using high-dimensional quantum encryption in realistic city conditions, including turbulence, said research team lead, Ebrahim Karimi, University of Ottawa, Canada. The secure, free-space communication scheme we demonstrated could potentially link Earth with satellites, securely connect places where it is too expensive to install fiber, or be used for encrypted communication with a moving object, such as an airplane.

As detailed inOptica, The Optical Societys journal for high impact research, the researchers demonstrated 4D quantum encryption over afree-space optical networkspanning two buildings 0.3 kilometers apart at the University of Ottawa. This high-dimensional encryption scheme is referred to as 4D because each photon encodes two bits of information, which provides the four possibilities of 01, 10, 00 or 11.

In addition to sending more information per photon, high-dimensional quantum encryption can also tolerate more signal-obscuring noise before the transmission becomes unsecure. Noise can arise from turbulent air, failed electronics, detectors that dont work properly and from attempts to intercept the data. This higher noise threshold means that when 2D quantum encryption fails, you can try to implement 4D because it, in principle, is more secure and more noise resistant, said Karimi.

Today, mathematical algorithms are used to encrypt text messages, banking transactions and health information. Intercepting these encrypted messages requires figuring out the exact algorithm used to encrypt a given piece of data, a feat that is difficult now but that is expected to become easier in the next decade or so as computers become more powerful.

Given the expectation that current algorithms may not work as well in the future, more attention is being given to stronger encryption techniques such asquantum key distribution, which uses properties of light particles known as quantum states to encode and send the key needed to decrypt encoded data.

Although wired and free-space quantum encryption has been deployed on some small, local networks, implementing it globally will require sending encrypted messages between ground-based stations and the satellite-based quantum communication networks that would link cities and countries. Horizontal tests through the air can be used to simulate sending signals to satellites, with about three horizontal kilometers being roughly equal to sending the signal through the Earths atmosphere to a satellite.

Before trying a three-kilometer test, the researchers wanted to see if it was even possible to perform 4D quantum encryption outside. This was thought to be so challenging that some other scientists in the field said that the experiment would not work. One of the primary problems faced during any free-space experiment is dealing with air turbulence, which distorts the optical signal.

For the tests, the researchers brought their laboratory optical setups to two different rooftops and covered them with wooden boxes to provide some protection from the elements. After much trial and error, they successfully sent messages secured with 4D quantum encryption over their intracity link. The messages exhibited an error rate of 11 percent, below the 19 percent threshold needed to maintain a secure connection. They also compared 4Dencryptionwith 2D, finding that, after error correction, they could transmit 1.6 times more information per photon with 4Dquantum encryption, even with turbulence.

After bringing equipment that would normally be used in a clean, isolated lab environment to a rooftop that is exposed to the elements and has no vibration isolation, it was very rewarding to see results showing that we could transmit secure data, said Alicia Sit, an undergraduate student in Karimis lab.

As a next step, the researchers are planning to implement their scheme into a network that includes three links that are about 5.6 kilometers apart and that uses a technology known as adaptive optics to compensate for the turbulence. Eventually, they want to link this network to one that exists now in the city. Our long-term goal is to implement aquantumcommunication network with multiple links but using more than four dimensions while trying to get around the turbulence, said Sit.

This article was provided byOptical Society of America. Materials may have been edited for clarity and brevity.

More here:
High-Dimensional Quantum Encryption Performed in Real-World ... - Futurism

Read More..