Page 40«..1020..39404142..5060..»

Microsoft Azure: 8 Ways to Save on Cloud Storage – Enterprise Storage Forum

Data storage vendors are pairing up with cloud providers such as Microsoft Azure as a way to broaden their offerings and lower storage costs for their customers. After all, the economies of scale offered by the likes of Google, Microsoft and Amazon make it hard for others to compete on raw cloud storage. But these providers offer further services to add additional value. That can be a blessing or a curse depending on how smartly you deploy them.

Here are eight tips from the experts on how to maximize the value of the cloud storage services provided by Microsoft Azure.

Augie Gonzalez, director of product marketing, DataCore Software, advises others to take small steps before they throw all their storage eggs into an Azure basket. There are so many factors to take into account that a mere reading of the pricing sheet wont suffice. You have to experience how it all works and see how that translates into a monthly bill before you can really appreciate the nuances of cloud pricing.

Explore the variables that you can rapidly adjust to meet variations in the capacity and performance of the Azure cloud storage by starting with a modest pilot program, said Gonzalez.

Plenty of enterprises are evolving all-cloud strategies at a top management level. But its up to those on the ground floor of storage to inject some reality into the equation. Those at the top are attracted by the huge potential cost savings of the cloud. They have lived through years of on-prem IT and storage cost overruns and budgetary bickering. They are keen to simplify, cut costs and move storage over to more of a utility model.

But caution is advised. Its up to storage managers to figure out tactically how to make actual savings. And in most cases, that means the avoidance of an all-cloud now approach. That doesnt mean that its an undesirable long-term goal. But in the short term, the best way forward is to find the low-hanging fruit, learn the ropes and add more cloud from there.

Identify spot uses for Azure cloud storage where its flexibility and convenience bring immediate payoff, without having to deliberate on long-term strategic decisions that tend to bog down the initial taste, said Gonzalez.

Many in IT have experience in using colocation facilities. They are aware of the way colos work and how to factor in the different elements to determine what is worth collocating and what is not. So for those less familiar with Azure pricing, yet who are being urged by management to head for the cloud in a big way, Gonzalez advice is to think of the cloud in a similar way to colocation economics.

Look at Azure the way you might assess a colocation facility, but with someone else taking care of the day-to-day chores necessary to keep the servers and storage infrastructure running well, he said. Tap the services of experienced hybrid cloud solution providers to expedite the process.

The above points all add up to gaining an understanding of all the real-world costs of cloud storage. This goes far beyond storage capacity, and delves into bandwidth, compute fees, the cost of API calls, event monitoring, and the oft-forgotten inter-region, intra-region and intra-cloud communication charges, said Greg Schulz, an analyst at StorageIO Group.

Likewise, understand the difference between ephemeral local on the instance and persistent storage, as well as other cloud storage options, he said. Its not just about blobs, objects, containers and buckets.

Schulz added that flexibility is key, and that means finding the right balance. A blinkered look only at very low storage costs in the cloud may appear to save a bundle. But the corresponding compute charges or network and gateway as well as API fees may kill any real savings. Its best to view the various services and see which one works best for you. That may mean paying a little more for storage in order to get more compute and lower latency. That might either be cheaper or gain the organization greater productivity.

Look at all of your options, including where your applications are going to be located in order to maximize cloud efficiency, said Schulz. Also, understand how licensing works. There can be pricing advantages which are constantly changing, as are the resiliency, regions and location support.

An important consideration in any cloud strategy is latency. You dont want your users to have to go out to the cloud every time they need to access data. That could mean delay. After all, the request has to come from your own internal systems, be fed over the Web to the cloud, be processed there, and then make its way back.

There are ways around this, of course. Azure provides compute resources for a premium to greatly reduce latency. Similarly, storage providers add value by taking the latency out of the process via various strategies. Nasuni, for example, has a cache-from-cloud architecture.

Azure is used to store the authoritative gold copies of all files, but frequently accessed files are stored locally in edge appliances for fast access, said Warren Mead, vice president, alliances and business development, Nasuni.

To paraphrase Scottish poet Robert Burns, The best laid plans of mice and storage managers often go astray. This is particularly the case when security is not taken into account in an otherwise carefully thought out plan to slash storage costs when heading for the cloud. The plan may save millions, but if it violates security policies or leaves the organization less in control, it wont be approved.

Many cloud storage services either do not use encryption, or hold the encryption keys themselves, said Mead. You also need to consider what authentication and access procedures will be used for cloud storage.

Again, storage providers are coming up with ways to address security concerns and give enterprises greater control. Nasuni, for instance, lets customers hold their own encryption keys, which means neither Microsoft nor Nasuni can access sensitive data. Similarly, it integrates with on-premises Active Directory (AD) implementations and uses standard CIFS/SMB protocols to present access to file shares on the edge appliances the same as traditional NAS. Access to the cached local data is governed by standard AD authentication, and the usual drive letters still apply. As a result, user drives dont have to be re-mapped, and automation scripts and workflows dont need to be changed.

Yes, caution is advisable, and Azure storage and services sometimes cost more than expected. But for everyone who has gotten an unfortunate surprise at the end of the billing period, there are many more who have reaped the benefits of cloud storage financially and otherwise.

Thats why Schulz recommends doing a proof-of-concept for functionality, management, day-to-day operations, troubleshooting and how to refine process and procedures, as well as testing performance before you leap. Also, know what tools you have in your toolbox for moving, migrating, optimizing and managing cloud services and cloud storage.

But dont be afraid of using cloud services, just be prepared and informed, said Schulz.

In other words, be bold. Fortune favors the bold, after all.

Photo courtesy of Shutterstock.

Read more:
Microsoft Azure: 8 Ways to Save on Cloud Storage – Enterprise Storage Forum

Read More..

Trump Effect Could Cost US Cloud Providers Over $10 Billion: Report – Web Host Industry Review

Cloud computing companies in the U.S. could lose more than $10 billion by 2020 as a result of the Trump administrations reputation regarding data privacy, according to Swiss hosting company Artmotion.

A whitepaper published by Artmotionsuggests that growth rate in U.S. cloud revenue relative to the rest of the world will decline significantly more than previously forecast by IDC.

See also: Tech Goes From White House to Doghouse in Trumps Washington

IDCsWorldwide Public Cloud Services Spending Guidepredicts that the U.S. will account for 60 percent of cloud revenue worldwide to 2020. The same research, however, suggests revenue growth in the U.S. will be lower than that in all seven other regions analyzed by IDC, and according to Artmotion does not take into account the sharply falling confidence businesses have in the capacity of U.S. companies to protect the privacy of data in the cloud.

While these figures may be concerning for U.S. service providers already, they dont take full account of the scale of the disapproval of President Trumps actions since taking office, according to Mateo Meier, CEO of Artmotion.

Artmotions own research shows that half of U.S. and U.K. citizens feel online data privacy is less secure under President Trump. Further, 24 percent are most concerned about their own government, while only 20 percent consider the Russian government most concerning, and 15 percent fear the Chinese government.Both Russia and China were considered a greater threat to data privacy by Americans in Artmotions 2015 survey.

Artmotion, which has seen a 14 percent increase in revenue from U.S. companies from 2016 to 2017, estimates the total loss in revenue to U.S. cloud companies will be $1 billion this year, over $3 billion in 2018 and 2019, respectively, and $2.7 billion in 2020.

The whitepaper was released before the U.S. Department of Justice took the unprecedented step of demanding visitor logs containing the IP addresses of website visitors to anti-Trump website disruptj20.org.

The study cites survey results released by Pew Research Center, which show confidence in the U.S. presidents handling of international affairs falling from 64 percent at the end of Obamas term to only 22 percent at the beginning of Trumps.

(A)ny government, legislative and regulatory uncertainty is likely to make organizations think twice about where they host their data, Meier writes in the whitepaper.

Artmotion reported a 45 percent increase in revenue immediately following the PRISM program revelations of 2013, though the loss of business confidence in U.S. data privacy protections did not reach the $35 billion impact through 2016 estimated by the Information Technology and Innovation Foundation at the time.

Part of the reason for this may be slower than expected cloud adoption by European businesses.

IDC research indicates that Western European cloud adoption is about to catch up to that of U.S. businesses, just as the EU seeks clarification from the U.S. about the impact of an early President Trump executive order on the EU-US Privacy Shield.

Read the rest here:
Trump Effect Could Cost US Cloud Providers Over $10 Billion: Report – Web Host Industry Review

Read More..

How to make upwards of $1,000 a month by mining cryptocurrency – Mashable

If youre as confused about Bitcoin and other cryptocurrencies youre not alone.

Image: Pixabay

By Team CommerceMashable Shopping2017-09-02 12:00:00 UTC

In essence, cryptocurrencies are decentralized digital currencies that can be sent to anyone through the internet. They arent affiliated with any particular country so theres no central bank that verifies these transactions. Instead, cryptocurrency miners use special software that creates a public record of each transaction and gives the miner a payment in return.

If you know what youre doing, you can make a lot of money mining this digital currency. But how does it work and whats the best way to do it? You can learn all of this from the Beginners Guide to Cryptocurrency Mining.

This course gives you access to 13 lectures so you can hit the ground running and make real money fast. Youll learn a mining system that has low startup costs and requires no affiliate marketing or graphics card. Youll also learn all the technical details about blockchains, general ledgers, hashes, and nonces that make up each successful transaction.

By the time youve finished with this course, you could be earning up to $1,000 per month from the comforts of your own home. The Beginners Guide to Cryptocurrency Mining normally costs $180, but you can get it for just $15today. Plus, over Labor Day weekend you can save 15 percent by using the code BYESUMMER.

Read more:
How to make upwards of $1,000 a month by mining cryptocurrency – Mashable

Read More..

Chinese Officials Contemplate Suspending Cryptocurrency ICOs – The Merkle

Not too long ago, Chinese regulators voiced their concerns regarding cryptocurrency ICOs and the amount of money that companies have raised throughthoseevents. There is plenty of reason to be concerned overthe lack of regulation, asmost of these companies do not have a license to issue securities. It now appears the Chinese regulators are seeking to warn the public aboutthe risks pertaining to ICOs. They are advising the general public to report any suspicious ICO activity to the police. Things are not looking good for ICOs in China.

Various governments around the world have not taken too kindly to the concept of cryptocurrency ICOs. Given the hugeamount of money raised by projects without regulations or licenses in place, there is valid reason for concern. Some of these ICOs may pose a genuine risk to investors. Chinese regulators are very concerned about this way of raising a lot of money and want to address the hidden financial risks it presents.

According to Reuters, Chinese citizens are being asked to report suspected crimes to the police. In particular, theyare instructed to report any suspicious ICO activity to the authorities as quickly as possible. It is a bit unclear what would classify an ICO as being suspicious, since virtually every project will run into some issues along the way. Sites get hacked, information is leaked, or deadlines are not met. All of those events are suspicious in nature, although it is difficult to hold companies accountable for their actions (or lack thereof).

Chinese regulators are preparing new rules on digital currency offerings. It is certainly possible cryptocurrency ICOs mightbe suspended in China altogether until the new regulations are in place. No one knows for sure when that will happen exactly, though. Regulatory effortslike these can take anywhere from a few weeks to months or even years before they are fully implemented. With 65 ICOs organized in China to date and around US$400 million having been raised, these projects will face a lot of scrutiny from officials. They also highlight the dire need for proper ICO regulation in China and the rest of the world.

No one can deny these coin offerings have shaken up the financial industry quite a bit. They have disrupted the economic order and created relatively large hidden risks, according to a Chinese spokesperson. Those are some very serious comments which will hinder the growth of ICOs in China for the foreseeable future. Suspending suchfundraising efforts will have a similar effect. China is known for focusing on regulation first and foremost, even at the cost of stifling innovation a bit. The same issues have affected Bitcoin exchanges recently.

Similarly to Bitcoin, ICO tokens are a legal gray area for the time being. Over in the United States, a lot of these tokens may be labeled as securities, which couldspell disaster for companies conductingICOs in the past, present, and future. For the time being, it is unclear what the future will bring for cryptocurrency ICOs in China, but things are not looking all that great. If ICOs wereofficially suspended in the country, raising funds would become a lot more difficult, to say the least.

Suspending ICOs across China will be very difficult, though. Companies couldtry to prevent Chinese users from contributing money, but those measures wouldbe bypassed with relative ease. The same applies to projects barring U.S. investors, as all they do is introduce a short form and IP blocks. That block can easily be bypassed with a VPN or proxy connection, though. For the time being, raising funds with cryptocurrency is still somewhat legal in China, although things may change pretty quickly.

Follow this link:
Chinese Officials Contemplate Suspending Cryptocurrency ICOs – The Merkle

Read More..

VMware officially lands on AWS cloud with new management and security features – SiliconANGLE News (blog)

Nearly a year after signing a landmark deal to bring its software-defined data center technology to the Amazon Web Services Inc. cloud, VMware Inc. kicked off its VMworld conference in Las Vegas today with the news that VMware Cloud on AWS is now generally available.

The service essentially enables the vast majority of companies that use VMware inside their data centers to use VMware software, which allowsdifferent operating systems and multiple applications to run on the same physical computer,with AWS services as well.

To date, companies have had difficulty moving workloads to Amazons cloud to take advantage of the clouds more flexible and lower-cost computing and storage services because many of their applications depended on VMware software that only ran on computers in company data centers. That presented customers of each provider with a tough choice: Use VMware technology it built its core applications on, but with none of the cost and flexibility of cloud computing, or use Amazons cloud, but not with the VMware software their data centers are built on.

They hated this binary decision that we were forcing on them, AWS Chief Executive Andy Jassy (pictured, right) said during an appearance this morning at VMworld with VMware CEO Pat Gelsinger (left). Now, the executives said, customers can more easily use so-called hybrid cloud services that use both on-premises software and hardware and cloud services as needed.

If this fully works, CIOs have no excuse in regard to moving VMWare loads to the cloud, said Holger Mueller, vice president and principal analyst at Constellation Research. But lets see if this works.

VMware, part of Dell Technologies Inc.s constellation of companies that also includes storage supplier Dell EMC, also announced a raft of services for the VMware Cloud today. Initially, VMware Cloud is available in the AWS U.S. West region, but other regions will be added throughout 2018.VMware said the integration will enable customers to run applications across operationally consistent vSphere-based private, public and hybrid cloud environments with the option of expanding to AWS elastic or bare-metal infrastructure.

When the AWS-VMware deal was announced last October, it was apparent that it could reset the competitive environment in computing, in particular presenting new challenges for IBM Corp., which had signed a deal with VMware earlier in 2016, Google Inc.s cloud platform and Microsoft Corp., whose No. 2-ranked Azure public cloud had claimed the lead in hybrid cloud computing.

The arrangement with AWS offers some benefits for VMware, including a connection to the leading public cloud provider that its customers have been clamoring for. When your own cloud fails, you need to join the ones that work, Mueller told SiliconANGLE. VMware now focuses on add-on software, such as application security.

But it also means AWS could steal some of VMwares customers ultimately, if it results in what Dave Vellante, chief analyst at SiliconANGLE Medias Wikibon, has called a potential one-way trip to Amazon cloudville.’ Moreover, said Mueller, the arrangement doesnt help Dell sell more servers into an on-premises data center.

As for Amazon, Mueller said, AWS needs a piece of the on-premises enterprise load and this is the way. He added that the fact that AWS is offering to host VMware instances on so-called bare-metal servers, those with no operating software installed on them, indicates how much it needs VMwares help to reach large enterprise customers, since AWS had generally eschewed bare-metal arrangements.

The offering will be delivered, sold and supported by VMware as an on-demand service. Its powered by VMware Cloud Foundation, a software-defined data center platform that includes vSphere, VMware VSAN and VMware NSX virtualization technologies managed by VMware vCenter. The initial set of cloud services includes six modules:

Discovery centralizes inventory information and cloud accounts across AWS, Microsoft Azure and VMware clouds, making it easier for information technology departments to search for and identify workloads. Administrators can group cloud resources even if they span multiple clouds. Built-in search and filters enables administrators to filter resources based upon cloud attributes.

AppDefense protects applications by embedding application control and threat detection and response capabilities into vSphere-based environments. Its tightly integrated with the NSX networking platform, and operates within the vSphere hypervisor to create a knowledge base of the correct state and behavior of each endpoint for change detection.

Cost Insight helps organizations analyze their cloud spending and identify savings opportunities. It provides detailed visibility into public and private cloud costs on AWS, Azure and VMware environments and enables drill-down to identify cost drivers. Cost Insight also identifies stopped virtual machines and associated storage resources across public and private clouds to reduce waste.

Network Insight analyzes application traffic flows between different tiers, virtual and physical network layers and public and private clouds. This has application security and load balancing applications, and makes it easier for cloud administrators to manage and troubleshoot large-scale NSX deployments.

NSX Cloud provides a single management console and common application program interface for monitoring and securing applications that span multiple private and public clouds. It features a micro-segmentation security policy that can be defined once and applied to application workloads running anywhere.

Wavefront is a metrics monitoring and analytics platform that gives developers insight into the performance of highly-distributed cloud-native services to detect performance anomalies while enabling high availability. Operating on what VMware said is a massive scale, Wavefront gives DevOps teams instant visualization of millions of data points per second. This helps resolve bottlenecks more efficiently and proactively.

VMware also said its expanding Cloud Foundations scope with new partner offerings. They include support from CenturyLink Inc., Rackspace Inc. and Fujitsu Ltd. New hardware platforms that support Cloud Foundation include Dell EMCs VxRack SDDC, Hitachi Data Systems Corp.s UCP-RS, Fujitsu Primeflex and Quanta Cloud Technology LLCs QxStack.

VMwares shares closed up nearly 2 percent today, to about $104.68 a share, on a relatively flat day for the overall market.

With reporting from Robert Hof

(* Disclosure: SiliconANGLE Medias video unit, theCUBE, is a paid media partner at VMworld. Stories on SiliconANGLE are written independently of coverage on theCUBE. Sponsors have no editorial influence on content on SiliconANGLE or theCUBE.)

Read more:
VMware officially lands on AWS cloud with new management and security features – SiliconANGLE News (blog)

Read More..

Bitcoin Cash Statistics Confirm BCH is Just Another Altcoin – Live Bitcoin News

With almost 6,000 blocks mined on the Bitcoin Cash network, now is a good time to check some statistics. It is still less profitable to mine than Bitcoin, which is not surprising. Bitcoin Cash has around 8% of the value of Bitcoin itself right now. The blockchain operates at 13% of the original chains difficulty as well. Moreover, the BCH is no longer the longest blockchain either. All of these developments are pretty interesting and somewhat surprising.

One could argue Bitcoin Cash has not achieved all that much. That would be a rather shortsighted statement, though. After all, no one expected this project to stick around for as long as it has. It still has a fair bit of support from mining pools and miners alike, which is good to see. Moreover, the BCH value has somewhat stabilized around $550 as well. Unfortunately, it is inferior to Bitcoin in all other aspects one can think of right now.

First of all, the Bitcoin Cash mining difficulty is still extremely low. The mining difficulty adjustment has always been subject to some wild speculation. Some people feel miners can effectively trick the EDA into allowing them to mine more coins accordingly. Whether or not this is effectively the case, will always remain a bit of mystery. It is certainly true the mining difficulty adjustment algorithm shakes things up a bit.

Despite this lower difficulty, Bitcoin Cash is still pretty unprofitable to mine. It has been more profitable than BTC mining on two occasions so far. For the majority of the time, however, it wont net you any major income. Unless you believe BCH will double or triple in value, that is. Should that be the case, the people mining BCH right now will have a big payday to look forward to. It is unclear what the future holds for this altcoin in this regard.

Additionally, it appears the BCH blockchain is no longer the longest chain. Although it is only a minor title to obtain, it would give Bitcoin Cash some advantage over Bitcoin. With this factor out of reach as well, there is no reason to consider BCH is superior in any regard. It has all of the traits of an altcoin and not much is changing in this regard. That doesnt mean its not worth paying attention to. However, it will not rival Bitcoin in any significant manner any time soon.

Header image courtesy of Shutterstock

About JP Buntinx

View all posts by JP Buntinx

See the article here:
Bitcoin Cash Statistics Confirm BCH is Just Another Altcoin – Live Bitcoin News

Read More..

We’re About to Cross The ‘Quantum Supremacy’ Limit in Computing – ScienceAlert

The 4th International Conference on Quantum Technologies held in Moscow last month was supposed to put the spotlight on Google, who were preparing to give a lecture on a 49-qubit quantum computer they have in the works.

A morning talk presented by Harvard University’s Mikhail Lukin, however, upstaged that evening’s event with a small announcement of his own his team of American and Russian researchers had successfully tested a 51-qubit device, setting a landmark in the race for quantum supremacy.

Quantum computers are considered to be part of the next generation in revolutionary technology; devices that make use of the odd ‘in-between’ states of quantum particles to accelerate the processing power of digital machines.

The truth is both fascinating and disappointing. It’s unlikely we’ll be playing Grand Theft Auto VR8K-3000 on a quantum-souped Playstation 7 any time soon. Sorry, folks.

Quantum computing isn’t all about swapping one kind of chip for a faster one.

What it does do is give us a third kind of bit where typical computers have only two. In quantum computing, we apply quantum superposition that odd cloud of ‘maybes’ that a particle occupies before we observe its existence cemented as one of two different states to solving highly complex computational problems.

While those kinds of problems are a long, tedious process that tax even our best supercomputers, a quantum computer’s “qubit” mix of 1s, 0s, and that extra space in between can make exercises such as simulating quantum systems in molecules or factorising prime numbers vastly easier to crunch.

That’s not to say quantum computing could never be a useful addition for your home desktop. But to even begin dreaming of the possibilities, there are a whole number of problems to solve first.

One of them is to ramp up a measly handful of qubits from less than 20 to something that can begin to rival our best classical supercomputers on those trickier tasks.

That number? About 50-odd, a figure that’s often referred to in rather rapturous terms as quantum supremacy.

The Harvard device was based on an array of super-cooled atoms of rubidium held in a trap of magnets and laser ‘tweezers’ that were then excited in a fashion that allowed their quantum states to be used as a single system.

The researchers were able to control 51 of these trapped atoms in such a way that they could model some pretty complex quantum mechanics, something well out of reach of your everyday desktop computer.

While the modelling was mostly used to test the limits of this kind of set-up, the researchers gained useful insights into the quantum dynamics associated with what’s called many-body phenomena.

Fortunately they were still able to test their relatively simpler discoveries using classical computers, finding their technique was right on the money.

The research is currently on the pre-publish website arXiv.com, awaiting peer review. But the announcement certainly has the quantum computing community talking about the possibilities and consequences of achieving such limits.

The magical number of 50 qubits is more like a relative horizon than a true landmark. Not much has changed in the world of quantum computing with the Harvard announcement, and we still have a long way to go before this kind of technology will be useful in making any significant discoveries.

Google’s own plan for a 49-qubit device uses a completely different process to Lukin’s, relying on multiple-qubit quantum chips that employ a solid-state superconducting structure called a Josephson junction.

They’ve proven their technology with a simpler 9-qubit version, and plan to gradually step up to their goal.

Without going into detail, each of the technologies has its pros and cons when it comes to scaling and reliability.

A significant problem with quantum computing will be how to make the system as reliable and error-free as possible. While classical computing can duplicate processes to reduce the risk of mistakes, the probabilistic nature of qubits makes this impossible for quantum calculations.

There’s also the question on how to connect a number of units together to form ever larger processors.

Which methods will address these concerns best in the long run is anybody’s guess.

“There are several platforms that are very promising, and they are all entering the regime where it is getting interesting, you know, system sizes you cannot simulate with classical computers,” Lukin said to Himanshu Goenka from International Business Times.

“But I think it is way premature to pick a winner among them. Moreover, if we are thinking about truly large scales, hundreds of thousands of qubits, systems which will be needed for some algorithms, to be honest, I don’t think anyone knows how to go there.”

It’s a small step on the road to a hundred thousand qubits, but it doesn’t make passing this milestone any less significant.

Happy 51, Harvard!

See the article here:
We’re About to Cross The ‘Quantum Supremacy’ Limit in Computing – ScienceAlert

Read More..

Explaining the Most Recent Record for Quantum Computing: A 51-Qubit Quantum Computer Array – All About Circuits

Last month, a team of Russian and American scientists unveiled a quantum computer array with 51 qubits at the International Conference on Quantum Technologies in Moscow. Here’s a look at how they accomplished this new milestone with the use of cold atoms and lasers.

If you’re already familiar with quantum computing, I recommend skipping to the next section. If you’re not familiar with quantum computing, it is aptly named for its quantum properties. In quantum physics, particles do not have a defined location until they are observed. In classical computing, digital data is read in bits, which are 1s and 0s, or ON and OFF states, which we know as binary, which can be manipulated into different arrangements using various logic gates.

Quantum computing combines concepts from classical computing and quantum mechanics to make qubits (a shortened nickname for “quantum bits”). Unlike classical bits, qubits can be a 1 or 0 at the same time, much like Schrodinger’s cat, which is in a state of flux until observed. So,four bits have 16 possible combinations (24), whereas four qubits can be in every possible combination at the same time until they are observed. This allows a quantum computer to perform every possible calculation at the same time. A quantum algorithm reduces the time required for large calculations by the square root of the number of entries being searched.

Quantum computers are not practical for most tasks handled by personal computers, but they excel at large-scale calculations such as searching databases, running simulations, and even breaking encryptions. The video below is the simplest explanation of quantum computing I have seen so far.

It seems like every few months, quantum computing reaches a new milestone. Last month, at the International Conference on Quantum Technologies in Moscow, attendees and reporters gathered in mass for Professor John Martinis’ presentation of a chip embedded with 49 qubits. Instead, in a fashion that reminds me of Steve Harvey announcing the Miss Universe pageant, Mikhail Lukin, a Harvard professor and co-founder of the Russian Quantum Centermade his own announcement and stole the show.

Lukin’s team had successfully created the world’s most powerful, functional quantum computer to date, which runs on 51 qubits. The device was tested successfully at Harvard, where it solved physics problems that silicon chip-based supercomputers were struggling with.

Most quantum computers have been designed using superconductors and even semiconductors. Martinis’ 49-qubit chip was constructed in this fashion. Since traditional semiconductor materials are reaching their limits, Lukin’s team took a different approach.

The 51-qubit machine uses “cold atoms” in place of qubits that are locked onto laser cells. Cold atom physics is the discipline of studying atoms at incredibly low temperatures (.0000001 degrees Kelvin) in order to recreate quantum conditions. Cooling atoms to temperatures near absolute zero slows their movement down and makes them easier to observe. The video below gives an introduction to cold atom physics (starting at 1:35). After that, we’ll get into the biggest question I had about all of this:

How the heck do super-cooled atoms with lasers shining through them make a computer?

Lukin’s team wrote a research paper (PDF) explaining the experiment they set up. After sifting through the equations, I arrived at the data-reading mechanism. The setup consists of a linear array101 evenly spaced “optical tweezers”, which are generated by feeding a multi-tone RF signal into an into an acousto-optic deflector.

In simpler terms, they shine a laser beam through a vacuum tube and take fluorescence images(a type of laser scanning microscopy) of the atoms as they change betweenpositions. The “traps” that control the position of the atoms are programmable, which allows this super cooledvacuum tube with a laser shooting through it to function like a quantum computer.

As computing devices become ever smaller, engineers have been teaming up with scientists from other disciplines like physics and biology to make some outside-the box computing devices. Although it’s unlikely that any of these will end up in personal devices anytime soon (or ever), it always reminds me that a computer is just a device that calculates problems, and what our concept of a “computer” will look like in 100 years might just be beyond our current levels of comprehension.

If you’d like to learn more about quantum computing, I’ve compiled some resources below along with some of my favorite outlandish non-silicon computers!

Featured image used courtesy of Kurzgesagt

More:
Explaining the Most Recent Record for Quantum Computing: A 51-Qubit Quantum Computer Array – All About Circuits

Read More..

Searching for the future of currency, major companies try Bitcoin technology – PBS NewsHour

HARI SREENIVASAN: Im walking on Wall Street with author Don Tapscott. Hes written a dozen books on technology and sees one that could change everything around us. Hes not the only believer. While the Dow Jones Industrial Average is up about 20 percent in the past year, Bitcoin, a digital currency, is up more than 700 percent, with a total value of near $80 billion. Thats more than American Express. The surge has people wondering whether Bitcoin is in a bubble.

For Tapscott, that question is missing the real story.

DON TAPSCOTT, AUTHOR BLOCKCHAIN REVOLUTION: The real pony here is the underlying technology called the blockchain.

HARI SREENIVASAN: Tapscott and his son co-wrote a book called Blockchain Revolution, named after the technology that supports bitcoin and other so-called cryptocurrencies. Theyre called that because of the cryptography, or computer code, that makes them secure.

Tapscott says the technology is the key to creating trust in peer-to-peer transactions, like sending or receiving money without a bank or a credit card company in between.

DON TAPSCOTT: Trust is achieved not by a big intermediary; its achieved by cryptography, by collaboration and by some clever code.

HARI SREENIVASAN: Heres how the blockchain works: when you send or receive an asset, the transaction is recorded in a global, public ledger. A network of millions of computers store copies of that ledger and work to validate new transactions in blocks. When each block is verified, its sealed and connected to the preceding block, which in turn is connected to every block that has ever been validated, creating a secure blockchain.

DON TAPSCOTT: There is now an immutable record of that transaction. And if I wanted to go and hack that transaction, say to use that money to pay somebody else, Id have to hack that block, plus the previous block, in the entire history of commerce on that block chain, not just on one computer, but across millions of computers simultaneously all using the highest level of cryptography while the most powerful computing resource in the world is watching me. The way I like to think of it is that is a blockchain is a highly processed thing sort of like a chicken nugget, and if you wanted to hack it, itd be like turning a chicken nugget back into a chicken. Now someday someone will be able to do that. But for now, its going to be tough.

HARI SREENIVASAN: Tapscott predicts these global ledgers, or blockchains, could affect several parts of the economy during the next decade, in particular, the financial industry.

HARI SREENIVASAN: In a blockchain future, what happens to the New York Stock Exchange?

DON TAPSCOTT: Well, a likely scenario is it becomes a fabulous museum, and it is a beautiful building when you think about it. But buying and selling a stock can be done peer-to-peer now using new blockchain platforms.

HARI SREENIVASAN: He says routine transactions, like using a credit card or making online payments with PayPal or Venmo, could be replaced with instant, peer-to-peer blockchain transactions, speeding up how long it takes and shrinking the costs.

DON TAPSCOTT: Think about something like you tap your card in a Starbucks and a bunch of messages go through different companies. Some of them using, you know, 30-year-old technology, and three days later, a settlement occurs. Well, if all of that were on a blockchain there would be no three-day delay. The payment and the settlement is the same activity. So it would happen instantly and in a secure way. So thats either going to disintermediate those players, or if those players are smart, theyll embrace this technology to speed up the whole metabolism of the financial industry.

HARI SREENIVASAN: Beyond upending financial transactions, Tapscott imagines a future where a blockchain could be used to transfer any kind of asset, from a users personal data to intellectual property.

Some of that has already begun. This is Consensys, a technology start-up in Brooklyn, New York. Joseph Lubin founded Consensys and helped develop the Ethereum blockchain, the second biggest blockchain in the world after Bitcoin. Ethereum launched in 2015.

JOSEPH LUBIN, CONSENSYS: Ethereum is by far the most powerful blockchain platform out there. It has the most expressive programming language.

HARI SREENIVASAN: Meaning Ethereum can do something pretty radical: it allows for what are known as smart contracts to be built into the code. So it can also transfer a set of instructions or conditions.

DON TAPSCOTT: Its kind of like what it sounds like its a contract that self-executes, and it has a payment system built into it. Sort of like a contract that has built in lawyers and governments and a bank account.

HARI SREENIVASAN: At Consensys, one project applies this idea to music.

JESSE GRUSHAK: Click buy album

HARI SREENIVASAN: Jesse Grushack is the founder of ujo, a music platform for artists to distribute their music through the blockchain. Artists decide what price to sell their music and pocket more from their intellectual property.

JESSE GRUSHAK, UJO MUSIC: Were looking at how to make the music industry more efficient, but at the end of the day, our top level goal is getting artists paid more for their work and all their creative content.

HARI SREENIVASAN: But ujo is not yet easy to use. There is only one album on the platform, and it requires users to buy music with ether, the cryptocurrency used on the Ethereum blockchain.

JESSE GRUSHAK: The blockchain is still in its infancy right now. Its still kind of in the Netscape phase, really, of the internet. You dont have that AOL, you dont have that landing page that opens the world up to you. Its still a little nerdy, its still a little technical but were working really hard to kind of make it usable, make the user experience seamless because really this technology we want to be in the hands of everyone.

HARI SREENIVASAN: When he said, a little nerdy, he wasnt kidding. In order to get an idea, I went out and bought some crypto-currencies online and the process was not easy. Certainly not as easy as going to the bank to get cash or calling a stockbroker to buy a stock. But then, using my first email account in the early 90s, that wasnt easy either.

DON TAPSCOTT: I think were in 1994. And in 94, we had the internet and most people were using it for a single application, email. And thats kind of like Bitcoin is today. The application is called a currency, but were starting to see the rise of the web as we did in 94. A general purpose platform for building applications that changed many, many industries.

HARI SREENIVASAN: Youve literally written the book on the blockchain. How do you know that this is actually working, that people are believing in this, investing in this, understanding the potential in this?

DON TAPSCOTT: In every single industry now, companies are starting to implement pilots to explore how this technology can change their operations.

HARI SREENIVASAN: Tapscott points to retailer Walmart, which has done a pilot using a blockchain to track food safety, and manufacturer Foxconn, which is experimenting with using a blockchain to track its supply chain.

Still, this blockchain believer acknowledges it has a lot left to prove.

HARI SREENIVASAN: Theres several critics out there that kind of look at this and say, This is like tulip mania. This cryptocurrency stuff, this is a bubble, bigger than Ive ever seen before. Theres a bunch of people that dont know a thing about whats going on that just want to see something go up.

DON TAPSCOTT: Well, for sure theres a hype cycle that were into now. But the biggest impact will be that blockchain itself is going to change the fundamental operations of banks, of retail companies of supply chains, of manufacturing companies, of governments, and of every institution in society.

View post:
Searching for the future of currency, major companies try Bitcoin technology – PBS NewsHour

Read More..

Bitcoin $5,000: Currency Hits New Record High | Fortune.com – Fortune

The price of the world’s best known digital currency briefly crossed the $5,000 mark on a major index for the first time on Friday evening ET, before retreating about 5% in subsequent hours.

The idea of Bitcoin breaking the symbolic milestone of $5,000 would have been unthinkable to most people at the start 2017, when the price topped $1,000 for the first time. If you’re keeping track, the digital currency is up 500% this year, and nearly 2200% since mid-2015, when it was in the doldrums at around $220.

There appears to be no single reason for the recent run-up. Instead, it can likely be explained by the same factors driving this year’s cryptocurrency bull run: Publicity-driven speculation; New financial products creating unprecedented liquidity; Trading surges in Asian markets; Institutional investors treating digital currency as a permanent new asset class.

Meanwhile, the $5,000 milestone is likely to trigger a new round of chatter that Bitcoin and other cryptocurrencies are in a bubble and vulnerable to a major price collapse. Bitcoin has experienced a series of spectacular crashes in the past (most recently in 2014 when it dropped around 75%) but has always recovered.

Get Data Sheet, Fortune’s technology newsletter.

Finally, it should be noted that Bitcoin crossed $5,000 on an index used by the trade publication Coindesk, but not by other major indexes. This is significant because Coindesk’s BPI index includes prices from several Asian exchanges, where prices are typically higher than US or European one. Here’s a screenshot showing the milestone (the time shown is GMT):

A more conservative price estimate can be found on an index created by the Winkelvoss twins, are among the world’s biggest holders of Bitcoin. Known as the Winkdex, the index only draws data from U.S.-dollar denominated exchanges. As you can see, the index’s calculation shows how Bitcoin prices approached the $5,000 mark, but did not break it:

Bitcoin is not the only cryptocurrency to achieve new highs this week. Ethereum nearly crossed the $400 mark for the first time while another smaller rival, Litecoin, briefly broke through the $90 mark.

As of Saturday late morning ET, Bitcoin was trading between $4,500 and $4,600.

This is part of Fortunes new initiative, The Ledger, a trusted news source at the intersection of tech and finance. For more on The Ledger, click here.

More:
Bitcoin $5,000: Currency Hits New Record High | Fortune.com – Fortune

Read More..