Page 4,239«..1020..4,2384,2394,2404,241..4,2504,260..»

Morro Data Announces CloudNAS | StorageReview.com – Storage … – StorageReview.com

August 1st, 2017 by Adam Armstrong

Today Morro Data introduced two new service offerings that aim to replace the need of having a local NAS, CloudNAS and CloudNAS Business. According to Morro Data, these new service offerings can deliver the benefits of cloud storage with the performance and reliability of local NAS, all in a cost-effective manner. Morro states that the solutions will save users productivity while operating costs by as much as 80%.

Founded by Paul Tien (founder of ReadyNAS, acquired by NETGEAR) and his core team, Morro Data is attempting to revolutionize storage for Small- and Medium-Businesses (SMBs) with simple, fast, affordable, enterprise-class cloud storage and accelerated file distribution. The heart of Morro Datas services is CacheDrive, a cloud storage gateway, and Morro Storage, built on Amazon S3. Files are stored in the cloud, cached in the gateway, and synced globally.

Aimed at prosumer, SOHO, and SMB markets, the new solutions offer several benefits including the elimination of maintenance, infinite capacity, and plug and paly setup. Morro also offers enterprise-class storage features with the on-demand file performance of an on-premise NAS system, something that would cost smaller end users a premium to have in their NAS. Morro Data works with common cloud storage options, meaning users can keep the storage solutions they have.

The two solutions announced include:

Morro Data

Discuss this story

Sign up for the StorageReview newsletter

View post:
Morro Data Announces CloudNAS | StorageReview.com - Storage ... - StorageReview.com

Read More..

Cloud Hosting – Cloud Web Hosting Services – Bluehost

Monthly Billing

Add resources in a simple, no-nonsense way while avoiding any surprising usage fees at the end of the month.

Already have a shared hosting account? Enjoy the power of the cloud in minutes with a seamless transition to our new platform.

Cloud Sites automatically distributes three mirrored copies of your data across multiple devices to ensure saftey and protection.

Enjoy automatically accelerated web content from day one with an advanced, customized NGINX/Varnish Caching configuration.

Monitor your site's performance by tracking traffic, load speed, global reach, and more from one centralized location.

Easily identify when more resources are needed to support your website, then power up with just the click of a button.

Our experts deal with maintaining languages, security patches, port access, and any issues that may arise.

Scale up CPU and RAM at any time and without a reboot. Our OpenStack-powered backend distribution technology makes it easy.

If a hardware device falters, your site is rapidly switched over to another device to provide maximum uptime for your site.

Link:
Cloud Hosting - Cloud Web Hosting Services - Bluehost

Read More..

CCS points users of major expiring frameworks towards G-Cloud – PublicTechnology.net

G-Cloud 9 launched in May with a total of 2,847 suppliers Credit: Fotolia

Users of two soon-to-expire frameworks worth a cumulative total of more than 750m are being directed towards G-Cloud, the Crown Commercial Service (CCS) has said.

In its monthly update, CCS today announced that bodies currently using the Computer-Based Testing Services vehicle should look to G-Cloud when the deal expires on 17 October. The framework, which has a spending pot of up to 400mwas primarily launched to find a company which could provide digital service for the roughly 1.7mpeople in the UK who sit the Driver and Vehicle Standards Agencys driving theory test each year.

The framework also came with a requirement to digitally deliver an estimated annual total of 400,000 literacy, numeracy, and computer skills tests on behalf of the Department for Education.

The deal, which commenced its scheduled four-year term on 18 October 2013, picked learndirect Limited as its sole supplier. The Sheffield-based learning services provider is not on G-Cloud 9, the latest iteration of the governments flagship cloud framework, which went live in May. The new agreement features a total of 2,847 suppliers across three lots: Cloud Hosting; Cloud Software; and Cloud Support.

The move to push users of the testing services framework towards G-Cloud follows similar guidance from CCS in its July update in regards to the Managed eMail Framework, which expires later this month.

The vehicle, which came with a budget of 350m, was put in place in 2014 for an initial term of two years. Its remit was to find suppliers to deliver managed email services to central government departments and their arms-length bodies, non-departmental public bodies, NHS organisations, and local authorities.

From 29 August, any entities procuring services through the framework should, instead, turn to G-Cloud, CCS said.

The managed email contract was split into three lots, with same four suppliers being appointed to each: BT; Accenture; General Dynamics; and CSC. The first three of these are on G-Cloud 9, although the latter is not.

Since the first iteration of G-Cloud was launched in 2012, a total of about 1.7bn has been spent through the various frameworks, as of November 2016. Monthly G-Cloud spending has reached as high as 71.6m, a figure achieved in August of last year.

Read more:
CCS points users of major expiring frameworks towards G-Cloud - PublicTechnology.net

Read More..

Buy a share in a brewery: Pozible launches equity crowd-funding site Birchal – The Sydney Morning Herald

Dan Norris wants to turn the drinkers of his beers into investors in his brewery.

The co-founder of Black Hops Brewingis keen to use equity crowd-funding platform Birchalto raise further funds for his business.

"The sort of business we are in requires a lot of money to do what we do," he says. "Any sort of expansion to build a new brewery costs in the millions of dollars. As an entrepreneur I have always struggled to get people to invest in what I'm doing, the banks are really hard work and raising capital is really difficult, you often have to be an expert yourself."

Norris wants Black Hops Brewingto be one of the early businesses listed on equity crowd-funding platform Birchal, which launched in Melbourne on Tuesdaynight.

The platform is being spun out ofAustralian rewards-based crowd-funding platform Pozible and is founded byPozible co-founder and chief executive Alan Crabbe and former Ashurst lawyerMatt Vitale.

Birchal will be applying to the Australian Securities and Investment Commission for anewly created "crowd-funding service"licence.

From September 29 unlisted public companies with annual turnover and gross assets of up to $25 million will be able to issue shares to the public.

Get the latest news and updates emailed straight to your inbox.

This will enable retail or so-called "mum and dad" investors to take equity in small businesses and Birchal aims to provide a platform for this.

Crabbeand his team have spent over a year, under$500,000 plus "a lot of sweat equity", building the platform which will be open for early access from Wednesday and should have the first company profiles available within a month.

Crabbe says there are two majorbenefits for businesses undertaking equity crowd funding.

"Firstlyyou are building a very loyal customer base and secondly if you look at these consumer brands that are verysuccessfulthey are able to build a community of ambassadors and that creates word of mouth and foreveryconsumer brand that is very valuable."

Crabbe says there is a bigopportunityfor consumer brands and consumer businessesin Australia to leverage global trends.

We have people coming to the bar and asking about it, they come in and want to invest but they don't have hundreds of thousands of dollars so this sounds perfect.

"We are seeing consumer brands aregrowingas fast, if not faster than technology companies globally," he says."I think this is a space that tends to getoverlookedand tech companieseven like ourselves tend to get the spotlight."

Crabbe points to brewer BrewDog in the UK which was an early adopter of the equity crowd-funding model and was recently valued at over 1 billion.

For investors, Crabbe says the attraction is getting in early.

"They getaccess toearly stage businesses as generally retail investorsdon'tget access to these companies until they are at IPO stage."

Crabbe says interested businesses can register on Birchal and set up a company profile with basic details of what the business does and the product and services it offers. Companies can create an online pitch for equity or reward-based funding.

"When the legislation comes into effect on 29 September thesecompanieswill be able to run a campaign to either getexpressionsofinterestto raise funds or raise a round of investment," he says.

Birchalwill take a 5.75 per centsuccess fee on successfulraises and hopes to enable investors to invest as little as $100.

"With every raise there will be a minimum you have to raise and also a time frame inthe same wayPozibleoperates with an all-or-nothingmodel," Crabbe says.

Crabbepredicts within three years equity crowd funding will attract $100 million a year.

But he acknowledges it's not all upside.

"Of course these businesses are typically high risk, they are usually fast-growing scale operationsand generally the foundersmay have less experience than some of the bigger companies," he says."There is potentially a higher risk for investors."

Norris has just closed a funding round of $400,000 to expand Black Hops Brewing and says equity crowd funding would have provided a great source for this money if it was available.

Black Hops Brewing is three years old with five full-time staff andturnover of around $1 million a year.

"We have people coming to the bar and asking about it, they come in and want to invest but they don't have hundreds of thousands of dollars so this sounds perfect," says Norris. "We are at capacity at the moment, all the tanks are full and we have to expand to keep growing. I think it's a model that's going to bereallyinteresting in Australia."

FollowMySmallBusinessonTwitter,FacebookandLinkedIn.

See more here:
Buy a share in a brewery: Pozible launches equity crowd-funding site Birchal - The Sydney Morning Herald

Read More..

Cloud Computing in Industrial IoT: Market for Cloud support of IIoT by Software, Platforms, Infrastructure (SaaS … – Business Wire (press release)

DUBLIN--(BUSINESS WIRE)--The "Cloud Computing in Industrial IoT: Market for Cloud support of IIoT by Software, Platforms, Infrastructure (SaaS, PaaS, and IaaS) and Outlook for Centralized Cloud and Fog Computing for IIoT Devices and Objects/Things 2017 - 2022" report has been added to Research and Markets' offering.

Cloud Computing is moving beyond the consumer and enterprise markets into support for manufacturing and industrial automation of other industry verticals. The Industrial Internet of Things (IIoT) represents a substantial opportunity for both the centralized cloud as a service model for software, platforms, and infrastructure as well as distributed computing often referred to as Fog Computing wherein IIoT edge computing will avail industry of real-time processing and analytics.

This research evaluates the technologies, players, and solutions relied upon for Cloud Computing in IIoT. The report analyzes the impact of SaaS, PaaS, and IaaS upon IIoT as well as Cloud Computing software, platforms, and infrastructure in support of edge computing. The report also assesses market opportunities for Cloud Computing support of IIoT Devices and the Objects/Things that will be monitored, actuated, and controlled through IoT enabled processes. The report includes detailed forecasts for the global and regional outlook as well as by industry vertical, devices, and objects/things from 2017 to 2022.

Target Audience:

Companies Mentioned

Key Topics Covered:

1 Overview

2 IIoT Cloud Computing Ecosystem

3 Industrial IoT Cloud Computing Market

4 IIoT Cloud Connected Devices/Things Forecasts

5 Company Analysis

For more information about this report visit https://www.researchandmarkets.com/research/93wpcp/cloud_computing

Originally posted here:
Cloud Computing in Industrial IoT: Market for Cloud support of IIoT by Software, Platforms, Infrastructure (SaaS ... - Business Wire (press release)

Read More..

Cloud Computing Market in Latin America 2016-2020 – Key vendors are Amazon Web Services (AWS), Microsoft, IBM … – Business Wire (press release)

DUBLIN--(BUSINESS WIRE)--The "Cloud Computing Market in Latin America 2016-2020" report has been added to Research and Markets' offering.

The analysts forecast the cloud computing market in Latin America to grow at a CAGR of 10.16% during the period 2016-2020.

Cloud Computing Market in Latin America 2016-2020, has been prepared based on an in-depth market analysis with inputs from industry experts. The report covers the market landscape and its growth prospects over the coming years. The report also includes a discussion of the key vendors operating in this market.

One of latest trends in the market is growing momentum of cloud brokerage services. Cloud service brokers act as intermediaries between cloud service providers and businesses that integrate their infrastructures with cloud-computing platforms.

These brokers provide management and maintenance services to enterprises. They understand the enterprise requirements and provide them with a set of vendors that meet their current requirement and to some extend future needs, as well.

Cloud brokers also provide consulting services and assistance in selecting the right vendor. After procuring solutions, cloud brokers assist clients in integrating application with the existing computing infrastructures.

Along with installation and maintenance, cloud service brokers also deliver deduplication, security, and data protection services to enterprises.

Key vendors

Other prominent vendors

Key Topics Covered:

Part 01: Executive summary

Part 02: Scope of the report

Part 03: Market research methodology

Part 04: Introduction

Part 05: Market landscape

Part 06: Market segmentation by services

Part 07: Market segmentation by IT deployment model

Part 08: Geographical segmentation

Part 09: Key leading countries

Part 10: Market drivers

Part 11: Impact of drivers

Part 12: Market challenges

Part 13: Impact of drivers and challenges

Part 14: Market trends

Part 15: Vendor landscape

Part 16: Key vendor profile

For more information about this report visit https://www.researchandmarkets.com/research/nb8smj/cloud_computing

Read this article:
Cloud Computing Market in Latin America 2016-2020 - Key vendors are Amazon Web Services (AWS), Microsoft, IBM ... - Business Wire (press release)

Read More..

Joining Apple, Amazon’s China Cloud Service Bows to Censors – New York Times

The move came at roughly the same time that Apple said it took down a number of apps from its China app store that help users vault the Great Firewall. Those apps helped users connect to the rest of the internet world using technology called virtual private networks, or VPNs.

Taken together, the recent moves by Apple and Amazon show how Beijing is increasingly forcing Americas biggest tech companies to play by Chinese rules if they want to maintain access to the market. The push comes even as the number of foreign American tech companies able to operate and compete in China has dwindled.

Beijing has become increasingly emboldened in pushing Americas internet giants to follow its local internet laws, which forbid unregistered censorship-evasion software. Analysts say the government has been more aggressive in pressuring companies to make concessions following the passage of a new cybersecurity law, which went into effect June 1, and ahead of a sensitive Communist Party conclave set for late autumn.

The government has been intent on tightening controls domestically as well. It recently shut down a number of Chinese-run VPNs. New rules posted to government websites in recent days said Communist Party members can be punished for viewing illegal sites and that they must register all foreign or local social media accounts.

Also in response to the new law, Apple said it planned to open a new data center in China and store user data there.

Ms. Wang, who said that Sinnet handles Amazon Web Services operations across China, said that the company has sent letters warning users about such services in the past but that the government had been more focused on other issues.

Amazon Web Services allows companies small and large to lease computing power instead of running their websites or other online services through their own hardware and software. Because Amazons cloud services allow customers to lease servers in China, it could be used to give Chinese internet users access to various types of software that would help them get around the Great Firewall.

Keeping in line with censorship rules is only a part of it. In cloud computing, China requires foreign companies have a local partner and restricts them from owning a controlling stake in any cloud company. New proposed laws, which have drawn complaints of protectionism from American politicians, further restrict the companies from using their own brand and call for them to terminate and report any behavior that violates Chinas laws.

While Microsoft and Amazon both run cloud services in China, similar ones run by local Chinese internet rivals dwarf them in scale. In particular Chinese e-commerce giant Alibaba runs its own cloud services, which have grown rapidly in China. In order to operate in the country, Chinas biggest internet companies must stay in close contact with the government and carry out Beijings various demands, whether they be a request for user data or to censor various topics.

While China is not a major market for Amazon, the company has been in the country for a long time and has been pushing its cloud computing services there. Also recently the company announced a partnership with the state-run telecom China Mobile to create a Kindle, the companys e-reader device, aimed at the local Chinese market.

Link:
Joining Apple, Amazon's China Cloud Service Bows to Censors - New York Times

Read More..

Physicists Take Big Step Towards Quantum Computing and … – Universe Today

Quantum entanglement remains one of the most challenging fields of study for modern physicists. Described by Einstein as spooky action at a distance, scientists have long sought to reconcile how this aspect of quantum mechanics can coexist with classical mechanics. Essentially, the fact that two particles can be connected over great distances violates the rules of locality and realism.

Formally, this is a violation of Bells Ineqaulity, a theory which has been used for decades to show that locality and realism are valid despite being inconsistent with quantum mechanics. However, in a recent study, a team of researchers from the Ludwig-Maximilian University (LMU) and the Max Planck Institute for Quantum Optics in Munich conducted tests which once again violate Bells Inequality and proves the existence of entanglement.

Their study, titled Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes, was recently published in the Physical Review Letters. Led by Wenjamin Rosenfeld, a physicist at LMU and the Max Planck Institute for Quantum Optics, the team sought to test Bells Inequality by entangling two particles at a distance.

John Bell, the Irish physicist who devised a test to show that nature does not hide variables as Einstein had proposed. Credit: CERN

Bells Inequality (named after Irish physicist John Bell, who proposed it in 1964) essentially states that properties of objects exist independent of being observed (realism), and no information or physical influence can propagate faster than the speed of light (locality). These rules perfectly described the reality we human beings experience on a daily basis, where things are rooted in a particular space and time and exist independent of an observer.

However, at the quantum level, things do not appear to follow these rules. Not only can particles be connected in non-local ways over large distances (i.e. entanglement), but the properties of these particles cannot be defined until they are measured. And while all experiments have confirmed that the predictions of quantum mechanics are correct, some scientists have continued to argue that there are loopholes that allow for local realism.

To address this, the Munich team conducted an experiment using two laboratories at LMU. While the first lab was located in the basement of the physics department, the second was located in the basement of the economics department roughly 400 meters away. In both labs, teams captured a single rubidium atom in an topical trap and then began exciting them until they released a single photon.

As Dr. Wenjamin Rosenfeld explained in an Max Planck Institute press release:

Our two observer stations are independently operated and are equipped with their own laser and control systems. Because of the 400 meters distance between the laboratories, communication from one to the other would take 1328 nanoseconds, which is much more than the duration of the measurement process. So, no information on the measurement in one lab can be used in the other lab. Thats how we close the locality loophole.

The experiment was performed in two locations 398 meters apart at the Ludwig Maximilian University campus in Munich, Germany. Credit: Rosenfeld et al/American Physical Society

Once the two rubidium atoms were excited to the point of releasing a photon, the spin-states of the rubidium atoms and the polarization states of the photons were effectively entangled. The photons were then coupled into optical fibers and guided to a set-up where they were brought to interference. After conducting a measurement run for eight days, the scientists were able to collected around 10,000 events to check for signs entanglement.

This would have been indicated by the spins of the two trapped rubidium atoms, which would be pointing in the same direction (or in the opposite direction, depending on the kind of entanglement). What the Munich team found was that for the vast majority of the events, the atoms were in the same state (or in the opposite state), and that there were only six deviations consistent with Bells Inequality.

These results were also statistically more significant than those obtained by a team of Dutch physicists in 2015. For the sake of that study, the Dutch team conducted experiments using electrons in diamonds at labs that were 1.3 km apart. In the end, their results (and other recent tests of Bells Inequality) demonstrated that quantum entanglement is real, effectively closing the local realism loophole.

As Wenjamin Rosenfeld explained, the tests conducted by his team also went beyond these other experiments by addressing another major issue. We were able to determine the spin-state of the atoms very fast and very efficiently, he said. Thereby we closed a second potential loophole: the assumption, that the observed violation is caused by an incomplete sample of detected atom pairs.

By obtaining proof of the violation of Bells Inequality, scientists are not only helping to resolve an enduring incongruity between classical and quantum physics. They are also opening the door to some exciting possibilities. For instance, for years, scientist have anticipated the development of quantum processors, which rely on entanglements to simulate the zeros and ones of binary code.

Computers that rely on quantum mechanics would be exponentially faster than conventional microprocessors, and would ushering in a new age of research and development. The same principles have been proposed for cybersecurity, where quantum encryption would be used to cypher information, making it invulnerable to hackers who rely on conventional computers.

Last, but certainly not least, there is the concept of Quantum Entanglement Communications, a method that would allow us to transmit information faster than the speed of light. Imagine the possibilities for space travel and exploration if we are no longer bound by the limits of relativistic communication!

Einstein wasnt wrong when he characterized quantum entanglements as spooky action. Indeed, much of the implications of this phenomena are still as frightening as they are fascinating to physicists. But the closer we come to understanding it, the closer we will be towards developing an understanding of how all the known physical forces of the Universe fit together aka. a Theory of Everything!

Further Reading: LMU, Physical Review Letters

By Matt Williams- Matt Williams is the Curator of Universe Today's Guide to Space. He is also a freelance writer, a science fiction author and a Taekwon-Do instructor. He lives with his family on Vancouver Island in beautiful British Columbia.

Bell's Inequality, classical physics, Featured, Max Planck Institute for Quantum Optics, quantum entanglement, quantum mechanics

Original post:
Physicists Take Big Step Towards Quantum Computing and ... - Universe Today

Read More..

Clarifiying complex chemical processes with quantum computers – Phys.Org

Future quantum computers will be able to calculate the reaction mechanism of the enzyme nitrogenase. The image shows the active centre of the enzyme and a mathematical formula that is central for the calculation. Credit: Visualisations: ETH Zurich

Science and the IT industry have high hopes for quantum computing, but descriptions of possible applications tend to be vague. Researchers at ETH Zurich have now come up with a concrete example that demonstrates what quantum computers will actually be able to achieve in the future.

Specialists expect nothing less than a technological revolution from quantum computers, which they hope will soon allow them to solve problems that are currently too complex for classical supercomputers. Commonly discussed areas of application include data encryption and decryption, as well as special problems in the fields of physics, quantum chemistry and materials research.

But when it comes to concrete questions that only quantum computers can answer, experts have remained relatively vague. Researchers from ETH Zurich and Microsoft Research are now presenting a specific application for the first time in the scientific journal PNAS: evaluating a complex chemical reaction. Based on this example, the scientists show that quantum computers can indeed deliver scientifically relevant results.

A team of researchers led by ETH professors Markus Reiher and Matthias Troyer used simulations to demonstrate how a complex chemical reaction could be calculated with the help of a quantum computer. To accomplish this, the quantum computer must be of a "moderate size", says Matthias Troyer, who is Professor for Computational Physics at ETH Zurich and currently works for Microsoft. The mechanism of this reaction would be nearly impossible to assess with a classical supercomputer alone especially if the results are to be sufficiently precise.

One of the most complex enzymes

The researchers chose a particularly complex biochemical reaction as the example for their study: thanks to a special enzyme known as a nitrogenase, certain microorganisms are able to split atmospheric nitrogen molecules in order to create chemical compounds with single nitrogen atoms. It is still unknown how exactly the nitrogenase reaction works. "This is one of the greatest unsolved mysteries in chemistry," says Markus Reiher, Professor for Theoretical Chemistry at ETH Zurich.

Computers that are available today are able to calculate the behaviour of simple molecules quite precisely. However, this is nearly impossible for the nitrogenase enzyme and its active centre, which is simply too complex, explains Reiher.

In this context, complexity is a reflection of how many electrons interact with each other within the molecule over relatively long distances. The more electrons a researcher needs to take into account, the more sophisticated the computations. "Existing methods and classical supercomputers can be used to assess molecules with about 50 strongly interacting electrons at most," says Reiher. However, there is a significantly greater number of such electrons at the active centre of a nitrogenase enzyme. Because with classical computers the effort required to evaluate a molecule doubles with each additional electron, an unrealistic amount of computational power is needed.

Another computer architecture

As demonstrated by the ETH researchers, hypothetical quantum computers with just 100 to 200 quantum bits (qubits) will potentially be able to compute complex subproblems within a few days. The results of these computations could then be used to determine the reaction mechanism of nitrogenase step by step.

That quantum computers are capable of solving such challenging tasks at all is partially the result of the fact that they are structured differently to classical computers. Rather than requiring twice as many bits to assess each additional electron, quantum computers simply need one more qubit.

However, it remains to be seen when such "moderately large" quantum computers will be available. The currently existing experimental quantum computers use on the order of 20 rudimentary qubits respectively. It will take at least another five years, or more likely ten, before we have quantum computers with processors of more than 100 high quality qubits, estimates Reiher.

Mass production and networking

Researchers emphasise the fact that quantum computers cannot handle all tasks, so they will serve as a supplement to classical computers, rather than replacing them. "The future will be shaped by the interplay between classical computers and quantum computers," says Troyer.

With regard to the nitrogenase reaction, quantum computers will be able to calculate how the electrons are distributed within a specific molecular structure. However, classical computers will still need to tell quantum computers which structures are of particular interest and should therefore be calculated. "Quantum computers need to be thought of more like a co-processor capable of taking over particular tasks from classical computers, thus allowing them to become more efficient," says Reiher.

Explaining the mechanism of the nitrogenase reaction will also require more than just information about the electron distribution in a single molecular structure; indeed, this distribution needs to be determined in thousands of structures. Each computation takes several days. "In order for quantum computers to be of use in solving these kinds of problems, they will first need to be mass produced, thereby allowing computations to take place on multiple computers at the same time," says Troyer.

Explore further: Developing quantum algorithms for optimization problems

More information: Markus Reiher et al. Elucidating reaction mechanisms on quantum computers, Proceedings of the National Academy of Sciences (2017). DOI: 10.1073/pnas.1619152114

Read this article:
Clarifiying complex chemical processes with quantum computers - Phys.Org

Read More..

When Will Quantum Computers Be Consumer Products? – Futurism

In BriefQuantum computers are rapidly developing, but when will we be able to add one to our Christmas lists? Here is a timeline for when you can expect to see quantum computers on the shelves of your local tech store.

Quantum computers are making an entrance, and its a dramatic one. Even in its infancy, the technology isoutperforming the conventional competition and is expected to make the field of cryptography as we know it obsolete. Quantum computing has the potential to revolutionize several sectors, including the financial and medical industries.

Quantum computers can processesa greater number of calculations because they rely on quantum bits(qubits), which canbe onesand zeroessimultaneously, unlikeclassical bits that must be either a one or a zero. The company D-Wave is releasing a version of a quantum computer this year, but its not a fully formed embodiment of this technology. So we asked our readers when we should expect to see quantum computers available as consumer products?

Almost 80 percent of respondents believed we will be able to buy our own quantum computer before 2050, and the decade that received the most votes about 34 percent was the 2030s. Respondent Solomon Duffin explained why his prediction, the2040s, was slightly more pessimistic than those of the majority.

In the 2020s, we will have quantum computers that are significantly better than super computers today, but they most likely wont be in mass use by governments and companies until the 2030s. Eventually toward the end of the 2030s and early 2040s theyll shrink down to a size and cost viable for consumer use. Before that point even with the exponential growth of technology I dont think that it would be cost efficient enough for the average consumer to replace regular computing with quantum computing.

Quantum computers are indeed currently out of the price range of the average consumer, and will likely stay that way for a few years at least. The $15 million price tag for theD-Wave 2000Qhas a long way to drop before it makes it to a Black Friday sale.

But the technology is rapidly advancing, and experts are optimistic that we will soon see a bonafide, functioning quantum computer in all of its glory. In fact, an international team of researchers wrote in a study published in Physical Review, Recent improvements in the control of quantum systems make it seem feasible to finally build a quantum computer within a decade.

Andrew Dzurak,Professor in Nanoelectronics at University of New South Wales, said in an interview with CIOthat he hopes quantum computers will be able to advance scientific research, for example, by simulating what potential drugs would do in the human body. However,Dzurak said he expects it will take 20 years for quantum computers to be useful enough for that kind of application.

I think that within ten years, there will be demonstrations of modelling of certain chemicals and drugs that couldnt be done today but I dont think there will be a convenient, routine [system] that [people] can use, Dzurak said in the interview. To move to that stage will take another decade further beyond that.

Dzurak also expressed his doubts that quantum computers will be very useful to the average consumer since they can get most of what they want using conventional computers. But D-Wave international president Bo Ewald thinks thats just because we havent imagined what we could do with the technology yet. This is why D-Wave has released a new software toolto help developers make programs for the companys computers.

D-Wave is driving the hardware forward, Ewald said in an interview with Wired. But we need more smart people thinking about applications, and another set thinking about software tools.

See all of the Futurism predictions and make your own predictions here.

Originally posted here:
When Will Quantum Computers Be Consumer Products? - Futurism

Read More..