Page 4,172«..1020..4,1714,1724,1734,174..4,1804,190..»

Cloud Storage Topics – SearchCloudStorage

The Cloud Backup and Recovery topic section offers resources and best practices for data storage professionals who are mapping a disaster recovery strategy using cloud storage and cloud services. Get the latest news and tips on cloud disaster recovery, and find out how cloud storage impacts restore times, business continuity and the organization of sensitive data. Cloud-based disaster recovery can bring benefits to your IT shop, but should be considered in the larger context of a long-term DR plan. More about Cloud Disaster Recovery

The Cloud Storage Backup and Recovery topic section offers resources and best practices for data storage professionals who are mapping a disaster recovery strategy using cloud storage and cloud services. Get the latest news and tips on cloud disaster recovery, and find out how cloud storage impacts restore times, business continuity and the organization of sensitive data. Cloud-based disaster recovery can bring benefits to your IT shop, but should be considered in the larger context of a long-term DR plan. More about Cloud Storage Backup

The Cloud Storage Management and Standards topic section offers comprehensive resources for data storage professionals looking for information on cloud storage management and cloud standards. Find out how to properly manage your data in the cloud and read our expert advice and guidance on cloud management. Get the latest news and tips on cloud storage strategies and what steps to take for better data management in the cloud. Read our essential guides, special reports and tutorials to keep up with the latest trends in cloud storage management, and see what tips and improvements you can apply to your own management strategy. More about Cloud Management

The Hybrid Cloud Storage topic section offers comprehensive resources for data storage professionals looking for information on hybrid storage clouds. Learn about the pros and cons of hybrid cloud storage and read our expert advice and guidance on hybrid clouds. Get the latest news and tips on using a hybrid cloud model in your environment and what types of environments are suited for hybrid storage clouds. Read our essential guides, special reports and tutorials to catch up on the most recent developments and advances in the hybrid cloud arena, and see what tips and improvements you can apply to your own hybrid cloud. More about Hybrid Cloud

The Private Cloud Storage topic section offers comprehensive resources for data storage professionals looking for information on private clouds. Learn about the pros and cons of private cloud storage and read our expert advice and guidance on private cloud infrastructure. Get the latest news and tips on private cloud providers, private cloud services, plus implementation tips, costs and private cloud management advice. Read our essential guides, special reports and tutorials to catch up on the most recent developments in the private cloud storage space, and see what tips and improvements you can apply to your own private cloud environment. More about Private Cloud

The Public Cloud Storage topic section offers comprehensive resources for data storage professionals looking for information on public clouds. Learn about the pros and cons of public cloud storage and read our expert advice and guidance on public clouds. Get the latest news and tips on public cloud services, public cloud providers, plus costs, implementation tips and common problems storage managers run into when using public clouds. Learn about the best practices regarding public clouds. Read our essential guides, special reports and tutorials to catch up on the most recent developments in the public cloud storage space, and see what tips and improvements you can apply to your own public cloud. More about Public Cloud

View original post here:
Cloud Storage Topics - SearchCloudStorage

Read More..

Cloud Storage – 2017 News, Articles and Trends – Tom’s IT Pro

Refine Also By Tag: , Article Type: , Author: ,

Tag:

Article Type:

Author:

January 30, 2017 in Best Picks

Before settling on a cloud backup option for your business, check out our buying advice and top picks, which include Amazon S3, Code 42 CrashPlan, OpenDrive, Microsoft Azure and more. Read More

October 13, 2016 in News

PDF signing, sharing through iMessage, viewing files without unlocking are all new business features of Dropbox for iOS. Find out what else is news. Read More

September 2, 2015 in Reviews

eFolders Anchor is a unique EFSS option targeted at MSPs and VARs that offers customers the choice between cloud and on-premises storage, flexible security options and a customizable user interface. Read More

August 11, 2015 in Reviews

Egnyte is one of the few enterprise file sync and share vendors that has a true hybrid cloud strategy bolstered by veteran thought leadership, flexible UI options and a strong focus on security, encryption, as well as auditing and reporting capabilities. Read More

June 25, 2015 in News

Red Hat announced the updated Red Hat Ceph Storage 1.3 and Red Hat Gluster Storage 3.1 software-defined storage solutions. The new versions offer improved performance and data integrity for petabyte scale deployments. Read More

June 18, 2015 in Reviews

Box Enterprise is more than just a simple cloud file sync and share tool. With a strong focus on security, integration and consistent cross-platform usability, Box has a lot to offer business customers and more to come in the near future. Read More

June 4, 2015 in Reviews

Citrix's ShareFile is a strong enterprise file sync and share (EFSS) solution that offers plenty of security controls and can leverage both cloud and on-premises storage. It's especially an attractive option for current Citrix customers. Read More

Read more:
Cloud Storage - 2017 News, Articles and Trends - Tom's IT Pro

Read More..

Cloud Storage Pricing Comparison: AWS, Azure, Google, and B2

So why do some vendors make it so hard to get information about how much youre storing and how much youre being charged?

Cloud storage is fast becoming the central repository for mission critical information, irreplaceable memories, and in some cases entire corporate and personal histories. Given this responsibility, we believe cloud storage vendors have an obligation to be transparent as possible in how they interact with their customers.

In that light we decided to challenge four cloud storage vendors and ask two simple questions:

The detailed results are below, but if you wish to skip the details and the screen captures (TL;DR), weve summarized the results in the table below.

Our challenge was to upload 1 terabyte of data, store it for one month, and then downloadit.

Cloud Storage Test Details

For our tests, we choose Backblaze B2, Microsofts Azure, Amazons S3, and Google Cloud Storage. Our idea was simple: Upload 1 TB of data to the comparable service for each vendor, store it for 1 month, download that 1 TB, then document and share the results.

Lets start with most obvious observation, the cost charged by each vendor for the test:

Later in this post, well see if we can determine the different cost components (storage, downloading, transactions, etc.) for each vendor, but our first step is to see if we can determine how much data we stored. In some cases, the answer is not as obvious as it would seem.

At the core, a provider of a service ought to be able to tell a customer how much of the service he or she is using. In this case, one might assume that providers of Cloud Storage would be able to tell customers how much data is being stored at any given moment. It turns out, its not that simple.

Backblaze B2

Logging into a Backblaze B2 account, one is presented with a summary screen that displays all buckets. Each bucket displays key summary information, including data currently stored.

Clicking into a given bucket, one can browse individual files. Each file displays its size, and multiple files can be selected to create a size summary.

Summary: Accurate, intuitive display of storage information.

Microsoft Azure

Moving on to Microsofts Azure, things get a little more exciting. There was no area that we could find where one can determine the total amount of data, in GB, stored with Azure.

Theres an area entitled usage, but that wasnt helpful.

We then moved on to Overview, but had a couple challenges.The first issue was that we were presented with KiB (kibibyte) as a unit of measure. One GB (the unit of measure used in Azures pricing table) equates to roughly 976,563 KiB. It struck us as odd that things would be summarized by a unit of measure different from the billing unit of measure.

Summary: Storage is being measured in KiB, but is billed by the GB. Even with a calculator, it is unclear how much storage we are using.

Amazon S3

Next we checked on the data we were storing in S3. We again ran into problems.

In the bucket overview, we were able to identify our buckets. However, we could not tell how much data was being stored.

Drilling into a bucket, the detail view does tell us file size. However, there was no method for summarizing the data stored within that bucket or for multiple files.

Summary: Incomplete. From the file browsing user interface, while summaries of folders can be found, there is no reasonable way to understand how much data is being globally stored.

Google Cloud Storage (GCS)

GCS proved to have its own quirks, as well.

One can easily find the bucket summary, however, it does not provide information on data stored.

Clicking into the bucket, one can see files and the size of an individual file. However, no ability to see data total is provided.

Summary: Incomplete. From the file browsing user interface, there is no reasonable way to understand how much data is being stored.

Test 1 Conclusions

We knew how much storage we were uploading and, in many cases, the user will have some sense of the amount of data they are uploading. However, it strikes us as odd that many vendors wont tell you how much data you have stored. Even stranger are the vendors that provide reporting in a unit of measure that is different from the units in their pricing table.

The cloud storage industry has done itself no favors with its tiered pricing that requires a calculator to figure out whats going on. Setting that aside for a moment, one would presume that bills would be created in clear, auditable ways.

Backblaze

Inside of the Backblaze user interface, one finds a navigation link entitled Billing. Clicking on that, the user is presented with line items for previous bills, payments, and an estimate for the upcoming charges.

One can expand any given row to see the the line item transactions composing each bill.

Heres more detail.

Summary: Available on demand, and the site clearly defines what has and will be charged for.

Azure

Trying to understand the Azure billing proved to be a bit tricky.

On August 6th, we logged into the billing console and were presented with this screen.

As you can see, on Aug 6th, billing for the period of May-June was not available for download. For the period ending June 26th, we were charged nearly a month later, on July 24th. Clicking into that row item does display line item information.

Summary: Available, but difficult to find. The nearly 30 day lag in billing creates business and accounting challenges.

Amazon S3

Amazon presents a clean billing summary and enables users to drill down into line items.

Going to the billing area of AWS, one can survey various monthly bills and is presented with a clean summary of billing charges.

Expanding into the billing detail, Amazon articulates each line item charge. Within each line item, charges are broken out into sub-line items for the different tiers of pricing.

Summary: Available on demand. While there are some line items that seem unnecessary for our test, the bill is generally straight-forward to understand.

Google Cloud Storage (GCS)

This was an area where the GCS User Interface, which was otherwise relatively intuitive, became confusing.

Going to the Billing Overview page did not offer much in the way of an overview on charges.

However, moving down to the Transactions section did provide line item detail on all the charges incurred. However, similar to Azure introducing the concept of KiB, Google introduces the concept of the equally confusing Gibibyte (GiB). While all of Googles pricing tables are listed in terms of GB, the line items reference GiB. 1 GiB is 1.07374 GBs.

Summary: Available, but provides descriptions in units that are not on the pricing table nor commonly used.

Test 2 Conclusions

Clearly, some vendors do a better job than others in making their pricing available and understandable. From a transparency standpoint, its difficult to justify why a vendor would have their pricing table in units of X, but then put units of Y in the user interface.

Transparency: The Backblaze Way

Transparency isnt easy. At Backblaze, we believe in investing time and energy into presenting the most intuitive user interfaces that we can create. We take pride in our heritage in the consumer backup space servicing consumers has taught us how to make things understandable and usable. We do our best to apply those lessons to everything we do.

This philosophy reflects our desire to make our products usable, but its also part of a larger ethos of being transparent with our customers. We are being trusted with precious data. We want to repay that trust with, among other things, transparency.

Its that spirit that was behind the decision to publish our hard drive performance stats, to open source the infrastructure that is behind us having the lowest cost of storage in the industry, and also to open source our erasure coding (the math that drives a significant portion of our redundancy for your data).

Why? We believe its not just about good user interface, its about the relationship we want to build with our customers.

Ahin enjoys writing in the third person, cookies (digital or baked), and the new Chris Stapleton album.

Read more:
Cloud Storage Pricing Comparison: AWS, Azure, Google, and B2

Read More..

Altcoin Exchange, the Decentralized Cryptocurrency Exchange …

SAN DIEGO, Oct. 15, 2017 /PRNewswire/ -- Altcoin Exchange, the decentralized cryptocurrency exchange, is changing its name to Altcoin.io. This marks the beginning of a new era of altcoin trading that promises the cryptocurrency community complete security of their coins and a safer way to trade.

Altcoin.io wants to empower everyone, regardless of market knowledge, to trade altcoins securely and with confidence. This currently isn't possible with centralized exchanges. They're vulnerable to theft, and require you to relinquish control of your coins in order to trade.

Since June 2011, there have been 26 known centralized exchange hacks involving the loss of nearly $1,000,000,000 in user funds. Until now, the market has failed to respond with a robust solution, and Altcoin.io's goal is to solve this problem.

Altcoin.io aims to be the first decentralized exchange with an unparalleled focus on user experience, security, customer support, and design. By collaborating with the trading community, Altcoin.io will create a safe, trustless platform that enables users to make fast trades without worrying if they're exposing themselves to risk.

Andrew Gazdecki, CEO and founder of Altcoin.io, says, "Our new name is about setting the standard for how we move forward. With this company rebranding, we make it clear our goal is going beyond what already exists to make something bettera decentralized altcoin exchange where you can trade anonymously, securely, and stay in complete control of your funds."

By eschewing the centralized model, Altcoin.io lets traders retain full control of their coins so they can exchange with confidence. There's no single point of failure, no central repository for hackers to exploit, and full transparency in every transaction.

"The centralized exchange model is broken," says Andrew. "As altcoins continue their meteoric rise in popularity, it's more important than ever to establish a secure and trustless exchange but with the trading community involved. Altcoin.io is being built by traders, for traders."

Last week, Altcoin.io completed the world's ="https://medium.com/@altcoinexchnge/the-first-ethereum-bitcoin-atomic-swap-79befb8373a8" rel="nofollow" target="_blank">first Atomic Swap between the Ethereum and Bitcoin blockchains. Building on Decred's successful swap of Decred for Litecoin, Altcoin.io transferred 0.1245 Ethereum to 0.12345 Bitcoinswithout first passing ownership to a third party. Atomic Swaps are the key to creating secure decentralized trades that transact as quickly as centralized ones, and this milestone puts Altcoin.io on track for a community release in early 2018.

News of the swap has buoyed Altcoin.io's growing community and was enthusiastically covered by Bitcoin.com, CoinDesk, The Merkle, Cryptovestand others. While there are still some kinks to iron outsuch as privacy, options, and order matchingAltcoin.io is confident this is just the beginning of better things in the cryptocurrency community: Altcoin.io hopes to shake up this dynamic market by giving traders what they've always wanteda safe way to exchange and trade digital assets.

Related Files

altcoin.io.png

ethereum-swap.gif

Related Images

image1.png Altcoin.io Logo

image2.png Altcoin.io Decentralized Exchange

Related Links

="https://medium.com/@altcoinexchnge/altcoin-exchanges-purpose-mission-values-and-value-proposition-7d525303c005" rel="nofollow" target="_blank">Altcoin Exchange's purpose, mission, values and value proposition.

="https://medium.com/@altcoinexchnge/for-cryptocurrency-trading-to-flourish-we-need-a-truly-decentralized-exchange-6f2401e20323" rel="nofollow" target="_blank">For Cryptocurrency Trading To Flourish, We Need A Truly Decentralized Exchange.

View original content with multimedia:http://www.prnewswire.com/news-releases/altcoin-exchange-the-decentralized-cryptocurrency-exchange-rebrands-to-altcoinio-300536914.html

SOURCE Altcoin.io

Read more here:
Altcoin Exchange, the Decentralized Cryptocurrency Exchange ...

Read More..

Security Awareness – Encryption | Office of Information …

Overview

Encryption is the transformation of information into a form that is only readable by those with particular knowledge or technology to prevent others who might have access to the information from reading it. It has long been used for messages in transit, whether carried by hand, transmitted via radio or sent over a computer network if the message is intercepted, the interceptor would be unable to interpret the information. It also serves an important role for stored information to protect it in case of loss or theft.

While the concepts and processes of encryption greatly pre-date modern computing, the topic has become increasingly popular in computing over the past few years. This has largely been fueled by the vast increase of information transfer over computer networks and the increased security concerns that accompany a massively interconnected "always online" computing environment.

OIToffers and supports PGP software and licenses to faculty and staff for whole disk encryption. Whole disk encryption will keep educational records and confidential data secure in case your laptop is lost or stolen. This information should only be stored on a mobile device, like a laptop, when there is a specific business purpose. Find out if PGP whole disk encryption is right for you.

If we had a number we wished to keep secret (say the combination to a safe), one option to protect it is to encrypt the number, after all we can't store the combination to the safe inside the safe. Let's say the combination is 12-28-11 which we shorten to just 122811. Let's use some simple math to make it into a scrambled number.

Here's an equation that adds a secret number (n) to the combination and then multiplies the result by the same secret number:

If we pick 5 as our secret number, then we get:

Our scrambled number, 614080, is an encrypted version of our safe combination. To get our combination number back, we need to know our secret number and the formula used to create the scrambled number. Here's the formula:

We insert our secret number and our scrambled number:

And solve the equation to find our combination:

We have successfully developed our own encryption process for our safe combination.

The process of transforming readable information into an unreadable form. Making the safe combination into the scrambled number.

Decrypt

The process of transforming encrypted information back into its readable form. Making the scrambled number back into the safe combination.

Key

The item used, along with the algorithm, to encrypt and decrypt information. . In the example above, the secret number, n, was our key. The key could be a password, a special file or a hardware device often called a token Strong encryption processes may use multiple keys like both a password and a token.

Key length

Algorithm

The mathematical technique used, along with the key(s), to encrypt and decrypt information. In the example above, the equation, n*(combination + n)=scrambled number, was our algorithm. Popular encryption algorithms include: AES, DES, triple-DES, RSA, blowfish, IDEA

Information is considered "at rest" when it is saved to a computer or storage device (like a CD, tape or thumbdrive) which is usually in contrast to "in transit". Note that data can be considered "at rest" while physically moving like someone carrying a CD with information.

Information is "in transit" when it is being transferred over a network. This could be copying a file from a file server, submitting a webpage order form or sending an email.

The behavior of an encryption technology/product which keeps a file encrypted when it is moved between disks or computers. Many forms of encryption only keep information encrypted when stored in a particular location.

Symmetrical vs Asymmetrical

Encryption/decryption processes are often referred to as being either symmetrical or asymmetrical, which relates to what keys are used to encrypt and decrypt information.

In symmetrical encryption, the same key is used to encrypt and decrypt the information. The most common use of this technique is password encryption where the same password is used to encrypt and decrypt the information. This method is simple and useful when sharing the key isn't problematic (either the key isn't shared or all parties are trusted with the information). It requires that all parties who need to encrypt or decrypt the information safely obtain the key.

In asymmetrical encryption, there are two different keys one used to encrypt the information and one used to decrypt the information. In this approach, the key used to encrypt the information cannot be used to decrypt it. This technique is useful when sharing a key might be problematic. These two keys are often referred to as public and private keys. As the names imply, the public key is openly distributed as it can only be used to encrypt information and the private key that can decrypt the information is protected.

Key managementPerhaps the most important aspect of encryption deployment is management of keys. This includes what types of keys are used (passwords, files, tokens, certificates, etc), how they are given to users, how they are protected and how to deal with a lost key scenario. Each technology and product handles this differently, but the lost key scenario is usually the most concerning since it could lead to either an unauthorized person decrypting information or the inability for authorized people to decrypt information. Many encryption horror stories come in the form of not being able to decrypt the only copy of very important information. Pay careful attention to key generation, distribution, use, recovery and security when looking into encryption options.

Impacts to system/data managementWhen files or disks are encrypted, an IT administrator might have to adapt some of their management processes or tools. For example, what impact do encrypted hard drives have on system imaging? What about the use of wake-on-LAN for management? The answers to these questions vary with your management processes and the encryption product, so it's important to understand how encryption products will impact your IT environment.

When does encryption stay with the file?Many forms of encryption only protect information while it is transferred over the network (like a website using SSL) or while it is stored in a particular place (like on an encrypted hard drive). This means that once the file is moved out of the situation, it is no longer encrypted. This often confuses users who think encryption "sticks" to files and they can email a file stored on an encrypted disk and it will stay encrypted as an email attachment, or copy a file from an encrypted disk to a thumb drive and the file will remain encrypted. It's important to understand the conditions under which a file will be encrypted and explain those conditions to those in your department. Since encryption conditions vary by technology, product and implementation, there isn't a general rule.

Follow this link:
Security Awareness - Encryption | Office of Information ...

Read More..

Data Encryption and Decryption (Windows)

This documentation is archived and is not being maintained.

Encryption is the process of translating plain text data (plaintext) into something that appears to be random and meaningless (ciphertext). Decryption is the process of converting ciphertext back to plaintext.

To encrypt more than a small amount of data, symmetric encryption is used. A symmetric key is used during both the encryption and decryption processes. To decrypt a particular piece of ciphertext, the key that was used to encrypt the data must be used.

The goal of every encryption algorithm is to make it as difficult as possible to decrypt the generated ciphertext without using the key. If a really good encryption algorithm is used, there is no technique significantly better than methodically trying every possible key. For such an algorithm, the longer the key, the more difficult it is to decrypt a piece of ciphertext without possessing the key.

It is difficult to determine the quality of an encryption algorithm. Algorithms that look promising sometimes turn out to be very easy to break, given the proper attack. When selecting an encryption algorithm, it is a good idea to choose one that has been in use for several years and has successfully resisted all attacks.

For more information, see Data Encryption and Decryption Functions.

Go here to read the rest:
Data Encryption and Decryption (Windows)

Read More..

Introduction to Cryptocurrency – CryptoCurrency Facts

Cryptocurrency facts takes a simplified look at digital currency like bitcoin to help everyone understand what it is, how it works, and its implications. On this site, we cover everything you need to know about:

As of 2017, cryptocurrency has been used as a decentralized alternative to traditional fiat currencies (which are usually backed by somecentral government)such asthe US dollar (USD).

For theaverage person using cryptocurrency is as easy as:

What is a cryptocurrency address?: A public address is a unique string of charactersused to receive cryptocurrency. Each public address has a matching private address that can be used to prove ownership of thepublic address. WithBitcoin the addressis called a Bitcoin address. Think of it like a unique email address that people can send currency to as opposed to emails.

The first decentralized digital cryptocurrency can be traced back to Bit Gold, which was worked on by Nick Szabo between 1998 and 2005. Bit gold is considered the first precursor to bitcoin. In 2008,Satoshi Nakamoto (an anonymousperson and/or group) released a paper detailing what would become Bitcoin.

Bitcoin became the first decentralized digital coin when it was created in 2008. Itthen went public in2009. As of 2017, Bitcoinis the most commonly known and used cryptocurrency (with other coins like Ethereum and Litecoin also being notable). Given thepopularity of Bitcoin as well asits history, the term altcoin is sometimes used to describe alternative cryptocurrenciesto bitcoin (especially coins with small market caps).

As of January 2015, there wereover 500different types of cryptocurrencies or altcoins for trade in online markets. However,only 10 of them had market capitalizations over $10 million.

As of September 2017, there were over 1,100 cryptocurrencies and thetotal market capitalization of all cryptocurrencies reached an all-time high surpassing $60 billion!

In other words, although the future is uncertain, cryptocurrency seems to be more than just a fad. Here in 2017 cryptocurrency is shaping up a growing market that (despite its pros and cons) is likely here for the long haul.

On this site, we explore every aspect of cryptocurrency. Simply choose a page from the menu,visitour what is cryptocurrency page for a more detailed explanation of cryptocurrency, or jump right in to the how cryptocurrency works section to start learning about transactions, mining, and public ledgers.

See original here:
Introduction to Cryptocurrency - CryptoCurrency Facts

Read More..

quantum computing – engadget.com

According to Intel, the building blocks of quantum computing, qubits, are very fragile. They can only operate at extremely low temperatures (250 times colder than deep space) and must be packaged carefully to prevent data loss. Intel's research groups in Oregon and Arizona have found a way to manufacture 17-quibit chips with an architecture that makes them more reliable at higher temperatures and reduced RF interference between each qubit. The chip can send and receive 10 to 100 times more signal than comparable wire-bonded chips and has an advanced design that allows for the techniques to be applied to larger quantum integrated circuits, which are much bigger than typical silicon chips.

"Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on a regular basis in our leading-edge manufacturing facilities," said Intel Labs' Dr. Michael Mayberry. "Intel's expertise in fabrication, control electronics and architecture sets us apart and will serve us well as we venture into new computing paradigms, from neuromorphic to quantum computing."

Go here to read the rest:
quantum computing - engadget.com

Read More..

Qudits: The Real Future of Quantum Computing? – IEEE Spectrum

Instead of creating quantum computers based on qubits that can each adopt only two possible options, scientists have now developed a microchip that can generate qudits that can each assume 10 or more states, potentially opening up a new way to creating incredibly powerful quantum computers, a new study finds.

Classical computers switch transistors either on or off to symbolize data as ones and zeroes. In contrast, quantum computers use quantum bits, or qubitsthat, because of the bizarre nature of quantum physics, can be in a state ofsuperpositionwhere they simultaneously act as both 1 and 0.

The superpositions that qubits can adopt let them each help perform two calculations at once. If two qubitsare quantum-mechanically linked, orentangled,they can help perform four calculations simultaneously; three qubits, eight calculations; and so on. As a result, aquantum computer with 300 qubits could perform more calculations in an instant than there are atoms in the known universe, solving certain problems much faster than classical computers. However, superpositions are extraordinarily fragile, making it difficult to work with multiple qubits.

Most attempts at building practical quantum computers rely on particles that serve as qubits. However, scientists have long known that they could in principle use quditswith more than two states simultaneously. In principle, a quantum computer with two 32-state qudits, for example, would be able to perform as many operations as 10 qubits while skipping the challenges inherent with working with 10 qubits together.

Researchers used the setup pictured above to create, manipulate, and detect qudits. The experiment starts when a laser fires pulses of light into a micro-ring resonator, which in turn emits entangled pairs of photons.Because the ring has multiple resonances, the photons have optical spectrumswitha set of evenly spaced frequencies(red and blue peaks), a process known as spontaneous four-wave mixing (SFWM).The researchers were able to use each of thefrequencies to encode information, which means the photons act asqudits.Each quditis in a superposition of 10 possible states, extending the usual binary alphabet (0 and 1) of quantum bits.The researchers also showed they could perform basic gate operations on the qudits using optical filters and modulators, and then detect the results using single-photon counters.

Now scientists have for the first time created a microchip that can generate two entangled qudits each with 10 states, for 100 dimensions total, more than what six entangled qubits could generate. We have now achieved the compact and easy generation of high-dimensional quantum states, says study co-lead author Michael Kues, a quantum optics researcher at Canadas National Institute of Scientific Research, or INRS,its French acronym,in Varennes, Quebec.

The researchers developed a photonic chip fabricated using techniques similar to ones used for integrated circuits. A laser fires pulses of light into a micro-ring resonator, a 270-micrometer-diameter circle etched onto silica glass, which in turn emits entangled pairs of photons. Each photon is in a superposition of 10 possible wavelengths or colors.

For example, a high-dimensional photon can be red and yellow and green and blue, although the photons used here were in the infrared wavelength range, Kues says. Specifically, one photon from each pair spanned wavelengths from 1534 to 1550 nanometers, while the other spanned from 1550 to 1566 nanometers.

Using commercial off-the-shelf telecommunications components, the researchers showed they could manipulate these entangled photons. The basic capabilities they show are really what you need to do universal quantum computation, says quantum optics researcher Joseph Lukens at Oak Ridge National Laboratory, in Tennessee, who did not take part in this research. Its pretty exciting stuff.

In addition, by sending the entangled photons through a 24.2-kilometer-long optical fiber telecommunications system, the researchers showed that entanglement was preserved over large distances. This could prove useful for nigh-unhackable quantum communications applications, the researchers say.

What I think is amazing about our system is that it can be created using components that are out on the market, whereas other quantum computer technologies need state-of-the-art cryogenics, state-of-the-art superconductors, state-of-the-art magnets, saysstudy co-senior authorRoberto Morandotti, a physicistatINRSin Varennes. The fact that we use basic telecommunications components to access and control these states means that a lot of researchers could explore this area as well.

The scientists noted that current state-of-the-art components could conceivably generate entangled pairs of 96-state qudits, corresponding to more dimensions than 13 qubits. Conceptually, in principle, I dont see a limit to the number of states of qudits right now, Lukens, from Oak Ridge,says. I do think a 96-by-96-dimensional system is fairly reasonable, and achievable in the near future.

But he adds that several components of the experiment were not on the microchips, such as the programmable filters and phase modulators, which led to photon loss. Kues says that integrating such components with the rest of the chips and optimizing their micro-ring resonator would help reduce such losses to make their system more practical for use.

The next big challenge we will have to solve is to use our system for quantum computation and quantum communications applications, Kues says. While this will take some additional years, it is the final step required to achieve systems that can outperform classical computers and communications.

The scientists detailed their findings in the latest issue of the journal Nature.

Originally posted here:
Qudits: The Real Future of Quantum Computing? - IEEE Spectrum

Read More..

Intel Takes First Steps To Universal Quantum Computing

October 11, 2017Timothy Prickett Morgan

Someone is going to commercialize a general purpose, universal quantum computer first, and Intel wants to be the first. So does Google. So does IBM. And D-Wave is pretty sure it already has done this, even if many academics and a slew of upstart competitors dont agree. What we can all agree on is that there is a very long road ahead in the development of quantum computing, and it will be a costly endeavor that could nonetheless help solve some intractable problems.

This week, Intel showed off the handiwork its engineers and those of partner QuTech, a quantum computing spinoff from the Technical University of Delft and Toegepast Natuurwetenschappelijk Onderzoek (TNO), which as the name suggests is an applied science research firm that, among other things, is working with Intel on quantum computing technology.

TNO, which was established in 1988, has a 500 million annual budget and does all kinds of primary research. The Netherlands has become a hotbed of quantum computing technology, along with the United States and Japan, and its government wants to keep it that way and hence the partnership in late 2015 with Intel, which invested $50 million in the QuTech partnership between TU Delft and TNO so it could jumpstart its own quantum computing program after sitting on the sidelines.

With this partnership, Intel is bringing its expertise in materials science, semiconductor manufacturing, interconnects, and digital systems to play to help develop two types of quantum bits, or qubits, which are the basic element of processing in a quantum computer. The QuTech partnership involves the manufacturing of superconducting qubits, but Intel also is working on another technology called spin qubits that makes use of more traditional semiconductor technologies to create what is, in essence, the quantum transistor for this very funky and very parallel style of computing.

The big news this week is that Intel has been able to take a qubit design that its engineers created alongside of those working at QuTech and scale it up to 17 qubits on a single package. A year ago, the Intel-QuTech partnership had only a few qubits on their initial devices, Jim Clarke, director of quantum hardware at Intel, tells The Next Platform, and two years ago it had none. So that is a pretty impressive roadmap in a world where Google is testing a 20 qubit chip and hopes to have one running at 49 qubits before the year is out. Google also has quantum annealing systems from D-Wave, which have much more scale in terms of qubits 1,000 today and 2,000 on the horizon but according to Intel are not a generic enough to be properly commercialized. And if Intel knows anything, it knows how to create a universal computing substrate and scale its manufacturing and its deployment in the datacenters of the world.

Production and cleanroom facilities for the quantum chip made at Intels D1D/D1X plant in Hillsboro, Oregon, in April 2017.

We are trying to build a general purpose, universal quantum computer, says Clarke. This is not a quantum annealer, like the D-Wave machine. There are many different types of qubits, which are the devices for quantum computing, and one of the things that sets Intel apart from the other players is that we are focused on multiple qubit types. The first is a superconducting qubit, which is similar to what Google, IBM, and a startup named Rigetti Computing are working on. But Intel is also working on spin qubits in silicon are very similar to our transistor technologies, and you can expect to hear about that in the next couple of months. These spin qubits build on our expertise in ordinary chip fabrication, and what really sets us apart here is our use of advanced packaging at very low temperatures to improve the performance of the qubit, and with an eye towards scalability.

Just as people are obsessed with the number of transistors or cores on a standard digital processor, people are becoming a bit obsessed with the number of qubits on a quantum chip, and Jim Held, director of emerging technology research at Intel Labs, says that this focus is a bit misplaced. And for those of us who look at systems for a living, this makes perfect sense. Intel is focused on getting the system design right, and then scaling it up on all vectors to build a very powerful quantum machine.

Here is the situation as Held sees it, and breathe in deeply here:

People focus on the number of qubits, but that is just one piece of what is needed. We are really approaching this as engineers, and everything is different about this kind of computer. It is not just the devices, but the control electronics and how the qubits are manipulated with microwave pulses and measured with very sensitive DC instrumentation, and it is more like an analog computer in some respects. Then it has digital electronics that do error correction because quantum devices are very fragile, and they are prone to errors and to the degree that we can correct the errors, we can compute better and longer with them. It also means a new kind of compiler in order to get the potential parallelism in an array of these qubits, and even the programs, the algorithms, written for these devices are an entirely different kind of thing from conventional digital programming. Every aspect of the stack is different. While there is research going on in the academic world at all levels, as an engineering organization we are coming at them all together because we know we have to deliver them all at once as a computer. Moreover, our experience tells us that we want to understand at any given point what our choices at one level are going to mean for the rest of the computer. What we know is that if you have a plate full of these qubits, you do not have a quantum computer, and some of the toughest problems with scaling are in the rest of the stack. Focusing on the number of qubits or the coherence time really does a disservice to the process of getting to something useful.

This is analogous to massively parallel machines that dont have enough bandwidth or low latency to talk across cores, sockets, or nodes efficiently and to share work. You can cram as many cores as you want in them, but the jobs wont finish faster.

And thus, Intel is focusing its research on the interconnects that will link qubits together on a device and across multiple devices.

The interconnects are one of the things that concerns us most with quantum computing, says Clarke. From the outset, we have not been focused on a near-term milestone, but rather on what it would take from the interconnect perspective, from the point of view of the design and the control, to deliver a large scale, universal quantum computer.

Interestingly, Clarke says that the on-chip interconnect on commercial quantum chips will be similar to that used on a conventional digital CPU, but it may not be made out of copper wires, but rather superconducting materials.

The one used in the superconducting qubit chip that Intel just fabbed in its Oregon factory and packaged in its Arizona packaging facility is a bit ridiculous looking.

Quantum computing presents a few physical challenges, and superconducting qubits are especially tricky. To keep preserve the quantum states that allow superposition a kind of multiple, concurrent state of the bits that allows for parallel processing at the bit level, to over simplify hugely requires for these analog devices to be kept at extremely cold temperatures and yet still have to interface with the control electronics in the outside world, crammed into a rack.

We are putting these chips in an extremely cold environment 20 millikelvins, and that is much colder than outer space, says Clarke. And first of all, we have to make sure that the chip doesnt fall apart at these temperatures. You have thermal coefficient of expansion. Then you need to worry about package yield and then about the individual qubit yield. Then we worry about wiring them up in a more extensible fashion. These are very high quality radio or microwave frequency chips and we have to make sure we maintain that quality at low temperature once the device is packaged. A lot of the performance and yield that we are getting comes from the packaging.

So for this chip, Intel has wallpapered one side of the chip with standard coaxial ports, like the ones on the back of your home router. Each qubit has two or more coax ports going into it to control its state and to monitor that state. How retro:

We are focused on a commercial machine, so we are much more interested in scaling issues, Held continues along this line of thinking. You have to be careful to not end up in a dead end that only gets you so far. This quantum chip interconnect is not sophisticated like Omni-Path, and it does not scale well, Held adds with a laugh. What we are interested in is improving on that to reduce the massive number of connections. A million qubits turning into millions of coax cables is obviously not going to work. Even at hundreds of qubits, this is not going to work. One way we are going to do this is to move the electronics that is going to control this quantum machine into this very cold environment, not down at the millikelvin level, but a layer or two up at the 4 kelvin temperature of liquid hydrogen. Our partners at QuTech are experts at cryo-CMOS, which means making chips work in this 4 kelvin range. By moving this control circuitry from a rack outside of the quantum computer into the refrigeration unit, it cuts the length of the connections to the qubits.

With qubits, superposition allows a single qubit to represent two different states, and quantum entanglement what Einstein called spooky action at a distance allows for the states to scale linearly as the qubit counts go up. Technically, n quantum bits yield 2 to the n states. (We wrote that out because there is something funky about superscripts in the Alike font we use here at The Next Platform.) The interconnect is not used to maintain the quantum states across the qubits that happens because of physics but to monitor the qubit states and maintain those states and, importantly, to do error correction. Qubits cant be shaken or stirred or they lose their state, and they are extremely fussy. As Google pointed out two years ago at the International Super Computing conference in Germany, a quantum computer could end up being an accelerator for a traditional parallel supercomputer, which is used to do error correction and monitoring of qubits. Intel is also thinking this might happen.

The fussiness of superconducting qubits is probably one of the reasons why Intel is looking to spin qubits and a more standard semiconductor process to create a quantum computer chip whose state is easier to maintain. The other is that Intel is obviously an expert at manufacturing semiconductor devices. So, we think, the work with QuTech is as much about creating a testbed system and a software stack that might be portable as it is investing in this particular superconducting approach. Time will tell.

And time, indeed, it will take. Both Held and Clarke independently think it will take maybe eight to ten years to get a general purpose, universal quantum computer commercialized and operating at a useful scale.

It is research, so we are only coming to timing based on how we think we are going to solve a number of problems, says Held. There will be a milestone where a machine will be able to tackle interesting but small problems, and then there will be a full scale machine that is mature enough to be a general purpose, widely useful accelerator in the supercomputer environment or in the cloud. These will not be free-standing computers because they dont do a lot of things that a classical digital computer does really well. They could do them, because in theory any quantum computer can do anything a digital computer can do, but they dont do it well. It is going to take on the order of eight to ten years to solve these problems we are solving now. They are all engineering problems; the physicists have done an excellent job of finding feasible solutions out of the lab and scaling them out.

Clarke adds a note of caution, pointing out that there are a lot of physics problems that need to be solved for the packaging aspects of a quantum computer. But I think to solve the next level of physics problems, we need a healthy dose of engineering and process control, Clarke says. I think eight to ten years is probably fair. We are currently at mile one of a marathon. Intel is already in the lead pack. But when we think of a commercially relevant quantum computer, we think of one that is relevant to the general population, and moreover, one that would show up on Intels bottom line. They key is that we are building a system, and at first, that system is going to be pretty small but it is going to educate us about all aspects of the quantum computing stack. At the same time, we are designing that system for extensibility, both at the hardware level and at the architecture control level to get to many more qubits. We want to make the system better, and larger, and it is probably a bit premature to start assigning numbers to that other than to say that we are thinking about the longer term.

It seems we might need a quantum computer to figure out when we might get a quantum computer.

Categories: Cloud, Compute, HPC, Hyperscale

Tags: Delft, Intel, quantum, qubit, QuTech, spin qubit, Superconducting, TNO

See the rest here:
Intel Takes First Steps To Universal Quantum Computing

Read More..