Page 11234..1020..»

Avast Internet Security Download –

Avast Internet Security is the most complete suite that the company issues. It bundles an antivirus module, a two-way firewall, antispam and antispyware modules that combine their power to provide an unbreakable wall against all kinds of threats.

Avast! Internet Security uses the same antivirus engine as the companys Pro Antivirus. It provides real-time protection and various scan modes for both your computer and for removable devices as well. A special type of scan is the Boot-Time mode, which scans the operating system right before startup, making sure to clean any infected files (it doesn’t even take long).

In addition to that, the Firewall module builds a shield against hackers and protects your identity against theft. Your efforts are reduced to selecting your network type (work, home or public). An expert mode is available for advanced users (includes user-defined network and packet rules).

Moreover, Avasts Internet Security also provides protection for your email via the Antispam module, a feature that prevents phishing attempts and blocks untrusted senders for Outlook and POP3/IMAP servers.

Maximum safety for your working environment is achieved via the virtualization feature. It goes by the name of Sandbox and its a place that you can use to open applications or webpages that you dont trust.

To top it all off, youve got the SafeZone module that you quickly grow fond of, because it acts as an isolated space for online shopping and e-banking. Your transactions are protected and your activity is not in danger of being tracked.

In conclusion, Avast! Internet Security is a complex protection suite. You can rest assured that it has the ability to protect your computer and your online and social activity in real time.

Go here to see the original:
Avast Internet Security Download –

Read More..

Cloud Storage Topics – SearchCloudStorage

The Cloud Backup and Recovery topic section offers resources and best practices for data storage professionals who are mapping a disaster recovery strategy using cloud storage and cloud services. Get the latest news and tips on cloud disaster recovery, and find out how cloud storage impacts restore times, business continuity and the organization of sensitive data. Cloud-based disaster recovery can bring benefits to your IT shop, but should be considered in the larger context of a long-term DR plan. More about Cloud Disaster Recovery

The Cloud Storage Backup and Recovery topic section offers resources and best practices for data storage professionals who are mapping a disaster recovery strategy using cloud storage and cloud services. Get the latest news and tips on cloud disaster recovery, and find out how cloud storage impacts restore times, business continuity and the organization of sensitive data. Cloud-based disaster recovery can bring benefits to your IT shop, but should be considered in the larger context of a long-term DR plan. More about Cloud Storage Backup

The Cloud Storage Management and Standards topic section offers comprehensive resources for data storage professionals looking for information on cloud storage management and cloud standards. Find out how to properly manage your data in the cloud and read our expert advice and guidance on cloud management. Get the latest news and tips on cloud storage strategies and what steps to take for better data management in the cloud. Read our essential guides, special reports and tutorials to keep up with the latest trends in cloud storage management, and see what tips and improvements you can apply to your own management strategy. More about Cloud Management

The Hybrid Cloud Storage topic section offers comprehensive resources for data storage professionals looking for information on hybrid storage clouds. Learn about the pros and cons of hybrid cloud storage and read our expert advice and guidance on hybrid clouds. Get the latest news and tips on using a hybrid cloud model in your environment and what types of environments are suited for hybrid storage clouds. Read our essential guides, special reports and tutorials to catch up on the most recent developments and advances in the hybrid cloud arena, and see what tips and improvements you can apply to your own hybrid cloud. More about Hybrid Cloud

The Private Cloud Storage topic section offers comprehensive resources for data storage professionals looking for information on private clouds. Learn about the pros and cons of private cloud storage and read our expert advice and guidance on private cloud infrastructure. Get the latest news and tips on private cloud providers, private cloud services, plus implementation tips, costs and private cloud management advice. Read our essential guides, special reports and tutorials to catch up on the most recent developments in the private cloud storage space, and see what tips and improvements you can apply to your own private cloud environment. More about Private Cloud

The Public Cloud Storage topic section offers comprehensive resources for data storage professionals looking for information on public clouds. Learn about the pros and cons of public cloud storage and read our expert advice and guidance on public clouds. Get the latest news and tips on public cloud services, public cloud providers, plus costs, implementation tips and common problems storage managers run into when using public clouds. Learn about the best practices regarding public clouds. Read our essential guides, special reports and tutorials to catch up on the most recent developments in the public cloud storage space, and see what tips and improvements you can apply to your own public cloud. More about Public Cloud

View original post here:
Cloud Storage Topics – SearchCloudStorage

Read More..

Cloud Storage – 2017 News, Articles and Trends – Tom’s IT Pro

Refine Also By Tag: , Article Type: , Author: ,


Article Type:


January 30, 2017 in Best Picks

Before settling on a cloud backup option for your business, check out our buying advice and top picks, which include Amazon S3, Code 42 CrashPlan, OpenDrive, Microsoft Azure and more. Read More

October 13, 2016 in News

PDF signing, sharing through iMessage, viewing files without unlocking are all new business features of Dropbox for iOS. Find out what else is news. Read More

September 2, 2015 in Reviews

eFolders Anchor is a unique EFSS option targeted at MSPs and VARs that offers customers the choice between cloud and on-premises storage, flexible security options and a customizable user interface. Read More

August 11, 2015 in Reviews

Egnyte is one of the few enterprise file sync and share vendors that has a true hybrid cloud strategy bolstered by veteran thought leadership, flexible UI options and a strong focus on security, encryption, as well as auditing and reporting capabilities. Read More

June 25, 2015 in News

Red Hat announced the updated Red Hat Ceph Storage 1.3 and Red Hat Gluster Storage 3.1 software-defined storage solutions. The new versions offer improved performance and data integrity for petabyte scale deployments. Read More

June 18, 2015 in Reviews

Box Enterprise is more than just a simple cloud file sync and share tool. With a strong focus on security, integration and consistent cross-platform usability, Box has a lot to offer business customers and more to come in the near future. Read More

June 4, 2015 in Reviews

Citrix’s ShareFile is a strong enterprise file sync and share (EFSS) solution that offers plenty of security controls and can leverage both cloud and on-premises storage. It’s especially an attractive option for current Citrix customers. Read More

Read more:
Cloud Storage – 2017 News, Articles and Trends – Tom’s IT Pro

Read More..

Cloud Storage Pricing Comparison: AWS, Azure, Google, and B2

So why do some vendors make it so hard to get information about how much youre storing and how much youre being charged?

Cloud storage is fast becoming the central repository for mission critical information, irreplaceable memories, and in some cases entire corporate and personal histories. Given this responsibility, we believe cloud storage vendors have an obligation to be transparent as possible in how they interact with their customers.

In that light we decided to challenge four cloud storage vendors and ask two simple questions:

The detailed results are below, but if you wish to skip the details and the screen captures (TL;DR), weve summarized the results in the table below.

Our challenge was to upload 1 terabyte of data, store it for one month, and then downloadit.

Cloud Storage Test Details

For our tests, we choose Backblaze B2, Microsofts Azure, Amazons S3, and Google Cloud Storage. Our idea was simple: Upload 1 TB of data to the comparable service for each vendor, store it for 1 month, download that 1 TB, then document and share the results.

Lets start with most obvious observation, the cost charged by each vendor for the test:

Later in this post, well see if we can determine the different cost components (storage, downloading, transactions, etc.) for each vendor, but our first step is to see if we can determine how much data we stored. In some cases, the answer is not as obvious as it would seem.

At the core, a provider of a service ought to be able to tell a customer how much of the service he or she is using. In this case, one might assume that providers of Cloud Storage would be able to tell customers how much data is being stored at any given moment. It turns out, its not that simple.

Backblaze B2

Logging into a Backblaze B2 account, one is presented with a summary screen that displays all buckets. Each bucket displays key summary information, including data currently stored.

Clicking into a given bucket, one can browse individual files. Each file displays its size, and multiple files can be selected to create a size summary.

Summary: Accurate, intuitive display of storage information.

Microsoft Azure

Moving on to Microsofts Azure, things get a little more exciting. There was no area that we could find where one can determine the total amount of data, in GB, stored with Azure.

Theres an area entitled usage, but that wasnt helpful.

We then moved on to Overview, but had a couple challenges.The first issue was that we were presented with KiB (kibibyte) as a unit of measure. One GB (the unit of measure used in Azures pricing table) equates to roughly 976,563 KiB. It struck us as odd that things would be summarized by a unit of measure different from the billing unit of measure.

Summary: Storage is being measured in KiB, but is billed by the GB. Even with a calculator, it is unclear how much storage we are using.

Amazon S3

Next we checked on the data we were storing in S3. We again ran into problems.

In the bucket overview, we were able to identify our buckets. However, we could not tell how much data was being stored.

Drilling into a bucket, the detail view does tell us file size. However, there was no method for summarizing the data stored within that bucket or for multiple files.

Summary: Incomplete. From the file browsing user interface, while summaries of folders can be found, there is no reasonable way to understand how much data is being globally stored.

Google Cloud Storage (GCS)

GCS proved to have its own quirks, as well.

One can easily find the bucket summary, however, it does not provide information on data stored.

Clicking into the bucket, one can see files and the size of an individual file. However, no ability to see data total is provided.

Summary: Incomplete. From the file browsing user interface, there is no reasonable way to understand how much data is being stored.

Test 1 Conclusions

We knew how much storage we were uploading and, in many cases, the user will have some sense of the amount of data they are uploading. However, it strikes us as odd that many vendors wont tell you how much data you have stored. Even stranger are the vendors that provide reporting in a unit of measure that is different from the units in their pricing table.

The cloud storage industry has done itself no favors with its tiered pricing that requires a calculator to figure out whats going on. Setting that aside for a moment, one would presume that bills would be created in clear, auditable ways.


Inside of the Backblaze user interface, one finds a navigation link entitled Billing. Clicking on that, the user is presented with line items for previous bills, payments, and an estimate for the upcoming charges.

One can expand any given row to see the the line item transactions composing each bill.

Heres more detail.

Summary: Available on demand, and the site clearly defines what has and will be charged for.


Trying to understand the Azure billing proved to be a bit tricky.

On August 6th, we logged into the billing console and were presented with this screen.

As you can see, on Aug 6th, billing for the period of May-June was not available for download. For the period ending June 26th, we were charged nearly a month later, on July 24th. Clicking into that row item does display line item information.

Summary: Available, but difficult to find. The nearly 30 day lag in billing creates business and accounting challenges.

Amazon S3

Amazon presents a clean billing summary and enables users to drill down into line items.

Going to the billing area of AWS, one can survey various monthly bills and is presented with a clean summary of billing charges.

Expanding into the billing detail, Amazon articulates each line item charge. Within each line item, charges are broken out into sub-line items for the different tiers of pricing.

Summary: Available on demand. While there are some line items that seem unnecessary for our test, the bill is generally straight-forward to understand.

Google Cloud Storage (GCS)

This was an area where the GCS User Interface, which was otherwise relatively intuitive, became confusing.

Going to the Billing Overview page did not offer much in the way of an overview on charges.

However, moving down to the Transactions section did provide line item detail on all the charges incurred. However, similar to Azure introducing the concept of KiB, Google introduces the concept of the equally confusing Gibibyte (GiB). While all of Googles pricing tables are listed in terms of GB, the line items reference GiB. 1 GiB is 1.07374 GBs.

Summary: Available, but provides descriptions in units that are not on the pricing table nor commonly used.

Test 2 Conclusions

Clearly, some vendors do a better job than others in making their pricing available and understandable. From a transparency standpoint, its difficult to justify why a vendor would have their pricing table in units of X, but then put units of Y in the user interface.

Transparency: The Backblaze Way

Transparency isnt easy. At Backblaze, we believe in investing time and energy into presenting the most intuitive user interfaces that we can create. We take pride in our heritage in the consumer backup space servicing consumers has taught us how to make things understandable and usable. We do our best to apply those lessons to everything we do.

This philosophy reflects our desire to make our products usable, but its also part of a larger ethos of being transparent with our customers. We are being trusted with precious data. We want to repay that trust with, among other things, transparency.

Its that spirit that was behind the decision to publish our hard drive performance stats, to open source the infrastructure that is behind us having the lowest cost of storage in the industry, and also to open source our erasure coding (the math that drives a significant portion of our redundancy for your data).

Why? We believe its not just about good user interface, its about the relationship we want to build with our customers.

Ahin enjoys writing in the third person, cookies (digital or baked), and the new Chris Stapleton album.

Read more:
Cloud Storage Pricing Comparison: AWS, Azure, Google, and B2

Read More..

Altcoin Exchange, the Decentralized Cryptocurrency Exchange …

SAN DIEGO, Oct. 15, 2017 /PRNewswire/ — Altcoin Exchange, the decentralized cryptocurrency exchange, is changing its name to This marks the beginning of a new era of altcoin trading that promises the cryptocurrency community complete security of their coins and a safer way to trade. wants to empower everyone, regardless of market knowledge, to trade altcoins securely and with confidence. This currently isn’t possible with centralized exchanges. They’re vulnerable to theft, and require you to relinquish control of your coins in order to trade.

Since June 2011, there have been 26 known centralized exchange hacks involving the loss of nearly $1,000,000,000 in user funds. Until now, the market has failed to respond with a robust solution, and’s goal is to solve this problem. aims to be the first decentralized exchange with an unparalleled focus on user experience, security, customer support, and design. By collaborating with the trading community, will create a safe, trustless platform that enables users to make fast trades without worrying if they’re exposing themselves to risk.

Andrew Gazdecki, CEO and founder of, says, “Our new name is about setting the standard for how we move forward. With this company rebranding, we make it clear our goal is going beyond what already exists to make something bettera decentralized altcoin exchange where you can trade anonymously, securely, and stay in complete control of your funds.”

By eschewing the centralized model, lets traders retain full control of their coins so they can exchange with confidence. There’s no single point of failure, no central repository for hackers to exploit, and full transparency in every transaction.

“The centralized exchange model is broken,” says Andrew. “As altcoins continue their meteoric rise in popularity, it’s more important than ever to establish a secure and trustless exchange but with the trading community involved. is being built by traders, for traders.”

Last week, completed the world’s =”″ rel=”nofollow” target=”_blank”>first Atomic Swap between the Ethereum and Bitcoin blockchains. Building on Decred’s successful swap of Decred for Litecoin, transferred 0.1245 Ethereum to 0.12345 Bitcoinswithout first passing ownership to a third party. Atomic Swaps are the key to creating secure decentralized trades that transact as quickly as centralized ones, and this milestone puts on track for a community release in early 2018.

News of the swap has buoyed’s growing community and was enthusiastically covered by, CoinDesk, The Merkle, Cryptovestand others. While there are still some kinks to iron outsuch as privacy, options, and order is confident this is just the beginning of better things in the cryptocurrency community: hopes to shake up this dynamic market by giving traders what they’ve always wanteda safe way to exchange and trade digital assets.

Related Files


Related Images

image1.png Logo

image2.png Decentralized Exchange

Related Links

=”″ rel=”nofollow” target=”_blank”>Altcoin Exchange’s purpose, mission, values and value proposition.

=”″ rel=”nofollow” target=”_blank”>For Cryptocurrency Trading To Flourish, We Need A Truly Decentralized Exchange.

View original content with multimedia:


Read more here:
Altcoin Exchange, the Decentralized Cryptocurrency Exchange …

Read More..

Security Awareness – Encryption | Office of Information …


Encryption is the transformation of information into a form that is only readable by those with particular knowledge or technology to prevent others who might have access to the information from reading it. It has long been used for messages in transit, whether carried by hand, transmitted via radio or sent over a computer network if the message is intercepted, the interceptor would be unable to interpret the information. It also serves an important role for stored information to protect it in case of loss or theft.

While the concepts and processes of encryption greatly pre-date modern computing, the topic has become increasingly popular in computing over the past few years. This has largely been fueled by the vast increase of information transfer over computer networks and the increased security concerns that accompany a massively interconnected “always online” computing environment.

OIToffers and supports PGP software and licenses to faculty and staff for whole disk encryption. Whole disk encryption will keep educational records and confidential data secure in case your laptop is lost or stolen. This information should only be stored on a mobile device, like a laptop, when there is a specific business purpose. Find out if PGP whole disk encryption is right for you.

If we had a number we wished to keep secret (say the combination to a safe), one option to protect it is to encrypt the number, after all we can’t store the combination to the safe inside the safe. Let’s say the combination is 12-28-11 which we shorten to just 122811. Let’s use some simple math to make it into a scrambled number.

Here’s an equation that adds a secret number (n) to the combination and then multiplies the result by the same secret number:

If we pick 5 as our secret number, then we get:

Our scrambled number, 614080, is an encrypted version of our safe combination. To get our combination number back, we need to know our secret number and the formula used to create the scrambled number. Here’s the formula:

We insert our secret number and our scrambled number:

And solve the equation to find our combination:

We have successfully developed our own encryption process for our safe combination.

The process of transforming readable information into an unreadable form. Making the safe combination into the scrambled number.


The process of transforming encrypted information back into its readable form. Making the scrambled number back into the safe combination.


The item used, along with the algorithm, to encrypt and decrypt information. . In the example above, the secret number, n, was our key. The key could be a password, a special file or a hardware device often called a token Strong encryption processes may use multiple keys like both a password and a token.

Key length


The mathematical technique used, along with the key(s), to encrypt and decrypt information. In the example above, the equation, n*(combination + n)=scrambled number, was our algorithm. Popular encryption algorithms include: AES, DES, triple-DES, RSA, blowfish, IDEA

Information is considered “at rest” when it is saved to a computer or storage device (like a CD, tape or thumbdrive) which is usually in contrast to “in transit”. Note that data can be considered “at rest” while physically moving like someone carrying a CD with information.

Information is “in transit” when it is being transferred over a network. This could be copying a file from a file server, submitting a webpage order form or sending an email.

The behavior of an encryption technology/product which keeps a file encrypted when it is moved between disks or computers. Many forms of encryption only keep information encrypted when stored in a particular location.

Symmetrical vs Asymmetrical

Encryption/decryption processes are often referred to as being either symmetrical or asymmetrical, which relates to what keys are used to encrypt and decrypt information.

In symmetrical encryption, the same key is used to encrypt and decrypt the information. The most common use of this technique is password encryption where the same password is used to encrypt and decrypt the information. This method is simple and useful when sharing the key isn’t problematic (either the key isn’t shared or all parties are trusted with the information). It requires that all parties who need to encrypt or decrypt the information safely obtain the key.

In asymmetrical encryption, there are two different keys one used to encrypt the information and one used to decrypt the information. In this approach, the key used to encrypt the information cannot be used to decrypt it. This technique is useful when sharing a key might be problematic. These two keys are often referred to as public and private keys. As the names imply, the public key is openly distributed as it can only be used to encrypt information and the private key that can decrypt the information is protected.

Key managementPerhaps the most important aspect of encryption deployment is management of keys. This includes what types of keys are used (passwords, files, tokens, certificates, etc), how they are given to users, how they are protected and how to deal with a lost key scenario. Each technology and product handles this differently, but the lost key scenario is usually the most concerning since it could lead to either an unauthorized person decrypting information or the inability for authorized people to decrypt information. Many encryption horror stories come in the form of not being able to decrypt the only copy of very important information. Pay careful attention to key generation, distribution, use, recovery and security when looking into encryption options.

Impacts to system/data managementWhen files or disks are encrypted, an IT administrator might have to adapt some of their management processes or tools. For example, what impact do encrypted hard drives have on system imaging? What about the use of wake-on-LAN for management? The answers to these questions vary with your management processes and the encryption product, so it’s important to understand how encryption products will impact your IT environment.

When does encryption stay with the file?Many forms of encryption only protect information while it is transferred over the network (like a website using SSL) or while it is stored in a particular place (like on an encrypted hard drive). This means that once the file is moved out of the situation, it is no longer encrypted. This often confuses users who think encryption “sticks” to files and they can email a file stored on an encrypted disk and it will stay encrypted as an email attachment, or copy a file from an encrypted disk to a thumb drive and the file will remain encrypted. It’s important to understand the conditions under which a file will be encrypted and explain those conditions to those in your department. Since encryption conditions vary by technology, product and implementation, there isn’t a general rule.

Follow this link:
Security Awareness – Encryption | Office of Information …

Read More..

Data Encryption and Decryption (Windows)

This documentation is archived and is not being maintained.

Encryption is the process of translating plain text data (plaintext) into something that appears to be random and meaningless (ciphertext). Decryption is the process of converting ciphertext back to plaintext.

To encrypt more than a small amount of data, symmetric encryption is used. A symmetric key is used during both the encryption and decryption processes. To decrypt a particular piece of ciphertext, the key that was used to encrypt the data must be used.

The goal of every encryption algorithm is to make it as difficult as possible to decrypt the generated ciphertext without using the key. If a really good encryption algorithm is used, there is no technique significantly better than methodically trying every possible key. For such an algorithm, the longer the key, the more difficult it is to decrypt a piece of ciphertext without possessing the key.

It is difficult to determine the quality of an encryption algorithm. Algorithms that look promising sometimes turn out to be very easy to break, given the proper attack. When selecting an encryption algorithm, it is a good idea to choose one that has been in use for several years and has successfully resisted all attacks.

For more information, see Data Encryption and Decryption Functions.

Go here to read the rest:
Data Encryption and Decryption (Windows)

Read More..

Introduction to Cryptocurrency – CryptoCurrency Facts

Cryptocurrency facts takes a simplified look at digital currency like bitcoin to help everyone understand what it is, how it works, and its implications. On this site, we cover everything you need to know about:

As of 2017, cryptocurrency has been used as a decentralized alternative to traditional fiat currencies (which are usually backed by somecentral government)such asthe US dollar (USD).

For theaverage person using cryptocurrency is as easy as:

What is a cryptocurrency address?: A public address is a unique string of charactersused to receive cryptocurrency. Each public address has a matching private address that can be used to prove ownership of thepublic address. WithBitcoin the addressis called a Bitcoin address. Think of it like a unique email address that people can send currency to as opposed to emails.

The first decentralized digital cryptocurrency can be traced back to Bit Gold, which was worked on by Nick Szabo between 1998 and 2005. Bit gold is considered the first precursor to bitcoin. In 2008,Satoshi Nakamoto (an anonymousperson and/or group) released a paper detailing what would become Bitcoin.

Bitcoin became the first decentralized digital coin when it was created in 2008. Itthen went public in2009. As of 2017, Bitcoinis the most commonly known and used cryptocurrency (with other coins like Ethereum and Litecoin also being notable). Given thepopularity of Bitcoin as well asits history, the term altcoin is sometimes used to describe alternative cryptocurrenciesto bitcoin (especially coins with small market caps).

As of January 2015, there wereover 500different types of cryptocurrencies or altcoins for trade in online markets. However,only 10 of them had market capitalizations over $10 million.

As of September 2017, there were over 1,100 cryptocurrencies and thetotal market capitalization of all cryptocurrencies reached an all-time high surpassing $60 billion!

In other words, although the future is uncertain, cryptocurrency seems to be more than just a fad. Here in 2017 cryptocurrency is shaping up a growing market that (despite its pros and cons) is likely here for the long haul.

On this site, we explore every aspect of cryptocurrency. Simply choose a page from the menu,visitour what is cryptocurrency page for a more detailed explanation of cryptocurrency, or jump right in to the how cryptocurrency works section to start learning about transactions, mining, and public ledgers.

See original here:
Introduction to Cryptocurrency – CryptoCurrency Facts

Read More..

quantum computing –

According to Intel, the building blocks of quantum computing, qubits, are very fragile. They can only operate at extremely low temperatures (250 times colder than deep space) and must be packaged carefully to prevent data loss. Intel’s research groups in Oregon and Arizona have found a way to manufacture 17-quibit chips with an architecture that makes them more reliable at higher temperatures and reduced RF interference between each qubit. The chip can send and receive 10 to 100 times more signal than comparable wire-bonded chips and has an advanced design that allows for the techniques to be applied to larger quantum integrated circuits, which are much bigger than typical silicon chips.

“Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on a regular basis in our leading-edge manufacturing facilities,” said Intel Labs’ Dr. Michael Mayberry. “Intel’s expertise in fabrication, control electronics and architecture sets us apart and will serve us well as we venture into new computing paradigms, from neuromorphic to quantum computing.”

Go here to read the rest:
quantum computing –

Read More..

Qudits: The Real Future of Quantum Computing? – IEEE Spectrum

Instead of creating quantum computers based on qubits that can each adopt only two possible options, scientists have now developed a microchip that can generate qudits that can each assume 10 or more states, potentially opening up a new way to creating incredibly powerful quantum computers, a new study finds.

Classical computers switch transistors either on or off to symbolize data as ones and zeroes. In contrast, quantum computers use quantum bits, or qubitsthat, because of the bizarre nature of quantum physics, can be in a state ofsuperpositionwhere they simultaneously act as both 1 and 0.

The superpositions that qubits can adopt let them each help perform two calculations at once. If two qubitsare quantum-mechanically linked, orentangled,they can help perform four calculations simultaneously; three qubits, eight calculations; and so on. As a result, aquantum computer with 300 qubits could perform more calculations in an instant than there are atoms in the known universe, solving certain problems much faster than classical computers. However, superpositions are extraordinarily fragile, making it difficult to work with multiple qubits.

Most attempts at building practical quantum computers rely on particles that serve as qubits. However, scientists have long known that they could in principle use quditswith more than two states simultaneously. In principle, a quantum computer with two 32-state qudits, for example, would be able to perform as many operations as 10 qubits while skipping the challenges inherent with working with 10 qubits together.

Researchers used the setup pictured above to create, manipulate, and detect qudits. The experiment starts when a laser fires pulses of light into a micro-ring resonator, which in turn emits entangled pairs of photons.Because the ring has multiple resonances, the photons have optical spectrumswitha set of evenly spaced frequencies(red and blue peaks), a process known as spontaneous four-wave mixing (SFWM).The researchers were able to use each of thefrequencies to encode information, which means the photons act asqudits.Each quditis in a superposition of 10 possible states, extending the usual binary alphabet (0 and 1) of quantum bits.The researchers also showed they could perform basic gate operations on the qudits using optical filters and modulators, and then detect the results using single-photon counters.

Now scientists have for the first time created a microchip that can generate two entangled qudits each with 10 states, for 100 dimensions total, more than what six entangled qubits could generate. We have now achieved the compact and easy generation of high-dimensional quantum states, says study co-lead author Michael Kues, a quantum optics researcher at Canadas National Institute of Scientific Research, or INRS,its French acronym,in Varennes, Quebec.

The researchers developed a photonic chip fabricated using techniques similar to ones used for integrated circuits. A laser fires pulses of light into a micro-ring resonator, a 270-micrometer-diameter circle etched onto silica glass, which in turn emits entangled pairs of photons. Each photon is in a superposition of 10 possible wavelengths or colors.

For example, a high-dimensional photon can be red and yellow and green and blue, although the photons used here were in the infrared wavelength range, Kues says. Specifically, one photon from each pair spanned wavelengths from 1534 to 1550 nanometers, while the other spanned from 1550 to 1566 nanometers.

Using commercial off-the-shelf telecommunications components, the researchers showed they could manipulate these entangled photons. The basic capabilities they show are really what you need to do universal quantum computation, says quantum optics researcher Joseph Lukens at Oak Ridge National Laboratory, in Tennessee, who did not take part in this research. Its pretty exciting stuff.

In addition, by sending the entangled photons through a 24.2-kilometer-long optical fiber telecommunications system, the researchers showed that entanglement was preserved over large distances. This could prove useful for nigh-unhackable quantum communications applications, the researchers say.

What I think is amazing about our system is that it can be created using components that are out on the market, whereas other quantum computer technologies need state-of-the-art cryogenics, state-of-the-art superconductors, state-of-the-art magnets, saysstudy co-senior authorRoberto Morandotti, a physicistatINRSin Varennes. The fact that we use basic telecommunications components to access and control these states means that a lot of researchers could explore this area as well.

The scientists noted that current state-of-the-art components could conceivably generate entangled pairs of 96-state qudits, corresponding to more dimensions than 13 qubits. Conceptually, in principle, I dont see a limit to the number of states of qudits right now, Lukens, from Oak Ridge,says. I do think a 96-by-96-dimensional system is fairly reasonable, and achievable in the near future.

But he adds that several components of the experiment were not on the microchips, such as the programmable filters and phase modulators, which led to photon loss. Kues says that integrating such components with the rest of the chips and optimizing their micro-ring resonator would help reduce such losses to make their system more practical for use.

The next big challenge we will have to solve is to use our system for quantum computation and quantum communications applications, Kues says. While this will take some additional years, it is the final step required to achieve systems that can outperform classical computers and communications.

The scientists detailed their findings in the latest issue of the journal Nature.

Originally posted here:
Qudits: The Real Future of Quantum Computing? – IEEE Spectrum

Read More..