Page 11234..1020..»

Encryption: Avoiding the Pitfalls That Can Lead to Breaches

Cybercrime , Cybersecurity , Data Breach

The Marriott mega-breach is calling attention to the issues of whether organizations are storing too much data and whether they’re adequately protecting it with the proper encryption steps.

See Also: The Role of Threat Intelligence in Cyber Resilience

In its revised findings about a mega-breach that it now says affected 327 million customers, Marriott notes that 25.6 million passport numbers were exposed in the breach, of which 5.25 million were unencrypted. “There is no evidence that the unauthorized third party accessed the master encryption key needed to decrypt the encrypted passport numbers,” Marriott says. But that doesn’t mean that the attackers couldn’t later brute-force decrypt the numbers (see: Marriott Mega-Breach: Victim Count Drops to 383 Million).

Also exposed in the breach were approximately 8.6 million encrypted payment cards that were being stored by Marriott. By the time the breach was discovered in late 2018, however, Marriott says most of the payment cards had already expired. As with the passport data, “there is no evidence that the unauthorized third party accessed either of the components needed to decrypt the encrypted payment card numbers,” Marriott says.

U.S. Sen. Mark Warner, D-Virginia, says the breach highlights a failure by many organizations to minimize the amount of data they routinely store on consumers.

“It’s unacceptable that Marriott was retaining sensitive data like passport numbers for so long, and it’s unconscionable that it kept this data unencrypted,” said Warner, who co-chairs the Senate Cybersecurity Caucus, the Wall Street Journal reported.

Meanwhile, security experts around the world are calling attention to the need to take all necessary steps to properly encrypt sensitive data that organizations store.

Although cryptography is being added to more backend applications, it’s often being implemented incorrectly, contends Steve Marshall, chief information security officer and head of cyber consulting at Bytes Software Services, a U.K.-based IT company. “This often leaves organizations with a false sense of security, which, unfortunately becomes evident when they are attacked,” he says.

And with governments across the world pushing for encryption backdoors to be used by law enforcement, the hacking risks could get worse.

Jagdeep Singh, head of risk and governance at Instarem, a Singapore-based payments company, says many companies worldwide make common mistakes when implementing encryption. For example, they:

Tarun Pant, CEO at SecurelyShare, a Bangalore-based company, says too many organizations focus on encrypting data while it’s transmitted but fail to encrypt it when it’s at rest.

“Many organizations don’t do end-to-end encryption of data,” he says. “Hence, the weakest link is often the source of the breach. Data at rest, if not encrypted with source key, leads to breaches from within the organization.”

Too many companies take a “check list” approach to data security, focusing narrowly on regulatory compliance. These firms often don’t devote enough time and effort to properly implementing encryption, security experts say.

“Many development teams adding encryption to their code call it a day once they achieve the minimum security needed for a regulatory checkmark. This attitude is dangerous,” Singh says (see: Demystifying DevSecOps and Its Role in App Security).

Kevin Bocek, vice president of security strategy and threat intelligence for Salt Lake City, Utah-based Venafi, a cybersecurity company that develops software to secure and protect cryptographic keys, says managing machine identities that are used to establish encryption is challenging for many organizations.

“Investigations have shown that simply not keeping track of machine identities, like TLS certificates, can create encrypted tunnels for hackers to hide in,” Bocek says. “In addition, if a simple machine identity, like a key and certificate, not being updated, mobile networks across entire countries can be impacted.”

Depending on where encryption occurs – column level vs. application level – what encryption techniques are used and what kind of vulnerability is being exploited, attackers can use many different techniques to cause data breaches, says Sandesh Anand, managing consultant at Synopsys, a Mountain View, Calif.-based technology company.

“Practitioners should not build their own crypto algorithms or libraries,” he stresses. “They should instead focus on implementing well-known, peer-reviewed, secure algorithms properly.”

Anand says the best algorithms to use are AES or Advanced Encryption Standard for symmetric encryption algorithm, RSA for asymmetric encryption algorithm and SHA-256 for hashing.

Mistakes in key management also can lead to trouble, Anand says. “Often firms end up either using short keys or they end up using the same key for months,” he says. “Then there is the problem of insecure key management.”

Pune-based Rohan Vibhandik, a security researcher with a multinational company, notes: “Storing or transmitting keys insecurely remains a common mistake, especially in case of a symmetric key where a single key is used at both ends – encryption and decryption.”

While it’s important to secure the storage of machine identities, including keys, it’s become even more critical to be able to have the capability to change machine identities fast, Bocek stresses.

“Browsers can distrust Certificate Authorities. This means businesses have to quickly find and change out machine identities, like TLS keys and certificates, used for encryption,” he says.

While encryption plays an important role in data security, it’s not a cure-all, security experts stress.

“Encryption is just one of the many controls that protect data while in transit or at rest,” Singh says. “However, there are numerous ways to circumvent encryption in a client-server model. “Also, encryption technologies and the way they get adopted are still evolving.”

Anand notes: “Remember: The strength of a chain is the weakest link. So, if crypto keys are lying around in insecure locations or if database admins use weak passwords, data can still be breached. Finally, insecure application controls can also lead to a breach.”

An important aspect of encryption is proper key management.

“Key management is a challenge that grows with the size and complexity of your environment,” Pant says. “The larger the user base, the more diverse the environment, the more distributed the keys are. Hence the challenges of key management will be greater.”

Singh recommends organizations avoid saving keys in the same server as the encrypted data.

“One needs to ensure that private keys, when stored, are non-exportable. Also, one must not use the same keys for both directions,” he says. He also recommends adoption of proper standards, including TLS, or Transport Layer Security, while data is in transit. “Avoid using secure sockets layer as it is outdated,” he emphasizes.

To help ensure that encrypted data remains untampered, adding a layer of hashing and salting is essential, Vibhandik says.

“When data is encrypted, one must hash it using functions like MD5 and SHA,” he says. “To provide further layered security to the hashed data, SALT function must be used; that can prevent tampering of data.

“One must remember that hashing does not add any privacy to data; it only saves against any data alteration or tampering attempts. Encryption provides privacy to your data but does not make it tamper proof. So a combination of both is important for endpoint and end-to-end communication and data security.”

See the original post here:
Encryption: Avoiding the Pitfalls That Can Lead to Breaches

Read More..

Cloud Computing – Yahoo


Whats all the fluff about cloud computing? There are plenty of reasons its the most talked-about trend in technology. Starting with the fact that it helps reduce the up-front capital needed to build IT infrastructure and develop software. Cloud services are so appealing that the total market is expected to nearly triple from 2010 to 2016. (Yep, you read that right.)

Of course, technology companies have clamored to add cloud computing to their repertoires, leading to lots of M&A activity. Software and Internet deals represented 57% of transactions closed in 2012, a figure that has grown steadily over the last two years. All of which leaves the cloud looking like a lot more than a passing storm.

We identified US-listed stocks and American Depository Receipts of companies that are engaged in activities relevant to this watchlist’s theme. We then filtered out companies that have a share price of less than .00 or a market capitalization less than 00 million, and excluded illiquid stocks by screening companies for liquidity i.e. average bid-ask spreads, dollar volume traded etc. Finally the proprietary Motif Optimization Engine determined the constituent stocks. Learn more about how we select our watchlists.

Motif is an online brokerage built on thematic portfolios of up to 30 stocks and ETFs. Founded in 2010 by Hardeep Walia, Motif combines complex proprietary algorithms with skilled advisers to develop these thematic portfolios. Learn more about our team.

First, we determined each company’s percentage of total revenue derived from this watchlist’s theme. Second, we applied a pure-play factor to give greater relative weight to companies that derive a higher percentage of their revenue from this theme. Finally, we weighted each company by its market capitalization adjusted for revenue exposure to the theme.

More details on how we build and weight watchlists are available here.

See original here:
Cloud Computing – Yahoo

Read More..

CES 2019: IBM’s Q System One Is the Rock Star Quantum …

IBM announced the worlds first commercially available quantum computer at CES 2019. Well. Kinda.

Called IBM Q System One, the computer is a glass box the size of a van with a sleek black cylinder hanging from the ceiling. Yet you wont find it in your garage, or in the offices of your nearest Fortune 500 company. Those willing to pay to harness the power of the 20-qubit machine will access IBM Q System One over the cloud. The hardware will be housed at IBMs Q Computation Center, set to open this year in Poughkeepsie, New York.

Reception has proven mixed. While the initial wave of news was positive, some have received the announcement with skepticism. Their points are valid. While IBMs press release touts that Q System One enables universal approximate superconducting quantum computersto operate beyond the confines of the research lab, it will remain under IBMs watchful eye. And IBM already offered cloud access to quantum computers at the Thomas J. Watson Research Center in Yorktown, New York.

In effect, IBM Q System One is an expansion of an existing cloud service, not a new product. Yet that doesnt lessen its impact.

Quantum computing faces many massive scientific challenges. Q System One, with 20 qubits, isnt no where near capable of beating classical computers even in tasks that will theoretically benefit from quantum computing. No universal quantum computer exists today, and no one knows when one will arrive.

Yet, building a useful quantum computer will only be half the battle. The other half is learning how to use it. Quantum computing, once it arrives, will fundamentally change what computers can accomplish. Engineers will tackle the challenge of building a quantum computer that can operate in a normal environment, while programmers must learn to write software for hardware that compute in ways alien to binary computers.

Companies cant rely on a build it, and they will come philosophy. That might suffice so long as quantum computing remains in the realm of research, but it wont work as the quantum realm bumps up against the general public. Quantum will need a breakthrough device that wows everyone at a glance. IBM Q System One is such a device.

Impact is what IBM Q System One was meant to deliver from the start. Robert Sutor, IBMs Vice President of Q Strategy and Ecosystem, said as much, telling Digital Trends that [we] have to step back and say, What have we created so far? Its amazing what weve created so far, but is it a system? Is it a well-integrated system? Are all the individual parts optimized and working together as best as possible?

The answer, up until recently, was no. IBMs quantum computers were not meant to be used outside of a lab and were built with no regard for aesthetic or ease of use. Q System One changes that, and in doing so, it could entirely change how the system and quantum computers, in general are perceived.

This isnt a new strategy for IBM. As Sutor will quickly point out, the company took a similar approach when it built computer mainframes in the 1960s and 70s. With all the focus now, people going back to mid-century modern, IBM has a long history of design. [], he told Digital Trends. We are fully coming back to that. Other examples of this tactic include Deep Blues famous chess match and the ThinkPad, which redefined how consumers thought of portable computers.

Q System One might not be a major leap forward for the science of quantum computing, but it will give the field the standard bearer it needs. Its already making quantum feel less intimidating for those of us who lack a Ph.D in quantum physics.

See the original post here:
CES 2019: IBM’s Q System One Is the Rock Star Quantum …

Read More..

Bitcoin | Definition, Mining, & Facts |

Bitcoin, digital currency created by an anonymous computer programmer or group of programmers known as Satoshi Nakamoto in 2009. Owners of Bitcoins can use various Web sites to trade them for physical currencies, such as U.S. dollars or euros, or can exchange them for goods and services from a number of vendors.

Nakamoto was concerned that traditional currencies were too reliant on the trustworthiness of banks to work properly. Nakamoto proposed a digital currency, Bitcoin, that could serve as a medium of exchange without relying on any financial institutions or governments. The proposal was made in October 2008 in a paper published on the Bitcoin Web site, which had been founded in August 2008.

Bitcoin relies on public-key cryptography, in which users have a public key that is available for everyone to see and a private key known only to their computers. In a Bitcoin transaction, users receiving Bitcoins send their public keys to users transferring the Bitcoins. Users transferring the coins sign with their private keys, and the transaction is then transmitted over the Bitcoin network. So that no Bitcoin can be spent more than once at the same time, the time and amount of each transaction is recorded in a ledger file that exists at each node of the network. The identities of the users remain relatively anonymous, but everyone can see that certain Bitcoins were transferred. Transactions are put together in groups called blocks. The blocks are organized in a chronological sequence called the blockchain. Blocks are added to the chain using a mathematical process that makes it extremely difficult for an individual user to hijack the blockchain. The blockchain technology that underpins Bitcoin has attracted considerable attention, even from skeptics of Bitcoin, as a basis for allowing trustworthy record-keeping and commerce without a central authority.

New Bitcoins are created by users running the Bitcoin client on their computers. The client mines Bitcoins by running a program that solves a difficult mathematical problem in a file called a block received by all users on the Bitcoin network. The difficulty of the problem is adjusted so that, no matter how many people are mining Bitcoins, the problem is solved, on average, six times an hour. When a user solves the problem in a block, that user receives a certain number of Bitcoins. The elaborate procedure for mining Bitcoins ensures that their supply is restricted and grows at a steadily decreasing rate. About every four years, the number of Bitcoins in a block, which began at 50, is halved, and the number of maximum allowable Bitcoins is slightly less than 21 million. As of late 2017 there were almost 17 million Bitcoins, and it is estimated that the maximum number will be reached around 2140.

Because the algorithm that produces Bitcoins makes them at a near-constant rate, early miners of Bitcoins obtained them more often than later miners because the network was small. The premium that early users received and Nakamotos silence after 2011 led to criticism of Bitcoin as a Ponzi scheme, with Nakamoto benefiting as one of the first users. (An analysis of the first 36,289 mined blocks showed that one miner, believed to be Nakamoto, had accumulated over 1 million Bitcoins. However, as of 2017, those Bitcoins, then valued at $10 billion, remained unspent.) Defenders of Bitcoin claim that early users should receive some return for investing in an unproven technology.

The value of Bitcoins relative to physical currencies fluctuated wildly in the years following its introduction. In August 2010 one Bitcoin was worth $0.05 (U.S.). Beginning in May 2011, the Bitcoin increased sharply in value, reaching a peak of about $30 that June, but by the end of the year the value of a Bitcoin had collapsed to less than $3. However, Bitcoin began to attract the attention of mainstream investors, and its value climbed to a high of over $1,100 in December 2013. Some companies even began building computers optimized for Bitcoin mining.

With the marked increase in value, Bitcoin became a target for hackers, who could steal Bitcoins through such means as obtaining a users private key or stealing the digital wallet (a computer file recording a Bitcoin balance). The most spectacular theft was revealed in February 2014 when Mt. Gox, which had been the worlds third largest Bitcoin exchange, declared bankruptcy because of the theft of about 650,000 Bitcoins, then valued at about $380 million.

In 2017 the value of Bitcoins rose sharply from around $1,200 in April to more than $10,000 in November. The sharp rise in Bitcoins value encouraged more intensive mining. It was estimated in late 2017 that Bitcoin mining consumed 0.14 percent of the worlds electricity production.

Read the original:
Bitcoin | Definition, Mining, & Facts |

Read More..

Quantum Computing | The MIT Press


A thorough exposition of quantum computing and the underlying concepts of quantum physics, with explanations of the relevant mathematics and numerous examples.

The combination of two of the twentieth century’s most influential and revolutionary scientific theories, information theory and quantum mechanics, gave rise to a radically new view of computing and information. Quantum information processing explores the implications of using quantum mechanics instead of classical mechanics to model information and its processing. Quantum computing is not about changing the physical substrate on which computation is done from classical to quantum but about changing the notion of computation itself, at the most basic level. The fundamental unit of computation is no longer the bit but the quantum bit or qubit.

This comprehensive introduction to the field offers a thorough exposition of quantum computing and the underlying concepts of quantum physics, explaining all the relevant mathematics and offering numerous examples. With its careful development of concepts and thorough explanations, the book makes quantum computing accessible to students and professionals in mathematics, computer science, and engineering. A reader with no prior knowledge of quantum physics (but with sufficient knowledge of linear algebra) will be able to gain a fluent understanding by working through the book.

Hardcover Out of Print ISBN: 9780262015066 392 pp. | 7 in x 9 in 3 graphs, 79 figures, 2 tables March 2011

Paperback $39.00 S | 30.00 ISBN: 9780262526678 392 pp. | 7 in x 9 in 3 graphs, 79 figures, 2 tables August 2014

Authors Eleanor G. Rieffel Eleanor Rieffel is Research Scientist at NASA Ames Research Center. Wolfgang H. Polak Wolfgang Polak is a computer science consultant.

Go here to see the original:
Quantum Computing | The MIT Press

Read More..

IBM thinks outside of the lab, puts quantum computer in a box

IBM unveiled the worlds first universal approximate quantum computing systeminstalled outside of a research lab at CES earlier this week and with it, the next era of computing.

The 20-qubit IBM Q System One represents the first major leap for quantum computers of 2019, but before we get into the technical stufflets take a look at this thing.

All we can say is: wowzah! When can we get a review unit?

The commitment to a fully-functional yet aesthetically pleasing design is intriguing. Especially considering that, just last year, pundits claimedquantum computing was adead-end technology.

To make the first integrated quantum computer designed for commercial use outside of a lab both beautiful and functional, IBM enlisted the aid of Goppion, the company responsible for some of the worlds most famous museum-quality display cases, Universal Design Studio and Map Project Office. The result is not only (arguably) a scientific first, but a stunning machine to look at.

Credit: IBM

This isnt just about looks. That box represents a giant leap in the field.

Its hard to overstate the importance of bringing quantum computers outside of laboratories. Some of the biggest obstacles to universal quantum computing have been engineering-related. It isnt easy to manipulate the fabric of the universe or, at a minimum, observe it and the machines that attempt it typically require massive infrastructure.

In order to decouple a quantum system from its laboratory lifeline, IBM had to figure out how to conduct super-cooling (necessary for quantum computation under the current paradigm) in a box. This was accomplished through painstakingly developed cryogenic engineering.

Those familiar with the companys history might recall that, back in the 1940s, IBMs classical computers took up an entire room. Eventually, those systems started shrinking. Now they fit on your wrist and have more computational power than all the computers from the mainframe era put together.

It sure looks like history is repeating itself:

TNW asked Bob Wisnieff, IBMs Quantum Computing CTO, if todays progress reminded him of that transition. He told us:

In some respects, quantum computing systems are at a similar stage as the mainframes of the 1960s. The big difference is the cloud access, in a couple of ways:

Imagine if everyone in the 60s had five to ten years to explore the mainframes hardware and programming when it was essentially still a prototype. Thats where we are with quantum computing.

And now, in the IBM Q System One, we have a quantum system that is stable, reliable, and continuously available for commercial use in an IBM Cloud datacenter.

The IBM Q System One isnt the most powerful quantum computer out there. Its not even IBMs most powerful. But its the first one that could, technically, be installed on-site for a commercial customer. It wont be, however. At least not for the time being.

Instead, it can be accessed via the cloud as part of the companys quantum computing Q initiative.

For more information about IBMs Q System One visit the official website here. And dont forget to check out TNWs beginners guide to quantum computers.

Read next: Trump Jr.’s deleted Instagram post likened border wall to zoo fencing

See the article here:
IBM thinks outside of the lab, puts quantum computer in a box

Read More..

Cloud Products & Services | HOSTING

The HOSTING Healthcare Cloud is a suite of secure, compliant cloud solutions that protects sensitive healthcare information (i.e., electronic protected health information (ePHI) and electronic medical records (EMRs)), while meeting specific compliance regulations for HIPAA, PCI DSS and SOC 2 and 3.


The HOSTING Unified Cloud is a complete unified cloud solution available on the AWS and Azure platforms. It provides customers with unprecedented flexibility to develop, run and manage custom applications on massive-scale clouds while leveraging the HOSTING suite of industry-leading managed services.


HOSTING combines creative problem-solving, unmatched technical skills and deep industry expertise to develop real-world solutions for our customers most complex business and IT challenges.


Ignoring the one-cloud-fits-all approach, HOSTING provides organizations with innovation, flexibility and choice. Advanced security and compliance features ensure that customers migrate to the cloud with confidence.


Your IT Dream Team whenever you need them. HOSTING partners with in-house IT teams to fine-tune strategies, manage operational details and drive business efficiencies.


HOSTING certified information security and compliance experts offer a complete range of compliant hosting solutions to meet the most stringent HIPAA, PCI DSS and SOC 2 and 3 obligations all backed by HOSTING 100% Audit Assurance.


Some cloud service providers hang their hats on cloud infrastructure, paying scant attention to performance, availability and security. Others take infrastructure for granted and seize every opportunity to upsell their customers with a steady stream of services (whether they need them or not).

Were better than that.

HOSTING ignores the one-cloud-fits-all approach and avoids jumping on any cloud bandwagons. We provide proactive, forward-thinking cloud products and services that meet our customers evolving business needs and enable them to realize meaningful results. Whether an organization is new to the cloud, looking to expand its cloud presence or needs to migrate to a cloud leader that truly understands its requirements, our team helps them anticipate and respond to new service opportunities, consumer demands and compliance regulations.

From colocation to cloud hosting to managed services, HOSTING cloud architects serve as trusted business partners to our customers. We bring to bear a depth of knowledge and expertise that is unmatched in the marketplace. But were not about tooting our own horn. We let independent analysts like Gartner do that for us.

Simply put: our cloud pros develop custom solutions to enable organizations to streamline operational expenses, increase revenues and plan for future growth.

<iframe src=”;src=4496082;type=Hosti00;cat=Hosti0;ord=1?” width=”1″ height=”1″ frameborder=”0″ style=”display:none”></iframe>

See more here:
Cloud Products & Services | HOSTING

Read More..

Cloud Hosting – Super Powerful and Fully Redundant | TMDHosting

TMDHosting, Best Cloud Web Hosting

Since its invention, cloud hosting has taken over every market scene and industry with its endless possibilities and options that work for every business. Now you can setup any web platform without having to worry about space, speed, connectivity and so much more. Cloud Hosting indicate a strong powered, user-friendly, scalable and reliable hosting solution for TMDHosting. This is precisely the level of service you will get from us.

We implement advanced technologies and Cyberspace expertise in creating cost-effective cloud solutions you can count on. We deliver cutting-edge web hosting services with private networking and multi-platform compatibility. No need to spend much on conventional web hosting with surrounding charges and maintenance fees, we offer a more convenient and Reliable Cloud Web Hosting for every business or niche.

Time is precious and a single minute of downtime online could mean a lot. Your web visitors, especially new ones should be able to access your website anytime and anywhere with no issues. Investing in a super-powerful cloud solution comes with huge advantages to take your business to the next level. The TMDHosting Cloud combines robust technologies with premium hardware, low-density environment and blazingly fast SSD storage. All this combined with the three unique levels of combined caching ensure extremely fast loading time for your website.

Safety is a deciding factor in checking the potential of a cloud provider. At TMDHosting we understand the need to keep your website and sensitive information safe from any government and third-parties. All TMDHosting Cloud accounts reside in a private network, protected by hardware and software appliances. We work tirelessly to keep you safe 24/7/365 by our cloud certified engineers.

With fully managed Integrated Caching, Data Mirroring and Instant Scaling, we go above and beyond to guarantee customer satisfaction. We ensure Zero Downtime and can bank our services regardless the size and functionality of your website. The TMDHosting Cloud storage is not only SSD based, but is also separated from the computer processes for providing maximum data transfer rates. This greatly boosts up your website and delivers extreme performance.

Personalize your Cloud account with the open source application of your choice. We also provide cloud services for WordPress, OpenCart, SocialEngine, Drupal, Dolphin and PrestaShop. You can virtually install anything to your website with Open Source applications experts available to help you get the best out of your website.

TMDHosting encourages everyone to become a part of today and future leading website, with revolutionary Cloud Web Hosting. At pocket-friendly prices, we provide an unbeatable offer to take full control of your business.

We have worked with thousands of clients around the world and we provide one of the best cloud-based shared hosting services in the US and have additional locations in The Netherlands, Japan, Singapore, Australia and United Kingdom. With affordable rates, flawless customer service and state of the art Cloud Technologies, TMDHosting is truly the best choice you can find as we work together to make a huge difference in the global spectrum.

See the rest here:
Cloud Hosting – Super Powerful and Fully Redundant | TMDHosting

Read More..

Quantum computer | computer science |

Quantum computer, device that employs properties described by quantum mechanics to enhance computations.

Read More on This Topic

computer: Quantum computing

According to quantum mechanics, an electron has a binary (two-valued) property known as spin. This suggests another way of representing a bit of information. While single-particle information storage is attractive, it would be difficult to manipulate. The fundamental idea of quantum computing, however,

As early as 1959 the American physicist and Nobel laureate Richard Feynman noted that, as electronic components begin to reach microscopic scales, effects predicted by quantum mechanics occurwhich, he suggested, might be exploited in the design of more powerful computers. In particular, quantum researchers hope to harness a phenomenon known as superposition. In the quantum mechanical world, objects do not necessarily have clearly defined states, as demonstrated by the famous experiment in which a single photon of light passing through a screen with two small slits will produce a wavelike interference pattern, or superposition of all available paths. (See wave-particle duality.) However, when one slit is closedor a detector is used to determine which slit the photon passed throughthe interference pattern disappears. In consequence, a quantum system exists in all possible states before a measurement collapses the system into one state. Harnessing this phenomenon in a computer promises to expand computational power greatly. A traditional digital computer employs binary digits, or bits, that can be in one of two states, represented as 0 and 1; thus, for example, a 4-bit computer register can hold any one of 16 (24) possible numbers. In contrast, a quantum bit (qubit) exists in a wavelike superposition of values from 0 to 1; thus, for example, a 4-qubit computer register can hold 16 different numbers simultaneously. In theory, a quantum computer can therefore operate on a great many values in parallel, so that a 30-qubit quantum computer would be comparable to a digital computer capable of performing 10 trillion floating-point operations per second (TFLOPS)comparable to the speed of the fastest supercomputers.

During the 1980s and 90s the theory of quantum computers advanced considerably beyond Feynmans early speculations. In 1985 David Deutsch of the University of Oxford described the construction of quantum logic gates for a universal quantum computer, and in 1994 Peter Shor of AT&T devised an algorithm to factor numbers with a quantum computer that would require as few as six qubits (although many more qubits would be necessary for factoring large numbers in a reasonable time). When a practical quantum computer is built, it will break current encryption schemes based on multiplying two large primes; in compensation, quantum mechanical effects offer a new method of secure communication known as quantum encryption. However, actually building a useful quantum computer has proved difficult. Although the potential of quantum computers is enormous, the requirements are equally stringent. A quantum computer must maintain coherence between its qubits (known as quantum entanglement) long enough to perform an algorithm; because of nearly inevitable interactions with the environment (decoherence), practical methods of detecting and correcting errors need to be devised; and, finally, since measuring a quantum system disturbs its state, reliable methods of extracting information must be developed.

Plans for building quantum computers have been proposed; although several demonstrate the fundamental principles, none is beyond the experimental stage. Three of the most promising approaches are presented below: nuclear magnetic resonance (NMR), ion traps, and quantum dots.

In 1998 Isaac Chuang of the Los Alamos National Laboratory, Neil Gershenfeld of the Massachusetts Institute of Technology (MIT), and Mark Kubinec of the University of California at Berkeley created the first quantum computer (2-qubit) that could be loaded with data and output a solution. Although their system was coherent for only a few nanoseconds and trivial from the perspective of solving meaningful problems, it demonstrated the principles of quantum computation. Rather than trying to isolate a few subatomic particles, they dissolved a large number of chloroform molecules (CHCL3) in water at room temperature and applied a magnetic field to orient the spins of the carbon and hydrogen nuclei in the chloroform. (Because ordinary carbon has no magnetic spin, their solution used an isotope, carbon-13.) A spin parallel to the external magnetic field could then be interpreted as a 1 and an antiparallel spin as 0, and the hydrogen nuclei and carbon-13 nuclei could be treated collectively as a 2-qubit system. In addition to the external magnetic field, radio frequency pulses were applied to cause spin states to flip, thereby creating superimposed parallel and antiparallel states. Further pulses were applied to execute a simple algorithm and to examine the systems final state. This type of quantum computer can be extended by using molecules with more individually addressable nuclei. In fact, in March 2000 Emanuel Knill, Raymond Laflamme, and Rudy Martinez of Los Alamos and Ching-Hua Tseng of MIT announced that they had created a 7-qubit quantum computer using trans-crotonic acid. However, many researchers are skeptical about extending magnetic techniques much beyond 10 to 15 qubits because of diminishing coherence among the nuclei.

Just one week before the announcement of a 7-qubit quantum computer, physicist David Wineland and colleagues at the U.S. National Institute for Standards and Technology (NIST) announced that they had created a 4-qubit quantum computer by entangling four ionized beryllium atoms using an electromagnetic trap. After confining the ions in a linear arrangement, a laser cooled the particles almost to absolute zero and synchronized their spin states. Finally, a laser was used to entangle the particles, creating a superposition of both spin-up and spin-down states simultaneously for all four ions. Again, this approach demonstrated basic principles of quantum computing, but scaling up the technique to practical dimensions remains problematic.

Quantum computers based on semiconductor technology are yet another possibility. In a common approach a discrete number of free electrons (qubits) reside within extremely small regions, known as quantum dots, and in one of two spin states, interpreted as 0 and 1. Although prone to decoherence, such quantum computers build on well-established, solid-state techniques and offer the prospect of readily applying integrated circuit scaling technology. In addition, large ensembles of identical quantum dots could potentially be manufactured on a single silicon chip. The chip operates in an external magnetic field that controls electron spin states, while neighbouring electrons are weakly coupled (entangled) through quantum mechanical effects. An array of superimposed wire electrodes allows individual quantum dots to be addressed, algorithms executed, and results deduced. Such a system necessarily must be operated at temperatures near absolute zero to minimize environmental decoherence, but it has the potential to incorporate very large numbers of qubits.

See the article here:
Quantum computer | computer science |

Read More..

IBM unveils its first commercial quantum computer

At CES, IBM today announced its first commercial quantum computer for use outside of the lab. The 20-qubit system combines into a single package the quantum and classical computing parts it takes to use a machine like this for research and business applications. That package, the IBM Q system, is still huge, of course, but it includes everything a company would need to get started with its quantum computing experiments, including all the machinery necessary to cool the quantum computing hardware.

While IBM describes it as the first fully integrated universal quantum computing system designed for scientific and commercial use, its worth stressing that a 20-qubit machine is nowhere near powerful enough for most of the commercial applications that people envision for a quantum computer with more qubits and qubits that are useful for more than 100 microseconds. Its no surprise then, that IBM stresses that this is a first attempt and that the systems are designed to one day tackle problems that are currently seen as too complex and exponential in nature for classical systems to handle. Right now, were not quite there yet, but the company also notes that these systems are upgradable (and easy to maintain).

The IBM Q System One is a major step forward in the commercialization of quantum computing, said Arvind Krishna, senior vice president of Hybrid Cloud and director of IBM Research. This new system is critical in expanding quantum computing beyond the walls of the research lab as we work to develop practical quantum applications for business and science.

More than anything, though, IBM seems to be proud of the design of the Q systems. In a move that harkens back to Crays supercomputers with its expensive couches, IBM worked withdesign studios Map Project Office and Universal Design Studio, as well Goppion, the company that has built, among other things, the display cases that house the U.K.s crown jewels and the Mona Lisa. IBM clearly thinks of the Q system as a piece of art and, indeed, the final result is quite stunning. Its a nine-foot-tall and nine-foot-wide airtight box, with the quantum computing chandelier hanging in the middle, with all of the parts neatly hidden away.

If you want to buy yourself a quantum computer, youll have to work with IBM, though. It wont be available with free two-day shipping on Amazon anytime soon.

In related news, IBM also announced the IBM Q Network, a partnership with ExxonMobil and research labs like CERN and Fermilab that aims to build a community that brings together the business and research interests to explore use cases for quantum computing. The organizations that partner with IBM will get access to its quantum software and cloud-based quantum computing systems.

Go here to see the original:
IBM unveils its first commercial quantum computer

Read More..