Page 4,168«..1020..4,1674,1684,1694,170..4,1804,190..»

Cloud computing at Ifes, IFs, and hospitals | RNP

In the second session on cloud computing of the III RNP Forum, representatives from hospitals and Federal Institutes of Higher Education (Institutos Federais de Ensino Superior - IFES) and of Education, Science and Technology (IFs) spoke about what they expect from cloud computing resources.

The Federal University of Rio de Janeiro (UFRJ) Professor and Superintendent of Information Technology and Communication Gabriel Pereira da Silva expressed the universities desire. We want a cloud that will solve all our problems, where we can put all our systems.

One of the cloud applications that UFRJ offers to its community is the OJS, the electronic journal service provider, to meet the search area. For Gabriel, the Capes (Coordination of Improvement of Higher Education Personnel) Journals Portal is an important element for the community, but forgets the national production of increasingly electronic research and journals. Therefore, we offer the OJS, he said.

Carlos Thiago Garantizado, from the Federal Institute of Amazonas (IFAM), showed the IFs perspective in deploying services and cloud applications. We work with infrastructure, platform, and service. The biggest challenge is to provide security. Not only to provide it, but to transmit this security to the user, he affirmed.

When submitting a SWOT matrix of cloud deployment in IFs, he highlighted as a help for the institutes integration, the Federate Academic Community (CAFe) service and the technical expertise of the teams.

Moderated by Adenilson Raniery Pontes, from the Par Museum Emilio Goeldi (MPEG), the panel also included the participation of Marco Antonio Gutierrez, who heads the Computer Service and the Medical Informatics Laboratory of the Heart Institute (Instituto do Corao - Incor).

Gutierrez noted that health information systems must be made available very quickly. According to him, the hospital area, open cloud solutions cannot be used. We need to ensure the information confidentiality and secrecy.

At the end of his speech, the officer explained the economic constraints of healthcare industry regarding cloud computing. The investment in technology within hospitals is still seen as a cost and not as an investment. Therefore, we cannot evolve into private cloud solutions due to financial issues, he stated.

Read the original here:
Cloud computing at Ifes, IFs, and hospitals | RNP

Read More..

MobileCoin: A New Cryptocurrency From Signal Creator Moxie …

In the early bitcoin years, proponents promised that you would soon be able to pay for anything and everything with cryptocurrency. Order pizza! Buy Etsy trinkets! Use a bitcoin ATM! While PayPal had existed for more than a decade, frictionless, social payment platforms like Venmo were just first taking off, and cryptocurrency seemed like a legitimate way for digital transactions to evolve.

It didn't happen. Cryptocurrency remains confusing and challenging for the average person to acquire and manage, much less sell. And the protocols that underlie bitcoin and other mainstream cryptocurrencies like ethereum suffer significant scalability and transaction bottleneck issues. Visa currently processes about 3,674 transactions per second; the best bitcoin network might be able to process seven per second.

But now the creator of the dead simple end-to-end encrypted messaging app Signal, Moxie Marlinspike, is on a mission to overcome those limitations, and to create a streamlined digital currency that's private, easy-to-use, and allows for quick transactions from any device. And while it may feel like the last thing the world needs is yet another cryptocurrency, Marlinspike's track record with Signaland the organization behind it, Open Whisper Systemsmakes this a project worth watching.

The currency Marlinspike has been working on as technical advisor for the last four months, alongside technologist Joshua Goldbard, is MobileCoin. The two based it on the open-source Stellar Consensus Protocols platform, an alternative payment network that underlies systems like an inter-bank payment network run by IBM in the South Pacific, and the low-fee international money transfer service Tempo in Europe.

'Usability is the biggest challenge with cryptocurrency today.'

Signal Creator Moxie Marlinspike

The Stellar blockchain is also generally regarded as being faster and more efficient than its predecessors; On Wednesday, the mobile messaging service Kik announced that it will move its Kin cryptocurrency platform from Ethereum to Stellar. "We've been using Ethereum to date, and to be honest I call it the dial-up era of blockchain," CEO Ted Livingston said.

MobileCoin wants to leverage an extensive architecture to add simplicity to real privacy protections and resilience against attacks. The ultimate goal: To make MobileCoin as intuitive as any other payment system.

That vision mirrors the animating purpose of Signal, which was developed to make robust end-to-end encrypted communication as easy and straightforward as less secure options, a simple experience that belies the complex cryptographic communication protocols that enable it.

"I think usability is the biggest challenge with cryptocurrency today," says Marlinspike. "The innovations I want to see are ones that make cryptocurrency deployable in normal environments, without sacrificing the properties that distinguish cryptocurrency from existing payment mechanisms."

Usability efforts for older generation cryptocurrency protocols, like bitcoin, have largely been left to services like Coinbase, which centralize everything from currency exchange to your wallet, key management, and processing transactions. These platforms make actually using cryptocurrency more realistic for the average person, but they also consolidate mechanisms that are meant to be kept separate in the private and decentralized concept of cryptocurrency. They generally detail extensive privacy and security protections, but they do require users to trust both their intentions and implementation.

By contrast, the idea of MobileCoin is to build a system that hides everything from everyone, leaving fewer (or theoretically no) opportunities for abuse.

Ideally, there would be a way to fix the structural problems of existing cryptocurrencies, rather than creating another new offering. But Marlinspike and Goldbard concluded that the only way to orient a cryptocurrency around user needs was to start from scratch, and architect everything with that "target user experience" in mind.

To that end, MobileCoin delegates all the complicated and processing-intensive work of participating in a blockchain ledger and validating transactions to nodesservers with constant connectivity that store and work on a fully updated copy of a currency's blockchain. The nodes can then provide software services to users, like apps that seamlessly integrate easy and quick MobileCoin transactions. The nodes also handle key management for users, so the publicand particularly the privatenumeric sequences that encrypt each person's transactions are stored and used by the node. But crucially MobileCoin is designed so the node operators can never directly access users' private keys.

'If you cant look at the ledger, how can you cheat it?'

Joshua Goldbard, MobileCoin

This is where the special features of MobileCoin come in. The currency is designed to utilize an Intel processor component known as Software Guard Extensions, or a "secure enclave." SGX is a sequestered portion of a processor that runs code like any other, but the software inside it can't be accessed or changed by a device's broader operating system. Computers can still check that an enclave is running the right software to validate it before connecting, but neither MobileCoin users nor node administrators can decrypt and view the enclave.

For MobileCoin, the enclaves in all of the nodes of the network hide the currency's indelible ledger from view. Users' private keys are stored and shielded in the enclave, too.

"If you put the cryptocurrency inside of the secure enclave, then people can run the nodes without seeing whats happening inside them," Goldbard says. "If you cant look at the ledger, how can you cheat it?"

Marlinspike first experimented with SGX for Signal as a workaround so users can find people they know on Signal through their address books without exposing all of that data.

Secure enclaves create some technical challenges, because they have limited processing capacity. But MobileCoin is designed with efficiency in mind. The system does as much data processing as possible outside the enclave, and only uses SGX for sensitive computing that needs to be shielded. And not needing to trust the nodesbecause sensitive data isn't exposed on themmeans that more can happen off of a user's device without sacrificing privacy, making transactions quick and easy on mobile devices.

"MobileCoin is designed to be deployable in normal resource-constrained environments like mobile devices, and to deliver a simple user experience along with privacy and security," Marlinspike says. "The design gives you the benefits of server assistance without the downsides of having to trust a server to act appropriately and not be hacked.

The platform has other protections layered with SGX as well. Even if someone compromised a MobileCoin enclave and could view the transaction ledger, one-time addresses and special one-time signatures for each transaction would still prevent an attacker from being able to trace and link events. And a privacy bonus of the Stellar Consensus Protocol is that the nodes don't need to store a full transaction history in the blockchain; they can discard most data after each payment is completed. These components make MobileCoin more resistant to surveillance, whether it's coming from a government or a criminal who wants to track and extort users.

There are lots of potential applications for MobileCoin, but Goldbard and Marlinspike envision it first as an integration in chat apps like Signal or WhatsApp. Here's how it would work in practice: To start using MobileCoin, you would generate a public and private key, and a recovery PIN. Then you would set up your account with an app that incorporates MobileCoin. The app would validate the software running in its service's node, establish an encrypted communication channel to the enclave, and then send your keys and the short, easy-to-remember recovery PIN that you'll use to access your MobileCoinlike a smartphone lock passcode.

To send MobileCoin to your friend Brian within a service that both of you use, your app would look up his public key, generate a one-time key and signature to use for the transaction, and send the transaction to the app's MobileCoin node. The node would sync and validate the transaction, update the ledger, and check the one-time key and signature to prevent spoofed double-spending. At this point Brian's MobileCoin node would take over, receiving and validating the transaction and communicating with Brian's app to generate the one-time private key that will allow Brian to receive the payment. And then Brian gets a notification that you paid him. The messaging app (or whatever service you're both using) doubles as a wallet for each of you.

It's a complicated process to wade through. The point of MobileCoin, though, is that you and Brian don't have to worry about any of it. The complicated parts all take place in the background.

The MobileCoin site, where developers looking to adopt the cryptocurrency will ultimately be able to access the software development kit, currently houses a white paper describing how MobileCoin works in more detail. But Goldbard says that the currency is still six months to a year from release, while he and Marlinspike refine the platform to eliminate potential problems, like the possibility that secure enclaves can inadvertently leak data.

That means there are still plenty of questions to be answered, including one big one: whether MobileCoin will be able to cut through all the noise and hype of the cryptocurrency community to actually be adopted by mainstream apps that could put it in everyone's hands. Currencies, after all, need a critical mass of people to not just be able to use them, but to agree on their worth.

And though speculation has driven bitcoin to all-time-high valuations, most cryptocurrencies don't end up capturing much value, languishing instead in far-flung corners of the internet. Here again, though, MobileCoin's creators hope to emulate Signal. End-to-end encryption was once a fringe feature; then WhatsApp gave it to a billion people at once using the Signal Protocol.

"Nobody actually transacts in cryptocurrency," Goldbard says. "So making something that people can actually use is our first goal. And then we want to find additional ways that people can implement it over time. But initially all we want is to make it so people can actually complete transactions."

If it works, the project will give hope to people who once believed cryptocurrency could truly replace cash in modern societyeven if you're only buying a pizza.

See original here:
MobileCoin: A New Cryptocurrency From Signal Creator Moxie ...

Read More..

security – Fundamental difference between Hashing and …

Well, you could look it up in Wikipedia... But since you want an explanation, I'll do my best here:

They provide a mapping between an arbitrary length input, and a (usually) fixed length (or smaller length) output. It can be anything from a simple crc32, to a full blown cryptographic hash function such as MD5 or SHA1/2/256/512. The point is that there's a one-way mapping going on. It's always a many:1 mapping (meaning there will always be collisions) since every function produces a smaller output than it's capable of inputting (If you feed every possible 1mb file into MD5, you'll get a ton of collisions).

The reason they are hard (or impossible in practicality) to reverse is because of how they work internally. Most cryptographic hash functions iterate over the input set many times to produce the output. So if we look at each fixed length chunk of input (which is algorithm dependent), the hash function will call that the current state. It will then iterate over the state and change it to a new one and use that as feedback into itself (MD5 does this 64 times for each 512bit chunk of data). It then somehow combines the resultant states from all these iterations back together to form the resultant hash.

Now, if you wanted to decode the hash, you'd first need to figure out how to split the given hash into its iterated states (1 possibility for inputs smaller than the size of a chunk of data, many for larger inputs). Then you'd need to reverse the iteration for each state. Now, to explain why this is VERY hard, imagine trying to deduce a and b from the following formula: 10 = a + b. There are 10 positive combinations of a and b that can work. Now loop over that a bunch of times: tmp = a + b; a = b; b = tmp. For 64 iterations, you'd have over 10^64 possibilities to try. And that's just a simple addition where some state is preserved from iteration to iteration. Real hash functions do a lot more than 1 operation (MD5 does about 15 operations on 4 state variables). And since the next iteration depends on the state of the previous and the previous is destroyed in creating the current state, it's all but impossible to determine the input state that led to a given output state (for each iteration no less). Combine that, with the large number of possibilities involved, and decoding even an MD5 will take a near infinite (but not infinite) amount of resources. So many resources that it's actually significantly cheaper to brute-force the hash if you have an idea of the size of the input (for smaller inputs) than it is to even try to decode the hash.

They provide a 1:1 mapping between an arbitrary length input and output. And they are always reversible. The important thing to note is that it's reversible using some method. And it's always 1:1 for a given key. Now, there are multiple input:key pairs that might generate the same output (in fact there usually are, depending on the encryption function). Good encrypted data is indistinguishable from random noise. This is different from a good hash output which is always of a consistent format.

Use a hash function when you want to compare a value but can't store the plain representation (for any number of reasons). Passwords should fit this use-case very well since you don't want to store them plain-text for security reasons (and shouldn't). But what if you wanted to check a filesystem for pirated music files? It would be impractical to store 3 mb per music file. So instead, take the hash of the file, and store that (md5 would store 16 bytes instead of 3mb). That way, you just hash each file and compare to the stored database of hashes (This doesn't work as well in practice because of re-encoding, changing file headers, etc, but it's an example use-case).

Use a hash function when you're checking validity of input data. That's what they are designed for. If you have 2 pieces of input, and want to check to see if they are the same, run both through a hash function. The probability of a collision is astronomically low for small input sizes (assuming a good hash function). That's why it's recommended for passwords. For passwords up to 32 characters, md5 has 4 times the output space. SHA1 has 6 times the output space (approximately). SHA512 has about 16 times the output space. You don't really care what the password was, you care if it's the same as the one that was stored. That's why you should use hashes for passwords.

Use encryption whenever you need to get the input data back out. Notice the word need. If you're storing credit card numbers, you need to get them back out at some point, but don't want to store them plain text. So instead, store the encrypted version and keep the key as safe as possible.

Hash functions are also great for signing data. For example, if you're using HMAC, you sign a piece of data by taking a hash of the data concatenated with a known but not transmitted value (a secret value). So, you send the plain-text and the HMAC hash. Then, the receiver simply hashes the submitted data with the known value and checks to see if it matches the transmitted HMAC. If it's the same, you know it wasn't tampered with by a party without the secret value. This is commonly used in secure cookie systems by HTTP frameworks, as well as in message transmission of data over HTTP where you want some assurance of integrity in the data.

A key feature of cryptographic hash functions is that they should be very fast to create, and very difficult/slow to reverse (so much so that it's practically impossible). This poses a problem with passwords. If you store sha512(password), you're not doing a thing to guard against rainbow tables or brute force attacks. Remember, the hash function was designed for speed. So it's trivial for an attacker to just run a dictionary through the hash function and test each result.

Adding a salt helps matters since it adds a bit of unknown data to the hash. So instead of finding anything that matches md5(foo), they need to find something that when added to the known salt produces md5(foo.salt) (which is very much harder to do). But it still doesn't solve the speed problem since if they know the salt it's just a matter of running the dictionary through.

So, there are ways of dealing with this. One popular method is called key strengthening (or key stretching). Basically, you iterate over a hash many times (thousands usually). This does two things. First, it slows down the runtime of the hashing algorithm significantly. Second, if implemented right (passing the input and salt back in on each iteration) actually increases the entropy (available space) for the output, reducing the chances of collisions. A trivial implementation is:

There are other, more standard implementations such as PBKDF2, BCrypt. But this technique is used by quite a few security related systems (such as PGP, WPA, Apache and OpenSSL).

The bottom line, hash(password) is not good enough. hash(password + salt) is better, but still not good enough... Use a stretched hash mechanism to produce your password hashes...

Do not under any circumstances feed the output of one hash directly back into the hash function:

The reason for this has to do with collisions. Remember that all hash functions have collisions because the possible output space (the number of possible outputs) is smaller than then input space. To see why, let's look at what happens. To preface this, let's make the assumption that there's a 0.001% chance of collision from sha1() (it's much lower in reality, but for demonstration purposes).

Now, hash1 has a probability of collision of 0.001%. But when we do the next hash2 = sha1(hash1);, all collisions of hash1 automatically become collisions of hash2. So now, we have hash1's rate at 0.001%, and the 2nd sha1() call adds to that. So now, hash2 has a probability of collision of 0.002%. That's twice as many chances! Each iteration will add another 0.001% chance of collision to the result. So, with 1000 iterations, the chance of collision jumped from a trivial 0.001% to 1%. Now, the degradation is linear, and the real probabilities are far smaller, but the effect is the same (an estimation of the chance of a single collision with md5 is about 1/(2128) or 1/(3x1038). While that seems small, thanks to the birthday attack it's not really as small as it seems).

Instead, by re-appending the salt and password each time, you're re-introducing data back into the hash function. So any collisions of any particular round are no longer collisions of the next round. So:

Has the same chance of collision as the native sha512 function. Which is what you want. Use that instead.

View post:
security - Fundamental difference between Hashing and ...

Read More..

New silicon structure opens the gate to quantum computers

In a major step toward making a quantum computer using everyday materials, a team led by researchers at Princeton University has constructed a key piece of silicon hardware capable of controlling quantum behavior between two electrons with extremely high precision. The study was published Dec. 7 in the journal Science.

The team constructed a gate that controls interactions between the electrons in a way that allows them to act as the quantum bits of information, or qubits, necessary for quantum computing. The demonstration of this nearly error-free, two-qubit gate is an important early step in building a more complex quantum computing device from silicon, the same material used in conventional computers and smartphones.

"We knew we needed to get this experiment to work if silicon-based technology was going to have a future in terms of scaling up and building a quantum computer," said Jason Petta, a professor of physics at Princeton University. "The creation of this high-fidelity two-qubit gate opens the door to larger scale experiments."

Silicon-based devices are likely to be less expensive and easier to manufacture than other technologies for achieving a quantum computer. Although other research groups and companies have announced quantum devices containing 50 or more qubits, those systems require exotic materials such as superconductors or charged atoms held in place by lasers.

Quantum computers can solve problems that are inaccessible with conventional computers. The devices may be able to factor extremely large numbers or find the optimal solutions for complex problems. They could also help researchers understand the physical properties of extremely small particles such as atoms and molecules, leading to advances in areas such as materials science and drug discovery.

The two-qubit silicon-based gate consists of two electrons (blue balls with arrows) in a layer of silicon (Si). By applying voltages through aluminum oxide (Al2O3)wires (red and green), the researchers trapped the electrons and coaxed quantum behaviors that transform their spin properties into quantum bits of information, or qubits. The image on the left shows a scanning electron micrograph of the device, which is about 200 nanometers (nm) across. The image on the right is a diagram of the device from the side.

Image courtesy of Science/AAAS

Building a quantum computer requires researchers to create qubits and couple them to each other with high fidelity. Silicon-based quantum devices use a quantum property of electrons called "spin" to encode information. The spin can point either up or down in a manner analogous to the north and south poles of a magnet. In contrast, conventional computers work by manipulating the electron's negative charge.

Achieving a high-performance, spin-based quantum device has been hampered by the fragility of spin states they readily flip from up to down or vice versa unless they can be isolated in a very pure environment. By building the silicon quantum devices in Princeton's Quantum Device Nanofabrication Laboratory, the researchers were able to keep the spins coherent that is, in their quantum states for relatively long periods of time.

To construct the two-qubit gate, the researchers layered tiny aluminum wires onto a highly ordered silicon crystal. The wires deliver voltages that trap two single electrons, separated by an energy barrier, in a well-like structure called a double quantum dot.

By temporarily lowering the energy barrier, the researchers allow the electrons to share quantum information, creating a special quantum state called entanglement. These trapped and entangled electrons are now ready for use as qubits, which are like conventional computer bits but with superpowers: while a conventional bit can represent a zero or a 1, each qubit can be simultaneously a zero and a 1, greatly expanding the number of possible permutations that can be compared instantaneously.

"The challenge is that its very difficult to build artificial structures small enough to trap and control single electrons without destroying their long storage times," said David Zajac, a graduate student in physics at Princeton and first-author on the study. "This is the first demonstration of entanglement between two electron spins in silicon, a material known for providing one of the cleanest environments for electron spin states."

The researchers demonstrated that they can use the first qubit to control the second qubit, signifying that the structure functioned as a controlled NOT (CNOT) gate, which is the quantum version of a commonly used computer circuit component. The researchers control the behavior of the first qubit by applying a magnetic field. The gate produces a result based on the state of the first qubit: If the first spin is pointed up, then the second qubit's spin will flip, but if the first spin is down, the second one will not flip.

"The gate is basically saying it is only going to do something to one particle if the other particle is in a certain configuration," Petta said. "What happens to one particle depends on the other particle."

The researchers showed that they can maintain the electron spins in their quantum states with a fidelity exceeding 99 percent and that the gate works reliably to flip the spin of the second qubit about 75 percent of the time. The technology has the potential to scale to more qubits with even lower error rates, according to the researchers.

"This work stands out in a worldwide race to demonstrate the CNOT gate, a fundamental building block for quantum computation, in silicon-based qubits," said HongWen Jiang, a professor of physics and astronomy at the University of California-Los Angeles."The error rate for the two-qubit operation is unambiguously benchmarked. It is particularly impressive that this extraordinarily difficult experiment, which requires a sophisticated device fabrication and an exquisite control of quantum states, is done in a university lab consisting of only a few researchers."

Additional researchers at Princeton are graduate student Felix Borjans and associate research scholar Anthony Sigillito. The team included input on the theory aspects of the work by Jacob Taylor, a professor at the Joint Quantum Institute and Joint Center for Quantum Information and Computer Science at the National Institute of Standards and Technology and the University of Maryland, and Maximilian Russ and Guido Burkard at the University of Konstanz in Germany.

Research was sponsored by U.S. Army Research Office grant W911NF-15-1-0149, the Gordon and Betty Moore Foundation's EPiQS Initiative through grant GBMF4535, and National Science Foundation grant DMR-1409556. Devices were fabricated in the Princeton University Quantum Device Nanofabrication Laboratory.

The study, "Resonantly driven CNOT gate for electron spins," by David M. Zajac, Anthony J. Sigillito, Maximilian Russ, Felix Borjans, Jacob M. Taylor, Guido Burkard and Jason R. Petta was published online in the journal Science on Dec. 7, 2017.

Originally posted here:
New silicon structure opens the gate to quantum computers

Read More..

Best Cloud Storage Providers 2017 Reviewed & Rated

No more email attachments

Large email attachments are a pain, its hard to know if they actually arrive safely at the recipient. As files can be shared with a public link that link can be then sent via email to a recipient where it can be downloaded.

No more awkward file names like final, final-1, final-final

All file versions of a document, presentation, spreadsheet or any file for that matter are stored in the cloud. All parties that have access to that file or folder will always see the current version no matter who edited it last. What used to be an endless chain of new documents whose filename creativity decreased by each version is now combined into one single file.

Knowing that all files are always there, aka file syncing

No matter if youre at the airport, in a caf, a co-working space, traveling, at the office, at home, all crucial files are always available and you dont have to remember to transfer them onto a USB thumb drive or even send yourself an email because files are synced through cloud storage software across your (mobile) devices.

Cloud File Storage for Backups

Cloud file storage services can serve as a quick backup because multiple versions of a single files is both available in the cloud and on other devices. So if a computer meltdown occurs, having a subscription with a cloud storage service allows for quick restoration of lost files.

Online Storage for Collaboration

Certainly one of the major benefits of services like Dropbox is collaboration with a virtual team spread across different continents. All members of a team can have access to a shared folder and upload and download data, so everybody is on the same page with the project status. Even if somebody on the team makes a mistake, most services offer file versioning which means the admin can go back and recover deleted files or even older versions of the same file.

Best Online Storage for Offline access?

Even though they are called cloud storage providers, with the above mentioned file synchronisation, all files are available offline. So working on a project with a weak or non-existing Internet connection is possible. Once re-connected, changes are uploaded automatically.

Back to comparison chart

View original post here:
Best Cloud Storage Providers 2017 Reviewed & Rated

Read More..

Microsoft Takes Path Less Traveled to Build a Quantum …

In the race to commercialize a new type of powerful computer, Microsoft Corp. has just pulled up to the starting line with a slick-looking set of wheels. Theres just one problem: it doesn't have an engine at least not yet.

The Redmond, Washington-based tech giant is competing with Alphabet Inc.s Google, International Business Machines Corp. and a clutch of small, specialized companies to develop quantum computers machines that, in theory, will be many times more powerful than existing computers by bending the laws of physics.

Microsoft says it has a different approach that will make its technology less error-prone and more suitable for commercial use. If it works. On Monday, the company unveiled a new programming language called Q# pronounced Q Sharp and tools that help coders craft software for quantum computers. Microsoft is also releasing simulators that will let programmers test that software on a traditional desktop computer or through its Azure cloud-computing service.

The machines are one of the advanced technologies, along with artificial intelligence and augmented reality, that Microsoft Chief Executive Officer Satya Nadella considers crucial to the future of his company. Microsoft, like IBM and Google, will most likely rent computing time on these quantum machines through the internet as a service.

D-Wave Systems Inc. in 2011 became the first company to sell a quantum computer, although its technology has been controversial and can only perform a certain subset of mathematical problems. Google and IBM have produced machines that are thought to be close to achieving quantum supremacy the ability to tackle a problem too complex to solve on any standard supercomputer. IBM and startup Rigetti Computing also have software for their machines.

Microsoft, in contrast, is still trying to build a working machine. It is pursuing a novel design based on controlling an elusive particle called a Majorana fermion that no one was sure even existed a few years ago. Engineers are close to being able to control the Majorana fermionin a way that will enable them to perform calculations, Todd Holmdahl, head of Microsofts quantum computing efforts, said in an interview. Holmdahl, who led development of the Xbox and the company's HoloLens goggles, said Microsoft will have a quantum computer on the market within five years.

We are talking to multiple customers today and we are proposing quantum-inspired services for certain problems, he added.

These systems push the boundaries of how atoms and other tiny particles work. While traditional computers process bits of information as 1s or zeros, quantum machines rely on "qubits" that can be a 1 and a zero at the same time. So two qubits can represent four numbers simultaneously, and three qubits can represent eight numbers, and so on. This means quantum computers can perform calculations much faster than standard machines and tackle problems that are way more complex.

Krysta Svore

Source: Microsoft

Applications could include things like creating new drugs and new materials or solving complex chemistry problems. The killer app of quantum computing is discovering a more efficient way to synthesize ammonia for fertilizer a process that currently consumes three percent of the worlds natural gas, according to Krysta Svore, who oversees the software aspects of Microsofts quantum work.

The technology is still emerging from a long research phase, and its capabilities are hotly debated. Researchers have only been able to keep qubits in a quantum state for fractions of a second. When qubits fall out of a quantum state they produce errors in their calculations, which can negate any benefit of using a quantum computer.

Microsoft says it uses a different design called a topological quantum computer that in theory will create more stable qubits. This couldproduce a machine with an error rate from 1,000 to10,000 times better than computers other companies are building, Holmdahl said.

Reducing or correcting the errors in quantum calculations is essential for the technology to fulfill its commercial potential, said Jonathan Breeze, a research fellow working on advanced materials at Imperial College London.

The lower error rate of Microsoft's design may mean it can be more useful for tackling real applications -- even with a smaller number of qubits perhaps less than 100. Svore said her team has already proven mathematically that algorithms that use a quantum approach can speed up machine learning applications substantially enabling them to run as much as 4,000 times faster. (Machine learning is a type of artificial intelligence behind recent advances in computers ability to identify objects in images, translate languages and drive cars).

"We want to solve today's unsolvable problems and we have an opportunity with a unique, differentiated technology to do that," Holmdahl said.

Read more from the original source:
Microsoft Takes Path Less Traveled to Build a Quantum ...

Read More..

Microsoft offers developers a preview of its quantum …

The simulator will allow developers to test programs and debug code with their own computers, which is necessary since there really aren't any quantum computers for them to test their work on yet. Microsoft is also offering a more powerful simulator -- one with over 40 logical qubits of computing power -- through its Azure cloud computing service. And because the kit is integrated into Microsoft's Visual Studio developer tool suite, many aspects of the new kit will be familiar.

"What you're going to see as a developer is the opportunity to tie into tools that you already know well, services you already know well," Todd Holmdahl, Microsoft's VP in charge of its quantum effort, said in a statement. "There will be a twist with quantum computing, but it's our job to make it as easy as possible for the developers who know and love us to be able to use these new tools that could potentially do some things exponentially faster which means going from a billion years on a classical computer to a couple hours on a quantum computer."

Read more:
Microsoft offers developers a preview of its quantum ...

Read More..

Researchers create new type of quantum computer | Harvard Gazette

Programming a computer is generally a fairly arduous process, involving hours of coding, not to mention the laborious work of debugging, testing, and documenting to make sure it works properly.

But for a team of physicists from the Harvard-MIT Center for Ultracold Atoms and the California Institute of Technology, things are actually much tougher.

Working in a Harvard Physics Department lab, a team of researchers led by Harvard Professors Mikhail Lukin and Markus Greiner and Massachusetts Institute of Technology Professor Vladan Vuletic developed a special type of quantum computer, known as a quantum simulator, that is programmed by capturing super-cooled rubidium atoms with lasers and arranging them in a specific order, then allowing quantum mechanics to do the necessary calculations.

The system could be used to shed light on a host of complex quantum processes, including the connection between quantum mechanics and material properties, and it could investigate new phases of matter and solve complex real-world optimization problems. The system is described in a Nov. 30 paper published in the journal Nature.

The combination of the systems large size and high degree of quantum coherence make it an important achievement, researchers say. With more than 50 coherent qubits, this is one of the largest quantum systems ever created with individual assembly and measurement.

In the same issue of Nature, a team from the Joint Quantum Institute at the University of Maryland described a similarly sized system of cold charged ions, also controlled with lasers. Taken together, these complimentary advances constitute a major step toward large-scale quantum machines.

Everything happens in a small vacuum chamber where we have a very dilute vapor of atoms which are cooled close to absolute zero, Lukin said. When we focus about 100 laser beams through this cloud, each of them acts like a trap. The beams are so tightly focused, they can either grab one atom or zero; they cant grab two. And thats when the fun starts.

Using a microscope, researchers can take images of the captured atoms in real time, and then arrange them in arbitrary patterns for input.

We assemble them in a way thats very controlled, said Ahmed Omran, a postdoctoral fellow in Lukins lab and a co-author of the paper. Starting with a random pattern, we decide which trap needs to go where to arrange them into desired clusters.

As researchers begin feeding energy into the system, the atoms begin to interact with each other. Those interactions, Lukin said, give the system its quantum nature.

We make the atoms interact, and thats really whats performing the computation, Omran said. In essence, as we excite the system with laser light, it self-organizes. Its not that we say this atom has to be a one or a zero we could do that easily just by throwing light on the atoms but what we do is allow the atoms to perform the computation for us, and then we measure the results.

Those results, Lukin and colleagues said, could shed light on complex quantum mechanical phenomena that are all but impossible to model using conventional computers.

If you have an abstract model where a certain number of particles are interacting with each other in a certain way, the question is why dont we just sit down at a computer and simulate it that way? asked Ph.D. student Alexander Keesling, another co-author. The reason is because these interactions are quantum mechanical in nature. If you try to simulate these systems on a computer, youre restricted to very small system sizes, and the number of parameters are limited.

If you make systems larger and larger, very quickly you will run out of memory and computing power to simulate it on a classical computer, he added. The way around that is to actually build the problem with particles that follow the same rules as the system youre simulating. Thats why we call this a quantum simulator.

Though its possible to use classical computers to model small quantum systems, the simulator developed by Lukin and colleagues uses 51 qubits, making it virtually impossible to replicate using conventional computing techniques.

It is important that we can start by simulating small systems using our machine, he said. So we are able to show those results are correct until we get to the larger systems, because there is no simple comparison we can make.

By Peter Reuell, Harvard Staff Writer | July 3, 2012 | Editor's Pick

When we start off, all the atoms are in a classical state. And when we read out at the end, we obtain a string of classical bits, zeros, and ones, said Hannes Bernien, another postdoctoral fellow in Lukins lab, and also a co-author. But in order to get from the start to the end, they have to go through the complex quantum mechanical state. If you have a substantial error rate, the quantum mechanical state will collapse.

Its that coherent quantum state, Bernien said, that allows the system to work as a simulator, and also makes the machine a potentially valuable tool for gaining insight into complex quantum phenomena and eventually performing useful calculations. The system already allows researchers to obtain unique insights into transformations between different types of quantum phases, called quantum phase transitions. It may also help shed light on new and exotic forms of matter, Lukin said.

Normally, when you talk about phases of matter, you talk about matter being in equilibrium, he said. But some very interesting new states of matter may occur far away from equilibrium and there are many possibilities for that in the quantum domain. This is a completely new frontier.

Already, Lukin said, the researchers have seen evidence of such states. In one of the first experiments conducted with the new system, the team discovered a coherent non-equilibrium state that remained stable for a surprisingly long time.

Quantum computers will be used to realize and study such non-equilibrium states of matter in the coming years, he said. Another intriguing direction involves solving complex optimization problems. It turns out one can encode some very complicated problems by programming atom locations and interactions between them. In such systems, some proposed quantum algorithms could potentially outperform classical machines. Its not yet clear whether they will or not, because we just cant test them classically. But we are on the verge of entering the regime where we can test them on the fully quantum machines containing over 100 controlled qubits. Scientifically, this is really exciting.

Other co-authors of the study were visiting scientist Sylvain Schwartz, Harvard graduate students Harry Levine and Soonwon Choi, research associate Alexander S. Zibrov, and Professor Manuel Endres.

This research was supported with funding from the National Science Foundation, the Center for Ultracold Atoms, the Army Research Office, and the Vannevar Bush Faculty Fellowship.

By Arthur Goldhammer, Center for European Studies | November 30, 2017

Read more here:
Researchers create new type of quantum computer | Harvard Gazette

Read More..

Microsoft releases quantum computing development kit preview …

At the Microsoft Ignite Conference in September, Microsoft let it be known it was going to be a player in the future of quantum computing, and today the company took another step toward that goal when it released a preview of its quantum computing development kit.

The kit includes all of the pieces a developer needs to get started including a Q# language and compiler, a Q# library, a local quantum computing simulator, a quantum trace simulator and a Visual Studio extension.

This is a preview, so its aimed at early adopters who want to understand what it takes to develop programs for quantum computers, which operate very differently from classical ones. Put in simple terms, with a classical computer, a bit can only exist in a binary state of on or off, whereas with quantum programs a qubit (the quantum equivalent of a bit) can exist in multiple states at the same time. This could open the door to programs that simply couldnt have existed before.

This is but one piece in Microsofts big vision for quantum computing that it discussed at Ignite. Microsofts Krysta Svore told TechCrunchs Frederic Lardinois in September that the idea was to offer a comprehensive full-stack solution for controlling the quantum computer and writing applications for it.

We like to talk about co-development, she said. We are developing those [the hardware and software stack] together so that youre really feeding back information between the software and the hardware as we learn, and this means that we can really develop a very optimized solution, she told Lardinois.

Microsoft clearly wants a piece of the quantum computing action, but they are hardly alone. IBM has had a quantum computing service available for programmers since last year, and last month it had a breakthrough with the release of a 20 qubit quantum computer. The company also announced a 50 qubit prototype.

Other companies working on quantum computing research include Google and Intel and a host of other established companies and startups.

We are still in very early days with this technology and it has a long way to go, but the potential is so great that all of these companies, including Microsoft, want to get in as early as possible to capture developer hearts and minds. Todays release is part of that.

See the rest here:
Microsoft releases quantum computing development kit preview ...

Read More..

What Is Cloud Hosting? | GoDaddy – YouTube

What is cloud hosting? This video answers that question and explains what part cloud servers play in delivering quick and innovative cloud hosting. Learn more at: https://www.godaddy.com/pro/cloud-ser....

Are you wondering if your website should be hosted in the cloud? This video explains what cloud hosting is and how to determine if its the right solution for you.

When it comes to hosting, the term cloud can be confusing and misleading. Theres nothing airy or nebulous about it. Like all other types of hosting, cloud hosting relies on very real hardware servers.

But unlike shared and dedicated hosting, where everything thats needed to make your website or application function is housed on a single server or server stack, cloud hosting uses innovative technologies to spread the files, data, resources, bandwidth, and computing operations among multiple servers that are networked to act as one system.

This distributed solution eliminates any single point of failure, provides redundant data storage, and guarantees exclusive use of server resources, such as processing and memory.

Cloud hosting also provides great flexibility and scalability, making it easy to upgrade or downgrade your hosting environment and accommodate for periods of higher or lower traffic.

Another big bonus of cloud hosting is that you pay only for the resources you consume.

All of this makes cloud hosting a powerful solution for large, high-traffic internet sites and applications that require the highest level of performance.

You may have already guessed that managing a cloud hosting solution is not a simple process. It takes the skills of an experienced Linux server administrator or IT professional, so its not a good solution for most small businesses or do-it-yourselfers.

To recap, cloud hosting distributes the files, data, and resources of internet sites and applications among multiple connected servers that operate as a single system. It provides a very high level of flexibility, scalability and performance, but must be managed by an IT pro.

Before you opt for cloud hosting, be sure to weigh its pros and cons against your online needs, budget, and business goals.

See the article here:
What Is Cloud Hosting? | GoDaddy - YouTube

Read More..