Page 21«..10..20212223..3040..»

Qudits: The Real Future of Quantum Computing? – IEEE Spectrum

Instead of creating quantum computers based on qubits that can each adopt only two possible options, scientists have now developed a microchip that can generate qudits that can each assume 10 or more states, potentially opening up a new way to creating incredibly powerful quantum computers, a new study finds.

Classical computers switch transistors either on or off to symbolize data as ones and zeroes. In contrast, quantum computers use quantum bits, or qubitsthat, because of the bizarre nature of quantum physics, can be in a state ofsuperpositionwhere they simultaneously act as both 1 and 0.

The superpositions that qubits can adopt let them each help perform two calculations at once. If two qubitsare quantum-mechanically linked, orentangled,they can help perform four calculations simultaneously; three qubits, eight calculations; and so on. As a result, aquantum computer with 300 qubits could perform more calculations in an instant than there are atoms in the known universe, solving certain problems much faster than classical computers. However, superpositions are extraordinarily fragile, making it difficult to work with multiple qubits.

Most attempts at building practical quantum computers rely on particles that serve as qubits. However, scientists have long known that they could in principle use quditswith more than two states simultaneously. In principle, a quantum computer with two 32-state qudits, for example, would be able to perform as many operations as 10 qubits while skipping the challenges inherent with working with 10 qubits together.

Researchers used the setup pictured above to create, manipulate, and detect qudits. The experiment starts when a laser fires pulses of light into a micro-ring resonator, which in turn emits entangled pairs of photons.Because the ring has multiple resonances, the photons have optical spectrumswitha set of evenly spaced frequencies(red and blue peaks), a process known as spontaneous four-wave mixing (SFWM).The researchers were able to use each of thefrequencies to encode information, which means the photons act asqudits.Each quditis in a superposition of 10 possible states, extending the usual binary alphabet (0 and 1) of quantum bits.The researchers also showed they could perform basic gate operations on the qudits using optical filters and modulators, and then detect the results using single-photon counters.

Now scientists have for the first time created a microchip that can generate two entangled qudits each with 10 states, for 100 dimensions total, more than what six entangled qubits could generate. We have now achieved the compact and easy generation of high-dimensional quantum states, says study co-lead author Michael Kues, a quantum optics researcher at Canadas National Institute of Scientific Research, or INRS,its French acronym,in Varennes, Quebec.

The researchers developed a photonic chip fabricated using techniques similar to ones used for integrated circuits. A laser fires pulses of light into a micro-ring resonator, a 270-micrometer-diameter circle etched onto silica glass, which in turn emits entangled pairs of photons. Each photon is in a superposition of 10 possible wavelengths or colors.

For example, a high-dimensional photon can be red and yellow and green and blue, although the photons used here were in the infrared wavelength range, Kues says. Specifically, one photon from each pair spanned wavelengths from 1534 to 1550 nanometers, while the other spanned from 1550 to 1566 nanometers.

Using commercial off-the-shelf telecommunications components, the researchers showed they could manipulate these entangled photons. The basic capabilities they show are really what you need to do universal quantum computation, says quantum optics researcher Joseph Lukens at Oak Ridge National Laboratory, in Tennessee, who did not take part in this research. Its pretty exciting stuff.

In addition, by sending the entangled photons through a 24.2-kilometer-long optical fiber telecommunications system, the researchers showed that entanglement was preserved over large distances. This could prove useful for nigh-unhackable quantum communications applications, the researchers say.

What I think is amazing about our system is that it can be created using components that are out on the market, whereas other quantum computer technologies need state-of-the-art cryogenics, state-of-the-art superconductors, state-of-the-art magnets, saysstudy co-senior authorRoberto Morandotti, a physicistatINRSin Varennes. The fact that we use basic telecommunications components to access and control these states means that a lot of researchers could explore this area as well.

The scientists noted that current state-of-the-art components could conceivably generate entangled pairs of 96-state qudits, corresponding to more dimensions than 13 qubits. Conceptually, in principle, I dont see a limit to the number of states of qudits right now, Lukens, from Oak Ridge,says. I do think a 96-by-96-dimensional system is fairly reasonable, and achievable in the near future.

But he adds that several components of the experiment were not on the microchips, such as the programmable filters and phase modulators, which led to photon loss. Kues says that integrating such components with the rest of the chips and optimizing their micro-ring resonator would help reduce such losses to make their system more practical for use.

The next big challenge we will have to solve is to use our system for quantum computation and quantum communications applications, Kues says. While this will take some additional years, it is the final step required to achieve systems that can outperform classical computers and communications.

The scientists detailed their findings in the latest issue of the journal Nature.

Originally posted here:
Qudits: The Real Future of Quantum Computing? – IEEE Spectrum

Read More..

Intel Takes First Steps To Universal Quantum Computing

October 11, 2017Timothy Prickett Morgan

Someone is going to commercialize a general purpose, universal quantum computer first, and Intel wants to be the first. So does Google. So does IBM. And D-Wave is pretty sure it already has done this, even if many academics and a slew of upstart competitors dont agree. What we can all agree on is that there is a very long road ahead in the development of quantum computing, and it will be a costly endeavor that could nonetheless help solve some intractable problems.

This week, Intel showed off the handiwork its engineers and those of partner QuTech, a quantum computing spinoff from the Technical University of Delft and Toegepast Natuurwetenschappelijk Onderzoek (TNO), which as the name suggests is an applied science research firm that, among other things, is working with Intel on quantum computing technology.

TNO, which was established in 1988, has a 500 million annual budget and does all kinds of primary research. The Netherlands has become a hotbed of quantum computing technology, along with the United States and Japan, and its government wants to keep it that way and hence the partnership in late 2015 with Intel, which invested $50 million in the QuTech partnership between TU Delft and TNO so it could jumpstart its own quantum computing program after sitting on the sidelines.

With this partnership, Intel is bringing its expertise in materials science, semiconductor manufacturing, interconnects, and digital systems to play to help develop two types of quantum bits, or qubits, which are the basic element of processing in a quantum computer. The QuTech partnership involves the manufacturing of superconducting qubits, but Intel also is working on another technology called spin qubits that makes use of more traditional semiconductor technologies to create what is, in essence, the quantum transistor for this very funky and very parallel style of computing.

The big news this week is that Intel has been able to take a qubit design that its engineers created alongside of those working at QuTech and scale it up to 17 qubits on a single package. A year ago, the Intel-QuTech partnership had only a few qubits on their initial devices, Jim Clarke, director of quantum hardware at Intel, tells The Next Platform, and two years ago it had none. So that is a pretty impressive roadmap in a world where Google is testing a 20 qubit chip and hopes to have one running at 49 qubits before the year is out. Google also has quantum annealing systems from D-Wave, which have much more scale in terms of qubits 1,000 today and 2,000 on the horizon but according to Intel are not a generic enough to be properly commercialized. And if Intel knows anything, it knows how to create a universal computing substrate and scale its manufacturing and its deployment in the datacenters of the world.

Production and cleanroom facilities for the quantum chip made at Intels D1D/D1X plant in Hillsboro, Oregon, in April 2017.

We are trying to build a general purpose, universal quantum computer, says Clarke. This is not a quantum annealer, like the D-Wave machine. There are many different types of qubits, which are the devices for quantum computing, and one of the things that sets Intel apart from the other players is that we are focused on multiple qubit types. The first is a superconducting qubit, which is similar to what Google, IBM, and a startup named Rigetti Computing are working on. But Intel is also working on spin qubits in silicon are very similar to our transistor technologies, and you can expect to hear about that in the next couple of months. These spin qubits build on our expertise in ordinary chip fabrication, and what really sets us apart here is our use of advanced packaging at very low temperatures to improve the performance of the qubit, and with an eye towards scalability.

Just as people are obsessed with the number of transistors or cores on a standard digital processor, people are becoming a bit obsessed with the number of qubits on a quantum chip, and Jim Held, director of emerging technology research at Intel Labs, says that this focus is a bit misplaced. And for those of us who look at systems for a living, this makes perfect sense. Intel is focused on getting the system design right, and then scaling it up on all vectors to build a very powerful quantum machine.

Here is the situation as Held sees it, and breathe in deeply here:

People focus on the number of qubits, but that is just one piece of what is needed. We are really approaching this as engineers, and everything is different about this kind of computer. It is not just the devices, but the control electronics and how the qubits are manipulated with microwave pulses and measured with very sensitive DC instrumentation, and it is more like an analog computer in some respects. Then it has digital electronics that do error correction because quantum devices are very fragile, and they are prone to errors and to the degree that we can correct the errors, we can compute better and longer with them. It also means a new kind of compiler in order to get the potential parallelism in an array of these qubits, and even the programs, the algorithms, written for these devices are an entirely different kind of thing from conventional digital programming. Every aspect of the stack is different. While there is research going on in the academic world at all levels, as an engineering organization we are coming at them all together because we know we have to deliver them all at once as a computer. Moreover, our experience tells us that we want to understand at any given point what our choices at one level are going to mean for the rest of the computer. What we know is that if you have a plate full of these qubits, you do not have a quantum computer, and some of the toughest problems with scaling are in the rest of the stack. Focusing on the number of qubits or the coherence time really does a disservice to the process of getting to something useful.

This is analogous to massively parallel machines that dont have enough bandwidth or low latency to talk across cores, sockets, or nodes efficiently and to share work. You can cram as many cores as you want in them, but the jobs wont finish faster.

And thus, Intel is focusing its research on the interconnects that will link qubits together on a device and across multiple devices.

The interconnects are one of the things that concerns us most with quantum computing, says Clarke. From the outset, we have not been focused on a near-term milestone, but rather on what it would take from the interconnect perspective, from the point of view of the design and the control, to deliver a large scale, universal quantum computer.

Interestingly, Clarke says that the on-chip interconnect on commercial quantum chips will be similar to that used on a conventional digital CPU, but it may not be made out of copper wires, but rather superconducting materials.

The one used in the superconducting qubit chip that Intel just fabbed in its Oregon factory and packaged in its Arizona packaging facility is a bit ridiculous looking.

Quantum computing presents a few physical challenges, and superconducting qubits are especially tricky. To keep preserve the quantum states that allow superposition a kind of multiple, concurrent state of the bits that allows for parallel processing at the bit level, to over simplify hugely requires for these analog devices to be kept at extremely cold temperatures and yet still have to interface with the control electronics in the outside world, crammed into a rack.

We are putting these chips in an extremely cold environment 20 millikelvins, and that is much colder than outer space, says Clarke. And first of all, we have to make sure that the chip doesnt fall apart at these temperatures. You have thermal coefficient of expansion. Then you need to worry about package yield and then about the individual qubit yield. Then we worry about wiring them up in a more extensible fashion. These are very high quality radio or microwave frequency chips and we have to make sure we maintain that quality at low temperature once the device is packaged. A lot of the performance and yield that we are getting comes from the packaging.

So for this chip, Intel has wallpapered one side of the chip with standard coaxial ports, like the ones on the back of your home router. Each qubit has two or more coax ports going into it to control its state and to monitor that state. How retro:

We are focused on a commercial machine, so we are much more interested in scaling issues, Held continues along this line of thinking. You have to be careful to not end up in a dead end that only gets you so far. This quantum chip interconnect is not sophisticated like Omni-Path, and it does not scale well, Held adds with a laugh. What we are interested in is improving on that to reduce the massive number of connections. A million qubits turning into millions of coax cables is obviously not going to work. Even at hundreds of qubits, this is not going to work. One way we are going to do this is to move the electronics that is going to control this quantum machine into this very cold environment, not down at the millikelvin level, but a layer or two up at the 4 kelvin temperature of liquid hydrogen. Our partners at QuTech are experts at cryo-CMOS, which means making chips work in this 4 kelvin range. By moving this control circuitry from a rack outside of the quantum computer into the refrigeration unit, it cuts the length of the connections to the qubits.

With qubits, superposition allows a single qubit to represent two different states, and quantum entanglement what Einstein called spooky action at a distance allows for the states to scale linearly as the qubit counts go up. Technically, n quantum bits yield 2 to the n states. (We wrote that out because there is something funky about superscripts in the Alike font we use here at The Next Platform.) The interconnect is not used to maintain the quantum states across the qubits that happens because of physics but to monitor the qubit states and maintain those states and, importantly, to do error correction. Qubits cant be shaken or stirred or they lose their state, and they are extremely fussy. As Google pointed out two years ago at the International Super Computing conference in Germany, a quantum computer could end up being an accelerator for a traditional parallel supercomputer, which is used to do error correction and monitoring of qubits. Intel is also thinking this might happen.

The fussiness of superconducting qubits is probably one of the reasons why Intel is looking to spin qubits and a more standard semiconductor process to create a quantum computer chip whose state is easier to maintain. The other is that Intel is obviously an expert at manufacturing semiconductor devices. So, we think, the work with QuTech is as much about creating a testbed system and a software stack that might be portable as it is investing in this particular superconducting approach. Time will tell.

And time, indeed, it will take. Both Held and Clarke independently think it will take maybe eight to ten years to get a general purpose, universal quantum computer commercialized and operating at a useful scale.

It is research, so we are only coming to timing based on how we think we are going to solve a number of problems, says Held. There will be a milestone where a machine will be able to tackle interesting but small problems, and then there will be a full scale machine that is mature enough to be a general purpose, widely useful accelerator in the supercomputer environment or in the cloud. These will not be free-standing computers because they dont do a lot of things that a classical digital computer does really well. They could do them, because in theory any quantum computer can do anything a digital computer can do, but they dont do it well. It is going to take on the order of eight to ten years to solve these problems we are solving now. They are all engineering problems; the physicists have done an excellent job of finding feasible solutions out of the lab and scaling them out.

Clarke adds a note of caution, pointing out that there are a lot of physics problems that need to be solved for the packaging aspects of a quantum computer. But I think to solve the next level of physics problems, we need a healthy dose of engineering and process control, Clarke says. I think eight to ten years is probably fair. We are currently at mile one of a marathon. Intel is already in the lead pack. But when we think of a commercially relevant quantum computer, we think of one that is relevant to the general population, and moreover, one that would show up on Intels bottom line. They key is that we are building a system, and at first, that system is going to be pretty small but it is going to educate us about all aspects of the quantum computing stack. At the same time, we are designing that system for extensibility, both at the hardware level and at the architecture control level to get to many more qubits. We want to make the system better, and larger, and it is probably a bit premature to start assigning numbers to that other than to say that we are thinking about the longer term.

It seems we might need a quantum computer to figure out when we might get a quantum computer.

Categories: Cloud, Compute, HPC, Hyperscale

Tags: Delft, Intel, quantum, qubit, QuTech, spin qubit, Superconducting, TNO

See the rest here:
Intel Takes First Steps To Universal Quantum Computing

Read More..

Quantum Computing | Intel Newsroom

Quantum computing is an exciting new computing paradigm with unique problems to be solved and new physics to be discovered. Quantum computing, in essence, is the ultimate in parallel computing, with the potential to tackle problems conventional computers cant handle. For example, quantum computers may simulate nature to advance research in chemistry, materials science and molecular modeling.

In 2015, Intel established a collaborative relationship with QuTech to accelerate advancements in quantum computing. The collaboration spans the entire quantum system or stack from qubit devices to the hardware and software architecture required to control these devices as well as quantum applications. All of these elements are essential to advancing quantum computing from research to reality.

Download A Quantum Computing Primer

Intels director of quantum hardware, Jim Clarke, holds the new 17-qubit superconducting test chip. (Credit: Intel Corporation)

Intels 17-qubit superconducting test chip for quantum computing has unique features for improved connectivity and better electrical and thermo-mechanical performance. (Credit: Intel Corporation)

Researchers work in the quantum computing lab at QuTech, Intels quantum research partner in the Netherlands. Intel in October 2017 provided QuTech a 17-qubit superconducting test chip for quantum computing. (Credit: QuTech)

Professor Leo DiCarlo poses in the quantum computing lab at QuTech, Intels quantum research partner in the Netherlands. Intel in October 2017 provided QuTech a 17-qubit superconducting test chip for quantum computing. (Credit: QuTech)

Intel is collaborating with QuTech in the Netherlands to advance quantum computing research. Intel in October 2017 provided QuTech a 17-qubit superconducting test chip for quantum computing. (Credit: Intel Corporation)

Intels new 17-qubit superconducting test chip packaged for delivery to research partners at QuTech, Intels quantum research partner in the Netherlands. Intel in October 2017 provided QuTech with the 17-qubit superconducting test chip for quantum computing. (Credit: Intel Corporation)

Read more here:
Quantum Computing | Intel Newsroom

Read More..

Intel moves towards production quantum computing with new 17 …

Intels quantum computing efforts have yielded a new 17-qubit chip, which the company has just delivered to its partner in that field, QuTech in the Netherlands. Its not a major advance in the actual computing power or applications those are still in very early days but its a step towardproduction systems that can be ordered and delivered to spec rather than experimental ones that live in a physics lab somewhere.

Intels celebration of this particular chip is a bit arbitrary; 17 isnt some magic number in the quantum world, nor does this chip do any special tricks other quantum computer systems cant. Intel is just happy that its history and undeniable expertise in designing and fabricating chips and architectures is paying off in a new phase of computing.

I chatted with Intels director of quantum hardware, Jim Clarke, about the new system.

The test chip itself (the gold ports arent the qubits themselves, obviously)

Were relying on our expertise in hardcore engineering, he said. Were working on all parts of the compute stack: the chip, the control electronics, the system architecture, the algorithm.

Its not quite like popping out a new Core processor every year, but theres plenty of overlap.

Our infrastructure allows us to adapt the materials and the package, Clarke said. If you think of a material that might be good for a qubit chip, Intel likely already has a mature process for that material or at least experience with it.

That isnt easy when the field of computing theyre attempting to enter is largely theoretical. Thats why partners like QuTech, a research institute under TU Delft, are essential. Intel isnt short on big brains, but a dedicated facility under a major technical university is likely more fertile ground for this kind of bleeding-edge work.

The basic relationship is that Intel makes the chips, and QuTech tests them with the latest algorithms, models, and instruments. They turn around and say something like that was great, but well need at least 14 qubits to do this next thing, and we saw a lot of interference under such and such conditions. Intel jots it down and a few months later (theres no set timeline), out comes a new one, and the cycle repeats.

Im simplifying, of course, because I dont know the details of all this quantum tomfoolery (who can, really?), but thats a powerful cycle to nurture.

The results so far let Intel boast of a chip that, thanks to the companys manufacturing prowess and the work by QuTech, has considerably improved in reliability and performance over the last two years, while the architecture, system infrastructure (such as interconnects and testing methods) and so on have evolved alongside.

Of course, these amazing quantum computers still dont really do anything yet and they have to operate at around 20 thousandths of a degree above absolute zero. But the first problem is more exciting than limiting (the potential of these machines, theoretically, is enormous), and the second one, to my surprise, isnt really a big deal any more.

Turns out (perhaps you knew, but I didnt) that you can package a multi-qubit quantum computing system, cooled to the millikelvin level, in an enclosure the size of an oil drum.

Theres a long way to go in the quantum computing world, but its a no-brainer for companies like Intel to bet on the concept; its billions of dollars in infrastructure serve excellently for collateral.

Here is the original post:
Intel moves towards production quantum computing with new 17 …

Read More..

What will you actually use quantum computing for? | ZDNet

With a tip of the hat to our Big on Data bro George Anadiotis, this week, we’re breaking from our usual routine of the here and now to look at what’s coming next. Mention the words quantum computing, and your first impression is that we’re probably going to be spouting science fiction.

So what is quantum computing? It harnesses the physics of subatomic particles to provide a different way to store data and solve problems compared to conventional computers. Specifically, it totally turns the world of conventional binary computing on its side because quantum computing bits, or qubits, can represent multiple states at once, rather than just 0 or 1. The result is that quantum computers could solve certain HPC-like problems more efficiently.

Oh and by the way, did we mention that quantum computers must run at 4 degrees Kelvin? That’s 4 degrees above absolute zero, far colder than interstellar space.

It’s tempting to dismiss quantum computers as the computing equivalent of Warp Speed out of Star Trek. Then again, it was barely a few months ago where we saw SAS founder James Goodnight talking to Alexa to gin up a SAS analytics run in much the same way that Captain James T. Kirk spoke to his computers.

So why are we having this conversation?

Our attention was piqued by a chain of events over the past month. IBM first convened an analyst call around an upcoming article in the scientific journal Nature showing how a quantum computing modeling problem for complex molecular behavior would be documented in a Jupyter notebook. (If you want to get technical, it was about how to derive the lowest energy state of a molecule of beryllium hydride.)

Then Satya Nadella assembled a panel of Microsoft researchers to conclude his Ignite conference keynote with a session on pure theoretical physics that sailed straight over the heads of the business analyst and developer audience. Fortunately, the IBM call was way more plain spoken, addressing how quantum computers could be applied to common business problems, and where the technology stands today.

Turns out, quantum computers represent advances that would look familiar to veterans of big data analytics where you could query all of the data, not just a sample. It would also look familiar to those working with graph computing where you could factor the complexity of many-to-many relationships that would otherwise require endless joins with relational data models.

Quantum computing lends itself to any optimization problem where the combination of what-ifs, and all the permutations associated with them, would simply overwhelm a conventional binary computer. That lends itself to a large trove of mundane business and operational problems that are surprisingly familiar.

For instance, if you try to optimize a supply chain, chances are, you are narrowing down the problem to tackle the dozen most likely scenarios. With the resources of quantum computing, you could widen and deepen the analysis to virtually all possible scenarios. The same goes with tangible business challenges like managing financial risk when you have a complex tangle of interlocking trading systems across the globe. Or imagine, during drug testing, that a clinical research team could model all the potential interactions of a new drug with virtually the entire basket of medications that a specific patient cohort would be likely also be taking? And from there, could true personalized medicine be far behind?

But quantum computing development is still embryonic. A small Canadian startup, D Wave Systems, is selling units on a limited basis today. IBM is offering machines from of a half dozen 5 – 17 qubits in the cloud while Google is developing architectures that could scale up to 49. So it’s not surprising that quantum still hits the wall with classes of problems that require complex, iterative processing (which, by the way, is what Spark excels at).

A good example of the type of problem that for now is just out of reach is encryption/decryption. As the algorithms grow more complex, it means factoring larger and larger prime numbers. Turns out, the interactions between qubits (which is called quantum entanglement) could short-cut such problems by taking the square root of the number of entries, and reducing the number of steps accordingly. The bottleneck is memory; such computations would require storing of state or interim results, much like a Spark or MapReduce problem. The problem is that, while development of compute chips is underway, nobody yet knows what true quantum memory would look like.

That would imply that for some problems, a division of labor where quantum factors the permutations while conventional scale-out systems handle the iterative processing might be an interim (or long-term) step.

There are a surprisingly sizable number of organizations currently pursuing quantum computing. Right now, most of the action is basic government-funded R&D, although some reports estimate VC investment over the past three years amounting to roughly $150 million. On one hand, it would be easy to get overly optimistic on near-term prospects for development given the rate at which technologies as varied as smart mobile devices, Internet of things, big data analytics, and cloud computing have blossomed from practically nothing a decade ago.

But the barriers to adoption of quantum are both physical and intellectual.

There is the physical need to super-cool machines that, in eras past, would have posed huge obstacles. But the cloud will likely do for quantum machines what they are already starting to do for GPUs: provide the economics for scale-out.

That leaves several more formidable hurdles. The physics of scale out still require basic rather than applied research – we still need to figure out how to scale such a large, fragile system. But the toughest challenge is likely to be intellectual, as it will likely require a different way of thinking to conceptualize a quantum computing problem. That suggests that the onramp to quantum will likely prove more gradual compared to the breakout technologies of the last decade.

See the article here:
What will you actually use quantum computing for? | ZDNet

Read More..

Trumps DOJ tries to rebrand weakened encryption as responsible …

A high-ranking Department of Justice official took aim at encryption of consumer products today, saying that encryption creates “law-free zones” and should be scaled back by Apple and other tech companies. Instead of encryption that can’t be broken, tech companies should implement “responsible encryption” that allows law enforcement to access data, he said.

“Warrant-proof encryption defeats the constitutional balance by elevating privacy above public safety,” Deputy Attorney General Rod Rosenstein said in a speech at the US Naval Academy today (transcript). “Encrypted communications that cannot be intercepted and locked devices that cannot be opened are law-free zones that permit criminals and terrorists to operate without detection by police and without accountability by judges and juries.”

Rosenstein was nominated by President Donald Trump to be the DOJ’s second-highest-ranking official, after Attorney General Jeff Sessions. He was confirmed by the Senate in April.

Rosenstein’s speech makes several references to Apple, continuing a battle over encryption between Apple and the US government that goes back to the Obama administration. Last year, Apple refused to help the government unlock and decrypt the San Bernardino gunman’s iPhone, but the FBI ended up paying hackers fora vulnerabilitythat it used to access data on the device.

“Fortunately, the government was able to access data on that iPhone without Apple’s assistance,” Rosenstein said. “But the problem persists. Today, thousands of seized devices sit in storage, impervious to search warrants.”

“If companies are permitted to create law-free zones for their customers, citizens should understand the consequences,” he also said. “When police cannot access evidence, crime cannot be solved. Criminals cannot be stopped and punished.”

We asked Apple for a response to Rosenstein’s speech and will update this story if we get one.

Separately, state lawmakers in New York and California have proposed legislationto prohibit the sale of smartphones with unbreakable encryption.

Despite his goal of giving law enforcement access to encrypted data on consumer products, Rosenstein acknowledged the importance of encryption to the security of computer users. He said that “encryption is a foundational element of data security and authentication,” that “it is essential to the growth and flourishing of the digital economy,” and that “we in law enforcement have no desire to undermine it.”

But Rosenstein complained that “mass-market products and services incorporating warrant-proof encryption are now the norm,” that instant-messaging service encryption cannot be broken by police, and that smartphone makers have “engineer[ed] away” the ability to give police access to data.

Apple CEO Tim Cook has argued in the past that the intentional inclusion of vulnerabilities in consumer products wouldn’t just help law enforcement solve crimesit would also help criminals hack everyday people who rely on encryption to ensure their digital safety.

Rosenstein claimed that this problem can be solved with “responsible encryption.” He said:

Responsible encryption is achievable. Responsible encryption can involve effective, secure encryption that allows access only with judicial authorization. Such encryption already exists. Examples include the central management of security keys and operating system updates; the scanning of content, like your e-mails, for advertising purposes; the simulcast of messages to multiple destinations at once; and key recovery when a user forgets the password to decrypt a laptop.

No one calls any of those functions a “back door.” In fact, those capabilities are marketed and sought out by many users.

It’s not clear exactly how Rosenstein would implement his desired responsible encryption.

Rosenstein’s”key recovery when a user forgets the password to decrypt a laptop” reference seems to refer to Apple and Microsoft providing the ability to store recovery keys in the cloud. But users who encrypt Mac or Windows laptops aren’t required to do thisthey can store the keys locally only if they prefer. To guarantee law enforcement access in this scenario, people who encrypt laptops would have to be forced to store their keys in the cloud. Alternatively, Apple and Microsoft would have to change the way their disk encryption systems work, overriding the consumer’s preference to have an encrypted system that cannot be accessed by anyone else.

Rosenstein gave some further insight into how “responsible encryption” might work in this section of his speech:

We know from experience that the largest companies have the resources to do what is necessary to promote cybersecurity while protecting public safety. A major hardware provider, for example, reportedly maintains private keys that it can use to sign software updates for each of its devices. That would present a huge potential security problem, if those keys were to leak. But they do not leak, because the company knows how to protect what is important. Companies can protect their ability to respond to lawful court orders with equal diligence.

Of course, there are many examples of companies leaking sensitive data due to errors or serious vulnerabilities. The knowledge that errors will happen at some point explains why technology companies take so many precautions to protect customer data. Maintaining a special system that lets third parties access data that would otherwise only be accessible by its owner increases the risk that sensitive data will get into the wrong hands.

Rosenstein claimed that “responsible encryption can protect privacy and promote security without forfeiting access for legitimate law enforcement needs supported by judicial approval.” But he doubts that tech companies will do so unless forced to:

Technology companies almost certainly will not develop responsible encryption if left to their own devices. Competition will fuel a mindset that leads them to produce products that are more and more impregnable. That will give criminals and terrorists more opportunities to cause harm with impunity.

“Allow me to conclude with this thought,” Rosenstein said just before wrapping up his speech. “There is no constitutional right to sell warrant-proof encryption. If our society chooses to let businesses sell technologies that shield evidence even from court orders, it should be a fully-informed decision.”

Go here to see the original:
Trumps DOJ tries to rebrand weakened encryption as responsible …

Read More..

Cryptocurrency Alternatives to Bitcoin –

When you think of cryptocurrency, chances are the first thing that comes to mind is Bitcoin.

By now, Bitcoin is something that most people are aware of even if they arent exactly sure what it is.

Accepting cryptocurrencies can make sense for your business, whether you sell physical goods or whether youre a freelancer. But using a cryptocurrency as a medium of exchange doesnt mean you have to go with Bitcoin.

In fact, there are a surprising number of cryptocurrency alternatives to Bitcoin.

One of the cryptocurrency alternatives to Bitcoin thats gaining a lot of ground right now is Dash. Thats because Dash is open source and very centered on privacy.

On top of that, there are low fees that come with Dash. To tell the truth, most cryptocurrencies are going to come with lower fees than what you pay with bank and credit card transactions. However, there are cases where its even free to send Dash.

Its also nice that Dash is an instant peer-to-peer cryptocurrency. You dont have to worry about it because payments are private, and instantly appear to the person on the other side of the transaction anywhere in the world.

Theres a reason Dash is one of the most popular Bitcoin alternatives out there.

One of the oldest cryptocurrency alternatives to Bitcoin is Litecoin. This cryptocurrency has been around for several years. Interestingly enough, even though it is capable of handling a higher transaction volume than Bitcoin, it still isnt as well-known.

Litecoin makes use of open source software and the decentralized network makes use of mathematics for security. Litecoin also comes with some cool features:

Due to its availability and other features, its no surprise Litecoin is on the rise.

Peercoin is one of the most potentially inflationary cryptocurrency alternatives to Bitcoin. Theres a lot that goes into rewarding miners and there is no upward limit to how many will be mined. Minting uses Proof of Stake for security in the network, which means that Peercoin security is not impacted the same way that Bitcoin mining is when it comes to Selfish Mining.

Its also worth noting that Peercoin is derived from Bitcoin. So if you have hardware that works with the Bitcoin network, it will also work with Peercoin.

This is great for mining, but it can also allow you to accept payment for more than one cryptocurrency without the need to use different networks.

The peer-to-peer technology used for feathercoin is designed to create borderless payments. One of the cool things is that feathercoin is somewhat unique among cryptocurrency alternatives in that it has a number of features to really bypass banking.

In fact, feathercoin is working on open source projects for ATMs and Point of Sale equipment. Right now, it can be cumbersome to use cryptocurrencies. In many cases, its hard to use cryptocurrencies in real world transactions that take place offline.

That might change if feathercoins projects come to fruition. Physical, laser-etched coins and access to the cryptocurrency easily at Point of Sale terminals and ATMs really set this digital currency apart.

When it comes to mining, Quarkcoin offers the opportunity to just about anyone with a CPU. It doesnt give the advantage to special equipment or server farms. So, if you are looking for a way to mine a little bit more, the Quarkcoin can help.

Like Bitcoin and cryptocurrency alternatives to Bitcoin, Quarkcoin is peer-to-peer. You can make payments directly to the person you want to, almost instantly. Plus, there is a high level of security with Quarkcoin. Were talking nine rounds of hashing, as opposed to one hash used by most cryptocurrencies.

One of the interesting things about Digitalcoin is that it is accepted by a number of businesses. Sometimes, it can be difficult to find someone willing to accept your cryptocurrency payment if it isnt Bitcoin or Dash. Digitalcoin offers stability as well, with a block rewards produced at a lower rate than many other digital currencies.

Digitalcoin, like other cryptocurrencies, is decentralized and secure. You can send and receive the currency anywhere in the world, and its free to use.

This is another of the highly private cryptocurrency alternatives to Bitcoin. Stablecoin is distinguished by the fact that the transactions are not only encrypted, but also untraceable. While this currency isnt quite as well-used as many others, it is working hard to move forward, especially in China. If this catches in China, which is a huge economy, it could grow elsewhere.

The essential question is whether or not you should buy cryptocurrencies with the idea of capital appreciation in mind. Do you buy (or mine) these cryptocurrencies in the hope that you can sell them on an exchange and make a profit?

There are those who look at the widely-accepted Litecoin and refer to it as silver to Bitcoins gold. But does that really make sense in the long term?

While it can be tempting to think of cryptocurrencies as investments, the reality is that they might not be solid. Sure, mine cryptocurrencies. But they might be most useful as mediums of exchange. They are inexpensive, and blockchain technology allows for almost instant transfer so its possible to set up a low-cost global payment system.

The real value might be in the way Bitcoin and the way cryptocurrency alternatives to Bitcoin are changing the way we think about money and do business.

Some of the more interesting blockchain developments are Ethereum and Namecoin.

Ethereum is interesting because it is at once a digital currency and an application layer. If you are hoping to get involved with smart contracts, one of the best choices is Ethereum.

The decentralized, open source Ethereum allows developers to create their own applications. This includes smart contracts, as well as token systems. The systems can be used as part of the smart contract process. Its possible to layer on the applications using Ethereum, which means that this blockchain development could change the face of business.

Namecoin is another interesting blockchain development. Namecoin technology isnt about currencies and money. Its all about decentralizing the Internet itself. Namecoin is about increasing privacy, resisting censorship, and improving the security of the infrastructure of the Internet. This is an interesting open source project that could change the way the Internet itself works.

Innovation in the way we see money and the way we do business are the main results of blockchain technologies. Bitcoin really brought the blockchain and cryptocurrencies into the mainstream consciousness. However, what comes next in terms of the way we conduct business on a global scale could be even more exciting.

See original here:
Cryptocurrency Alternatives to Bitcoin –

Read More..

Learn BitCoin and master the world of cryptocurrency

It’s about time you learned how Bitcoin works.

Image: pixabay

By Team CommerceMashable Shopping2017-10-04 16:47:54 UTC

You may think youre too late to invest in cryptocurrency like Bitcoin and wear suits made of money but youre actually just in time.

Cryptocurrency has been growing in popularity, but only a tiny percentage (.01 percent) of people have gotten wind of how to make money from it. And thats not because that goal is out of reach. Those who know their way around cryptocurrency know that the high-risk investment has a huge potential for getting you up to your elbows in hundred-dollar bills. Not to mention, you dont have to worry about high bank fees or fluctuations based on government regulations.

Interested in being one of the .01 percent? The #1 cryptocurrency investment course can help you get there. Youll learn different buying strategies for making gains in the short, medium, and long terms and strategies for protecting the money you make. And since its not just about Bitcoin anymore, youll also learn which cryptocurrencies are worth investing in.

The course also gives you access to a private community of like-minded investors, so you can learn from others, get your questions answered, and get live updates on the market. Get the #1 cryptocurrency investment course for $15 here.

Go here to read the rest:
Learn BitCoin and master the world of cryptocurrency

Read More..

Cryptocurrency Flash Crash Is Said to Draw Scrutiny From CFTC …

A popular digital-coin exchange isdrawing scrutiny from U.S. regulators over a June flash crash that erased most of the value in thesecond-largest cryptocurrency before traders had time to blink their eyes.

The Commodity Futures Trading Commission has requested information from Coinbase Inc. about a June 21 incident on its GDAX platform in which the ether digital token suffered a precipitous drop, falling to 10 cents from $317.81 in milliseconds before quickly recovering, said two people familiar with the matter.

Among the issues the agency is focused on is what role leverage might have played in the plunge, as Coinbase allowed traders to use borrowed money to make bigger wagers than would have otherwise been possible, said the people, who asked not to be named because the review isnt public.

The CFTC inquiry is the latest sign that federal authorities aregrowing worried about a market with scattershot oversight that has attracted big money. Coinbase, which says it has served 10.6 million customers and facilitated $20 billion in digital currency transactions, is regulated by various states through a patchwork system.

Its not registered with the CFTC, the main U.S. watchdog of currency futures. Coinbase doesnt allow traders to buy and sell derivatives, and firms dont typically fall under the regulators direct jurisdiction unless they allow swaps trading. Coinbase does hold licenses with financial agencies in dozens of states, as well as Puerto Rico, according to its website.

The CFTC sent San Francisco-based Coinbase a letter with a list of questions, including queries about margin trading, one of the people said. Coinbase began offering margin accounts in March, as it sought to attract institutional investors by providing them loans to amplify their bets. The company disabled the service after the June crash.

As a regulated financial institution, Coinbase complies with regulations and fully cooperates with regulators, the company said in an emailed statement. After the GDAX market event in June 2017, we proactively reached out to a number of regulators, including the CFTC. We also decided to credit all customers who were impacted by this event. We are unaware of a formal investigation.

CFTC spokeswoman Erica Elliott Richardson declined to comment.

Coinbases ether plunge was caused by a single $12.5 million trade — one of the biggest ever — that prompted selling by other investors. The decline triggered automatic sell orders from traders whod requested to bail on the currency if prices dropped to certain levels, and led GDAX to liquidate some margin trades.

While the drop was dramatic, it was also temporary. Computer algorithms quickly started issuing buy orders that drove prices back up to $300 within 10 seconds.

Bitcoin and other cryptocurrencies have surged this year. But regulators and financial executives are concerned that investors are inflating a bubble thats destined to pop. South Korea banned margin trading in bitcoin and ether Sept. 29 after China earlier cracked down on digital currencies. Last month, JPMorgan Chase & Co. Chief Executive Officer Jamie Dimon likened cryptocurrencies to the infamous Dutch tulip bulb mania of the 17th Century.

The Securities and Exchange Commission has been grappling with how to police digital currencies. SEC Chair Jay Clayton warned lawmakers last week that initial coin offerings are probably full of fraud. The next day, the agency sued a company for misrepresentations tied to a bitcoin offering purportedly backed by diamonds and real estate.

One risk of allowing margin trades is that in a sharp market reversal, trading platforms could run into problems if investors cant repay the money theyve borrowed.

Thats what happened in January 2015 when the currency brokerage FXCM Inc. almost toppled after Switzerland shocked markets by letting its currency appreciate. When the franc jumped, FXCM customers lost more money than they had in their accounts, forcing the company to seek a $300 million bailout from Leucadia National Corp.

Coinbases GDAX had a margin funding limit of $10,000.To qualify, investors had to meet at least one of several qualifications laid out under federal law.

For instance, individuals can only trade with borrowed money if they have more than $5 million invested in various financial markets and are using their margin accounts purely to hedge risks. Individuals are exempt from the hedging requirement if they have more than $10 million invested. The rules are looser for institutional investors, such as hedge funds and corporations.

Last year, the CFTC sanctioned a different digital token market, Bitfinex, for allowing investors who didnt meet the $10 million threshold to make margin trades. Bitfinex also broke the law because it didnt deliver some bitcoins that investors had bought using leverage within a required timeframe, the CFTC said. Instead, it held the tokens in accounts that it owned and controlled, according to the regulator. Bitfinex agreed to pay $75,000 to settle the case, without admitting or denying the allegations.

Coinbase has suffered outages and other performance problems as its struggled to handle the surge in volume thats accompanied skyrocketing cryptocurrency prices. It has also faced a sharp increase in customer complaints. Almost 500 consumer grievances have been flagged about the company this year on a database maintained by the Consumer Financial Protection Bureau, compared with just six for all of 2016.

Coinbase and investors who use it to trade have piqued regulators interest in the past. In 2016,the Internal Revenue Service asked a court for permission to serve a summons against Coinbase, seeking records about taxpayers who have traded digital currencies.

It also has attracted prominent investors including Marc Andreessens venture capital firm and the New York Stock Exchange. In August, Coinbase received $100 million from a group led by Institutional Venture Partners, a Menlo Park, California-based venture capital firm.

With assistance by Nick Baker, and Matthew Leising

See original here:
Cryptocurrency Flash Crash Is Said to Draw Scrutiny From CFTC …

Read More..

Here’s what quantum computing is and why it matters

Researchers for IBM, Google, Intel, and others are in a fantastic scientific arms race to build a commercially viable quantum computer. They already exist in laboratories, and were only a few years away from the beginning of what may turn out to be an entire shift in how we think about computing.

A typical computer, like the one inside the phone or laptop youre reading this on, is a binary system, basically a yes/no device. The most amazing thing about computer programmers is how they can take something as basic and simple as a computer chip and spit out something like Microsoft Office by creating a series of if this, then that scenarios. This showcases how useful the computer is as a tool for humans to accomplish tasks.

The quantum computer, however, is an entirely difference concept the reason its quantum is that it doesnt use binary logic. By its nature a quantum computer is a yes/no/both device. When a developer makes a logic choice they arent limited by if this then that, they can also ask if this, then that or both and that makes all the difference in the world.

There are several instances where a binary computer cant feasibly solve a problem the way wed like to. When asked to solve a problem where every answer is equally likely, a binary computer has to take the time to individually assess each possibility. Quantum computers can assess more than one probability at a time, through something called quantum entanglement.

When two particles become entangled a phenomenon occurs where anything that happens to one of these particles happens to the other. Einstein called this spooky action at a distance, and he was spot-on. A lions share of the research thats been done in quantum computing since the 1980s has been focused on figuring out how to use quantum entanglement to our advantage.

The quantum internet of the future is also being built right now, with Chinese researchers making amazing strides in quantum communications.

A quantum internet would be unhackable as theres no transmission of data. Of course storage vulnerabilities will still exist, but by then our security will be handled by AI anyway. The weird and wonderful phenomena of entanglement means you can stick data in one side and it pops out the other like teleportation. Theres nothing swirling through the ether; whatever happens to one entangled particle instantly happens to another.

The technology is here already, but there are numerous challenges to overcome on the way to full-scale implementation. First, the quantum computing were capable of is still a bit behind the binary computing weve mastered. We also need to overcome physical concerns such as the fact that, in the IBM labfor example, the processors need to be kept at perfect-zero temperatures within hundredths of a degree.

Despite several incredible problems the outlook is very bright. Recent breakthroughs include the first ever space-based video call secured by quantum encryption.

The video call connected a Chinese scientist in Beijing with an Austrian scientist in Vienna. The distance between the two was over 4,000 miles. The communication was sent to a satellite in space then beamed back down to earth. Scientists have chosen to investigate the quantum network this way due to issues of signal loss through traditional methods of sending photons like fiber-optic cables.

These quantum encrypted communications would be impossible to hack using a binary computer. On the flip-side the successful completion of a commercially viable quantum computer may signal the end of binary-based encryption systems. Theoretically, a quantum computer could crack 128-bit encryption almost instantly given the same resources for computing power as any binary system, for example.

Perhaps the best way to look at the change that quantum computing represents is to compare it to binary computing in the exact same way you would compare the iPhone Xs capabilities with those of a Timex calculator watch from the 1980s.

Read next: 7 tips for using Snapchat like a millennial

View post:
Here’s what quantum computing is and why it matters

Read More..