Page 4,336«..1020..4,3354,3364,3374,338..4,3504,360..»

The Bizarre Quantum Test That Could Keep Your Data Secure – WIRED

Slide: 1 / of 1. Caption: Getty Images

At the Ludwig-Maximilian University of Munich, the basement of the physics building is connected to the economics building by nearly half a miles worth of optical fiber. It takes a photon three millionths of a secondand a physicist, about five minutesto travel from one building to the other. Starting in November 2015, researchers beamed individual photons between the buildings, over and over again for seven months, for a physics experiment that could one day help secure your data.

Their immediate goal was to settle a decades-old debate in quantum mechanics: whether the phenomenon known as entanglement actually exists. Entanglement, a cornerstone of quantum theory, describes a bizarre scenario in which the fate of two quantum particlessuch as a pair of atoms, or photons, or ionsare intertwined. You could separate these two entangled particles to opposite sides of the galaxy, but when you mess with one, you instantaneously change the other. Einstein famously doubted that entanglement was actually a thing and dismissed it as spooky action at a distance.

Over the years, researchers have run all sorts of complicated experiments to poke at the theory. Entangled particles exist in nature, but theyre extremely delicate and hard to manipulate. So researchers make them, often using lasers and special crystals, in precisely controlled settings to test that the particles behave the way prescribed by theory.

In Munich, researchers set about their test in two laboratories, one in the physics building, the other in economics. In each lab, they used lasers to coax a single photon out of a rubidium atom; according to quantum mechanics theory, colliding those two photons would entangle the rubidium atoms. That meant they had to get the atoms in both departments to emit a photon pretty much simultaneouslyaccomplished by firing a tripwire electric signal from one lab to the other. Theyre synchronized to less than a nanosecond, says physicist Harald Weinfurter of the Ludwig-Maximilian University of Munich.

The researchers collided the two photons by sending one of them over the optical fiber. Then they did it again. And again, tens of thousands of times, followed up by statistical analysis. Even though the atoms were separated by a quarter of a milealong with all the impinging buildings, roads, and treesthe researchers found the two particles properties were correlated. Entanglement exists.

So, quantum mechanics isnt broken which is exactly what the researchers expected. In fact, this experiment basically shows the same results as a series of similar tests that physicists started to run in 2015. Theyre known as Bell tests, named for John Stewart Bell, the northern Irish physicist whose theoretical work inspired them. Few physicists still doubt that entanglement exists. I dont think theres any serious or large-scale concern that quantum mechanics is going to be proven wrong tomorrow, says physicist David Kaiser of MIT, who wasnt involved in the research. Quantum theory has never, ever, ever let us down.

But despite their predictable results, researchers find Bell tests interesting for a totally different reason: They could be essential to the operation of future quantum technologies. In the course of testing this strange, deep feature of nature, people realized these Bell tests could be put to work, says Kaiser.

For example, Googles baby quantum computer, which it plans to test later this year, uses entangled particles to perform computing tasks. Quantum computers could execute certain algorithms much faster because entangled particles can hold and manipulate exponentially more information than regular computer bits. But because entangled particles are so difficult to control, engineers can use Bell tests to verify their particles are actually entangled. Its an elementary test that can show that your quantum logic gate works, Weinfurter says.

Bell tests could also be useful in securing data, says University of Toronto physicist Aephraim Steinberg, who was not involved in the research. Currently, researchers are developing cryptographic protocols based on entangled particles. To send a secure message to somebody, youd encrypt your message using a cryptographic key encoded in entangled quantum particles. Then you send your intended recipient the key. Every now and then, you stop and do a Bell test, says Steinberg. If a hacker tries to intercept the key, or if the key was defective in the first place, you will be able to see it in the Bell tests statistics, and you would know that your encrypted message is no longer secure.

In the near future, Weinfurters group wants to use their experiment to develop a setup that could send entangled particles over long distances for cryptographic purposes. But at the same time, theyll keep performing Bell tests to provebeyond any inkling of a doubtthat entanglement really exists. Because whats the point of developing applications on top of an illusion?

Read this article:
The Bizarre Quantum Test That Could Keep Your Data Secure - WIRED

Read More..

IBM makes a leap in quantum computing power – PCWorld

Thank you

Your message has been sent.

There was an error emailing this page.

IBM has some new options for businesses wanting to experiment with quantum computing.

Quantum computers, when they become commercially available, are expected to vastly outperform conventional computers in a number of domains, including machine learning, cryptography and the optimization of business problems in the fields of logistics and risk analysis.

Where conventional computers deal in ones and zeros (bits) the processors in quantum computers use qubits, which can simultaneously hold the values one and zero. Thisto grossly oversimplifyallows a quantum computer with a 5-qubit processor to perform a calculation for 32 different input values at the same time.

On Wednesday, IBM put a 16-qubit quantum computer online for IBM Cloud platform customers to experiment with, a big leap from the five-qubit machine it had previously made available. The company said that machine has already been used to conduct 300,000 quantum computing experiments by its cloud service users.

But thats not all: IBM now has a prototype 17-qubit system working in the labs, which it says offers twice the performance of the 16-qubit machine.

Quantum computing performance is hard to compare. Much depends on the quality of the qubits in the processor, which rely on shortlived atomic-level quantum phenomena and are thus somewhat unstable.

IBM is proposing a new measure of quantum computing performance that it calls quantum volume, which takes into account the interconnections between the cubits and the reliability of the calculations they perform.

The companys quantum computing division, IBM Q, has set its sights on producing a commercial 50-qubit quantum computer in the coming years.

Peter Sayer covers European public policy, artificial intelligence, the blockchain, and other technology breaking news for the IDG News Service.

See more here:
IBM makes a leap in quantum computing power - PCWorld

Read More..

Quantum Computers Sound Great, But Who’s Going to Program Them? – TrendinTech

While everyones in a rush to get to the end of the quantum computer race, has anyone really given a moment thought as to who will actually program these machines? The idea of achieving quantum supremacy came after Google unveiled its new quantum chip design and is all about creating a device that can perform calculation impossible for a conventional computer to carry out.

Quantum computers should have no trouble in outperforming conventional computers as they work on the basis of qubits. Unlike bits that run conventional computers and either a 0 or a 1, qubits can be both at the same time. This is a phenomenon known as superposition. But in order to demonstrate that thousands of qubits would be needed, and right now, thats just not possible. So instead of Google is planning to compare the computers ability to simulate the behavior of a random arrangement of quantum circuits and estimate it will take around 50 qubits to outdo the most powerful of computers.

IBM is getting ready to release the worlds first commercial universe quantum computing service later this year that will give users the chance to connect to one of its quantum computers via the cloud for a fee. But, there are still many hurdles to overcome before this technology becomes mainstream. One of these problems is that programming a quantum computer is much harder than programming a conventional computer. So, whos going to program them?

There are a number of quantum simulators available now that will help users get familiar with quantum computing, but its not the real thing and is likely to behave very differently. MIT physicist, Isaac Chuang, said, The real challenge is whether you can make your algorithm work on real hardware that has imperfections. It will take time for any computer programmer to learn the skills needed for quantum computing, but until the systems have been developed, what will they learn on?

This is one of the reasons for the push in making quantum devices more accessible. D-wave made available their Qbsoly and Qmasm tools earlier this year in an attempt to get more people into the realms of quantum computing. If the tools are available, more people will be tempted to have a go and budding quantum computer scientists will be born. And as Googles researchers wrote in a statement, If early quantum-computing devices can offer even a modest increase in computing speed or power, early adopters will reap the rewards.

More News to Read

comments

Originally posted here:
Quantum Computers Sound Great, But Who's Going to Program Them? - TrendinTech

Read More..

IBM builds two new Quantum Computing processors – Enterprise Times

IBM has successfully built and tested two new universal quantum computing processors. The first one is aimed at developers, researchers and programmers. Many of these have been working with the APIs IBM released for the IBM Quantum Experience in March. They will now be able to move from testing their applications on a five quantum computing (qubit) processor to one with 16 qubits. This is a significant jump forward and it will be interesting to see if this leads to an acceleration in academic papers on the benefits from quantum computing.

The second processor is arguably far more interesting. IBM delivered a roadmap for its quantum computing journey that would see 50 qubit systems by 2020. It said that early access IBM Q commercial partners would get a new quantum chip this year and it is delivering them the prototype of that chip. It will have 17 qubits and a number of other technologies. At present IBM is talking about its current quantum computers as being accelerators to classical computers.

The delivery of this chip raises two questions. The first is what level of performance are early access customers getting from using a 17 qubits accelerator? Secondly, is the 17 qubit system capable of being a standalone solution and, if so, what level of classical computing is it comparable to?

According to Arvind Krishna, senior vice president and director of IBM Research and Hybrid Cloud: The significant engineering improvements announced today will allow IBM to scale future processors to include 50 or more qubits, and demonstrate computational capabilities beyond todays classical computing systems. These powerful upgrades to our quantum systems, delivered via the IBM Cloud, allow us to imagine new applications and new frontiers for discovery that are virtually unattainable using classical computers alone.

Interestingly this has changed since the March announcement. Back then IBM was targeting five areas, all of which it has invested a lot of money in solving with existing computer technology. The five were:

The latest list compresses this into four areas with Business Optimisation showing that this is more than just a research project. The four areas listed in this release are:

What is not known is how many of the 300,000 experiments already run on the IBM Quantum Experience are aimed at these target areas. It would be useful to know how quickly current researchers and business partners are testing the limits of quantum computing.

IBM set out its roadmap to say that by 2020 it expected to deliver commercial scale quantum computing through the IBM Cloud. It is adopting the same model for this as it has with IBM Watson. Pick key industry areas and then use the cloud to deliver it. Customers dont have to justify hardware costs while research teams will be quick to experiment with the technology.

So far there has been limited feedback from the early adopter customers who took to the platform in March. That they have had a significant upgrade since then will no doubt please many of them. However, there is a need to understand just how much quantum computing is delivered as an accelerator. IBM is about to deliver IBM POWER 9 to the market and all the public presentations show that the key focus of that processor is the accelerator technology.

The biggest indicator of where we are in terms of a commercial deployment will only arrive when we get details of how well a 17 qubit system compares with current technology. Until there is a clear indication of where it sits in terms of performance it is hard to assess how close we are to IBM reaching its target of delivering a quantum computer that is more powerful than any other computer.

IBM is heading into a large number of launches this year. IBM Power Systems will launch the POWER9 processor. There is also a refresh due for the z Systems mainframes with the likelihood that this will not just focus on z14 but also on a new generation of LinuxONE boxes. In the background IBM is making blockchain announcements at an almost weekly rate. There is a danger that some of the questions over quantum computing will get lost in the noise which could be why IBM has made this announcement now.

Whatever the reason, we are moving faster and faster towards a whole new computing paradigm.

See original here:
IBM builds two new Quantum Computing processors - Enterprise Times

Read More..

This week’s poll: Priorities for improving internet security – The Engineer

What is the most important target for investment to guard against incidents like last weeks cyberattacks?

The WannaCry ransomware attacks last Friday caught everyone unawares, and institutions around the world not least, our very own National Health Service are still assessing the damage and picking up the pieces. Attacks like these will almost always come without warning, and there seems to be widespread agreement that money needs to be spent to help prevent malware and hacking from damaging valuable and safety-critical systems and putting them out of action. As yet, however, there doesnt seem to be much clear consensus on the most important target for such investment, so were making that the subject of this weeks poll.

Should the main priority be on hardware; that is, replacing vulnerable IT systems? Should it be on updating software, such as the Windows XP operating system which most of the affected computers were running; or dedicated anti-malware programs to detect attacks and deflect them? Or should the priority be skills: training systems administrators in the best ways to handle malware attacks and mitigate their effects if necessary?

Comments are, of course, welcomed and as ever, particularly useful if you chose the None of the Above option. We will post the results of this survey on this page on 23 May.

Continued here:
This week's poll: Priorities for improving internet security - The Engineer

Read More..

Akamai Releases First Quarter 2017 State of the Internet / Security Report – PR Newswire (press release)

Download the latest State of the Internet / Security Report for data, analysis, and graphics at akamai.com/stateoftheinternet-security.

"If our analysis of Q1 tells us anything, it's that risks to the Internet and to targeted industry sectors remain and continue to evolve" said Martin McKeay, senior security advocate and senior editor, State of the Internet / Security Report. "Use cases for botnets like Mirai have continued to advance and change, with attackers increasingly integrating Internet of Things vulnerabilities into the fabric of DDoS botnets and malware. It's short sighted to think of Mirai as the only threat, though. With the release of the source code, any aspect of Mirai could be incorporated into other botnets. Even without adding Mirai's capabilities, there is evidence that botnet families like BillGates, elknot, and XOR have been mutating to take advantage of the changing landscape."

Highlights from Akamai's First Quarter, 2017 State of the Internet / Security Report include:

DDoS Attacks

Web Application Attacks

Top Attack Vectors

A complimentary copy of the Q1 2017 State of the Internet / Security Report is available for download at akamai.com/stateoftheinternet-security. Download individual figures, including associated captions, here.

About Akamai As the world's largest and most trusted cloud delivery platform, Akamai makes it easier for its customers to provide the best and most secure digital experiences on any device, anytime, anywhere. Akamai's massively distributed platform is unparalleled in scale with over 200,000 servers across 130 countries, giving customers superior performance and threat protection. Akamai's portfolio of web and mobile performance, cloud security, enterprise access, and video delivery solutionsare supported by exceptional customer service and 24/7 monitoring. To learn why the top financial institutions, e-commerce leaders, media & entertainment providers, and government organizations trust Akamaiplease visitwww.akamai.com,blogs.akamai.com, or @Akamai on Twitter.

Contacts:

Rob Morton

Tom Barth

Media Relations

Investor Relations

617-444-3641

617-274-7130

rmorton@akamai.com

tbarth@akamai.com

To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/akamai-releases-first-quarter-2017-state-of-the-internet--security-report-300458077.html

SOURCE Akamai Technologies, Inc.

http://www.akamai.com

View original post here:
Akamai Releases First Quarter 2017 State of the Internet / Security Report - PR Newswire (press release)

Read More..

Do Macs get viruses? – PC Advisor

Do Apple Macs need security software, or is the Mac OS safe to use without antivirus? Do Macs even get viruses? We explore the issues surrounding Macs and security software. Why Macs don't need security software (mostly), but why you should still have antivirus on your Mac

By Matt Egan | 15 May 17

The question of whether Macs need antivirusis not a new one, but the answer is changing. While Macs are generally more secure than Windows PCs, they are far from immune.

Head over to the forums on the Apple's website and you'll find the same answer time and again - 'you don't need security software because I haven't got security software and it's never been a problem'.

The scientists among you will recognise this as a confusion of cause and effect. It's also a simplification of a complex issue.

2017 has already seen multiple reports that suggest Macs are less secure than they once were. Business Insider claims that Macs are now more vulnerable to viruses and attack than even Windows PCs. And Fortune has warned of Mac malwarethat can freeze Apple computers.

Here then, are reasons for and against the suggestion that Macs don't require antivirus.

There are no technical reasons whythe Mac OS cannot be targeted by cybercriminals. Indeed, there are exploits in the wild: albeit they are principally Trojans, and require a user to erroneously install them.

Yes, criminals target the lowest hanging fruit, it is harder to target Macs, and the numbers of Mac users are relatively small, but that situation could change.

Windows is becoming more secure -Windows 8and Windows 10arethe most secure Windows ever - and the Mac market share in wealthy western countries is around 20-30 percent.

When you consider that Macs are expensive, and so their owners tend to be wealthier than the average PC users, they start to look like an attractive target.

I'm not scare-mongering - the threat is not there in any significant scale. But someday it could be, and that may make AV a worthwile investment.

It's also worth remembering that the end user is always the weakest link. In many ways security software exists to save you from bad decisions - installing apps that appear to offer something for nothing, but turn out to be spyware or viruses.

Even Mac users can fall victim in this way. So for the price of a cup of coffee each week, it makes sense to install security software and then forget all about it.

Our colleagues over at Macworld UK have the definitive guide to the best antivirus for Mac. It is regularly updated, and is worth checking out as some of the best Mac antivirusproducts are free. Right now the number one recommended product isBitDefender Antivirusand the number two isNorton Antivirus.

What follows is the counter argument as to why Macs don't need antivirus, but we still think it's better to be safe than sorry.

Quite simply, because all the evidence suggests they don't. I've had an unprotected Mac connected to the web for nearly 15years, and I have never had a problem. Why this is the case is worth investigating, however.

The argument most often put forward is a simple one of market economics: because Apple's global market share is in single figures, criminals go after the bigger shoals of fish in the Windows world.

There is something in this - virtually all current malware exists to generate cash for criminals. Crooks are not known for their application or invention, so the biggest, easiest target gets all the attention.

In practice, cybercriminal gangs are focused exclusively on Windows because there are more Windows users, yes, but also because Windows is still easier to hack.

As a Unix-based operating system the Mac OSis by its very nature sandboxed. It's like having a series of fire doors - even if malware gains access to your Mac, it is unable to spread to the heart of the machine.

Macs are not unhackable, but they are more difficult to exploit than are Windows PCs. So just as a burglar could break into a house with an alarm system but will probably choose the unprotected dwelling next door, a Mac makes a less attractive target in a world in which only attractive targets tend to be attacked.

The most recent versions of macOS - everything sinceOS X 10.8 Mountain Lion - take this even further. Theyinclude the GateKeeper function that by default prevents Mac users from installing anything other than Apple-approved software.

The existence of the Mac app store means that Apple computer users can install software with total peace of mind.

And the lack of Java and Flash plugins removes the temptation to install fake versions of both - previously the principal vectors of infection for Macs.

I'd say that if you are using your Mac at home, mostly for non-business purposes, you can close this article and continue to operate without security software. Yes, it is a risk. But using the internet is a risk, and in my considered view running a Mac without AV is a worthwhile calculated risk.

There are exceptions, however. If you are running a business with a fleet of Macs, or a network of both Macs and Windows PCs, I'd suggest getting in some protection. It's a belt and braces approach that may not be necessary, but if you have a lot to lose it's a small price to pay for peace of mind.

You may also consider using antivirus on your Mac if for some reason you could be targeted individually - if you have access to sensitive or high-value data, for instance. If you do choose to buy Mac antivirus, take a look at the reviews roundup put together by our colleagues on Macworld:Best Mac antivirus software.

More here:
Do Macs get viruses? - PC Advisor

Read More..

Encryption | Android Open Source Project

Encryption is the process of encoding all user data on an Android device using symmetric encryption keys. Once a device is encrypted, all user-created data is automatically encrypted before committing it to disk and all reads automatically decrypt data before returning it to the calling process. Encryption ensures that even if an unauthorized party tries to access the data, they wont be able to read it.

Android has two methods for device encryption: full-disk encryption and file-based encryption.

Android 5.0 and above supports full-disk encryption. Full-disk encryption uses a single keyprotected with the users device passwordto protect the whole of a devices userdata partition. Upon boot, the user must provide their credentials before any part of the disk is accessible.

While this is great for security, it means that most of the core functionality of the phone in not immediately available when users reboot their device. Because access to their data is protected behind their single user credential, features like alarms could not operate, accessibility services were unavailable, and phones could not receive calls.

Android 7.0 and above supports file-based encryption. File-based encryption allows different files to be encrypted with different keys that can be unlocked independently. Devices that support file-based encryption can also support a new feature called Direct Boot that allows encrypted devices to boot straight to the lock screen, thus enabling quick access to important device features like accessibility services and alarms.

With the introduction of file-based encryption and new APIs to make applications aware of encryption, it is possible for these apps to operate within a limited context. This can happen before users have provided their credentials while still protecting private user information.

Follow this link:
Encryption | Android Open Source Project

Read More..

NuCypher is using proxy re-encryption to lift more enterprise big data into the cloud – TechCrunch

After spending time at a London fintech acceleratorlast year, enterprise databasestartup ZeroDB scrapped its first business plan and mapped out a new one. By January this year it had a new name: NuCypher. It was no longer going to try to persuadeenterprises to switch out their Oracle databases but rather to sell them on a specialized encryption layerto enhance their ability to perform big data analytics by tapping into the cloud. Its slogan: body armor for big data.

Today itslaunching an open source version of its general releaseproduct here at TechCrunch Disrupt New York. At this point, the almost 1.5-year-old startup is also running a handful of pilots with major banks, says co-founder MacLane Wilkison.

Its a combination of cloud and big data, he says of the underlying drivers which the team reckons are creatinga need for the technology. Now all of a sudden youre workingin computing environments that are distributed across hundreds or thousands of machines, and that could be spanning both some on-prem, some private and evenpublic cloud. And that sort of scenario presents a lot of new and different security challenges.

Instead ofbuilding an open source end-to-end encrypted database, NuCypher is selling a proxy re-encryption platform forcorporates with large amounts of sensitive data stored in encrypted databases to letthem securely tap into the power of cloud computing. An idea that might need a bit of explaining to appreciate, but one thats grounded in a genuineneed at least based on what NuCyphers earlybanking partners are telling it.

On the competitors front Wilkison names the likes of HP-owned Voltageand Protegrityas the largest existing players in the space. Albeit, he says theyre both doing tokenization of data, whereasNuCypher reckons proxy re-encryption technologyoffers greater security for certain types of data.

Unlike some other approaches to processing big data in the cloud, heemphasizesthat NuCypher is not using tokenization to mask any data arguing this is necessary for the target customers because certain types of data when masked with tokens can be vulnerable to statistical attacks.

While proxy re-encryption is an existing area of cryptography, applying it to big data is whats novel here, according toWilkison, who saysthe tech has mostly been used in academia thus far. Were the only people that applied it to big data platforms like Hadoop and Spark, he says. As far as I know were the only one using proxy re-encryption in business.

So while the teamsearly ideas focused mostly onlooking at data archiving and encryption to enable banks to make use of cloud storage, he says the businesswas pulled onto its currentrails afterbanks asked if theycouldapply the encryption tech the team hadbeen building for data archiving to big data cloud processing.

Safe to say, this mini pivot is a familiar story for enterprise startups after all, who knows better the businessneeds than the target customers?

When we originally started the company, my co-founder and I had built an open-source database and then an encrypted database that allows you to operate unencrypted data without sharing encryption keys with the database server What the banks were particularly interested in was taking some of what we had built for that and applying it to more compute-heavy type of workloads, says Wilkison.

After a period of talking to customers we took some of what we had built for that and made it into a more generalized encryption layer for different platforms specifically for the big data space. So Hadoop, Kafka and Spark.

So what is proxy re-encryption aka NuCyphers secret sauce, as Wilkison putsit and why isthe technique useful for banks?

Proxy re-encryption is a set of encryption algorithms that allow you to transform encrypted data. Specifically it allows you to re-encrypt data so you have data thats encrypted under one set of keys, you can re-encrypt the data without de-encrypting it first, so that now its encrypted under a second, different set of keys,is how Wilkisonexplains it.

He gives the example of a person who has some encrypted files stored in Dropbox. If they want to share the files with someone else that could be achievedby downloading them, decrypting them with their key and then re-encrypting them with the public key of the person theywant to share with. But obviously at scale thats a pretty network-intensive and cumbersome process.

Even more naively, this personcould just share theirprivate encryption key with the person theywant to share the file with. But then theyreabandoning all control of theirsecurity.

Clearly neither scenario is ideal for NuCyphers target customers with their vast lakes of sensitive, highly regulated data. This is where NuCypher reckons proxy re-encryption can step in to offer an edge.

What I can do with proxy re-encryption thats much more elegant and secure than either of those alternatives is I can basically delegate access to my encrypted data to someone elses public key, he adds.

The platform creates a re-encryption token off of the public key of the entity with whom its customers wants to share data. That token can then be uploaded to the cloud where the third partycan access it in turn enabling them to decrypt and access the data.

Wilkison says re-encrypted tokens can be created and used todelegate access to as many people as I like.

Ensuring compliance with regulations around the processing of sensitive data data such as a bank or healthcare company might hold is one key selling point for the platform.

Hepoints to a regulation like HIPAA, which sets standards for protecting healthcare data, as one example where a lot of care is needed when handling data toensure compliance. He also flags upthe European Unions incoming GDPR (General Data Protection Regulation), which ramps up penalties for violations of rules on processing citizens personal data, as another instance ofdata-centric lawscreating data processing pain-points thatNuCyphers platform is setting out tofix.

Other target data-laden industries couldinclude telecoms and insurance, though the team has kicked-offfocusing on financial services, and the current pilot phase of the platform is with major banks.

Wilkison saysthere are essentially three main use-cases for the platform:

Another benefit henotes isthat NuCyphers proxy re-encryption technology enables itto givecustomers the ability to manage access controls without needing to provide full access to the data which meansit canremove any single point of failure (i.e. via an admin who has to have full access control to all of the data).

With NuCypher a hacker would have to hack into each node individually in order to get all the data, he adds.

Given the complexities of the technology, customer education is clearly one of the big challenges, with Wilkison saying thisboils down toexplaining how the approach differs from standard encryption.

And on that front, he says one selling point for the platform is that theproxy re-encryption tech works with NIST standardized encryption algorithms. Which means NuCypher customersdont have to abandon the tried and tested encryption algorithms theyre comfortable using, such as AES-256, in order to make use ofthe tech.

That was one of the pieces that we added that took a pretty significant amount of research to develop for us to get proxy re-encryption to work with things likeECIES, which is a standard elliptic curve, NIST-certified, he notes.So we can go to a customer and say, everything that were doing on a crypto level is very standardized, very well understood by industry. So theyre not having to rely on newly rolled crypto.

NuCyphers platform exists as an SDK and an encryption library, so its business model is licensing the software its not hosting any data itself, confirmsWilkison; customers can install the softwareon premise, such as within an existing Hadoop deployment, or directly in the cloud on the infrastructure theyre managing.

Funding-wise, the teamhas raised a $750,000 seed round to date, from Valley investors including Base Ventures, NewGen Capital and some angels. It also went throughY Combinator last summer. Wilkison says it will be looking to raise again in Q3 this year.

How big do they reckonthis market is? Wilkison says hes hoping the current six to seven pilot customers of NuCypher will turn into high double digit or maybe low triple digits in a years time. But with those target large enterprises typically spending vastamounts of money on securely storing the sensitive data theyre entrusted with,theres also a very sizeableincentive for them toshift some of that compute load into the cloud.And, potentially, a lot of money at stake if NuCypher can convince them to buy in.

NuCypher presents at Startup Battlefield at TechCrunch Disrupt NY 2017

NuCypher presents at Startup Battlefield at TechCrunch Disrupt NY 2017

NuCypher presents at Startup Battlefield at TechCrunch Disrupt NY 2017

NuCypher presents at Startup Battlefield at TechCrunch Disrupt NY 2017

NuCypher presents at Startup Battlefield at TechCrunch Disrupt NY 2017

NuCypher presents at Startup Battlefield at TechCrunch Disrupt NY 2017

Judges Q&A

Q: Can you talk a bit more about how far along you are with some of the early clients? A: Were in pilot stage right now. The bulk of our early customers are in financial services. Were starting to get traction in healthcare and telcos as well. Pilot phase at this stage.

Q: Tell me a bit more on the competition A: Theres a couple of ways to look at this. One: the platforms that we support do have some native data protection built in. So Hadoop for example. These tend not to be robust enough for the types of enterprise customers that were working with. Other alternatives include data masking and tokenization. HP Voltage for example.

Q: You worked before at Morgan Stanley. Why did you leave a steady job with nice salary and Wall Street and went into this kind of adventure? A: Ultimately I wanted to get back to a more technical role, and actually start building a product in a company again as opposed to building financial models and pitch decks

Q: And this is the actually launching of the product? A: Were launching the open source version. Weve had Hadoop available for a while. And then Kafka is launching as well

Q: What did your mother say when you told her that you were leaving Morgan Stanley for this adventure? A: She was supportive. Although maybe didnt quite understand what we were doing

Q: Can you tell me more about the implementation? What does it look like as you deploy to enterprise how do you get all of their existing data encrypted and how do you do key management? A: On the key management side we actually integrate with hardware security modules so at lots of banks we use HSM from vendors like Thales or SafeNet.

For Hadoop we encrypt at the HFS layer. And everything is transparent to applications running on top of Hadoop, so it doesnt change the experience for someone running Hive queries for example.

And we also integrate with access control tools like Ranger and Sentry. So people can keep using the standard tools that they use.

Q: Is your business a classic SaaS model? A: Were not hosting anything. Its not software as a service. We have term-based subscriptions, and then also a consumption-based model for cloud deployments.

Q: How do you intend to go to market? Sales force? direct sales? A: Some combination of direct sales, which weve done today, and then also the channel partners and big data vendors and the cloud service providers as well, folks like Amazon and Microsoft.

Q: Who are your main competitors? A: The data masking and tokenization companies are the one we run into most regularly. Voltage which is now part of HP. In Europe we see a company called Protegrity pretty frequently. And then as I mentioned before a lot of the underlying platforms will have some sort of protection tools natively.

Q: Do you run into people like Ciphercloud or Ionic? A: Not so much anymore. Were similar in some ways to them were more focused on infrastructure like Hadoop and data platforms

Q: How many people are you now? A: Were the two founders and then seven people total on the team

Q: And how much money did you raise? A: Weve raised $750K so far from Y Combinator, NewGen Capital and Base Ventures

Q: How long ago? A: Last fall

Q: How hard would it be for your competitors to replicate the work that youve done? A: Certainly its a lot easier now that its open source That said we do have an open core approach so we have certain enterprise features that are still proprietary that are only available in the enterprise version. Additionally if the Hadoop vendors integrated what were doing natively into Hadoop thats still just for Hadoop.

So NuCyphers meant to be layered, it sits across all of the organizations big data platforms. Right now theyve use Hadoop, Kafka, Spark. In the future that could include some new SQL databases, and potentially structured databases as well

Q: Judging from your experience with your colleague how do you compare the American level of mathematics and physics to the Russian one? A: The American approach is lacking. Im hugely impressed. Not only is my co-founder Russian educated, and Russian born, a lot of our engineers are as well, so weve been very fortunate in that regard

See the article here:
NuCypher is using proxy re-encryption to lift more enterprise big data into the cloud - TechCrunch

Read More..

How to make Fully Homomorphic Encryption "practical and usable … – Network World

Thank you

Your message has been sent.

There was an error emailing this page.

By Bob Brown

News Editor, Network World | May 15, 2017 11:39 AM PT

Fully Homomorphic Encryption (FHE) for years has been a promising approach to protecting data while its being computed on, but making it fast enough and easy enough to use has been a challenge.

The Intelligence Advanced Research Projects Activity, which has been leading the Department of Defenses examination of this topic, recently awarded research and development firm Galois a $1M contract to explore ways to bring FHE to programmers.

The goal, says Galois Principal Investigator Dr. David Archer, is making FHE practical and usable, and his outfit is working with researchers at the New Jersey Institute of Technology on this front via the Rapid Machine-learning Processing Applications and Reconfigurable Targeting of Security (RAMPARTS) initiative.

While its up to IARPA as to what becomes of the researchers' output, Archer says he wouldnt be surprised if it goes the open source route.

Just to step back, Archer describes FHE as a variant of Somewhat [or Semi] Homomorphic Encryption, and allows for computing a function on data that is encrypted so that the data is never in the clear while the computation is going on. It produces an encrypted result that a user with the right key can decrypt. The first live construction of FHE showed up from IBM in 2009, whereas Somewhat Homomorphic Encryption has been around since the 1970s and 80s.

FHE allows for conducting more complex functions than Somewhat Homomorphic Encryption.

"There's more and more data available," Archer says. "And people are recognizing, maybe not for the first time, that it's important to keep that data private, yet it would be great if we could get utility out of it."

A researcher studying the opioid crisis, for example, might benefit from running an analysis on private healthcare data, but the owner of that data might not feel comfortable providing access due to privacy concerns. FHE, if trusted by both parties, could allow a way for researchers to make use of data without actually seeing the original information.

Dr. David Archer, Principal Investigator at Galois, says FHE could boost network management

Archer even cites an example that could apply to IT networking pros, such as if you wanted to study network data traces from service providers to possibly spot attacks from new adversaries.

But he acknowledges that even a promising technology like FHE could be a tough sell.

"There are a couple of problems you need to get people past," he says. "One is belief that the data is never going to be in the clear, because there's still a bit of black magic sound to this... Another challenge is you have to think beyond just the thing you want to compute, you also have to think about what does it reveal? Because none of these things can hide the output. So if someone was to be a little malicious and said the analysis I want to run is just show me the data, then that wouldn't accomplish this privacy preservation that the data owners would want. So there's also this negotiation challenge between the researcher and data owner."

Statutory regulation is one more challenge, as there could be statutes that don't even let you share encrypted data, Archer says.

For now, Galois and New Jersey Institute of Technology are building a prototype that lets an analyst using the Julia language write programs that run functions on FHE data just as they would any other programs, except they'd tag these functions to deal with encrypted data. "Think of it as a transparent homomorphic subsystem that can run in a Julia environment," Archer says.

The reality is that "your mileage may vary" when using FHE, which is still very limited in production applications and still needs plenty of work on optimization, Archer says. So it might work quite well on certain functions like linear regressions, but be terribly inefficient on other operations, he says. However, Archer says the future for FHE is bright. About 5 years ago, using FHE was 10 to 12 orders of magnitude slower than computing in the clear, whereas a couple of years ago that had been improved by 6 or 7 orders of magnitude, he says. We're still not near real-time processing, but FHE is definitely getting faster, Archer says.

MORE: Dark thinking on IoT and exploding enterprise networks

Bob Brown is a news editor for Network World, blogs about network research, and works most closely with our staff's wireless/mobile reporters. Email me at bbrown@nww.com with story tips or comments on this post. No need to follow up on PR pitches via email or phone (I read my emails and will be in touch if interested, thanks)

Sponsored Links

Continue reading here:
How to make Fully Homomorphic Encryption "practical and usable ... - Network World

Read More..