Page 4,022«..1020..4,0214,0224,0234,024..4,0304,040..»

Will quantum computing overwhelm existing security tech in the near future? – Help Net Security

More than half (54%) of cybersecurity professionals have expressed concerns that quantum computing will outpace the development of other security tech, according to a research from Neustar.

Keeping a watchful eye on developments, 74% of organizations admitted to paying close attention to the technologys evolution, with 21% already experimenting with their own quantum computing strategies.

A further 35% of experts claimed to be in the process of developing a quantum strategy, while just 16% said they were not yet thinking about it. This shift in focus comes as the vast majority (73%) of cyber security professionals expect advances in quantum computing to overcome legacy technologies, such as encryption, within the next five years.

Almost all respondents (93%) believe the next-generation computers will overwhelm existing security technology, with just 7% under the impression that true quantum supremacy will never happen.

Despite expressing concerns that other technologies will be overshadowed, 87% of CISOs, CSOs, CTOs and security directors are excited about the potential positive impact of quantum computing. The remaining 13% were more cautious and under the impression that the technology would create more harm than good.

At the moment, we rely on encryption, which is possible to crack in theory, but impossible to crack in practice, precisely because it would take so long to do so, over timescales of trillions or even quadrillions of years, said Rodney Joffe, Chairman of NISC and Security CTO at Neustar.

Without the protective shield of encryption, a quantum computer in the hands of a malicious actor could launch a cyberattack unlike anything weve ever seen.

For both todays major attacks, and also the small-scale, targeted threats that we are seeing more frequently, it is vital that IT professionals begin responding to quantum immediately.

The security community has already launched a research effort into quantum-proof cryptography, but information professionals at every organization holding sensitive data should have quantum on their radar.

Quantum computings ability to solve our great scientific and technological challenges will also be its ability to disrupt everything we know about computer security. Ultimately, IT experts of every stripe will need to work to rebuild the algorithms, strategies, and systems that form our approach to cybersecurity, added Joffe.

The report also highlighted a steep two-year increase on the International Cyber Benchmarks Index. Calculated based on changes in the cybersecurity landscape including the impact of cyberattacks and changing level of threat November 2019 saw the highest score yet at 28.2. In November 2017, the benchmark sat at just 10.1, demonstrating an 18-point increase over the last couple of years.

During September October 2019, security professionals ranked system compromise as the greatest threat to their organizations (22%), with DDoS attacks and ransomware following very closely behind (21%).

See original here:
Will quantum computing overwhelm existing security tech in the near future? - Help Net Security

Read More..

Quantum expert Robert Sutor explains the basics of Quantum Computing – Packt Hub

What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?

For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.

If we could do this at scale, we might be able to compute the molecules we want and need.

These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.

Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?

This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.

Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.

This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is

295, 000, 000, 000, 000, 000, 000 molecules.

A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.

These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.

To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.

Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?

Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:

10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.

And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.

How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.

We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.

However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.

In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.

I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.

Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.

When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.

If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.

If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.

If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?

I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.

There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.

To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.

A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.

A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time

This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.

Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.

Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:

As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.

In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.

To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.

Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.

Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing

Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions

Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases

See the original post:
Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub

Read More..

Traditional cryptography doesn’t stand a chance against the quantum age – Inverse

Quantum computers will make easy work of our current encryption systems, putting some of the worlds most sensitive data at risk. And John Prisco, CEO of the security company Quantum Xchange, tells Inverse that the time for new encryption is already here.

Traditional cryptography relies on a system of public and private encrypted keys that protect data by creating a decryption process that relies on solving incredibly complex math. Namely, the factoring of prime numbers. For todays computers, trying to solve the answer through brute force (e.g. guessing as many different answers as possible) would be nearly impossible. But for quantum computers, such computational hurdles would be trivial.

Before computers were as powerful as they are today, that [kind of cryptography] was going to be good for a million years, says Prisco. [But] a million years got truncated into just a handful of years.

But such computational might, for the time being, is still fairly theoretical. Google was only able to achieve quantum supremacy (a benchmark that compares its computational abilities to a classical computer) this year and quantum systems are far from office staples. Yet, Prisco tells Inverse that waiting until these machines become more widespread to begin improving our encryption methods would be too late.

People are stealing data today and then harvesting [and] storing it, says Prisco. And when they crack the key, then theyve got the information. So if you have data that has a long shelf life, like personal information, personnel records, you really cant afford to not future proof that.

And government agencies says Prisco, are worried about this too. In 2017 NIST (National Institute of Science and Technology) put out a call for new, quantum-resistant algorithms. Out of the 82 submissions it received, only 26 are still being considered for implementation. But Prisco tells Inverse that simply creating algorithms to combat these advanced computers wont be enough. Instead, we need to fight quantum with quantum.

Thats where Priscos company, Quantum Xchange, comes in. Instead of focusing on quantum-resistant algorithms, Quantum Xchange creates new encryption keys that themselves rely on the physics of quantum mechanics.

Just as todays keys are made up of numbers, says Prisco, their quantum key (called QKD) would be made up of photons.

[The QKDs] photons are encoded with ones and zeros, but rather than relying on solving a difficult math problem, it relies on a property of physics, says Prisco. And that property is associated with not being able to observe a photon in any way, shape, or form without changing its quantum state.

This quantum property that Prisco refers to is a law of physics called the Heisenberg Uncertainty Principle. According to this principle, the quantum state of the QKD is only stable as long as its not observed. So, even if a nefarious actor were to steal the QKD, Prisco tells Inverse, the very act of stealing it would count as observation and would thus change the QKD altogether and render it moot.

You could steal the quantum key, says Prisco, but it would no longer be the key that was used to encrypt and therefore it would no longer be able to decrypt.

Prisco tells Inverse that he believes this new generation of quantum keys would remain resilient as long as the laws of quantum physics did. So in theory, a very, very long time.

While other experts have estimated that it will be ten years until such quantum attacks really start taking place, Prisco tells Inverse he believes it will be less than five. And waiting to develop these technologies will not only put our data at risk, but could put us behind the curve when it comes to competing with other countries in this arena as well. Particularly China, who Prisco says is outspending the U.S. 10-to-1 in quantum technology.

Going forward, Prisco says that the U.S.s best bet will be to incorporate both the quantum-resistant algorithms being developed by NIST and other government agencies as well as a quantum key like their QKD.

Im a proponent for combining what NSA and NIST are doing with quantum-resistant algorithms with quantum keys, says Prisco. You know, it may seem like a revolutionary concept in the United States but I can tell you that Chinas doing this, all of Europes doing this Russias doing this. Everybody kind of realizes that the quantum computer is an offensive weapon when it comes to cryptography. And that the first defensive weapon one can deploy are the quantum keys, and then quantum-resistant algorithms when theyre available.

Go here to read the rest:
Traditional cryptography doesn't stand a chance against the quantum age - Inverse

Read More..

China is beating the US when it comes to quantum security – MIT Technology Review

Its been six years since hackers linked with China breached the US Office of Personnel Managements computer system and stole sensitive information about millions of federal employees and contractors. It was the sort of information thats collected during background checks for security clearancesvery personal stuff. But not all was lost. Even though there were obviously some massive holes in the OPMs security setup, some of its data was encrypted. It was useless to the attackers.

Perhaps not for much longer. Its only a matter of time before even encrypted data is at risk. Thats the view of John Prisco, CEO of Quantum Xchange, a cybersecurity firm based in Bethesda, Maryland. Speaking at the EmTech Future Compute event last week, he said that Chinas aggressive pursuit of quantum computing suggests it will eventually have a system capable of figuring out the key to access that data. Current encryption doesnt stand much of a chance against a quantum system tasked with breaking it.

China is moving forward with a harvest today, read tomorrow approach, said Prisco. The country wants to steal as much data as possible, even if it cant access it yet, because its banking on a future when it finally can, he said. Prisco says the China is outspending the US in quantum computing 10 times over. Its allegedly spending $10 billion alone to build the National Laboratory for Quantum Information Sciences, scheduled to open next year (although this number is disputed). Americas counterpunch is just $1.2 billion over five years toward quantum information science. Were not really that safe, he said.

Sign up for The Download your daily dose of what's up in emerging technology

Part of Chinas massive investment has gone toward quantum security itself, including the development of quantum key distribution, or QKD. This involves sending encrypted data as classical bits (strictly binary information) over a fiber-optic network, while sending the keys used to decrypt the information in the form of qubits (which can represent more than just two states, thanks to quantum superposition). The mere act of trying to observe the key changes its state, alerting the sender and receiver of a security breach.

Bu it has its limits. QKD requires sending information-carrying photons over incredibly long distances (tens to hundreds of miles). The best way to do this right now is by installing a fiber-optic network, a costly and time-consuming process.

Its not foolproof, either. The signals eventually scatter and break down over long stretches of fiber optics, so you need to build nodes that will continue to boost them forward. These networks are also point-to-point only (as opposed to a broadcast connection), so you can communicate with only one other party at a time.

Nevertheless, China looks to be all in on QKD networks. Its already built a 1,263-mile link between Beijing and Shanghai to deliver quantum keys. And a successful QKD demonstration by the Chinese Micius satellite was reported across the 4,700 miles between Beijing and Vienna.

Even Europe is making aggressive strides: the European Unions OPENQKD initiative calls for using a combination of fiber optics and satellites to create a QKD-safe communications network covering 13 nations. The US, Prisco argues, is incredibly far behind, for which he blames a lack of urgency. The closest thing it has is a 500-mile fiber-optic cable running down the East Coast. Quantum Xchange has inked a deal to use the cable to create a QKD network that secures data transfers for customers (most notably the financial companies based around New York City).

With Europe and China already taking QKD seriously, Prisco wants to see the US catch upand fast. Its a lot like the space race, he said. We really cant afford to come in second place.

Update: This story has been amended to note that the funding figures for the National Laboratory for Quantum Information Sciences are disputed among some experts.

See the original post:
China is beating the US when it comes to quantum security - MIT Technology Review

Read More..

Technology to Highlight the Next 10 Years: Quantum Computing – Somag News

Technology to Highlight the Next 10 Years According to a Strategy Expert: Quantum Computing

It is said that quantum computers, quantum computing, will have an impact on human history in the coming years. Bank of Americas strategist said that quantum calculation will mark the 2020s.

Bank of America strategist Haim Israel, the revolutionary feature that will emerge in the 2020s will be quantum calculation, he said. The iPhone was released in 2007, and we felt its real impact in the 2010s. We will not see the first business applications for quantum computing until the end of the next decade.

Strategy expert Haim Israel; He stated that the effect of quantum computing on business will be more radical and revolutionary than the effect of smartphones. Lets take a closer look at quantum computing.

What is Quantum Calculation?

Quantum computation is a fairly new technology based on quantum theory in physics. Quantum theory, in the simplest way, describes the behavior of subatomic particles and states that these particles can exist in more than one place until they are observed. Quantum computers, like todays computers, go beyond the storage of zeros and get enormous computing power.

In October, Google, a subsidiary of Alphabet Inc., claimed that they completed the calculation in 200 seconds on a 53 qubit quantum computing chip using a quantum computer, which takes 10,000 years on the fastest supercomputer. Amazon said earlier this month that it intends to cooperate with experts to develop quantum computing technologies. IBM and Microsoft are also among the companies that develop quantum computing technologies.

Quantum computation; health services can recreate the Internet of objects and cyber security areas:

Israel; quantum computing would have revolutionary implications in areas such as health care, the Internet of things and cyber security. Pharmaceutical companies will be the first commercial users of these devices, he said, adding that only the quantum computers can solve the pharmaceutical industrys big data problem.

Quantum computing will also have a major impact on cyber security. Todays cyber security systems are based on cryptographic algorithms, but with quantum computing these equations can be broken in a very short time. Even the most powerful encryption algorithms in the future will weaken significantly by quantum computation, Ok said Oktas marketing manager, Swaroop Sham.

For investors, Israel said that the first one or two companies that could develop commercially applicable quantum computing in this field could access huge amounts of data. This makes the software of these companies very valuable for customers.

You may also like.

View post:
Technology to Highlight the Next 10 Years: Quantum Computing - Somag News

Read More..

DeepMind proposes novel way to train safe reinforcement learning AI – VentureBeat

Reinforcement learning agents or AI thats progressively spurred toward goals via rewards (or punishments) form the foundation of self-driving cars, dexterous robots, and drug discovery systems. But because theyre predisposed to explore unfamiliar states, theyre susceptible to whats called the safe exploration problem, wherein they become fixated on unsafe states (like a mobile robot driving into a ditch, say).

Thats why researchers at Alphabets DeepMind investigated in a paper a method for reward modeling that operates in two phases and is applicable to environments in which agents dont know where unsafe states might be. The researchers say their approach not only successfully trains a reward model to detect unsafe states without visiting them, it can correct reward hacking (loopholes in the reward specification) before the agent is deployed even in new and unfamiliar environments.

Interestingly, their work comes shortly after the release of San Francisco-based research firm OpenAIs Safety Gym, a suite of tools for developing AI that respects safety constraints while training and that compares its safety to the extent it avoids mistakes while learning. Safety Gym similarly targets reinforcement learning agents with constrained reinforcement learning, a paradigm that requires AI systems to make trade-offs to achieve defined outcomes.

The DeepMind teams approach encourages agents to explore a range of states through hypothetical behaviors generated by two systems: a generative model of initial states and a forward dynamics model, both trained on data like random trajectories or safe expert demonstrations. A human supervisor labels the behaviors with rewards, and the agents interactively learn policies to maximize their rewards. Only after the agents have successfully learned to predict rewards and unsafe states are they deployed to perform desired tasks.

Above: DeepMinds safe reinforcement learning approach tested on OpenAI Gym, an environment for AI benchmarking and training.

Image Credit: DeepMind

As the researchers point out, the key idea is the active synthesis of hypothetical behaviors from scratch to make them as informative as possible, without interacting with the environment directly. The DeepMind team calls it reward query synthesis via trajectory optimization, or ReQueST, and explains that it generates four types of hypothetical behaviors in total. The first type maximizes the uncertainty of an ensemble of reward models, while the second and third maximize the predicted rewards (to elicit labels for behaviors with the highest information value) and minimize predicted rewards (to surface behaviors for which the reward model might be incorrectly predicting). As for the fourth category of behavior, it maximizes the novelty of trajectories so as to encourage exploration regardless of predicted rewards.

Finally, once the reward model reaches a satisfactory state, a planning-based agent is deployed one that leverages model-predictive control (MPC) to pick actions optimized for the learned rewards. Unlike model-free reinforcement learning algorithms that learn through trial and error, this MPC enables agents to avoid unsafe states by using the dynamics model to anticipate actions consequences.

To our knowledge, ReQueST is the first reward modeling algorithm that safely learns about unsafe states and scales to training neural network reward models in environments with high-dimensional, continuous states, wrote the coauthors of the study. So far, we have only demonstrated the effectiveness of ReQueST in simulated domains with relatively simple dynamics. One direction for future work is to test ReQueST in 3D domains with more realistic physics and other agents acting in the environment.

Read the original here:

DeepMind proposes novel way to train safe reinforcement learning AI - VentureBeat

Read More..

Terence Crawford has next foe in mind after impressive knockout win – New York Post

Terence Crawfords latest opponent had not only never been knocked down, but Crawford couldnt recall seeing him even hurt.

Then again, hed never been in a ring with a fighter like Crawford before.

Crawford dropped Egidijus Kavaliauskas three times before stopping him in the ninth round Saturday night to remain unbeaten and defend his welterweight title at Madison Square Garden.

I wanted to give the crowd a knockout, Crawford said. When I started letting my hands go, I started landing more fatal shots.

Crawford knocked down the challenger once in the seventh round and twice more in the ninth before referee Ricky Gonzalez stopped it at 44 seconds of the round.

Crawford (36-0, 27 KOs) absorbed perhaps more shots than usual but seemed to enjoy getting to show he has power, too, letting out a big smile as Kavaliauskas returned to his corner looking frustrated after one round late in the fight.

I thought I had to entertain you all for a little bit, Crawford said. Hes a strong fighter, durable, and I thought Id give the crowd something to cheer for.

Kavaliauskas (21-1-1), a Lithuanian who was the mandatory challenger for Crawfords WBO belt, had some good moments in the first few rounds before Crawford took control midway through the fight and then poured it on late.

Crawford fought cautiously at the outset and Kavaliauskas showed why there was reason to when he landed a big right early in the third round and then a couple more punches inside as Crawford tried to hold on. Crawford ended up going to a knee but Kavaliauskas wasnt credited with a knockdown, the referee apparently determining Crawford had been pushed down.

Crawford said afterward he wasnt hurt by that shot and it wasnt long before he was the one doing more damage.

Kavaliauskas kept throwing big punches that drove Crawford backward when they landed, but Crawford used his speed advantage to slip out of the way of many of them while landing his own combinations.

Crawford took a hard shot early in the seventh but then began answering and finally caught Kavaliauskas with a looping right near the ear that sent him to the canvas.

Crawford finished it two rounds later, first using a three-punch combination to set up a right uppercut that sent Kavaliauskas to the canvas. He got up but Crawford then threw a right hook that returned the two-time Olympian to the canvas and the fight was immediately waved off.

The 32-year-old Crawford bristled this week when asked if getting in tougher fights would earn him extra appreciation, saying all that mattered was winning. But this fight certainly appeared harder than the skilled Nebraska natives first three after moving up to welterweight, all stoppages, after he won all four major belts at 140.

Hes still searching for better opposition in the deep 147-pound division and promoter Bob Arum indicated Crawford may look next to veteran Shawn Porter, who is coming off a competitive loss to Errol Spence Jr. in a unification bout in September.

A Crawford-Spence bout would likely be the most attractive possible, but Spence was injured in a car accident and its unknown when he can fight again. That could leave Porter as the next choice.

Porter is the next best guy, Arum said. He proved himself with Spence.

Crawford said hes ready for whichever fighter is next.

Ill fight anybody. Ive been saying that for I dont know how long, Crawford said.

Earlier, Teofimo Lopez won a lightweight belt with a second-round stoppage of Richard Commey, and Michael Conlan beat Olympic rival Vladimir Nikitin.

Lopez (15-0, 12 KOs) was spectacular in his first title fight, wobbling Commey with a left hand early in the second round and then flooring him with a hard right hand. He finished the fight with a barrage of punches in the corner and perhaps next moves on to a 135-pound unification bout with two-time Olympic gold medalist Vasiliy Lomachenko.

Conlan (13-0, 7 KOs) had lost to Nikitin twice as an amateur, including in the 2016 Olympic quarterfinals. He blasted the international boxing federation for being corrupt after the decision was announced and extended his middle finger to the judges at ringside.

He had also lost a close fight to Nikitin in 2013 but the judges saw this one as no contest, giving Conlan a lopsided decision by scores of 100-90, 99-91 and 98-92.

View post:

Terence Crawford has next foe in mind after impressive knockout win - New York Post

Read More..

What Are Normalising Flows And Why Should We Care – Analytics India Magazine

Machine learning developers and researchers are constantly in pursuit of finding a well-defined probabilistic model that would correctly describe the processes that produce data. A central need in all of the machine learning is to develop the tools and theories to develop better-specified models that lead to even better insights of data.

One such attempt has been made by Danilo Rezende in the form of normalising flows. Today building probability distributions as the normalising flow is an active area of ML research.

Normalizing flows operate by pushing an initial density through a series of transformations to produce a richer, more multimodal distribution like a fluid flowing through a set of tubes. Flows can be used for joint generative and predictive modelling by using them as the core component of a hybrid model.

Normalizing flows provide a general way of constructing flexible probability distributions over continuous random variables.

Let x be a D-dimensional real vector, and suppose we would like to define a joint distribution over x. The main idea of flow-based modelling is to express x as a transformation T of a real vector u sampled from a distribution of the flow-based model.

According to the Google Brain team, the key idea behind normalising of flows can be summarised as follows:

The flow can be thought of as an architecture, where the last layer is a (generalised) linear model operating on the features and these features distribution can be viewed as a regulariser on the feature space. In turn, flows are effective in any application requiring a probabilistic model with either of those capabilities.

Normalizing flows, due to their ability to be expressive while still allowing for exact likelihood calculations, are often used for probabilistic modelling of data. They have two primitive operations: density calculation and sampling.

For example, invertible ResNets have been explored for classification with residual flows and have witnessed a first big improvement. The improvement can be something as significant as the reduction of the models memory footprint by obviating the need to store activations for backpropagation.

This achievement may help one understand to what degree discarding information is crucial to deep learnings success.

Normalizing flows allow us to control the complexity of the posterior at run-time by simply increasing the flow length of the sequence.

Rippel and Adams (2013), were the first to recognise that parameterizing flows with deep neural networks could result in quite general and expressive distribution classes.

Like with deep neural networks, normalizing the intermediate representations is crucial for maintaining stable gradients throughout the flow.

Normalizing flows can also be integrated into traditional Markov chain Monte Carlo (MCMC) sampling by using the flow to reparameterize the target distribution. Since the efficiency of Monte Carlo methods drastically depends on the target distribution, normalizing flows would make it easier to explore.

Normalizing flows can be thought of as implementing a generalised reparameterization trick, as they leverage a transformation of a fixed distribution to draw samples from a distribution of interest.

For instance, the Generative Model has been a popular application of flows in machine learning. Here are some other examples:

In a paper titled, Normalizing Flows for Probabilistic Modeling and Inference, researchers from DeepMind investigated the state of flow models in detail.

They have listed the kind of flow models that have been in use, their evolution and their significance in domains like reinforcement learning, imitation learning, image, audio, text classification and many more.

The authors also speculate that many flow designs and specific implementations will inevitably become out-of-date as work on normalizing flows continues, we have attempted to isolate foundational ideas that will continue to guide the field well into the future.

The large scale adoption of normalising flows in place of conventional probabilistic models is advantageous because unlike other probabilistic models that require approximate inference as they scale, flows usually admit analytical calculations and exact sampling even in high dimensions.

However, the obstacles that are currently preventing wider application of normalizing flows are similar to those faced by any probabilistic models. With the way research is accelerating, the team at DeepMind are optimistic about the future of flow models.

comments

Continue reading here:

What Are Normalising Flows And Why Should We Care - Analytics India Magazine

Read More..

Cloud computing in 2020: views of the industry – Techerati

We asked six industry experts to weigh in on whats now and whats next in cloud computing

When we published our selection of cloud predictions last year, most predicted container orchestrator Kubernetes to consolidate its stranglehold over the container space and, correspondingly, modern cloud infrastructure.

Last November, one of the most extensive customer surveys bore this prediction out. In its study of thousands of companies, cloud and infrastructure monitoring company Datadog found 45 percent of its customers were using Kubernetes. And if that isnt evidence enough, just reflect on VMwares announcement in March that it plans to transition its enterprise virtualisation platform to a system that runs (and runs on) Kubernetes.

But in reality, Kubernetes centrality to cloud was put beyond doubt weeks before we published last years roundup. In January, IBM steamrollered into 2019 fresh off the back of its $34 billion acquisition of Red Hat. This year IBM confirmed it would integrate Red Hats popular Kubernetes implementation, OpenShift, into a new multi-cloud business focus.

It is in this context that most of this years experts consulted their cloud crystal balls. Rackspaces Lee James predicts 2020 to be a year of stiff competition between enterprise IT giants jostling to deliver a Kubernetes solution that unlocks multi-cloud for their customers. On the other hand, Stephan Fabel of Canonical says end-users will start to understand the limitations of Kubernetes, and accordingly, utilise it more strategically. Lastly, Pivotals Michael Cote expects companies to use this new-found savoir-faire to establish a singular, overall Kubernetes strategy.

Read the predictions in their entirety below.

Hybrid becomes the new multi-cloud, again

While the popularity of multi-cloud is undisputed with 81 per cent of companies using cloud technologies in some way, many firms are still making investments in their private cloud solutions. This is due to a number of reasons, such as the security posture, ongoing data centre leasing, or just because its the best platform for the application in some cases business. Indeed, even the UK Government plans to revise its cloud first policy to cloud right (or something similar) early next year, acknowledging that public cloud isnt right for everyone or every use case.

Reflecting this trend, weve seen the cloud giants respond with private cloud solutions that link directly into their public cloud solutions, such as Azure Arc, Google Cloud Anthos and AWS Outposts.

In 2020, theres going to be significant competition between the three biggest cloud hyperscalers and VMware as they all explore and deliver on how Kubernetes will unlock their potential to be the multi hybrid cloud provider of choice for customers. For customers, its ultimately going to come down to which fits and works best, as well as what gives the best bang for their buck. But this sets us up for an exciting year of new product and service announcements as each of the major cloud companies try to establish themselves as the cloud broker of choice.

Unicorn start-ups will begin repatriating workloads from the cloud

There has been a lot said about cloud repatriation of late. While this wont be a mass exodus from the cloud in fact quite the opposite, with public cloud growth expected to increase 2020 will see cloud native organisations leveraging a hybrid environment to enjoy greater cost savings.

For businesses starting out or working with limited budgets, which require an environment for playing around with the latest technology, public cloud is the perfect place to start. With the public cloud, you are your own limit and get immediate reward for innovation. But as these costs begin mounting, its prudent to consider how to regain control of cloud economics.

Repatriating workloads to on-premise is certainly a viable option, but it doesnt mean to say that we will start to see the decline of cloud. As organisations get past each new milestone in the development process, repatriation becomes more and more of a challenge. What we will likely see is public cloud providers reaching into the data centre to support this hybrid demand, so that they can capitalise on the trend

Kubernetes has become an integral part of modern cloud infrastructure and serves as a gateway to building and experimenting with new technology. Its little surprise that many companies we observe are doubling down on the application and reorienting their DevOps team around it to explore new things such as enabling serverless applications and automating data orchestration. We think this trend will continue at strength in 2020.

On a more cautious note, we may also see some companies questioning whether Kubernetes is really the correct tool for their purposes. While the technology can provide tremendous value, in some cases it can be complex to manage and requires specialist skills.

As Kubernetes is now commonly being used for production at scale, it becomes increasingly likely that users encounter issues around security and downtime. As a result of these challenges, we can expect the community will mature and in some cases come to the viewpoint that it might not be right for every application or increase the need to bring in outsourced vendors to aid with specialised expertise.

Organisations should try to find one standard Kubernetes approach

Kubernetes has already emerged as the leading choice for running containerised or cloud native applications. Organisations will now spend the time to create a kubernetes strategy, choosing the distro or services theyll use, to then run a few applications on it. Having multiple initiatives here would be a huge waste and delay the overall strategy. Instead organisations should try to find one standard Kubernetes approach. In 2020, though, the bulk of the work will be finding and modernising the apps that will run on that platform.

Most large organisations are doing just that and will spend 2020 modernising how they build and run software. To start, they need to find a handful of small, high value applications and the services those applications use. Then, run a working Proof of Concept (POC) to validate the platform choice by launching actual applications on the platform.

If it goes well, organisations can then put more apps on it. If it doesnt go well, they can try to find out why and try again, maybe with a new platform. Its important to look at the full end-to-end process: from development, to running in production, to re-deploying companies need to judge the success of the platform choice.

All applications will become mission-critical

The number of applications that businesses classify as mission-critical will rise during 2020 paving the way to a landscape in which every app is considered a high-priority. Previously, organizations have been prepared to distinguish between mission-critical apps and non-mission-critical apps. As businesses become completely reliant on their digital infrastructure, the ability to make this distinction becomes very difficult.

On average, the 2019 Veeam Cloud Data Management report revealed that IT decision-makers say their business can tolerate a maximum of two hours downtime of mission-critical apps. But what apps can any enterprise realistically afford to have unavailable for this amount of time? Application downtime costs organisations a total of $20.1 million globally in lost revenue and productivity each year, with lost data from mission-critical apps costing an average of $102,450 per hour. The truth is that every app is critical.

Businesses will continue to pick and choose the storage technologies and hardware that work best for their organisation, but data centre management will become even more about software. Manual provisioning of IT infrastructure is fast-becoming a thing of the past. Infrastructure as Code (IaC) will continue its proliferation into mainstream consciousness. Allowing business to create a blueprint of what infrastructure should do, then deploy it across all storage environments and locations, IaC reduces the time and cost of provisioning infrastructure across multiple sites.

Software-defined approaches such as IaC and Cloud-Native a strategy which natively utilises services and infrastructure from cloud computing providers are not all about cost though. Automating replication procedures and leveraging the public cloud offers precision, agility and scalability enabling organisations to deploy applications with speed and ease. With over three-quarters of organisations using software-as-a-service (SaaS), a software-defined approach to data management is now relevant to the vast majority of businesses.

Companies will take a step down from the cloud

The pattern goes something like this: lift and shift infrastructure VMs to the cloud, see costs actually go up, then move some workloads back to on-prem, then application lifecycle drivers push new apps (or new features for old apps) to be built in the cloud using PaaS/DbaaS technologies with more favourable cost model, then retire old IaaS apps. The key takeaway is this dynamic is one of the drivers for hybrid approaches.

We saw many companies this past year lift and shift to the cloud. Now, in 2020, I expect well see companies take a step back, and reevaluate their all-in approach. After feeling the effects of the full shift to cloud, the high associated costs and lack of flexibility, many companies will likely move to a multi-cloud or hybrid cloud approach next year.

Taking a hybrid cloud approach enables organizations to leverage the cloud (using AWS, Azure, or GCP) for some applications and computing needs while still keeping mission-critical or sensitive data closer to home. With a multi-cloud strategy, organizations can reduce costs and, instead of being constrained to one cloud, departments have the flexibility to select the service that works best for their individual needs (using a mix of AWS, Azure, or GCP).

Customers and organisations will really begin to look for more management layers on top of their solutions

One of the things I believe well see in 2020 is the true adoption of things like hybrid and multi-cloud solutions but the difference in this upcoming year will be that customers and organisations will really begin to look for more management layers on top of their solutions.

A lot of companies already have things like backup-in-the-cloud and DRaaS somewhere else, so what theyre now looking for is a uniform management layer on top of that to give visibility on cost, as well as a knowledge of where all data is located. Its important to know where data lives, whether workloads are protected, and whether they need to move workloads between different clouds if and when requirements change.

Go here to read the rest:
Cloud computing in 2020: views of the industry - Techerati

Read More..

Cloud Computing and Security: The Risks You Need to Know – TechDecisions

Cloud computing has become a valuable and increasingly popular approach to digital technology that includes on-demand self-service, broad network access, software as a service, and much, much more. As businesses continue to explore cloud options for a number of applications, its critical that they assess and align specific needs with the appropriate cloud vendor and service early in the cloud transition. Misaligning them, or underestimating cloud computing risks can spell trouble.

Clouds platforms and services available from technology giants including IBM, Google, Amazon, Salesforce, SAP and Oracle provide formats of IT service that offer users significant advantages over old-school, on-premises data centers.

For example, the users capital costs are lower and better matched to actual consumption; no hardware or software installations are required. Cloud-based IT infrastructure provides customers with rapid access to computing power whenever its needed.

Beyond that, the most significant misconception is that cloud services are protected 24/7 armies of security experts, making them virtually bullet-proof. That vision of an unimpregnable fortress, safeguarding client data against all adversaries, is comforting but its misguided.

Using cloud services actually carries its own set of risks risks that are unique to the cloud providers own operating environment, as well as other risks associated with traditional data centers.

Clients who dont recognize those risks and accept their own responsibilities for mitigating them, are almost as likely to experience data loss and compromise as they were before migrating to cloud operations.

Understanding and managing these shared cloud computing risks is key to successfully utilizing a cloud service. And it is equally important to recognize the cloud is not a monolithic concept; clouds vary both in who can use them and in what they do.

For one thing, just as in meteorological cloud formations, there are also different computing cloud configurations.

They include private clouds, which are hosted internally and used by a single organization; public clouds, which are commercial ventures available to the general public; community clouds which are only accessible to specific groups of users, and hybrid clouds which include elements of two or more such arrangements.

Cloud platforms and services are owned and operated by different companies, each with their own policies, prices, and resources.

There are also differences in the types of computing services they offer. Infrastructure as a Service, or IaaS, controls user access to computing resources servers, storage, network and so on which are actually owned by the client.

Platform as a Service, or PaaS, controls user access to the operating software and services needed to develop new applications.

The third and most popular cloud operations product is Software as a Service, or SaaS, which gives users direct access to the clients software applications.

For example, once theyre migrated to the cloud, client organizations lose a good amount of visibility and control over their assets and operations.

The monitoring and analysis of information about the companys applications, services, data and users never loses importance, but it will need to take a different form than it did when the clients own network monitoring and logging procedures were in place.

Before a clients data ever gets to the cloud, it travels across the internet. Unless the users network and internet channel are secure, powered by strong authentication standards and encrypted data, information in transit is susceptible to exposure.

Vulnerabilities in shared servers and system software used by public clouds to keep the data of multiple tenants separate can be exploited, enabling an attacker to access one organizations data via a separate organization and/or user.

Permanently removing sensitive data that a client wants securely deleted is difficult to confirm because of the reduced visibility inherent in cloud operations, which frequently includes data distributed over an assortment of storage devices. Any residual data can become available to attackers.

If a cloud service provider goes out of business or fails to meet your business and/or security needs, transferring data from that operator to another can be more costly in terms of time, effort and money than it was to initially become a subscriber. Additionally, each providers non-standard and proprietary tools can complicate data transfer.

Cloud operations are complicated by their technology, polices and their implementation methods. This complexity requires the clients IT staff to learn new ways of handling their information, because as complexity grows, so does the potential for a data breach.

Insider abuse has the potential to inflict even greater damage on the clients data than it did before due to the clouds ability to provide users with more access to more resources. Depending on your cloud service, the forensic capabilities needed to trace and detect any malicious insider may not be available.

The loss of stored data due to an accidental deletion or a physical catastrophe such as a fire or earthquake, can be permanent. A well thought out data recovery strategy needs to be in place, but the client and service provider must work together to establish a secure and effective process.

Managing user identities carefully controlling users identity attributes and regulating their privileged access remains an equally challenging task in cloud operations as it ever was in on-premises environments. Due to the nature of cloud services, the challenge in some cases can be much greater than in on-premises environments.

Providing appropriate levels of secure access for different user roles, such as employees, contractors and partners is critical to protecting your cloud environment, making Identity Governance a high priority when migrating to the cloud. Cloud computing and security should constantly be thought of as joined concepts, not separate silos.

Read Next:Getting the Most Out of Enterprise Content Management Software

Cloud operations provide a variety of valuable avenues to exploit. And while the childlike faith that cloud platforms and services are immune to malicious attacks may be touching, its simply not true. Vigilance is equally, if not even more important, than it was before migrating to the cloud.

See more here:
Cloud Computing and Security: The Risks You Need to Know - TechDecisions

Read More..