Page 3,325«..1020..3,3243,3253,3263,327..3,3303,340..»

Nudges and machine learning triples advanced care conversations – Penn Today

An electronic nudge to clinicianstriggered by an algorithm that used machine learning methods to flag patients with cancer who would most benefit from a conversation around end-of-life goalstripled the rate of those discussions. This is according to a new prospective, randomized study of nearly 15,000 patients from Penn Medicine and published in JAMA Oncology.

Early and frequent conversations with patients suffering from serious illness, particularly cancer, have been shown to increase satisfaction, quality of life, and care thats consistent with their values and goals. However, many do not get the opportunity to have those discussions with a physician or loved ones because their disease has progressed too far and theyre too ill.

Within and outside of cancer, this is one of the first real-time applications of a machine learning algorithm paired with a prompt to actually help influence clinicians to initiate these discussions in a timely manner, before something unfortunate may happen, says co-lead author Ravi B. Parikh, an assistant professor of medical ethics and health policy and medicine in the Perelman School of Medicine and a staff physician at the Corporal Michael J. Crescenz VA Medical Center. And its not just high-risk patients. It nearly doubled the number of conversations for patients who werent flaggedwhich tells us its eliciting a positive cultural change across the clinics to have more of these talks.

Christopher Manz, of the Dana Farber Cancer Institute, who was a fellow in the Penn Center for Cancer Care Innovation at the time of the study, serves as co-lead author.

In a separate JAMA Oncology study, the research team validated the Penn Medicine-developed machine learning tools effectiveness at predicting short-term mortality in patients in real-time using clinical data from the electronic health record. The algorithm considers more than 500 variablesage, hospitalizations, and co-morbidities, for examplefrom patient records, all the way up until their appointment. Thats one of the advantages of using the EHR to identify patients who may benefit from a timely conversation.

Read more at Penn Medicine News.

Read the original:
Nudges and machine learning triples advanced care conversations - Penn Today

Read More..

insitro Strengthens Machine Learning-Based Drug Discovery Capabilities with Acquisition of Haystack Sciences – Business Wire

SAN FRANCISCO--(BUSINESS WIRE)--insitro, a machine learning driven drug discovery and development company, today announced the acquisition of Haystack Sciences, a private company advancing proprietary methods to drive machine-learning enabled drug discovery. Haystacks approach focuses on synthesizing, breeding and analyzing large, diverse combinatorial chemical libraries encoded by unique DNA sequences called DNA-encoded libraries, or DELs. Financial details of the acquisition are not disclosed.

insitro is building the leading company at the intersection of machine learning and biological data generation at scale, with a core focus on applying these technologies for more efficient drug discovery. With the acquisition of Haystack, insitro will leverage the companys DEL technology to collect massive small molecule data sets that inform the construction of machine learning models able to predict drug activity from molecular structure. With the addition of the Haystack technology and team, insitro has taken a significant step towards building in-house capabilities for fully integrated drug discovery and development. insitros capabilities in this space are being further developed via a collaboration with DiCE Molecules, a leader in the DEL field. The collaboration, executed earlier this year, is aimed at combining the power of machine learning with high quality DEL datasets to address two difficult protein-protein interface targets that DiCE is pursuing.

We are thrilled to have the Haystack team join insitro, said Daphne Koller, Ph.D., founder and chief executive officer of insitro. For the past two years, insitro has been building a company focused on the creation of predictive cell-based models of disease in order to enable the discovery of novel targets and evaluate the benefits of new or existing molecules in genetically defined patient segments. This acquisition enables us to expand our capabilities to the area of therapeutic design and advances us towards our goal of leveraging machine learning across the entire process of designing and developing better medicines for patients.

Haystacks platform combines multiple elements, including the capability to synthetize broad, diverse, small molecule collections, the ability to execute rapid iterative follow-up, and a proprietary semi-quantitative screening technology, called nDexer, that generates higher resolution datasets than possible through conventional panning approaches. These capabilities will greatly enable insitros development of multi-dimensional predictive models for small molecule design.

The nDexerTM capabilities we have advanced at Haystack, combined with insitros state of the art machine learning models, will enable us to build a platform at the forefront of applying DEL technology to next-generation therapeutics discovery, said Richard E. Watts, co-founder and chief executive officer of Haystack Sciences who will be joining insitro as vice president, high-throughput chemistry. I am excited by the opportunity to join a company with such a uniquely open and collaborative culture and to work with and learn from colleagues in data science, machine learning, automation and cell biology. The capabilities enabled by joining our efforts are considerably greater than the sum of the parts, and I look forward to helping build core drug discovery efforts at insitro.

Haystacks best-in-class DEL technology is uniquely aligned with insitros philosophy of addressing the critical challenges in pharmaceutical R&D through predictive machine learning models, all enabled by producing quality data at scale, said Vijay Pande, Ph.D., general partner at Andreessen Horowitz and member of insitros board of directors. This investment will power insitros swift prosecution of the multiple targets emerging from their platform, as well as the creation of a computational platform for molecule structure and function optimization. Having seen the field of computationally driven molecule design mature over the past twenty years, I look forward to the next chapter in therapeutics design written by the combined efforts of insitro and Haystack.

About insitro

insitro is a data-driven drug discovery and development company using machine learning and high-throughput biology to transform the way that drugs are discovered and delivered to patients. The company is applying state-of-the-art technologies from bioengineering to create massive data sets that enable the power of modern machine learning methods to be brought to bear on key bottlenecks in pharmaceutical R&D. The resulting predictive models are used to accelerate target selection, to design and develop effective therapeutics, and to inform clinical strategy. The company is located in South San Francisco, CA. For more information on insitro, please visit the companys website at http://www.insitro.com.

About Haystack Sciences

Haystack Sciences seeks to inform and speed drug discovery by acquiring data of best-in-class accuracy and dimensionality from DNA Encoded Libraries (DELs). This is enabled by proprietary technologies for in vitro evolution of fully synthetic small molecules and high throughput mapping of structure-activity relationships for selection of molecules with drug-like properties. The companys technologies, including their nDexer platform, allow for generation of better libraries and quantification of binding affinities of entire DELs against a given target in parallel. The combination of these approaches with machine learning has the potential to greatly accelerate the discovery of optimized drug candidates. Haystack Sciences is based in South San Francisco, California. It was incubated at the Illumina Accelerator and is backed by leading investors including Viking Global Investors, Nimble Ventures, HBM Genomics, and Illumina. More information is available at: http://www.haystacksciences.com/

See the rest here:
insitro Strengthens Machine Learning-Based Drug Discovery Capabilities with Acquisition of Haystack Sciences - Business Wire

Read More..

Revolutionizing IoT with Machine Learning at the Edge | Perceive’s Steve Teig – IoT For All

In episode 88 of the IoT For All Podcast, Perceive Founder and CEO Steve Teig joins us to talk about how Perceive is bringing the next wave of intelligence to IoT through machine learning at the edge. Steve shares how Perceive developed Ergo, their chip announced back in March, and how these new machine learning capabilities will transform consumer IoT.

Steve Teig is an award-winning technologist, entrepreneur, and inventor on 388 US patents. Hes been the CTO of three EDA software companies, two biotech companies, and a semiconductor company of these, two went public during his tenure, two were acquired, and one is a Fortune 500 company. As the CEO and Founder of Perceive, Steve is leading a team building solutions and transformative machine learning technology for consumer edge devices.

To start the episode, Steve gave us some background on how Perceive got started. While serving as CTO of Xperi, Steve worked with a wide array of imaging and audio products and saw an opportunity in making the edge smart by leveraging machine learning at the edge. What if you could make gadgets themselves intelligent? Steve asked, thats what motivated me to pursue it technically and then commercially with Perceive.

At its core, Perceive builds chips and machine learning software for edge inference, providing data center class accuracy at the low power that edge devices, like IoT, require. The kinds of applications we go after, Steve said, are from doorbell cameras to home security cameras, to toys, to phones wherever you have a sensor, it would be cool to make that sensor understand its environment without sending data to the cloud.

Of the current solutions for device intelligence, Steve said you have two options and neither of them are ideal: first, you can send all of the data your sensor collects to someone elses cloud, giving up your privacy; or second, you can have a tiny chip that, while low power enough for your device, doesnt provide the computing power to provide answers you can actually trust.

We fix that problem by providing the kind of sophistication you would expect from the big cloud providers, but low enough power that you can run it at the edge, Steve said, saying that their chip is 20 to 100 times more power efficient than anything else currently in the market.

Steve also spoke to some of the use cases that Ergo enables. Currently, the main applications are doorbell cameras, home security cameras, and appliances. As we look forward, Steve said, being able to put really serious contextual awareness into gadgets opens up all kinds of applications. One of the examples he gave was a microwave that could identify both the user and the food to be heated, and adjust its settings to match that users preferences. Another example would be a robot vacuum cleaner that you could ask to find your shoes.

Changing gears, Steve shared Perceives philosophy on machine learning, saying that because they were looking to make massive improvements they had to start fresh. We had to start with the math. We really started from first principles. That philosophy has led to a number of new and proprietary techniques, both on the software and hardware side.

Moving more into the industry at large, Steve shared some observations in the smart home space during the pandemic. Those observations highlighted two somewhat conflicting viewpoints while there has been a broader interest in smart home technology, with people spending more time at home, people have also become more sensitive about their privacy. Steve also shared how Ergo handles data, in order to meet these security and privacy concerns.

To close out the episode, Steve shared some of the challenges his team faced while developing Ergo and what those challenges meant as he built out the team itself. He also shared some of his thoughts on the future of the smart home and consumer IoT space, with the introduction of these new machine learning capabilities.

Interested in connecting with Steve? Reach out to him on Linkedin!

About Perceive: Steve Teig, founder and CEO of Perceive, drove the creation of the company in 2018 while CTO of its parent company and investor, Xperi. Launching Perceive, Steve and his team had the ambitious goal of enabling state-of-the-art inference inside edge devices running at extremely low power. Adopting an entirely new perspective on machine learning and neural networks allowed Steve and his team to very quickly build and deploy the software, tools, and inference processor Ergo that make the complete Perceive solution.

(00:50) Intro to Steve

(01:25) How did you come to found Perceive?

(02:30) What does Perceive do? Whats your role in the IoT space?

(03:37) What makes your offering unique to the market?

(04:49) Could you share any use cases?

(09:41) How would you describe your philosophy when it comes to machine learning?

(11:37) What is Ergo and what does it do?

(12:39) What does a typical customer engagement look like?

(14:57) Have you seen any change in demand due to the pandemic?

(20:47) What challenges have you encountered building Perceive and Ergo?

(22:24) Where do you see the market going for smart home devices?

Read this article:
Revolutionizing IoT with Machine Learning at the Edge | Perceive's Steve Teig - IoT For All

Read More..

Mastercard Says its AI and Machine Learning Solutions Aim to Stop Fraudulent Activites which have Increased Significantly due to COVID – Crowdfund…

Ajay Bhalla, President, Cyber and Intelligence Solutions, Mastercard, notes that artificial intelligence (AI) algorithms are part of the payment companys first line of defense in protecting over 75 billion transactions that Mastercard processes on its network every year.

Bhalia recently revealed the different ways that Mastercard applies its AI expertise to solve some of the most pressing global challenges from cybersecurity to healthcare and the impact the COVID-19 pandemic has had on the way we conduct our lives and interact with those around us.

Cybersecurity fraud rates have reached record highs, with nearly 50% of businesses now claiming that they may have been targeted by cybercriminals during the past two years. Fraudulent activities carried out via the Internet may have increased significantly due to the Coronavirus crisis, because many more people are conducting transactions online.

Mastercard aims to protect consumers from becoming a victim of online fraud. The payments company has added AI-based algorithms to its networks multi-layered security strategy. This allows Mastercards network to support a coordinated set of AI-enabled services to act within milliseconds to potential online security threats. Last year, Mastercard reportedly helped save around $20 billion of fraud via its AI-enhanced systems (which include SafetyNet, Decision Intelligence and Threat Scan)

In statements shared with Arab News, Bhalla noted:

One of the impacts of this pandemic is the rapid migration to digital technologies. Recent data shows that we vaulted five years forward in digital adoption, both consumer and business, in a matter of eight weeks. Whether its online shopping, contactless payments or banks transitioning to remote sales and service teams, this trend is here to stay it is not the new normal, it is the next normal.

Bhalia also mentioned that with many more consumers interacting and performing transactions via the Internet, were now creating large amounts of data. He revealed that, by 2025, well be creating approximately 463 exabytes of data per day and this number is going to keep increasing rapidly.

He further noted that more professionals are now working from the comfort of their home and that this may have also opened new doors for cybercriminals and hackers.

He remarked:

The current crisis is breeding fear, anxiety and stress, with people understandably worried about their health, safety, family and jobs. Unfortunately, that creates a fertile breeding ground for criminals preying on those insecurities, resulting in more cyberattacks and fraud.

He confirmed that Mastercards NuData tech has seen cyberattacks increase in volume and their level of sophistication has also increased, with around one in every three online attacks now being able to closely emulate human behavior.

Bhalla claims that Mastercard has made considerable investments in AI for over a decade and it has also added AI capabilities to all key parts of its business operations.

He also noted:

Our AI and machine learning solutions stop fraud, reduce credit risk, fight financial crime, prevent health care fraud and so much more. In health care, were working with organizations on cyber assessments to help safeguard their cyber systems, staff and patients at this challenging time. In retail, criminals are increasingly targeting digital channels as we shift to shopping online.

He revealed that the Card Not Present fraud currently accounts for about 90% of all fraudulent activities carried out via online platforms. This type of fraud accounted for 75% of all Internet fraud before COVID, Bhalia said. He claims that Mastercards AI was able to rapidly learn this new behavior and changed its scoring to reflect the new pattern, delivering a stronger performance during the pandemic.

Read more:
Mastercard Says its AI and Machine Learning Solutions Aim to Stop Fraudulent Activites which have Increased Significantly due to COVID - Crowdfund...

Read More..

Machine-Learning Inference Chip Travels to the Edge – Electronic Design

Flex Logix has unveiled its InferX X1 machine-learning (ML) inference system, which is packed into a 54-mm2 chip. The X1 incorporates 64 1D Tensor processing units (TPUs) linked by the XFLX interconnect (Fig. 1). The dual 1-MB level 2 SRAM holds activations while the level 1 SRAM holds the weights for the next layer of computation. An on-chip FPGA provides additional customization capabilities. Theres also a 4-MB level 3 SRAM, LPDDR4 interface, and x4 PCI Express (PCIe) interface.

1. The InferX X1s XFLX interconnect links the 1D TPU engines with memory and each other.

The company chose to implement one-dimensional Tensor processors (Fig. 2), which can be combined to handle two- and three-dimensional tensors. The units support a high-precision Winograd acceleration option. This approach is more flexible and delivers a high level of system utilization.

2. The InferX X1 is built around one-dimensional Tensor processors that provide extreme flexibility and high utilization.

The simplified concept and underlying embedded FPGA (eFPGA) architectural approach allows the system to be reconfigured rapidly and enables layers to be fused together. This means that intermediate results can be given to the next layer without having to store them in memory, which slows down the overall system (Fig. 3). Moving data around ML hardware is often hidden, but it can have a significant impact on system performance.

3. Fast reconfiguration and support for soft logic within a process flow can eliminate the need to store intermediate results.

The inclusion of an eFPGA and a simplified, reconfigurable TPU architecture makes it possible for Flex Logix to provide a more adaptable ML solution. It can handle a standard confv2d model as well as a depth-wise conv2d model.

The chips are available separately or on a half-height, half-length PCIe board (Fig. 4). The PCIe board includes a x8 PCIe interface. The X1P1 board has a single chip while the X1P4 board incorporates four chips. Both plug into a x8 PCIe slot. The reason for going that route rather than a x16 for the X1P4 is because server motherboards typically have more x8 slots than x16 and the throughput difference for ML applications is minimal. As a result, more boards can be packed into a server. The X1P1 is only $499, while the X1P4 goes for $999.

4. The InferX X1 is available on a PCIe board. The X1P1 has a single chip, while the X1P4 includes four chips.

The X1M M.2 version is expected to arrive soon. The 22- 80-mm module has a x4 PCIe interface and will be available in 2021. It targets embedded servers, PCs, and laptops.

Read more:
Machine-Learning Inference Chip Travels to the Edge - Electronic Design

Read More..

Abstract Perspective: Long-term PM2.5 Exposure and the Clinical Application of Machine Learning for Predicting Incident Atrial Fibrillation – DocWire…

READ THE ORIGINAL ABSTRACT HERE.

Although atrial fibrillation (AF) often leads to complications such as stroke in patients without an awareness of such preexisting diseases, electrocardiogram screening is not sufficient to detect AF in the general population. Some scoring systems for predicting incident AF have been introduced, including the CHADS2, CHA2DS2-VASc, and HATCH scores; however, their prediction accuracies are not sufficient for wide application. Although epidemiological studies have suggested that an elevated level of ambient particulate matter <2.5 m in aerodynamic diameter (PM2.5) is consistently associated with adverse cardiac events and arrhythmias, including AF, the role of PM2.5 on incident AF remains to be investigated.

We also published about the association between PM2.5 and the incident AF in the general population previously, we then analyzed with machine learning methods in this study and it showed the improved accuracies for predicting incident AF in the general population after applying averaged PM2.5 concentration compared to the existing risk scoring systems.

And finally, we developed the online individual risk calculator for predicting incident AF with applying averaged PM2.5 concentration (https://ml4everyoneisk2.shinyapps.io/RiskCalcAFPM25_ISK). This study was performed with South Korean general population exposed to high levels of air pollution. Further external validation is warranted especially in Western countries affected by low levels of air pollution.

-Dr. Im-Soo Kim, co-author

The rest is here:
Abstract Perspective: Long-term PM2.5 Exposure and the Clinical Application of Machine Learning for Predicting Incident Atrial Fibrillation - DocWire...

Read More..

Machine Learning Data Catalog Software Market share forecast to witness considerable growth from 2020 to 2025 | By Top Leading Vendors IBM, Alation,…

Latest research report, titled Global Machine Learning Data Catalog Software Market Insights, Forecast to 2025. this report included a special section on the Impact of COVID-19. Also, Machine Learning Data Catalog Software Market (By major Key Players, By Types, By Applications, and Leading Regions) Segments outlook, Business assessment, Competition scenario and Trends .The report also gives 360-degree overview of the competitive landscape of the industries. SWOT analysis has been used to understand the strength, weaknesses, opportunities, and threats in front of the businesses. Moreover, it offers highly accurate estimations on the CAGR, market share, and market size of key regions and countries. Players can use this study to explore untapped Machine Learning Data Catalog Software markets to extend their reach and create sales opportunities.

Top Key players profiled in the report include:IBM, Alation, Oracle, Cloudera, Unifi, Anzo Smart Data Lake (ASDL), Collibra, Informatica, Hortonworks, Reltio, Talend and More

To Get PDF Sample Copy of the Report(with covid 19 Impact Analysis): https://www.globmarketreports.com/request-sample/9714

Machine Learning Data Catalog Software market competitive landscape offers data information and details by companies. Its provides a complete analysis and precise statistics on revenue by the major players participants for the period 2020-2025. The report also illustrates minute details in the Machine Learning Data Catalog Software market governing micro and macroeconomic factors that seem to have a dominant and long-term impact, directing the course of popular trends in the global Machine Learning Data Catalog Software market.

Market Type SegmentationCloud BasedWeb Based

Industry SegmentationLarge EnterprisesSMEs

Regions Covered in the Global Machine Learning Data Catalog Software Market: The Middle East and Africa (GCC Countries and Egypt) North America (the United States, Mexico, and Canada) South America (Brazil etc.) Europe (Turkey, Germany, Russia UK, Italy, France, etc.) Asia-Pacific (Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia, and Australia)

Years Considered to Estimate the Market Size:History Year: 2015-2019Base Year: 2019Estimated Year: 2020Forecast Year: 2020-2025

Get Chance of up to 30% Extra Discount @ https://www.globmarketreports.com/request-discount/9714

Some Major TOC Points:

For More Information with including full TOC: https://www.globmarketreports.com/industry-reports/9714/Machine-Learning-Data-Catalog-Software-market

Customization of the Report:Glob Market Reports provides customization of reports as per your need. This report can be personalized to meet your requirements. Get in touch with our sales team, who will guarantee you to get a report that suits your necessities.

Get Customization of the [emailprotected]:https://www.globmarketreports.com/report/request-customization/9714/Machine-Learning-Data-Catalog-Software-market

Get in Touch with Us :Mr. Marcus KelCall: +1 915 229 3004 (U.S)+44 7452 242832 (U.K)Email: [emailprotected]

More here:
Machine Learning Data Catalog Software Market share forecast to witness considerable growth from 2020 to 2025 | By Top Leading Vendors IBM, Alation,...

Read More..

Every Thing You Need to Know About Quantum Computers – Analytics Insight

Quantum computersare machines that use the properties of quantum physics to store data and perform calculations based on the probability of an objects state before it is measured. This can be extremely advantageous for certain tasks where they could vastlyoutperform even the best supercomputers.

Quantum computers canprocess massive and complex datasetsmore efficiently than classical computers. They use the fundamentals of quantum mechanics to speed up the process of solving complex calculations. Often, these computations incorporate a seemingly unlimited number of variables and the potential applications span industries from genomics to finance.

Classic computers, which include smartphones and laptops, carry out logical operations using the definite position of a physical state. They encode information in binary bits that can either be 0s or 1s. In quantum computing, operations instead use the quantum state of an object to produce the basic unit of memory called as a quantum bit or qubit. Qubits are made using physical systems, such as the spin of an electron or the orientation of a photon. These systems can be in many different arrangements all at once, a property known as quantum superposition. Qubits can also be inextricably linked together using a phenomenon called quantum entanglement. The result is that a series of qubits can represent different things simultaneously. These states are the undefined properties of an object before theyve been detected, such as the spin of an electron or the polarization of a photon.

Instead of having a clear position, unmeasured quantum states occur in a mixed superposition that can be entangled with those of other objects as their final outcomes will be mathematically related even. The complex mathematics behind these unsettled states of entangled spinning coins can be plugged into special algorithms to make short work of problems that would take a classical computer a long time to work out.

American physicist andNobel laureate Richard Feynmangave a note about quantum computers as early as 1959. He stated that when electronic components begin to reach microscopic scales, effects predicted by quantum mechanics occur, which might be exploited in the design of more powerful computers.

During the 1980s and 1990s, the theory of quantum computers advanced considerably beyond Feynmans early speculation. In 1985,David Deutschof the University of Oxford described the construction of quantum logic gates for a universal quantum computer.Peter Shor of AT&T devised an algorithmto factor numbers with a quantum computer that would require as few as six qubits in 1994. Later in 1998, Isaac Chuang of Los Alamos National Laboratory, Neil Gershenfeld of Massachusetts Institute of Technology (MIT) and Mark Kubince of the University of Californiacreated the first quantum computerwith 2 qubits, that could be loaded with data and output a solution.

Recently, Physicist David Wineland and his colleagues at the US National Institute for Standards and Technology (NIST) announced that they havecreated a 4-qubit quantum computerby entangling four ionized beryllium atoms using an electromagnetic trap. Today, quantum computing ispoised to upend entire industriesstarting from telecommunications to cybersecurity, advanced manufacturing, finance medicine and beyond.

There are three primary types of quantum computing. Each type differs by the amount of processing power (qubits) needed and the number of possible applications, as well as the time required to become commercially viable.

Quantum annealing is best for solving optimization problems. Researchers are trying to find the best and most efficient possible configuration among many possible combinations of variables.

Volkswagen recently conducted a quantum experiment to optimize traffic flows in the overcrowded city of Beijing, China. The experiment was run in partnership with Google and D-Wave Systems. Canadian company D-Wave developed quantum annealer. But, it is difficult to tell whether it actually has any real quantumness so far. The algorithm could successfully reduce traffic by choosing the ideal path for each vehicle.

Quantum simulations explore specific problems in quantum physics that are beyond the capacity of classical systems. Simulating complex quantum phenomena could be one of the most important applications of quantum computing. One area that is particularly promising for simulation is modeling the effect of a chemical stimulation on a large number of subatomic particles also known as quantum chemistry.

Universal quantum computers are the most powerful and most generally applicable, but also the hardest to build. Remarkably, a universal quantum computer would likely make use of over 100,000 qubits and some estimates put it at 1M qubits. But to the disappointment, the most qubits we can access now is just 128. The basic idea behind the universal quantum computer is that you could direct the machine at any massively complex computation and get a quick solution. This includes solving the aforementioned annealing equations, simulating quantum phenomena, and more.

Read more:
Every Thing You Need to Know About Quantum Computers - Analytics Insight

Read More..

Quantum computing will impact the enterprise–we just don’t know how – TechRepublic

Quantum computing promises to take on problems that were previously unsolvable. This whole new level of compute power will make it possible to crunch incredible volumes of data that traditional computers cant manage. It will allow researchers to develop new antibiotics, polymers, electrolytes, and so much more.

While the options for quantum computing uses may seem endless, the enterprise is still deciding if this is all just a pipe dream or a future reality.

TechRepublic Premium recently surveyed 598 professionals to learn what they know about quantum computing and what they dont. This report will fill in some of those gaps.

The survey asked the following questions:

Quantum computing is unknown territory for almost all of the survey respondents, as 90% stated that they had little to no understanding of the topic. In fact, only 11% of the 598 respondents said they had an excellent understanding of quantum computing.

Further, 36% of respondents said they were not sure which company was leading the race to develop a quantum computer. IBM got 28% of the votes, and Google got 18%. 1QBit and D-Wave each got 6% of votes. Honeywell came in at 3%.

In terms of industry impact, more than half of the respondents (58%) said that quantum computing will have either a significant impact or somewhat of an impact on the enterprise. While all industries will benefit through different use cases because quantum computing allows data to be consumed and processed faster while using less energy, 42% of survey respondents said IT would benefit the most. The pharmaceutical and finance sectors followed at 14% and 12%, respectfully.

To read all of the survey results, plus analysis, download the full report.

See the rest here:
Quantum computing will impact the enterprise--we just don't know how - TechRepublic

Read More..

Quantum Computing and the Cryptography Conundrum – CXOToday.com

By: Anand Patil

On October 23, 2019, researchers from Google made an official announcement of a major breakthrough one that scientists compared to the Wright Brothers first flight, or even mans first moon landing. They said to have achieved Quantum Supremacy, meaning that they had created a Quantum Computer that could perform a calculation that is considered impossible by the classical computers of today. The announcement was a landmark, highlighting the possibilities of Quantum Computing.

The concept of Quantum Computing itself isnt new. It is a field that has been a point of interest of physicists and computer researchers since the 1980s. Googles announcement, however, has brought it to the mainstream, and shone a spotlight on the promise that this niche field of innovation holds. Of course, like someone once said, with great power comes with great responsibility, so this field isnt without complexities.

The Possibilities of Quantum Computing

Quantum Computing is a branch of computer science that is focused on leveraging the principles of quantum physics to develop computer technology. Quantum Computers hold the promise to power major advances in various fields that require complex calculations from materials science and pharmaceuticals to aerospace and artificial intelligence (AI).

So far, Quantum Computers have been nothing more than fancy laboratory experiments large and expensive but they have successfully demonstrated that the underlying principles are sound and have the potential to transform industries and accelerate innovation like never before. This has spurred scientific and industrial interest in this nascent field, giving rise to multiple projects across the world in pursuit of creating a viable, general-use Quantum Computer. That said, it may still be many years before Quantum Computers are commercially and generally available.

So Why Does It Matter Today?The possibility of Quantum Computers poses a serious challenge to cryptographic algorithms deployed widely today. Todays key-exchange algorithms, like RSA, Diffie-Hellman, and others, rely on very difficult mathematical problems such as prime factorization for their security, which a Quantum computer would be able to solve much faster than a classical computer.

For example, it would take a classical computer centuries or even longer, to break modern algorithms like DH, RSA-2048 etc. by using brute-force methods. However, given the power and efficiency of quantum machines in calculations such as finding prime factors of large numbers it may be possible for a quantum computer to break current asymmetric algorithms in a matter of days

So, while the encrypted internet is not at risk at the moment, all that a bad actor has to do is capture the encrypted data today including the initial key exchange, and then wait until a powerful enough quantum computer is available to decrypt it. This is particularly a problem for organizations that have large amounts of sensitive data that they need to protect over the long term such as Banks, Governments and Defense agencies.

What Can I Do Now?For organizations that could be at risk in the future, this is the best time to start evaluating post-quantum cryptography. Simply put, this means moving to algorithms and/or keys that are a lot more robust and can withstand a brute-force attack by a quantum computer i.e. quantum resistant.

The National Institute of Standards and Technology (NIST) in the US is leading the effort towards the standardization of post-quantum secure algorithms. However, given the lengthy process involved, this may take many years to fructify.

An alternative is to use Quantum Key Distribution (QKD) techniques with existing algorithms that are considered quantum-safe. This involves using a dedicated optical channel to exchange keys using the quantum properties of photons. Any attempt to tap this secure channel will lead to a change in the quantum state of the photon and can be immediately detected and therefore the key is unhackable. One of the limitations of QKD in this method is the need for a dedicated optical channel that cannot span more than 50km between the two terminals. Of course, this also means that the existing encryption devices or routers should be capable of ingesting such Quantum-Generated keys.

Post-Quantum Cryptography and CiscoCisco is an active contributor to the efforts to standardize post-quantum algorithms. However, recognizing that an implementable standard may be some years away, there is work ongoing to ensure that organizations are able to implement quantum-resistant encryption techniques in the interim, that leverage existing network devices like routers which are most commonly used as encryptors.

To start with, a team of veteran technical leaders and cryptography experts from Cisco US David McGrew, Scott Fluhrer, Lionel Florit and the engineering team in Cisco India lead by Amjad Inamdar and Ramas Rangaswamy developed an API interface called the Secure Key Import Protocol or SKIP through which Cisco routers can securely ingest keys from an external post-quantum key source. This allows existing Cisco routers to be quantum-ready, with just the addition of an external QKD system. Going forward, this team is working on a way to deliver quantum-safe encryption keys without the need for short-range point-to-point connections.

The advantage of this method is that organizations can integrate post-quantum key sources with existing networking gear in a modular fashion without the need to replace anything already installed. In this manner, you could create a quantum-ready network for all traffic with minimal effort.

Getting Ready for the Post-Quantum WorldQuantum Supremacy is an event which demonstrates that a quantum machine is able to solve a problem that no classical computer can solve in a feasible amount of time. This race has gathered momentum in the recent past with several companies joining the bandwagon, and some even claiming to have achieved it.

There is an unprecedented amount of attention focused on making a commercially viable quantum computer. Many believe it is inevitable, and only a question of time. When it does happen, the currently used cryptography techniques will become vulnerable, and therefore be limited in their security. The good news is, there are methods available to adopt strong encryption techniques that will remain secure even after quantum computers are generally available.

If you are an organization that wants to protect its sensitive data over the long term, you should start to evaluate post-quantum secure encryption techniques today. By leveraging existing networking infrastructure and adding suitable post-quantum key distribution techniques, it is possible to take a quantum leap in securing your data.

(The author is Director, Systems Engineering, Cisco India and SAARC and the views expressed in this article are his own)

More:
Quantum Computing and the Cryptography Conundrum - CXOToday.com

Read More..