Page 3,778«..1020..3,7773,7783,7793,780..3,7903,800..»

ServiceNow pulls on its platforms, talks up machine learning, analytics in biggest release since ex-SAP boss took reins – The Register

As is the way with the 21st century, IT companies are apt to get meta and ServiceNow is no exception.

In its biggest product release since the arrival of SAP revenue-boosting Bill McDermott as new CEO, the cloudy business process company is positioning itself as the "platform of platforms". Which goes to show, if nothing else, that platformization also applies to platforms.

To avoid plunging into an Escher-eque tailspin of abstraction, it is best to look at what Now Platform Orlando actually does and who, if anyone, it might help.

The idea is that ServiceNow's tools make routine business activity much easier and slicker. To this the company is adding intelligence, analytics and AI, it said.

Take the arrival of a new employee. They might need to be set up on HR and payroll systems, get access to IT equipment and applications, have facilities management give them the right desk and workspace, be given building security access and perhaps have to sign some legal documents.

Rather than multiple people doing each of these tasks with different IT systems, ServiceNow will make one poor soul do it using its single platform, which accesses all the other prerequisite applications, said David Flesh, ServiceNow product marketing director.

It is also chucking chatbots at that luckless staffer. In January, ServiceNow bought Passage AI, a startup that helps customers build chatbots in multiple languages. It is using this technology to create virtual assistants to help with some of the most common requests that hit HR and IT service desks, for example password resets, getting assess to Wi-Fi, that kind of thing.

This can also mean staffers don't have to worry where they send requests, meaning if, for example, they've just found out they're going to become a parent, they can fire questions at an agent rather than HR, their boss or the finance team. The firm said: "Agents are a great way for employees find information and abstracts that organizational complexity."

ServiceNow has also introduced machine learning, for example, in IT operations management, which uses systems data to identify when a service is degrading and what could be causing the problem. "You get more specific information about the cause and suggested actions to take to actually remediate the problem," Flesh said.

Customers looking to use this feature will still have to train the machine learning models on historic datasets from their operations and validate models, as per the usual ML pipeline. But ServiceNow makes the process more graphical, and comes with its knowledge of common predictors of operational problems.

Lastly, analytics is a new feature in the update. Users can include key performance indicators in the workflows they create, and the platform includes the tools to track and analyse those KPIs and suggest how to improve performance. It also suggests useful KPIs.

Another application of the analytics tools is for IT teams - traditionally the company's core users - monitoring cloud services. ServiceNow said it helps optimise organisations' cloud usage by "making intelligent recommendations on managing usage across business hours, choosing the right resources and enforcing usage policies".

With McDermott's arrival and a slew of new features and customer references, ServiceNow is getting a lot of attention, but many of these technologies exist in other products.

There are independent robotic process automation (RPA) vendors who build automation into common tasks, while application vendors are also introducing automation within their own environments. But as application and platform upgrade cycles are sluggish, and RPA has proved difficult to scale, ServiceNow may find a receptive audience for its, er, platform of platforms.

Sponsored: Webcast: Why you need managed detection and response

Read the original post:
ServiceNow pulls on its platforms, talks up machine learning, analytics in biggest release since ex-SAP boss took reins - The Register

Read More..

AI in the Translation Industry The 5-10 Year Outlook – AiThority

Artificial intelligence(AI) has had a major and positive impact on a range of industries already, with the potential to give much more in the future. We sat down with Ofer Tirosh, CEO ofTomedes, to find out how the translation industry has changed as a result of advances in technology over the past 10 years and what the future might hold in store for it.

Translation services have felt the impact of technology in various positive ways during recent years. For individual translators, the range and quality of computer-assisted translation (CAT) tools have increased massively. A CAT tool is a piece of software that supports the translation process. It helps the translator to edit and manage their translations.

CAT tools usually include translation memories, which are particularly valuable to translators. They store sentences and their translations for future use and can save a vast amount of time during the translation process. This means that translators can work more efficiently, without compromising on quality.

There are myriad other ways that technology has helped the industry. Everything from transcription to localization services has become faster and better as a result of tech advances. Even things likecontract automationmake a difference, as they speed up the overall time taken to set up and deliver on each client contract.

Also Read:Top 9 SaaS Startups in India 2020

Machine translation is an issue that affects not just our translation agency but the industry as a whole. Human translation still outdoes machine translation in terms of quality but the fact that websites that can translate for free are widely available has tempted many companies to try machine translation. The resulting translations are not good qualitand this acceptance of below-par translations isnt great for the industry as a whole, as it drives down standards.

There were some fears around machine translation taking over from professional translation services whenmachine learningwas first used to move away from statistical-based machine translation. However, those fears havent really materialized. Indeed, the Bureau of Labor Statistics is projecting19% growthfor the employment of interpreters and translators between 2018 and 2028, which is well above the average growth rate.

Instead, the industry has adapted to work alongside the new machine translation technology, with translators providing post-editing machine translation services, which essentially tidy up computerized attempts at translation and turn them into high-quality documents that accurately reflect the original content.

It was the introduction of neural networks that really took machine language learning to the next level. Previously, computers relied on the analysis of phrases (and before that, words) from existing human translations in order to produce a translation. The results were far from ideal.

Neural networks have provided a different way forward. A machine learning algorithm is used so that the machine can explore the data in its own way, learning and progressing in ways that were not previously possible. What is particularly exciting about this approach is the adaptability of the model that the machine creates. Its not a static process but one that can flex and change over time and based on new data.

Also Read:Vectors of Innovation with Conversational AI

I think the fears of machines taking over from human translation professionals have been put to bed for now. Yes, machines can translate better than they used to, but they still cant translate as well as humans can.

I think that well see a continuation of the trend towards more audio and video translation. Video, in particular, has become such an important marketing and social connection tool that demandvideo translationis likely to boom in the years ahead, just as it has for the past few years.

Ive not had access yet to anyPredictive Intelligencedata for the translation industry, unfortunately, but were definitely likely to experience an increase in demand for more blended human and machine translation models over the coming years. Theres an increasing need to translate faster without a drop in quality, for example in relation to thespread of coronavirus. We need to ensure a smooth, rapid flow of accurate information from country to country in order to tackle the situation as a global issue and not a series of local ones. Thats where both machines and humans can support the delivery of high quality, fast translation services, by working together to achieve maximum efficiency.

AI has had a major impact on the translation industry over the past ten years and I expect the pace of change over the next ten to be even greater, as the technology continues to advance.

Also Read: Proactive vs Reactive: Eliminating Passive Safety Systems With New Technology Trends

Original post:
AI in the Translation Industry The 5-10 Year Outlook - AiThority

Read More..

2020-2027 Machine Learning in Healthcare Cybersecurity Industry Trends Survey and Prospects Report – 3rd Watch News

Summary: Global Machine Learning in Healthcare CybersecurityMarket 2020 by Company, Regions, Type and Application, Forecast to 2027

This report gives an in-depth research about the overall state of Machine Learning in Healthcare Cybersecurity Market and projects an overview of its growth Industry. It also gives the crucial elements of the market and across major global regions in detail. Number on primary and secondary research has been carried out in order to collect required data for completing this particular report. Sever industry based analytical techniques has been narrowed down for a better understanding of this market.

It explains the key market drivers, trends, restraints and opportunities to give a precise data which is required and expected. It also analyzes how such aspects affect the market existence globally helping make a wider and better choice of market establishment. The Machine Learning in Healthcare Cybersecurity Markets growth and developments are studied and a detailed overview is been given.

Get sample copy of this report:Global Machine Learning in Healthcare Cybersecurity Market 2020, Forecast to 2027

Leading Key Players: (You will get some more details, Please Enquire for sample by clicking on provided links. Thank You)

This report studies the Machine Learning in Healthcare Cybersecurity market status and outlook of Global and major regions, from angles of players, countries, product types and end industries; this report analyzes the top players in global market, and splits the Machine Learning in Healthcare Cybersecurity market by product type and applications/end industries.

Regions and Countries Level Analysis

Regional analysis is another highly comprehensive part of the research and analysis study of the global Machine Learning in Healthcare Cybersecurity market presented in the report. This section sheds light on the sales growth of different regional and country-level Machine Learning in Healthcare Cybersecurity markets. For the historical and forecast period 2015 to 2027, it provides detailed and accurate country-wise volume analysis and region-wise market size analysis of the global Machine Learning in Healthcare Cybersecurity market.

The report offers in-depth assessment of the growth and other aspects of the Machine Learning in Healthcare Cybersecurity market in important countries (regions), including:

North America (United States, Canada and Mexico)

Europe (Germany, France, UK, Russia and Italy)

Asia-Pacific (China, Japan, Korea, India and Southeast Asia)

South America (Brazil, Argentina, etc.)

Middle East & Africa (Saudi Arabia, Egypt, Nigeria and South Africa)

THIS REPORT PROVIDES COMPREHENSIVE ANALYSIS OF

Key market segments and sub-segments

Evolving market trends and dynamics

Changing supply and demand scenarios

Quantifying market opportunities through market sizing and market forecasting

Tracking current trends/opportunities/challenges

Competitive insights

Opportunity mapping in terms of technological breakthroughs

Machine Learning in Healthcare Cybersecurity Application Services

Reasons to buy

Identify high potential categories and explore further market opportunities based on detailed value and volume analysis

Existing and new players can analyze key distribution channels to identify and evaluate trends and opportunities

Gain an understanding of the total competitive landscape based on detailed brand share analysis to plan effective market positioning

Our team of analysts have placed a significant emphasis on changes expected in the market that will provide a clear picture of the opportunities that can be tapped over the next five years, resulting in revenue expansion

Analysis on key macro-economic indicators such as real GDP, nominal GDP, consumer price index, household consumption expenditure, population (by age group, gender, rurral-urban split, and employed people and unemployment rate. It also includes economic summary of the country along with labor market and demographic trends.

TABLE OF CONTENTS:

Global Machine Learning in Healthcare Cybersecurity Market 2020 by Company, Regions, Type and Application, Forecast to 2027

1 Market Overview

2 Manufacturers Profiles

3 Sales, Revenue and Market Share by Manufacturer

4 Global Market Analysis by Regions

5 North America by Country

6 Europe by Country

7 Asia-Pacific by Regions

8 South America by Country

9 Middle East & Africa by Countries

10 Market Segment by Type

11 Global Machine Learning in Healthcare Cybersecurity Market Segment by Application

12 Market Forecast

13 Sales Channel, Distributors, Traders and Dealers

14 Research Findings and Conclusion

15 Appendix

Enquire for complete report: Global Machine Learning in Healthcare Cybersecurity Market 2020, Forecast to 2027

About Reports and Markets:

REPORTS AND MARKETS is not just another company in this domain but is a part of a veteran group called Algoro Research Consultants Pvt. Ltd. It offers premium progressive statistical surveying, market research reports, analysis & forecast data for a wide range of sectors both for the government and private agencies all across the world. The database of the company is updated on a daily basis. Our database contains a variety of industry verticals that include: Food Beverage, Automotive, Chemicals and Energy, IT & Telecom, Consumer, Healthcare, and many more. Each and every report goes through the appropriate research methodology, Checked from the professionals and analysts.

Contact Info

Reports and Markets

Sanjay Jain

Manager Partner Relations & International Marketing

http://www.reportsandmarkets.com

Connect with Us:LinkedIn|Facebook|Twitter

Ph: +1-352-353-0818 (US)

Excerpt from:
2020-2027 Machine Learning in Healthcare Cybersecurity Industry Trends Survey and Prospects Report - 3rd Watch News

Read More..

Owkin and the University of Pittsburgh Launch a Collaboration to Advance Cancer Research With AI and Federated Learning – AiThority

Owkin, a startup that deploys AI and Federated Learning technologies to augment medical research and enable scientific discoveries, announces a collaboration with the University of Pittsburgh. This pilot leverages the high-quality datasets and world-class medical research within Pitts Departments of Biomedical Informatics and Pathology, as well as Owkins pioneering technologies and research platform. Collaborations such as these have potential to advance clinical research and drug development.

Pitt researchers led by Michael Becich, MD, PhD, Associate Vice Chancellor for Informatics in the Health Sciences and Chairman and Distinguished University Professor of the Department of Biomedical Informatics (DBMI), will team up with Owkin to develop and validate prognostic machine learning models. The pilot project will then have the potential to expand into several key therapeutic areas for the University.

Recommended AI News: Vectors Of Innovation With Conversational AI

The Pitt Department of Biomedical Informatics in partnership with the Department of Pathology is committed to improving biomedical research and clinical care through the innovative application of informatics and best practices in next generation data sharing. This collaboration with Owkin will expand our innovations in the computational pathology space, Dr. Becich said. Our currently funded projects explore areas such as the intersection of genomics and machine learning applied to histopathologic imaging (computational pathology) to broaden our understanding of the role of the tumor microenvironment for precision immune-oncology.

This partnership makes it possible for Pitt to join the Owkin Loop, a federated network of US and European academic medical centers that collaborate with Owkin to generate new insights from high-quality, curated, research-grade, multi-modal patient data captured in clinical trials or research cohorts. Loop generated insights can inform pharmaceutical drug development strategy, from biomarker discovery to clinical trial design, and product differentiation. Owkin seeks to create a movement in medicine by establishing federated learning at the core of future research.

Recommended AI News: AiThority Interview with Ben John, Chief Technology Officer at Xandr

Federated learning technologies enable researchers in different institutions and different geographies to collaborate and train multicentric AI models on heterogeneous datasets, resulting in better predictive performance and higher generalizability. Data does not move, only the algorithms travel, thus protecting an institutions data governance and privacy. Furthermore, Owkins data use is compliant with local ethical body consent processes and data compliance regulations such as HIPAA and GDPR.

Were thrilled to launch this project with Dr. Becich and his team at Pitt. The quality and size of the Universitys research cohorts in combination with the DBMIs mandate to bring together healthcare physicians and innovative academics to work on some of the most cutting-edge science, makes this collaboration a great opportunity to develop predictive AI models and to scale other research in the future. Owkin is proud to bring their expertise in machine learning technologies and data scientists to the table to foment new clinical insights, Meriem Sefta, Owkin Head of Partnerships said.

Recommended AI News: 5 Innovative Applications Of Quantum Computing

See the original post:
Owkin and the University of Pittsburgh Launch a Collaboration to Advance Cancer Research With AI and Federated Learning - AiThority

Read More..

Next-gen supercomputers are fast-tracking treatments for the coronavirus in a race against time – CNBC

A computer image created by Nexu Science Communication together with Trinity College in Dublin, shows a model structurally representative of a betacoronavirus which is the type of virus linked to COVID-19.

Source: NEXU Science Communication | Reuters

Research has gone digital, and medical science is no exception. As the novel coronavirus continues to spread, for instance, scientists searching for a treatment have drafted IBM's Summit supercomputer, the world's most powerful high-performance computing facility, according to the Top500 list, to help find promising candidate drugs.

One way of treating an infection could be with a compound that sticks to a certain part of the virus, disarming it. With tens of thousands of processors spanning an area as large as two tennis courts, the Summit facility at Oak Ridge National Laboratory (ORNL) has more computational power than 1 million top-of-the-line laptops. Using that muscle, researchers digitally simulated how 8,000 different molecules would interact with the virus a Herculean task for your typical personal computer.

"It took us a day or two, whereas it has traditionally taken months on a normal computer," said Jeremy Smith, director of the University of Tennessee/ORNL Center for Molecular Biophysics and principal researcher in the study.

Simulations alone can't prove a treatment will work, but the project was able to identify 77 candidate molecules that other researchers can now test in trials. The fight against the novel coronavirus is just one example of how supercomputers have become an essential part of the process of discovery. The $200 million Summit and similar machines also simulate the birth of the universe, explosions from atomic weapons and a host of events too complicated or too violent to recreate in a lab.

The current generation's formidable power is just a taste of what's to come. Aurora, a $500 million Intel machine currently under installation at Argonne National Laboratory, will herald the long-awaited arrival of "exaflop" facilities capable of a billion billion calculations per second (five times more than Summit) in 2021 with others to follow. China, Japan and the European Union are all expected to switch on similar "exascale" systems in the next five years.

These new machines will enable new discoveries, but only for the select few researchers with the programming know-how required to efficiently marshal their considerable resources. What's more, technological hurdles lead some experts to believe that exascale computing might be the end of the line. For these reasons, scientists are increasingly attempting to harness artificial intelligenceto accomplish more research with less computational power.

"We as an industry have become too captive to building systems that execute the benchmark well without necessarily paying attention to how systems are used," says Dave Turek, vice president of technical computing for IBM Cognitive Systems. He likens high-performance computing record-seeking to focusing on building the world's fastest race car instead of highway-ready minivans. "The ability to inform the classic ways of doing HPC with AI becomes really the innovation wave that's coursing through HPC today."

Just getting to the verge of exascale computing has taken a decade of research and collaboration between the Department of Energy and private vendors. "It's been a journey," says Patricia Damkroger, general manager of Intel's high-performance computing division. "Ten years ago, they said it couldn't be done."

While each system has its own unique architecture, Summit, Aurora, and the upcoming Frontier supercomputer all represent variations on a theme: they harness the immense power of graphical processing units (GPUs) alongside traditional central processing units (CPUs). GPUs can carry out more simultaneous operations than a CPU can, so leaning on these workhorses has let Intel and IBM design machines that would have otherwise required untold megawatts of energy.

IBM's Summit supercomputer currently holds the record for the world's fastest supercomputer.

Source: IBM

That computational power lets Summit, which is known as a "pre-exascale" computer because it runs at 0.2 exaflops, simulate one single supernova explosion in about two months, according to Bronson Messer, the acting director of science for the Oak Ridge Leadership Computing Facility. He hopes that machines like Aurora (1 exaflop) and the upcoming Frontier supercomputer (1.5 exaflops) will get that time down to about a week. Damkroger looks forward to medical applications. Where current supercomputers can digitally model a single heart, for instance, exascale machines will be able to simulate how the heart works together with blood vessels, she predicts.

But even as exascale developers take a victory lap, they know that two challenges mean the add-more-GPUs formula is likely approaching a plateau in its scientific usefulness. First, GPUs are strong but dumbbest suited to simple operations such as arithmetic and geometric calculations that they can crowdsource among their many components. Researchers have written simulations to run on flexible CPUs for decades and shifting to GPUs often requires starting from scratch.

GPU's have thousands of cores for simultaneous computation, but each handles simple instructions.

Source: IBM

"The real issue that we're wrestling with at this point is how do we move our code over" from running on CPUs to running on GPUs, says Richard Loft, a computational scientist at the National Center for Atmospheric Research, home of Top500's 44th ranking supercomputerCheyenne, a CPU-based machine "It's labor intensive, and they're difficult to program."

Second, the more processors a machine has, the harder it is to coordinate the sharing of calculations. For the climate modeling that Loft does, machines with more processors better answer questions like "what is the chance of a once-in-a-millennium deluge," because they can run more identical simulations simultaneously and build up more robust statistics. But they don't ultimately enable the climate models themselves to get much more sophisticated.

For that, the actual processors have to get faster, a feat that bumps up against what's physically possible. Faster processors need smaller transistors, and current transistors measure about 7 nanometers. Companies might be able to shrink that size, Turek says, but only to a point. "You can't get to zero [nanometers]," he says. "You have to invoke other kinds of approaches."

If supercomputers can't get much more powerful, researchers will have to get smarter about how they use the facilities. Traditional computing is often an exercise in brute forcing a problem, and machine learning techniques may allow researchers to approach complex calculations with more finesse.

More from Tech Trends:Robotic medicine to fight the coronavirusRemote work techology that is key

Take drug design. A pharmacist considering a dozen ingredients faces countless possible recipes, varying amounts of each compound, which could take a supercomputer years to simulate. An emerging machine learning technique known as Bayesian Optimization asks, does the computer really need to check every single option? Rather than systematically sweeping the field, the method helps isolate the most promising drugs by implementing common-sense assumptions. Once it finds one reasonably effective solution, for instance, it might prioritize seeking small improvements with minor tweaks.

In trial-and-error fields like materials science and cosmetics, Turek says that this strategy can reduce the number of simulations needed by 70% to 90%. Recently, for instance, the technique has led to breakthroughs in battery design and the discovery of a new antibiotic.

Fields like climate science and particle physics use brute-force computation in a different way, by starting with simple mathematical laws of nature and calculating the behavior of complex systems. Climate models, for instance, try to predict how air currents conspire with forests, cities, and oceans to determine global temperature.

Mike Pritchard, a climatologist at the University of California, Irvine, hopes to figure out how clouds fit into this picture, but most current climate models are blind to features smaller than a few dozen miles wide. Crunching the numbers for a worldwide layer of clouds, which might be just a couple hundred feet tall, simply requires more mathematical brawn than any supercomputer can deliver.

Unless the computer understands how clouds interact better than we do, that is. Pritchard is one of many climatologists experimenting with training neural networksa machine learning technique that looks for patterns by trial and errorto mimic cloud behavior. This approach takes a lot of computing power up front to generate realistic clouds for the neural network to imitate. But once the network has learned how to produce plausible cloudlike behavior, it can replace the computationally intensive laws of nature in the global model, at least in theory. "It's a very exciting time," Pritchard says. "It could be totally revolutionary, if it's credible."

Companies are preparing their machines so researchers like Pritchard can take full advantage of the computational tools they're developing. Turek says IBM is focusing on designing AI-ready machines capable of extreme multitasking and quickly shuttling around huge quantities of information, and the Department of Energy contract for Aurora is Intel's first that specifies a benchmark for certain AI applications, according to Damkroger. Intel is also developing an open-source software toolkit called oneAPI that will make it easier for developers to create programs that run efficiently on a variety of processors, including CPUs and GPUs.As exascale and machine learning tools become increasingly available, scientists hope they'll be able to move past the computer engineering and focus on making new discoveries. "When we get to exascale that's only going to be half the story," Messer says. "What we actually accomplish at the exascale will be what matters."

See the original post:
Next-gen supercomputers are fast-tracking treatments for the coronavirus in a race against time - CNBC

Read More..

Quantum computing, AI, China, and synthetics highlighted in 2020 Tech Trends report – VentureBeat

The worlds tech industry will be shaped by China, artificial intelligence, cancel culture, and other key trends, according to the Future Today Institutes 2020 Tech Trends Report.

Now in its thirteenth year, the document is put together by the Future Today Institute and director Amy Webb, who is also a professor at New York Universitys Stern School of Business. The report attempts to recognize connections between tech and future uncertainties, like the outcome of the 2020 U.S. presidential election, as well as the spread of diseases like COVID-19.

Among major trends in the report, 2020 is expected to be the synthetic decade.

Soon we will produce designer molecules in a range of host cells on demand and at scale, which will lead to transformational improvements in vaccine production, tissue production, and medical treatments. Scientists will start to build entire human chromosomes, and they will design programmable proteins, the report reads.

Augmentation of senses like hearing and sight, social media scaremongering, new ways to measure trust, and Chinas role in the growth of AI are also listed among key takeaways.

Artificial intelligence is again the first item highlighted on the list, and the tech Webb says is sparking a third wave of computing comes with positives, like the role AlphaFold can play in discovering cures for diseases, as well as negatives, like AIscurrent impact on the criminal justice system.

Tech giants in the U.S. and China like Amazon, Facebook, Google, and Microsoft in the United States and Tencent and Baidu in China continue to deliver the greatest impact. Webb predicts how these companies will shape the world in her 2019 bookThe Big Nine.

Those nine companies drive the majority of research, funding, government involvement, and consumer-grade applications of AI. University researchers and labs rely on these companies for data, tools, and funding, the report reads. Big Nine AI companies also wield huge influence over AI mergers and acquisitions, funding AI startups, and supporting the next generation of developers.

Other AI trends include synthetic data, a military-tech industrial complex, and systems made to recognize people.

Visit the Future Today Institute website to read the full report, which flags trends that require immediate action and highlights trends by industry.

Webb urges readers to digest the 366-page report in multiple sittings, rather than trying to read it all at once. She typically debuts the report with a presentation to thousands at the SXSW conference in Austin, Texas, but the conference was cancelled due to COVID-19.

Go here to read the rest:
Quantum computing, AI, China, and synthetics highlighted in 2020 Tech Trends report - VentureBeat

Read More..

Early investment in quantum computing could result in a competitive advantage – Help Net Security

Improved AI capabilities, accelerated business intelligence, and increased productivity and efficiency were the top expectations of organizations currently investing in cloud-based quantum computing technologies, according to IDC.

Initial survey findings indicate that while cloud-based quantum computing is a young market, and allocated funds for quantum computing initiatives are limited (0-2% of IT budgets), end-users are optimistic that early investment will result in a competitive advantage.

The manufacturing, financial services, and security industries are currently leading the way by experimenting with more potential use cases, developing advanced prototypes, and being further along in their implementation status.

Complex technology, skillset limitations, lack of available resources, and cost deter some organizations from investing in quantum computing technology. These factors, combined with a large interdisciplinary interest, has forced quantum computing vendors to develop quantum computing technology that addresses multiple end-user needs and skill levels.

The result has led to increased availability of cloud-based quantum computing technology that is more easily accessible and user friendly for new end users. Currently, the preferred types of quantum computing technologies employed across industries include quantum algorithms, cloud-based quantum computing, quantum networks, and hybrid quantum computing.

Quantum computing is the future industry and infrastructure disruptor for organizations looking to use large amounts of data, artificial intelligence, and machine learning to accelerate real-time business intelligence and innovate product development. Many organizations from many industries are already experimenting with its potential, said Heather West, senior research analyst, Infrastructure Systems, Platforms, and Technology at IDC.

Read more:
Early investment in quantum computing could result in a competitive advantage - Help Net Security

Read More..

UC Riverside to lead scalable quantum computing project using 3D printed ion traps – 3D Printing Industry

UC Riverside (UCR) is set to lead a project focused on enabling scalable quantum computing after winning a $3.75 million Multicampus-National Lab Collaborative Research and Training Award.

The collaborative effort will see contributions from UC Berkeley, UCLA and UC Santa Barbara, with UCR acting as project coordinator.

Scalable quantum computing

Quantum computing is currently in its infancy but it is expected to stretch far beyond the capabilities of conventional computing in the coming years. Intensive tasks such as modeling complex processes, finding large prime numbers, and designing new chemical compounds for medical use are what quantum computers are expected to excel at.

Quantum information is stored on quantum computers in the form of quantum bits, or qubits. This means that quantum systems can exist in two different states simultaneously as opposed to conventional computing systems which only exist in one state at a time. Current quantum computers are limited in their qubits, however, so for quantum computing to realize its true potential, new systems are going to have to be scalable and include many more qubits.

The goal of this collaborative project is to establish a novel platform for quantum computing that is truly scalable up to many qubits, said Boerge Hemmerling, an assistant professor of physics and astronomy at UC Riverside and the lead principal investigator of the three-year project. Current quantum computing technology is far away from experimentally controlling the large number of qubits required for fault-tolerant computing. This stands in large contrast to what has been achieved in conventional computer chips in classical computing.

3D printed ion trap microstructures

The research team will use advanced 3D printing technology, available at Lawrence Livermore National Laboratory, to fabricate microstructure ion traps for the new quantum computers. Ions are used to store qubits and quantum information is transferred when these ions move in their traps. According to UCR, trapped ions have the best potential for realizing scalable quantum computing.

Alongside UCR, UC Berkeley will enable high-fidelity quantum gates with the ion traps. UCLA will integrate fiber optics with the ion traps, UC Santa Barbara will put the traps through trials in cryogenic environments and demonstrate shuttling of ion strings while the Lawrence Berkeley National Laboratory will be used to characterize and develop new materials. The project coordinator, UCR, will develop simplified cooling schemes and research the possibility of trapping electrons with the traps.

We have a unique opportunity here to join various groups within the UC system and combine their expertise to make something bigger than a single group could achieve, Hemmerling stated. We anticipate that the microstructure 3D printed ion traps will outperform ion traps that have been used to date in terms of the storage time of the ions and ability to maintain and manipulate quantum information.

He adds, Most importantly, our envisioned structures will be scalable in that we plan to build arrays of interconnected traps, similar to the very successful conventional computer chip design. We hope to establish these novel 3D-printed traps as a standard laboratory tool for quantum computing with major improvements over currently used technology.

Hemmerlings concluding remarks explain that many quantum computing approaches, while very promising, have fallen short of providing a scalable platform that is useful for processing complex tasks. If an applicable machine is to be built, new routes must be considered, starting with UCRs scalable computing project.

Early quantum technology work involving 3D printing has paved the way for UCRs future project. When cooled to near 0K, the quantum characteristics of atomic particles start to become apparent. Just last year, additive manufacturing R&D company Added Scientific 3D printed the first vacuum chamber capable of trapping clouds of cold atoms. Elsewhere, two-photon AM system manufacturer Nanoscribe introduced a new machine, the Quantum X, with micro-optic capabilities. The company expects its system to be useful in advancing quantum technology to the industrial level.

The nominations for the 2020 3D Printing Industry Awards are now open. Who do you think should make the shortlists for this years show? Have your say now.

Subscribe to the 3D Printing Industry newsletter for the latest news in additive manufacturing. You can also stay connected by following us on Twitter and liking us on Facebook.

Looking for a career in additive manufacturing? Visit 3D Printing Jobs for a selection of roles in the industry.

Featured image showsUniversity of California, Riverside campus. Photo via UCR.

More:
UC Riverside to lead scalable quantum computing project using 3D printed ion traps - 3D Printing Industry

Read More..

Quantum Computing for Everyone – The Startup – Medium

Qubits are exponentially faster than bits in several computing problems, such as database searches and factoring (which, as we will discuss soon, may break your Internet encryption).

An important thing to realize is that qubits can hold much more information than a bit can. One bit holds the same amount of information as one qubit they can both only hold one value. However, four bits must be used to store the same amount of information as two qubits. A two-qubit system in equal superposition holds values for four states, which on a classical computer, would need at least four bits to hold. Eight bits are needed to store the same amount of information as three qubits, since a three-qubit system can store eight states 000, 001, 010, 011, 100, 101, 110, and 111. This pattern continues.

The below graph provides a visual for the computing power of qubits. The x-axis represents the number of qubits used to hold a certain amount of information. The blue lines y represents the number of bits needed to hold the same amount of information as the number of qubits (x-axis), or 2 to the power of x. The red lines y represents the number of qubits needed to hold the same amount of information as the number of qubits in the x-axis (y=x).

Imagine the exponential speedup quantum computing can provide! A gigabyte (8E+09 bits) worth of information can be represented with log(8E+09)/log(2) = 33 (rounded up from 32.9) qubits.

Quantum computers are also great at factoring numbers which leads us to RSA encryption. The security protocol that secures Medium and probably any other website youve been on is known as RSA encryption. It relies on the fact that with current computing resources, it would take a very, very long time to factor a 30+-digit number m that has only one solution namely, p times q, where both p and q are large prime numbers. However, dividing m by p or q is computationally much easier, and since m divided by q returns p and vice versa, it provides a quick key verification system.

A quantum algorithm called Shors algorithm has shown exponential speedup in factoring numbers, which could one day break RSA encryption. But dont buy into the hype yet as of this writing, the largest number factored by quantum computers is 21 (into 3 and 7). The hardware has not been developed yet for quantum computers to factor 30-digit numbers or even 10-digit numbers. Even if quantum computers one day do break RSA encryption, a new security protocol called BB84 that relies on quantum properties is verified safe from quantum computers.

So will quantum computers ever completely replace the classical PC? Not in the forseeable future.

Quantum computing, while developing very rapidly, is still in an infantile stage, with research only being conducted semi-competitively by large corporations like Google, Microsoft, and IBM. Much of the hardware to accelerate quantum computing is not currently available. There are several obstacles to a quantum future, of which a major one is addressing gate errors and maintaining integrity of a qubits state.

However, given the amount of innovation that has happened in the past few years, it seems inevitable during our lifetimes that quantum computing will make huge strides. In addition, complexity theory has shown that there are several cases where classical computers perform better than quantum computers. IBM quantum computer developers state that quantum computing will probably never completely eliminate classical computers. Instead, in the future we may see a hybrid chip that relies on quantum transistors for certain tasks and classical transistors for others, depending on which one is more appropriate.

Originally posted here:
Quantum Computing for Everyone - The Startup - Medium

Read More..

Deltec Bank, Bahamas – Quantum Computing Will have Positive Impacts on Portfolio Optimization, Risk Analysis, Asset Pricing, and Trading Strategies -…

Quantum computing is expected to be the new technology, fully integrated with the financial sector within five to ten years. This form of computer, also known as supercomputers, are capable of highly advanced processing power that takes in massive amounts of data to solve a problem in a fraction of the time it would for the best traditional computer on the market to resolve.

Traditional Computer vs. Quantum Computing

A typical computer today stores information in the form of bits. These are represented in the binary language (0s and 1s). In quantum computing, the bits are known as Qubits and will take on the processing of similar input but rather than break it down to 0s and 1s will break the data down significantly greater where the possibilities of computational speed can be almost immeasurable.

Quantum Computing in Banking

Let's examine personal encryption in banking for example. Using a security format called RSA-2048, traditional computers would be able to decrypt the security algorithm in about 1,034 steps. With our best computers on the market, even with a processor capable of performing a trillion calculations per second, these steps translate to 317 billion years to break the secure code. While it is possible, it is not practical for a cyber-criminal to make it worthwhile.

A quantum computer, on the other hand, would be able to resolve this problem in about 107 steps. With a basic quantum computer running at one million calculations per second, this translates to ten seconds to resolve the problem.

While this example centered on breaking complex security, many other use cases can emerge from the use of quantum computing.

Trade Transaction Settlements

Barclays bank researchers have been working on a proof of concept regarding the transaction settlement process. As settlements can only be worked on a transaction-by-transaction basis, they can easily queue up only to be released in batches. When a processing window opens, as many trades as possible are settled.

Complex by their very nature, Traders can end up tapping into funds prior to the transaction being cleared. They will only be settled if the funds are available or if a collateral credit facility was arranged.

As you could probably handle a small number of trades in your head, you would need to rely on a computer after about 10-20 transactions. The same can be described for our current computational power in that it is now nearing the point where it will need more and more time to resolve hundreds of trades at a time.

With quantum computing using a seven-qubit system, it would be able to run a greater amount of complex trades in the same time it would for a traditional system to complete the trades. It would take the equivalent of about two hundred traditional computers to match the speed.

Simulating a Future Product Valuation

Researchers at JP Morgan were working on a concept that simulates the future value of a financial product. The team is testing quantum computers to perform complex intensive pricing calculations that normally take traditional computer hours to complete. This is a problem as each year greater complexity is added via newer algorithms, getting to the point where it is nearing an impossibility to calculate in a practical sense.

The research team has discovered that using quantum computing resulted in finding a resolution to the problem in mere seconds.

Final Thoughts

Banks are working on successful tests today with quantum computing to resolve extreme resource-intensive calculations for financial problem scenarios. Everything from trading, fraud, AML, etc. this is a technology not to be overlooked.

According to Deltec Bank, Bahamas - "Quantum Computing will have positive impacts on portfolio optimization, risk analysis, asset pricing, and trading strategies is just the tip of the iceberg of what this technology could provide."

Disclaimer: The author of this text, Robin Trehan, has an Undergraduate degree in economics, Masters in international business and finance and MBA in electronic business. Trehan is Senior VP at Deltec International http://www.deltecbank.com. The views, thoughts, and opinions expressed in this text are solely the views of the author, and not necessarily reflecting the views of Deltec International Group, its subsidiaries and/or employees.

About Deltec Bank

Headquartered in The Bahamas, Deltec is an independent financial services group that delivers bespoke solutions to meet clients' unique needs. The Deltec group of companies includes Deltec Bank & Trust Limited, Deltec Fund Services Limited, and Deltec Investment Advisers Limited, Deltec Securities Ltd. and Long Cay Captive Management.

Media Contact

Company Name: Deltec International Group

Contact Person: Media Manager

Email: rtrehan@deltecial.com

Phone: 242 302 4100

Country: Bahamas

Website: https://www.deltecbank.com/

Source: http://www.abnewswire.com

.

Go here to see the original:
Deltec Bank, Bahamas - Quantum Computing Will have Positive Impacts on Portfolio Optimization, Risk Analysis, Asset Pricing, and Trading Strategies -...

Read More..