Page 4,076«..1020..4,0754,0764,0774,078..4,0904,100..»

Study: Our universe may be part of a giant quantum computer – The Next Web

A pair of physicists from Immanuel Kant Baltic Federal University (IKBFU) in Russia recently proposed an entirely new view of the cosmos. Their research takes the wacky idea that were living in a computer simulation and mashes it up with the mind-boggling many worlds theory to say that, essentially, our entire universe is part of an immeasurably large quantum system spanning uncountable multiverses.

When you think about quantum systems, like IBM and Googles quantum computers, we usually imagine a device thats designed to work with subatomic particles qubits to perform quantum calculations.

These computers may one day perform advanced calculations that classical computers today cant, but for now theyre useful as a way to research the gap between classical and quantum reality.

Artyam Yurov and Valerian Yurov, the IKBFU researchers behind the aforementioned study, posit that everything in the universe, including the universe itself, should be viewed as a quantum object. This means, to experience quantum reality we dont need to look at subatomic particles or qubits: were already there. Everything is quantum!

Yurov and Yurov begin their paper by stating theyve turned currently popular theoretical physics views on their head:

We present a new outlook on the cosmology, based on the quantum model proposed by Michael and Hall. In continuation of the idea of that model we consider finitely many classical homogeneous and isotropic universes whose evolutions are determined by the standard EinsteinFriedmann equations but that also interact with each other quantum-mechanically.

The paper goes on to mathematically describe how our entire universe is, itself, a quantum object. This means, like a tiny subatomic particle, it exhibits quantum properties that should include superposition. Theoretically, our universe should be able to be in more than one place or state at a time, and that means there simply must be something out there for it to interact with even if that means it uses jaw-droppingly unintuitive quantum mechanics to interact with itself in multiple states simultaneously.

The problem with expanding quantum mechanics to large objects like say, a single cell is that other theoretical quantum features stop making as much sense. In this case decoherence, or how quantum objects collapse from multiple states into the physical state we see in our classical observations, doesnt seem to pass muster at the cosmic scale.

Yurov and Yurov have a simple solution for that: They state unequivocally in their work that There is no such thing as decoherence.

According to an article from Sci-Tech Daily, lead author on the paper Artyom Yurov said:

Back in the days I was skeptical about the idea. Because it is known that the bigger an object is the faster it collapses. Even a bacteria collapses extremely fast, and here we are talking about the Universe. But here [Pedro Gonzales Diaz, a late theoretical physician whose work partially inspired this study] asked me: What the Universe interacts with? and I answered nothing. There is nothing but the Universe and there is nothing it can interact with.

But, the more Yurov and Yurov explored the many interacting worlds (MIW) theory that says all quantum functions manifest physically in alternate realities(the cat is dead on one world, alive on another, and dancing the Cha Cha on another, etc.), the more they realized it not only makes sense, but the math and science seem to work out better if you assume everything, the universe included, has quantum features.

Per the study:

This implies that the reason the quantum phenomena are so fragile has nothing to do with a collapse of a wave function (whatever that means) in fact, such an object as a wave function is inessential and can be completely avoided in the MIW formalism. No, the existence of quantum phenomena relies solely on the mutual positions of the neighbouring worlds when they are sufficiently close, the quantum potential is alive and kicking; when they depart, the quantum potential abates and the particles become effectively classical again.

The researchers then used their assumptions to come up with calculations that expand the many worlds theory to encompass multiple universes, or multiverses. The big idea here is that, if the universe is a quantum object it must interact with something and that something is probably other universes.

But what the research doesnt explain, is why our universe and everything in it would exist as something analogous to a single qubit in a gigantic quantum computer spanning multiple universes simultaneously. If humans arent the magical observers who cause the quantum universe to collapse into classical reality by measuring it, we might instead be cogs in the machine maybe the universe is a qubit, maybe were the qubits. Perhaps were just noise that the universes ignore while they go about their calculations.

Maybe we do live in a computer simulation after all. But instead of being some advanced creatures favorite NPCs, were just bits of math that help the operating system run.

You can read the Yurov duos paper The day the universes interacted: quantum cosmology without a wave function here on Springer.

Read next: Twitter delays deleting inactive accounts to decide how to respect dead users

View original post here:
Study: Our universe may be part of a giant quantum computer - The Next Web

Read More..

First quantum computing conference to take place in Cambridge – Cambridge Independent

Riverlane is inviting submissions for contributed talks at next years inaugural Quantum Computing Theory in Practice (QCTIP) conference 2020 which will take place in Cambridge.

Talks will be selected on the basis of scientific excellence and workshop-friendliness. Topics will include applications and architectures of quantum computers; quantum algorithms; quantum compilation and circuit optimisation; quantum error correction and fault tolerance; simulation of quantum systems; theory of near-term quantum computing; and verification of quantum devices.

QCTIP has emerged from a series of Heilbronn quantum algorithms meetings hosted in Bristol and Cambridge since 2010.

Themes to be explored at the event on April 6-8 at the Centre for Mathematical Sciences start with the theory of the whole quantum software stack, seconded by practical aspects of running experiments on current and NISQ devices, and thirdly scaling up to more and higher-quality qubits.

The programme committee includes Srinivasan Arunachalam of MIT/IBM Research and chair Iordanis Kerenidis, CNRS senior researcher/QCWare. Speakers from IBM Research, Google and Oxford University have been invited.

All submissions for talks must be made online through the EasyChair submission system.

Riverlane is based at St Andrews House in the centre of town and is run by Dr Steve Brierley, has spent 10 years researching algorithms and architectures for quantum computers, most recently as a senior research fellow in applied mathematics at the University of Cambridge. The company writes software for quantum computers, with the software being run on a quantum computer based at Oxford Quantum Circuits.

Theres only 50 quantum computers currently around, and its likely to remain a limited number, says Dr Brierley. Quantum computers are very good at certain things: you wont see one on your phone any time soon, though it might be used to make the chips on the phone run faster.

It costs several million pounds to buy the components to build a quantum computer and you have to get the staff theres very few people who know how to build one. We work with companies that already use computational modelling in design, for instance Merck, which has a performance materials division which includes everything from lip gloss to organic LEDs in a TV.

Dr Brierleys expertise was recently called upon by the Guardian to solve a bit of a spat between Google and IBM. Google announced its Sycamore quantum processor had performed a specific task in 200 seconds that would take the worlds best supercomputer 10,000 years to complete, meaning it had achieved quantum supremacy by exceeding the potential of traditional devices. But in a blog post IBM researchers said the result should be treated with a large dose of scepticism due to the complicated nature of benchmarking an appropriate metric.

Dr Brierley told the Guardian: Its clearly an amazing achievement. I think this is going to be one of those moments when people look back and say, That was the time that really changed this field of quantum computing. It is also a great moment in time to stop talking about quantum supremacy, which has unfortunate historical connotations, and move on to talking about quantum advantage, which has a useful application.

Quantum computing is so new there isnt a standard operating system so Riverlane is writing one.

Its quite difficult because if you write software for one quantum computer it wont work on any other so were currently developing an operating system, which we expect to be complete within 18 months as an initial product, Dr Brierley told the Cambridge Independent. The challenge in the sector is what is the best way to build a quantum computer and this operating system will remove the uncertainty.

Link:
First quantum computing conference to take place in Cambridge - Cambridge Independent

Read More..

World High Performance Computing (HPC) Market Oulook Report, 2019-2024 – HPC Will Be Integral to Combined Classical & Quantum Computing Hybrid…

DUBLIN, Nov. 28, 2019 /PRNewswire/ -- The "High Performance Computing (HPC) Market by Component, Infrastructure, Services, Price Band, HPC Applications, Deployment Type, and Region 2019-2024" report has been added to ResearchAndMarkets.com's offering.

This report evaluates the HPC market including companies, solutions, use cases, and applications. Analysis includes HPC by organizational size, software and system type, server type and price band, and industry verticals. The report also assesses the market for integration of various artificial intelligence technologies in HPC.

It also evaluates the exascale-level HPC market including analysis by component, hardware type, service type, and industry vertical. The report provides HPC market sizing by component, hardware type, service type, and industry vertical from 2019 to 2024.

High Performance Computing (HPC) refers to high speed computation, which may be provided via a supercomputer or via parallel processing techniques such as leveraging clusters of computers to aggregate computing power. HPC is well-suited for applications that require high performance data computation and analysis such as high frequency trading, autonomous vehicles, genomics-based personalized medicine, computer-aided design, deep learning, and more. Specific examples include computational fluid dynamics, simulation, modeling, and seismic tomography.

No longer solely the realm of supercomputers, HPC is increasingly provided via cluster computing. By way of example, Hewlett Packard Enterprise (HPE) provides a computational clustering solution in conjunction with Intel that represents HPC Infrastructure as a Service (IaaS). This particular HPC IaaS offering environment provides customized tenant clusters tailored to client and application requirements. Key to this particular solution is the intelligent use of APIs, which enable a high degree of flexibility and what HPE refers to as Dynamic Fabric Configuration.

HPC capabilities are often used to solve very specific problems for large institutions. Examples include financial services organizations, government R&D facilities, universities research, etc. However, the cloud-computing based as a Service model allows HPC capabilities to be extended via HPC-as-a-Service (HPCaaS) to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems. Industry use cases are increasingly emerging that benefit from HPC-level computing, many of which benefit from split processing between localized device/platform and HPCaaS.

Today, HPC is universally associated with classical computing. While quantum computing does not utilize a faster clock-speed than classical computing, it is much faster than traditional computing infrastructure for solving certain problems as quantum computers can handle exponentially larger data sets. Accordingly, quantum computing is well-positioned to support certain industry verticals and solve certain problems such as cybersecurity and cryptocurrencies that rely upon prime factoring. Current classical computing technologies would take an inordinate amount of time to break-down prime factors to support cryptology and blockchain technology.

Due to the limitations of quantum computing, and the evolution of HPC, we see a future in which hybrid systems utilize both quantum and classical CPUs on the same computing platform. These next generation computing systems will provide the best of both worlds - high speed general purpose computing combined with use case specific ultra-performance for certain tasks that will remain outside the range of binary computation for the foreseeable future. Mind Commerce sees a future of quantum and classical CPUs on the same computing platform, which will lead to a combined general purpose and use case specific computation solution that will solve many industry problems in a more scalable and economic manner.

Key Topics Covered

1 Executive Summary

2 Introduction2.1 Next Generation Computing2.2 High Performance Computing2.2.1 HPC Technology2.2.1.1 Supercomputers2.2.1.2 Computer Clustering2.2.2 Exascale Computation2.2.2.1 United States2.2.2.2 China2.2.2.3 Europe2.2.2.4 Japan2.2.2.5 India2.2.2.6 Taiwan2.2.3 High Performance Technical Computing2.2.4 Market Segmentation Considerations2.2.5 Use Cases and Application Areas2.2.5.1 Computer Aided Engineering2.2.5.2 Government2.2.5.3 Financial Services2.2.5.4 Education and Research2.2.5.5 Manufacturing2.2.5.6 Media and Entertainment2.2.5.7 Electronic Design Automation2.2.5.8 Bio-Sciences and Healthcare2.2.5.9 Energy Management and Utilities2.2.5.10 Earth Science2.2.6 Regulatory Framework2.2.7 Value Chain Analysis2.2.8 AI to Drive HPC Performance and Adoption

3 High Performance Computing Market Analysis and Forecast3.1 Global High Performance Computing Market 2019 - 20243.1.1 Total High Performance Computing Market3.1.2 High Performance Computing Market by Component3.1.2.1 High Performance Computing Market by Hardware and Infrastructure Type3.1.2.1.1 High Performance Computing Market by Server Type3.1.2.2 High Performance Computing Market by Software and System Type3.1.2.3 High Performance Computing Market by Professional Service Type3.1.3 High Performance Computing Market by Deployment Type3.1.4 High Performance Computing Market by Organization Size3.1.5 High Performance Computing Market by Server Price Band3.1.6 High Performance Computing Market by Application Type3.1.6.1 High Performance Technical Computing Market by Industry Vertical3.1.6.2 Critical High Performance Business Computing Market by Industry Vertical3.1.1 High Performance Computing Deployment Options: Supercomputer vs. Clustering3.1.2 High Performance Computing as a Service (HPCaaS)3.1.3 AI Powered High Performance Computing Market3.1.3.1 AI Powered High Performance Computing Market by Component3.1.3.2 AI Powered High Performance Computing Market by AI Technology3.2 Regional High Performance Computing Market 2019 - 20243.3 Exascale Computing Market3.3.1 Exascale Computing Driven HPC Market by Component3.3.2 Exascale Computing Driven HPC Market by Hardware Type3.3.3 Exascale Computing Driven HPC Market by Service Type3.3.4 Exascale Computing Driven HPC Market by Industry Vertical3.3.1 Exascale Computing as a Service

4 High Performance Computing Company Analysis4.1 HPC Vendor Ecosystem4.2 Leading HPC Companies4.2.1 Amazon Web Services Inc. 4.2.2 Atos SE 4.2.3 Advanced Micro Devices Inc. 4.2.4 Cisco Systems 4.2.5 DELL Technologies Inc. 4.2.6 Fujitsu Ltd 4.2.7 Hewlett Packard Enterprise 4.2.8 IBM Corporation 4.2.9 Intel Corporation 4.2.10 Microsoft Corporation 4.2.11 NEC Corporation 4.2.12 NVIDIA 4.2.13 Rackspace Inc.

5 Conclusions and Recommendations

6 Appendix: Future of Computing6.1 Quantum Computing6.1.1 Quantum Computing Technology6.1.2 Quantum Computing Considerations6.1.3 Market Challenges and Opportunities6.1.4 Recent Developments6.1.5 Quantum Computing Value Chain6.1.6 Quantum Computing Applications6.1.7 Competitive Landscape6.1.8 Government Investment in Quantum Computing6.1.9 Quantum Computing Stakeholders by Country6.1 Other Future Computing Technologies6.1.1 Swarm Computing6.1.2 Neuromorphic Computing6.1.3 Biocomputing6.2 Market Drivers for Future Computing Technologies6.2.1 Efficient Computation and High Speed Storage6.2.2 Government and Private Initiatives6.2.3 Flexible Computing6.2.4 AI enabled, High Performance Embedded Devices, Chipsets, and ICs6.2.5 Cost Effective Computing powered by Pay-as-you-go Model6.3 Future Computing Market Challenges6.3.1 Data Security Concerns in Virtualized and Distributed Cloud6.3.2 Funding Constrains R&D Activities6.3.3 Lack of Skilled Professionals across the Sector6.3.4 Absence of Uniformity among NGC Branches including Data Format

For more information about this report visit https://www.researchandmarkets.com/r/j1jwus

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Read more:
World High Performance Computing (HPC) Market Oulook Report, 2019-2024 - HPC Will Be Integral to Combined Classical & Quantum Computing Hybrid...

Read More..

Threat of quantum computing hackathon to award $100,000 – App Developer Magazine

Communique Laboratory Inc. launched its quantum hackathon tackling the threat of quantum computing. Cybersecurity companies, computer science students and hackers have begun challenging the Companys quantum-safe encryption in a $100,000 hackathon.

The Company hosted an innovation celebration event with technology presentations from industry experts in artificial intelligence and cybersecurity. Andrew Cheung, 01 Communiques CEO, was one of the presenters addressing business people, students, and hackers on the threat quantum computers present with respect to keeping your data safe. He revealed the purpose behind the hackathon and why he is confident enough to offer a $100,000 prize.

Andrew Cheung enthusiastically described the hackathon challenge, Our hackathon will show the world that our encryption is rock-solid. We are the only Canadian company and the first post-quantum encryption to offer a prize of $100,000. We have invested over three years in developing our IronCAP technology with a development team that has combined 50 years of experience in code-based encryption. We are very confident that our technology will withstand any attempt by any participant to crack the code in our hackathon.

The Company expects contestants from around the world to challenge its quantum-safe encryption. The hackathon is available online globally. Anyone to who has a Google or Facebook account can sign up to participate. Contestants will be given 30 days to crack IronCAPs code. A cash prize of $100,000 will be awarded to the first person (if there is any) who is able to break the encryption. A paper describing the method used to crack the encryption is required to be submitted by the participant.

Innovative people working in tech along with researchers, computer scientists, students, and hackers are encouraged to sign up for the hackathon. The contest closes on December 12, 2019. Results will be announced on or about December 16, 2019.

Read the original here:
Threat of quantum computing hackathon to award $100,000 - App Developer Magazine

Read More..

ETU "LETI" first won the Bertrand Meyer Award – QS WOW News

The 15th Software Engineering Conference Russia 2019 (SECR 2019), a key annual event in this field in Eastern Europe, took place on November 14-15 in St. Petersburg; more than 500 participants from the industry take part in it. Traditionally, the winners of the Bertrand Meyer Award were announced at the closing of the conference. Bertrand Meyer is a professor at Politecnico di Milano and Innopolis University and initial designer of the Eiffel method and language. The prize is awarded annually for the best paper presented in the SECR program.

This year, the award was shared between two works by the decision of the program committee. One of them is Automated Generation of Quantum Circuit Specifications Based on Reed-Muller Expressions byVitaly KalmychkovandIrina Matveeva, associate professors of the Department of Computer Aided Design of ETU LETI.

Vitaly Kalmychkovpresented the paper at the conference. The presentation was devoted to current trends in quantum computing and a promising area for the creation and programming of quantum computers.Vitaly Kalmychkovspoke about the experience of developing a modular system for the automatic generation of quantum circuit specifications, used as the basis for the logical representation of a quantum computing process. He also proposed methods for automatic minimization of quantum circuits based on scalable templates, their evaluation, and automatic verification. Today, quantum technologies are already used in telecommunications (security, cryptography), fast computing (artificial intelligence and processing of large amounts of data), modeling of complex systems (physical, chemical) and materials in medicine.

In our study, we offer automation of the design process of quantum circuit specifications, based on a combination of classical mathematical foundations with a common approach to the development of quantum algorithms based on a set of CkNOT converters with multiple control taking into account the nearest neighbor architecture. Our toolkit provides an automatic compilation of all possible variants of quantum circuits using Reed-Muller expressions, including automatic modes for switching to the linearly nearest neighbor while minimizing the number of SWAP converters based on the scalable patterns that we have implemented, automatic statistics collection, visualization, lexical verification of equivalency of quantum chains compilation results. In general, this allows us to choose from all automatically generated specifications variants of quantum circuits under various criteria.

Today, the development of ETU LETI researchers is extremely relevant. Quantum computers are shifting from the field of scientific interest and research laboratories to the mass user. On October 23, 2019, one of Googles divisions announced achieving the quantum supremacy. The company introduced a quantum algorithm that solves the problem of generating a random sequence on a quantum processor. IBM provides a cloud service for those wishing to implement quantum algorithms on an existing quantum computer. Rosatom announced a large-scale project to create a Russian quantum computer.

Also at the conference,Vladimir Litoshenko, a graduate of the Faculty of Computer Science and Technology of ETU LETI in 2006, Deputy General Director of First Line Software, presented the practical results of using the IBM Q cloud quantum platform for quantum computing.

There was a friendly, creative atmosphere at the conference, which was made possible by the organizing committee with the involvement of volunteers, among whom were ETU LETI students,Vitaly Kalmychkov, Associate Professor of the Department of Computer Aided Design of ETU LETI, says.

In total, researchers submitted more than 150 applications to the conference. After a careful selection, organizers accepted 99 papers on topics of programming tools, cloud services, the Internet of Things, development team management and others.

See more here:
ETU "LETI" first won the Bertrand Meyer Award - QS WOW News

Read More..

Global Quantum Computing Market is Set to Experience Revolutionary Growth With +25% CAGR by 2025 | Top Players D-Wave Systems Inc., QX Branch, Google…

Quantum Computing Marketis the area of study focused on developing computer technology based on the principles ofquantumtheory, which explains the nature and behavior of energy and matter on thequantum(atomic and subatomic) level. It is the use ofquantum mechanical phenomena such assuperpositionandentanglementto performcomputation. Aquantum computeris used to perform such computation, which can be implemented theoretically or physically.

The Quantum Computing Market is expected to reach +25% CAGR during forecast period 2019-2025

According to the Market Research Inc research report, the growing Quantum Computing Market is likely to boost the global market substantially over the forthcoming years. Apart from this, the increasing number of driving is projected to add to the growth of this market significantly in the near future. The worldwide market is analyzed on the basis of the various segments and the geographical reach of this market. How the markets segments are propelling the market in the market scenario is mentioned in this report. The continual rising factors boosting the demand for market notes the research study.

Request a Sample Quantum ComputingMarket Research Report at @

https://www.marketresearchinc.com/request-sample.php?id=16150

Key players in the Quantum Computingproducts markets include Market:

D-Wave Systems Inc. (Canada), QX Branch (US), International Business Machines Corporation (US), Cambridge Quantum Computing Limited (UK), 1QB Information Technologies (Canada), QC Ware, Corp. (US), StationQ- Microsoft (US), Rigetti Computing (US), Google Inc. (US), River Lane Research (US).

Avail Discount up to 40%

https://www.marketresearchinc.com/ask-for-discount.php?id=16150

On the geographical front, the global market is classified into Europe, Asia-Pacific, Middle East & Africa, North America, and Latin America. The leading region of this global market and the region which is projected to continue its dominance over the forthcoming years is given in the study. The key driving force behind the growth of this market in the near future is also presented.

For product type segment, this report listed main product type of Quantum Computing market in global and china.

For end use/application segment, this report focuses on the status and outlook for key applications. End users are also listed.

Ask Your Queries or Requirements:

https://www.marketresearchinc.com/enquiry-before-buying.php?id=16150

In this study, the years considered to estimate the size of Quantum Computing are as follows:

History Year: 2013-2018

Base Year: 2018

Estimated Year: 2019

Forecast Year 2019 to 2025.

Table of Content:

The Marker Research Inc studies the Quantum Computing market status and outlook of Global and major regions, from angles of players, countries, product types and end industries; this report analyzes the top players in global market, and splits the Quantum Computing market by product type and applications/end industries.

Any special requirements about this report, please let us know and we can provide custom report.

About Us:

Market Research Inc is farsighted in its view and covers massive ground in global research. Local or global, we keep a close check on both markets. Trends and concurrent assessments sometimes overlap and influence the other. When we say market intelligence, we mean a deep and well-informed insight into your products, market, marketing, competitors, and customers. Market research companies are leading the way in nurturing global thought leadership. We help your product/service become the best they can with our informed approach.

Studying consumer behavior, changing preference patterns and events that impact different courses and flow of businesses and their corresponding markets, is our forte. Once we join hands with you, what you do, will be guided by our expertise, every step of the way.

Contact Us:

Market Research Inc.

Kevin

51 Yerba Buena Lane,

Ground Suite, Inner Sunset San Francisco,

CA 94103, USA.

+1(628) 225-1818

Write Us@ sales@marketresearchinc.com

More here:
Global Quantum Computing Market is Set to Experience Revolutionary Growth With +25% CAGR by 2025 | Top Players D-Wave Systems Inc., QX Branch, Google...

Read More..

The Best Artificial Intelligence Stocks of 2019 — and The Top AI Stock for 2020 – The Motley Fool

Artificial intelligence (AI) -- the capability of a machine to mimic human thinking and behavior -- is one of the biggest growth trends today.Spending on AI systems will increase by more than two and a half times between 2019 and 2023, from $37.5 billion to $97.9 billion, for a compound annual growth rate of 28.4%,according to estimates by research firm IDC. Other sources are projecting even more torrid growth rates.

There are two broad ways you can get exposure to the AI space:

With this background in mind, let's look at which AI stocks are performing the best so far this year (through Nov. 25) and which one is my choice for best AI stock for 2020.

Image source: Getty Images.

The following chart isn't meant to be all-inclusive, as that would be impossible, and the chart has limits on the number of metrics. Notable among the companies missing areAdvanced Micro Devices and Intel. They were left out largely because NVIDIA is currently the leader in supplying AI chips. While there are things to like about shares of both of these companies, NVIDIA stock is the better play on AI, in my view.

Data by YCharts.

Graphics processing unit (GPU) specialist NVIDIA (NASDAQ:NVDA), e-commerce and cloud computing service titanAmazon, computer software and cloud computer service giant Microsoft, Google parent and cloud computing service provider Alphabet, old technology guard and multifaceted AI player IBM, and Micron Technology, which makes computer memory chips and related storage products, would best be put in the first category above. They produce and sell AI-related products and/or services. They're all also probably using AI internally, with Amazon and Alphabet being notably heavy users of the tech to improve their products.

iPhone makerApple (NASDAQ:AAPL), social media leader Facebook (NASDAQ:FB), video-streaming king Netflix, and Stitch Fix, an online personal styling service provider, would best be categorized in the second group since they're either primarily or solely using AI to improve their products and services.

Now let's look at some basic stats for the three best performers of this group.

Company

Market Cap

P/E(Forward)

Wall Street's 5-Year Estimated Average Annual EPS Growth

5-Year Stock Return

Apple

NVIDIA

Facebook

S&P 500

--

--

Data sources: YCharts (returns) and Yahoo! Finance (all else). P/E = price-to-earnings ratio. EPS = earnings per share. Data as of Nov. 25, 2019.

On a valuation basis alone, Facebook stock looks the most compelling when we take earnings growth estimates into account. Then would come Apple and then NVIDIA. However, there are other factors to consider, with the biggie being that projected earnings growth is just that, projected.

There's a good argument to be made that NVIDIA has a great shot at exceeding analysts' earnings estimates. Why? Because it has a fantastic record of doing so, and all one needs to do is listen to enough quarterly earnings calls with Wall Street analysts to realize why this is so: A fair number of them don't seem to have a strong grasp of the company's operations and products. (I'm not knocking, as most analysts don't have technical backgrounds, and they cover a lot of companies.)

Facebook stock probably has the potential to continue to be a long-term winner. But it's relatively high regulatory risk profile makes it not a good fit for all investors. Moreover, it will likely have to keep spending a ton of money to help prevent "bad actors" from using its site for various nefarious purposes. Indeed, this is one of the major internal functions for which the company is using AI. It also uses the tech to recognize and tag uploaded images, among other things.

Apple uses AI internally in various ways, with the most consumer-facing one being powering its voice assistant Siri. It's the best of these three stocks for more conservative investors, as it has a great long-term track record and pays a modest dividend.NVIDIA, however, is probably the better choice for growth-oriented investors who are comfortable with a moderate risk level.

Image source: Getty Images.

NVIDIA is the leading supplier of graphics cards for computing gaming, with AMD a relatively distant second. In the last several years, it's transformed itself into a major AI player, or more specifically, a force to be reckoned with in the fast-growing deep-learning category of AI. Its GPUs are the gold standard for AI training in data centers, and it's now making inroads into AI inferencing. (Inferencing involves a machine or device applying what it's learned in its training to new data. It can be done in data centers or "at the edge" -- meaning at the location of the machine or device that's collecting the data.)

NVIDIA is in the relatively early stages of profiting from many gigantic growth trends, including AI, esports, driverless vehicles, virtual reality (VR), smart cities, drones, and more. (There is some overlap in these categories, as AI is involved to some degree in most of NVIDIA's products.) There are no pure plays on AI, to my knowledge, but NVIDIA would probably come the closest.

Read more:

The Best Artificial Intelligence Stocks of 2019 -- and The Top AI Stock for 2020 - The Motley Fool

Read More..

It Pays To Break Artificial Intelligence Out Of The Lab, Study Confirms – Forbes

null

Yes, artificial intelligence (AI) is proving itself to be a worthwhile tool in the business arena at least in focused, preliminary projects. Intelligent chatbots are a classic example. Now its a question of how quickly it can be expanded to deliver on a wider basis across the business to automate decisions around inventory or investments, for example.

Theres progress on this front, as shown in McKinseys latest survey of 2,360 executives, which shows a nearly 25 percent year-over-year increase in the use of AI in various business processes and there has been a sizable jump in companies spreading AI across multiple processes.

A majority of executives in companies that have adopted AI report that it has increased revenues in areas where it is used, and 44 percent say it has reduced costs, the surveys authors, Arif Cam, Michael Chui, and Bryce Hall, all with McKinsey, state.

The results also show that a small share of companies the authors call them AI high performers are attaining outsize business results from AI. Close to two in three companies, 63 percent, report revenue increases from AI adoption in the business units. Respondents from high performers are nearly three times likelier than their lagging counterparts to report revenue gains of more than 10 percent, the survey shows.

The leading AI use cases include marketing and sales, product and service development, and supply-chain management. In marketing and sales, respondents most often report revenue increases from AI use in pricing, prediction of likelihood to buy, and customer-service analytics, the surveys authors report. In product and service development, revenue-producing use cases include the creation of new AI-based products and new AI-based enhancements. And in supply-chain management, respondents often cite sales and demand forecasting and spend analytics as use cases that generate revenue.

What are these high performers doing differently? Strategy is a key area. For example, 72 percent of respondents from AI high performers say their companies AI strategy aligns with their corporate strategy, compared with 29 percent of respondents from other companies. Similarly, 65 percent from the high performers report having a clear data strategy that supports and enables AI, compared with 20 percent from other companies. Also, the application of standardized tools to be used across the enterprise is more likely to be seen at high performers.

Adoption of Strategic AI Approaches:

Retraining workers is also a key differentiator, the survey shows. One-third of high performers, 33%, indicate the majority of their workforce has received AI-related training over the past year, compared to five percent of lagging organizations. Over the next three years, 42% of high performers intend to extend such training to most of their workers, versus only 17% of their lagging counterparts.

For AI to take hold, the McKinsey authors urge ramping up workforce retraining. Even the AI high performers have work to do in several key areas, the surveys authors point out. Only 36 percent of respondents from these companies say their frontline employees use AI insights in real time for daily decision making. A minority, 42 percent, report they systematically track a comprehensive set of well-defined key performance indicators for AI. Likewise, only 35 percent of respondents from AI high performers report having an active continuous learning program on AI for employees.

Read the original here:

It Pays To Break Artificial Intelligence Out Of The Lab, Study Confirms - Forbes

Read More..

Artificial intelligence will affect Salt Lake, Ogden more than most areas in the nation, study shows – KSL.com

SALT LAKE CITY The Salt Lake and Ogden-Clearfield areas are among the top 10 regions in the United States that will be most affected by the rise of artificial intelligence, according to a study recently released by Washington D.C.-based research group the Brookings Institution.

In the past, research has suggested that AI will disproportionately affect blue-collar and low-income workers, like factory employees or office clerks, who will soon find themselves replaced by machines. But past research hasnt often distinguished between the coming effects of advancements in robotics and software, and those of artificial intelligence, or computers that can plan, learn, reason and problem solve.

As robotics and software become more sophisticated, theyll replace employees in industries like manufacturing, construction or clerical work, the study claims. But artificial intelligence will change the world of the white-collar worker more than anything else and Salt Lake and Ogden will be in the thick of it.

In fact, AI will disproportionately affect areas that specialize in industries like technology, engineering, science, transportation, manufacturing and law, the study shows. And Utahs booming tech sector has not gone unnoticed.

Among the most AI-exposed large metro areas are San Jose, Calif., Seattle, Salt Lake City and Ogden, Utah all high-tech centers, the study reads.

Those four tech hubs are joined in the top 10 most-affected areas by agriculture, logistics and manufacturing centers like Bakersfield, California; Greenville, South Carolina; Detroit, Michigan; and Louisville, Kentucky.

Higher educated and higher paid workers will be most affected by the rise of AI in the coming decades, and workers with bachelors degrees will be more than five times as exposed to artificial intelligence as workers with high school degrees, the study shows.

Eventually, AI will be a significant factor in the future work lives of relatively well-paid managers, supervisors and analysts, according to the report.

Nobody can predict the future, said Dan Ventura, a computer science professor at Brigham Young University who specializes in artificial intelligence research.

While the studys methodology and predictions are kind of cool and better than nothing, theyre just that: predictions, Ventura explained. And the study acknowledges its shortcomings, too.

While the present assessment predicts areas of work in which some kind of impact is expected, it doesnt specifically predict whether AI will substitute for existing work, complement it, or create entirely new work for humans, the study reads.

AI is getting disturbingly good at pattern recognition and pattern matching, including tasks like facial recognition or medical diagnosing from images, Ventura said. But it falls short in other areas.

AI is not good at judgement right now. And even to the extent that it is good at judgement, people dont trust it and dont know if they can trust it. So theyre not going to turn that kind of thing over to AI. At least, they shouldnt, he said.

So while Ventura believes jobs that require skills like pattern recognition may be threatened, those that involve judgment calls are probably safe for a while.

Whats interesting about this (study) is the claim that theyre making that, probably for the first time, this sort of displacement concern, it isnt focused on lower education, lower skill its the other kind of people that theyre worried about. And I think thats pretty interesting, even if Im not sure I buy it all the way, he said.

Ventura does predict, however, that even if AI replaces certain high-skill jobs, new jobs will pop up in response. The rise of artificial intelligence will most likely require (at least in the beginning) something like AI quality control to ensure that the new technology isnt making mistakes.

AI is not good at judgement right now. And even to the extent that it is good at judgement, people dont trust it.Dan Ventura, BYU computer science professor

And while the rise of AI may cause some workforce casualties along the way, Ventura expects the labor market will adapt to the technological advancements, as it has throughout all of human history.

Mark Knold, chief economist of Utahs Department of Workforce Services, agrees.

His research shows that there simply arent enough workers to maintain the size of the U.S. economy as it stands. Instead, the labor market must either allow more immigrants into the country, let the economy shrink in size, or let machines do some of the work, he said.

Artificial intelligence wont replace workers, it will replace missing workers, he argues.

"A lot of these studies can leave you the impression with a fear of the future, Knold said. I think thats the wrong takeaway from studies like this. Theres always new technologies coming that threaten old technologies and workers in those old technologies. But yet, as time goes on, they transition to the new ones, and things are even bigger and better.

If workers are going to be ready to adapt to the change artificial intelligence brings to the workforce, education will need to adapt too, Ventura explained.

But the BYU professor believes the states educational system is already behind, even at the university level.

In my little computer science environment, were not out of touch with it at all, he said. But if you look at the general education program (at BYU), theres nothing. Theres no computer science (or) algorithmic stuff in general education. Its just not a thing.

Utahs fast-growing tech companies have been aware of a talent gap for awhile as they scramble to find employees to fill their ever-expanding needs. But research shows that unless children are exposed to computer science at an early age, theyre much less likely to choose it as a career.

While Utah is working to bring computer science to all K-12 schools in the state by 2022, its a difficult feat, and educational curriculums dont change nearly as fast as technology.

If this AI boom continues to happen, and technology continues to march forward, and we see some of these paradigm-shifting kinds of things, thatll just make us even more behind, Ventura said.

Continue reading here:

Artificial intelligence will affect Salt Lake, Ogden more than most areas in the nation, study shows - KSL.com

Read More..

2019 Artificial Intelligence in Precision Health – Dedication to Discuss & Analyze AI Products Related to Precision Healthcare Already Available -…

DUBLIN--(BUSINESS WIRE)--The "Artificial Intelligence in Precision Health" book from Elsevier Science and Technology has been added to ResearchAndMarkets.com's offering.

Artificial Intelligence in Precision Health: From Concept to Applications provides a readily available resource to understand artificial intelligence and its real time applications in precision medicine in practice. Written by experts from different countries and with diverse background, the content encompasses accessible knowledge easily understandable for non-specialists in computer sciences. The book discusses topics such as cognitive computing and emotional intelligence, big data analysis, clinical decision support systems, deep learning, personal omics, digital health, predictive models, prediction of epidemics, drug discovery, precision nutrition and fitness. Additionally, there is a section dedicated to discuss and analyze AI products related to precision healthcare already available.

Key Topics Covered:

Section 1: Artificial Intelligence Technologies 1. Interpretable Artificial Intelligence: Addressing the Adoption Gap in Medicine 2. Artificial Intelligence methods in computer-aided diagnostic tools and decision support analytics for clinical informatics 3. Deep learning in Precision Medicine 4. Machine learning systems and precision medicine: a conceptual and experimental approach to single individual statistics 5. Machine learning in digital health, recent trends and on-going challenges 6. Data Mining to Transform Clinical and Translational Research Findings into Precision Health

Section II: Applications and Precision Systems/Application of Artificial Intelligence 7. Predictive Models in Precision Medicine 8. Deep Neural Networks for Phenotype Prediction: Application to rare diseases 9. Artificial Intelligence in the management of patients with intracranial neoplasms 10. Artificial Intelligence to aid the early detection of Mental Illness 11. Use of Artificial Intelligence in Alzheimer Disease Detection 12. Artificial Intelligence to predict atheroma plaque vulnerability 13. Decision support systems in cardiovascular medicine through artificial intelligence: applications in the diagnosis of infarction and prognosis of heart failure 14. Artificial Intelligence for Decision Support Systems in Diabetes 15. Clinical decision support systems to improve the diagnosis and management of respiratory diseases 16. Use of Artificial Intelligence in Neurosurgery and Otorhinolaryngology (Head and Neck Surgery) 17. Use of Artificial Intelligence in Emergency Medicine 18. Use of Artificial Intelligence in Infectious diseases 19. Artificial Intelligence techniques applied to patient care and monitoring 20. Use of artificial intelligence in precision nutrition and fitness

Section III: Precision Systems 21. Artificial Intelligence in Precision health: Systems in practice

Authors

For more information about this book visit https://www.researchandmarkets.com/r/i5n12k

Read more here:

2019 Artificial Intelligence in Precision Health - Dedication to Discuss & Analyze AI Products Related to Precision Healthcare Already Available -...

Read More..