Page 2,989«..1020..2,9882,9892,9902,991..3,0003,010..»

How machine learning can revolutionize the quality of hearing – TechHQ

As machine learning (ML) integrates itself into almost every industry from automotive and healthcare to banking and manufacturing- the most exciting advancements look as if they are still yet to come. Machine learning as a subset of artificial intelligence (AI) have been among the most significant technological developments in recent history, with few fields possessing the same amount of potential to disrupt a wide range of industries.

And while many applications of ML technology go unseen, there are countless ways companies are harnessing its power in new and intriguing applications. That said, MLs revolutionary impact is most poised perhaps when put to use for age-old problems.

Hearing loss is not a new condition by any means, and people have suffered from it for centuries. The first electric hearing aid was designed in 1898 by Miller Reese Hutchison, with the first commercially manufactured hearing aids introduced in 1913.With an estimated 48 million Americans experiencing some sort of hearing loss, hearing aids can be a lifeline for many who struggle with the quality of their hearing.

And while it may seem hard to believe, todays most predominant hearing aids on the market can be painful to wear having been designed 50-100 years ago.In response to a stagnant area of development, ML is being leveraged with deep learning and advanced signal processing techniques at a level of detail previously impossible.

Through the application of software-based solutions, ML algorithms can power hearing aids to detect, predict, and suppress unwanted background noise. Neural network models take structured and unstructured data and augment it with other data sets relating to the spectrum of age, language, and voice types. The data is then refined by being fed into neural network training, which begins a process of ongoing product improvement.

In an interview with Forbes, Andre Esteva, Head of Medical AI at Salesforce says that the limits of traditional approaches have been emphasized by the manual processes involved in acquiring data to mold it into a usable format, before preparing basic algorithms to be deployed to devices. ML training protocols, on the other hand, Esteva says, automatically processes data before updating themselves to redeploy.

The effect is a significant reduction in product feedback cycles and an increase in the range of capabilities available. The beauty of this approach is that the underlying intelligence improves over time as the neural nets go through iterative training, added Esteva.

As of today, there are several companies providing AI-powered hearing aids. The most recent being Whisper, a startup that has recently obtained funds of around US$50 million as they prepare to go to production on their first product. The AI-powered hearing aids from Whisper self-tune over time, and continually improve with AI for better performance.

Elsewhere, MicroTech claims its Essentia Edge product scans environments to make changes to boost speech intelligibility, while Widexs Evoke hearing aids combine real-time input with previously learned sounds from users and millions of other listening domains.The goal of introducing machine learning technologies in healthcare is to enhance the experience of patients and users. As intelligent innovative solutions continue to emerge in a field full of noise, the buzz around revolutionary tech seems to only get louder.

See the rest here:
How machine learning can revolutionize the quality of hearing - TechHQ

Read More..

Optimize manufacturing with AI, machine learning and digitalization – The Manufacturer

From production planning and mechanical engineering to project planning, advanced technologies and digitalization effectively mitigate challenges while optimizing key processes.

INFORM Software Corporation, a leading provider of AI-based optimization software that facilitates improves decision making, processes and resource management across diverse industries, will be hosting three, free webinars focused on optimizing various manufacturing and assembly process in sectors that include: machine building, marine and aircraft engines, turbines, generators, fluid power and air motors, pumps, hydraulics and industry cranes with a high diversity of end products and complex planning processes. Find here a short video of our intelligent production planning solution.

The first webinar scheduled for Tuesday, 1 June, 2021, 17:00-17:30 CEST, 11:00-11:30 AM EDT is on Production planning of the future The road to digitalization, AI and machine learning.

It will demonstrate how companies can utilize AI and machine learning in target environments to increase efficiency and planning in their production. They will learn: how to capitalize on their data by making decisions with the assistance of intelligent solutions that increase planning reliability and adherence to schedules in the long term; how machine learning can help to accurately predict replenishment lead times; and how to optimally schedule production despite limited capacities.

INFORM Software Corporation Chief Operating Officer Justin Newell noted, By leveraging AI and machine learning, manufacturers can derive key benefits, including shorter throughput times, cost savings, improved materials management, optimized procurement and identification of critical paths within the supply chain, and how to mitigate potential problems.

The second webinar on Tuesday, 8 June, 2021, 17:00-17:30 CEST, 11:00-11:30 AM EDT will cover Digitalization in mechanical engineering: Mastering complexity in production.

Its focus will be on the growing challenges faced by mechanical and plant engineers stemming from increasing production variants, smaller batch sizes, and daily changes often introduced with short notice. Takeaways will include advice on how to master the daily challenges and leverage an intelligent planning solution to achieve delivery reliability, realistic scheduling, and agility within manufacturing and assembly planning processes.

The topic of the last webinar held on Tuesday, 15 June, 2021, 17:00-17:30 CEST, 11:00-11:30 AM EDT, will be Increased transparency in project and assembly planning focused on machine and plant engineering.

Machine and plant engineering demands transparency with up-to-date information, visualization on the impacts of postponements on production processes and orders, synchronization of work lists to capacities and materials, as well as a reduction missing parts, residue and routine work, all of which can be achieved using intelligent planning solutions, continued Newell. Through this webinar, participants will learn how to achieve their project management goals while planning and delivering parallel projects on-schedule. They will gain insights relating to centralized planning of all resources from employees and facility space to materials and machines, and how to achieve real-time synchronization with their ERP systems, added Newell.

Register here for the free webinars.

The date does not fit for you? Register anyway to receive the presentations after the webinars are held.

Read the original here:
Optimize manufacturing with AI, machine learning and digitalization - The Manufacturer

Read More..

Savills, MRI Software Announce Expanded Partnership to Accelerate AI and Machine Learning Capabilities for Knowledge Cubed – KPVI News 6

NEW YORK, April 8, 2021 /PRNewswire/ --Global real estate services firm Savills today announced that it expanded its global partnership with MRI Software, a worldwide leader in real estate software.

As part of an extended agreement, Savills will expand its integration of MRI's artificial intelligence-powered data abstraction tool, MRI Contract Intelligence powered by Leverton AI, to include MRI ProLease, a cloud-based solution for lease administration, lease accounting, lease analysis and workplace management. The integrated capabilities will deliver enhanced real estate management applications to clients utilizing Savills award-winning business intelligence platform Knowledge Cubed.

"Corporate occupiers require access to data historically locked away in leases to effectively manage and reevaluate their portfolios," said Saurabh Abhyankar, MRI Software's chief product officer. "The expanded integration of Leverton AI automates and simplifies the complex data extraction process, enabling Savills clients to easily access and analyze data from leases, contracts and legal documents."

Savills and MRI will leverage a jointly developed data model to accelerate document abstraction and structuring for corporate occupiers. The proprietary machine-learning algorithm will allow smaller teams to quickly set up digital applications within Knowledge Cubed and highlight actionable insights to enable better management of real estate portfolios.

"By integrating our algorithm within Knowledge Cubed applications, we are able to provide clients an unparalleled speed and scale advantage that helps analyze portfolios in real time with access to the source documents in one click," said Patrick McGrath, Savills chief information officer and head of client technologies.

The MRI partnership continues Savills ongoing investment in innovative client technologies and data partnerships. Launched in 2016, Knowledge Cubed brings together key technologies (e.g., machine learning, cloud, IoT, big data, mobile apps, cybersecurity, and digital contracts) to help clients better understand and optimize global real estate and human capital investments.

During the last 12 months, Savills expanded and signed new partnerships with key partners such as Matterport, CoStar, and CompStak to further invest in the best-in-class technology and data platform designed for corporate occupiers.

About Savills Inc.

As one of the world's leading property advisors, Savills services span the globe,with 39,000 experts working across 600 offices in the Americas, Europe, Asia Pacific, Africa and the Middle East. Sharply skilled and fiercely dedicated, the firm's integrated teams of consultants and brokers are experts in better real estate. With services in tenant representation, workforce and incentives strategy, workplace strategy and occupant experience, project management, and capital markets, Savills has elevated the potential of workplaces around the corner, and around the world, for 160 years and counting.

For more information, please visitSavills.usand follow us on LinkedIn, Twitter, Instagramand Facebook.

View original content to download multimedia:http://www.prnewswire.com/news-releases/savills-mri-software-announce-expanded-partnership-to-accelerate-ai-and-machine-learning-capabilities-for-knowledge-cubed-301265002.html

SOURCE Savills

Read more from the original source:
Savills, MRI Software Announce Expanded Partnership to Accelerate AI and Machine Learning Capabilities for Knowledge Cubed - KPVI News 6

Read More..

27 million galaxy morphologies quantified and cataloged with the help of machine learning | Penn Today – Penn Today

Research from Penns Department of Physics and Astronomy has produced the largest catalog of galaxy morphology classification to date. Led by former postdocs Jess Vega-Ferrero and Helena Domnguez Snchez, who worked with professor Mariangela Bernardi, this catalog of 27 million galaxy morphologies provides key insights into the evolution of the universe. The study was published in Monthly Notices of the Royal Astronomical Society.

The researchers used data from the Dark Energy Survey (DES), an international research program whose goal is to image one-eighth of the sky to better understand dark energys role in the accelerating expansion of the universe.

A byproduct of this survey is that the DES data contains many more images of distant galaxies than other surveys to date. The DES images show us what galaxies looked like more than 6 billion years ago, says Bernardi.

And because DES has millions of high-quality images of astronomical objects, its the perfect dataset for studying galaxy morphology. Galaxy morphology is one of the key aspects of galaxy evolution. The shape and structure of galaxies has a lot of information about the way they were formed, and knowing their morphologies gives us clues as to the likely pathways for the formation of the galaxies, Domnguez Snchez says.

Previously, the researchers had published a morphological catalog for more than 600,000 galaxies from the Sloan Digital Sky Survey (SDSS). To do this, they developed a convolutional neural network, a type of machine learning algorithm, that was able to automatically categorize whether a galaxy belonged to one of two major groups: spiral galaxies, which have a rotating disk where new stars are born, and elliptical galaxies, which are larger, and made of older stars which move more randomly than their spiral counterparts.

But the catalog developed using the SDSS dataset was primarily made of bright, nearby galaxies, says Vega-Ferrero. In their latest study, the researchers wanted to refine their neural network model to be able to classify fainter, more distant galaxies. We wanted to push the limits of morphological classification and trying to go beyond, to fainter objects or objects that are farther away, Vega-Ferrero says.

To do this, the researchers first had to train their neural network model to be able to classify the more pixelated images from the DES dataset. They first created a training model with previously known morphological classifications, comprised of a set of 20,000 galaxies that overlapped between DES and SDSS. Then, they created simulated versions of new galaxies, mimicking what the images would look like if they were farther away using code developed by staff scientist Mike Jarvis.

Once the model was trained and validated on both simulated and real galaxies, it was applied to the DES dataset, and the resulting catalog of 27 million galaxies includes information on the probability of an individual galaxy being elliptical or spiral. The researchers also found that their neural network was 97% accurate at classifying galaxy morphology, even for galaxies that were too faint to classify by eye.

We pushed the limits by three orders of magnitude, to objects that are 1,000 times fainter than the original ones, Vega-Ferrero says. That is why we were able to include so many more galaxies in the catalog.

Catalogs like this are important for studying galaxy formation, Bernardi says about the significance of this latest publication. This catalog will also be useful to see if the morphology and stellar populations tell similar stories about how galaxies formed.

For the latter point, Domnguez Snchez is currently combining their morphological estimates with measures of the chemical composition, age, star-formation rate, mass, and distance of the same galaxies. Incorporating this information will allow the researchers to better study the relationship between galaxy morphology and star formation, work that will be crucial for a deeper understanding of galaxy evolution.

Bernardi says that there are a number of open questions about galaxy evolution that both this new catalog, and the methods developed to create it, can help address. The upcoming LSST/Rubin survey, for example, will use similar photometry methods to DES but will have the capability of imaging even more distant objects, providing an opportunity to gain even deeper understanding of the evolution of the universe.

Mariangela Bernardi is a professor in the Department of Physics and Astronomy in the School of Arts & Sciences at the University of Pennsylvania.

Helena Domnguez Snchez is a former Penn postdoc and is currently a postdoctoral fellow at Instituto de Ciencias del Espacio (ICE), which is part of the Consejo Superior de Investigaciones Cientficas (CSIC).

Jess Vega Ferrero is a former Penn postdoc and currently a postdoctoral researcher at the Instituto de Fsica de Cantabria (IFCA), which is part of the Consejo Superior de Investigaciones Cientficas (CSIC).

The Dark Energy Survey is supported by funding from the Department of Energys Fermi National Accelerator Laboratory, the National Center for Supercomputing Applications, and the National Science Foundations NOIRLab. A complete list of funding organizations and collaborating institutions is at The Dark Energy Survey website.

This research was supported by NSF Grant AST-1816330.

Read the original here:
27 million galaxy morphologies quantified and cataloged with the help of machine learning | Penn Today - Penn Today

Read More..

PODCAST: rise of the machine (learning) – BlueNotes

Jason is working on a few of the complex processes weve been wanting to automate for some time now and hes seeing some positive results.

[Were] looking to automate the home loan process - very document driven - trying to condense that, trying to extract data they can send into our decision systems for me to make a decision, Jason says.

The really exciting part is in today's world, using the old school techniques [such as neutral networks and gradient boosted models], we can make a decision after all those processes have been conducted within four seconds.

A faster decision means customers dont need to find supplementary documentation or spend time waiting for approval. They can get their answer and focus on whats important: getting into their new home.

But its not just the home loan process thats seen the benefit of new technologies. Our Institutional team has been using machine learning for the past few years and Sreeram says even three years ago the team saw the promise the tool held. Now, theyre seeing results.

I'm excited because it is really good for our staff. You know, there's so much value added from an individual point of view because banking can be notoriously paper intensive, he says.

This is a combination of technologies and capabilities. The machine nowthe transfer of paper to image, the quality and accuracy of imaging, the ability to read, the ability to interpret and then the ability to process; this is coming together for the first time, at least in my career.

We have seen cases where 50 per cent of the manual effort before has been. We have seen cases where our internal times have improved roughly 40 to 50 per cent. So I think it's absolutely made things better.

Although Sreeram reminds us that comes with its challenges and caution and management of governance, as with any technology.

Original post:
PODCAST: rise of the machine (learning) - BlueNotes

Read More..

Rackspace Technology Works with Brave Software to Improve Machine Learning Functionality in the Web Browser – Yahoo Finance

SAN ANTONIO, April 08, 2021 (GLOBE NEWSWIRE) -- Rackspace Technology (NASDAQ: RXT), a leading end-to-end, multicloud technology solutions company, announced today its relationship with Brave Software which provides a free, open-source private and secure web browser for PC, Mac, and mobile environments.

Brave gives users a fast and private web experience, helps advertisers achieve better conversions and increases publishers revenue share. "Its machine learning functionality helps to match advertisements in the Brave Ads content categories for which Brave users would have the most interest, while preserving user privacy

Brave worked with AWS Premier Consulting Partner, Onica, a Rackspace Technology company, to improve the scalability of Braves software, increased the teams efficiency, and reduced infrastructure costs by 50 percent. Rackspace Technology used a wide range of AWS services to build cloud infrastructure tailored to Braves needs.

Before working with Rackspace Technology, Braves processes for training and deploying machine learning models were slower, involving manual steps spanning several days. Brave needed a more robust pipeline and fully automated processes.

Working with Rackspace Technology and AWS was beneficial to the continued success and scaling of Brave, said Jimmy Secretan, VP of Services and Operations, Brave Software. It substantially improved the way we created and deployed new models, which has helped us to be much more responsive to advertisers needs."

Rackspace Technology is one of only a few providers to have achieved AWS Machine Learning Competency status, said Jeff Deverter, CTO, Solutions at Rackspace Technology. This unique combination of expertise in AWS services and machine learning made us an ideal partner for Brave.

To learn more about Rackspace Technologys work and capabilities please visit http://www.rackspace.com.

About Rackspace TechnologyRackspace Technology is a leading end-to-end multicloud technology services company. We can design, build and operate our customers cloud environments across all major technology platforms, irrespective of technology stack or deployment model. We partner with our customers at every stage of their cloud journey, enabling them to modernize applications, build new products and adopt innovative technologies.

Story continues

Media ContactNatalie SilvaRackspace Technology Corporate Communicationspublicrelations@rackspace.com

Read the rest here:
Rackspace Technology Works with Brave Software to Improve Machine Learning Functionality in the Web Browser - Yahoo Finance

Read More..

i.MX 8M plus eval kit with machine learning and voice and vision capabilities – Electropages

09-04-2021 | Mouser Electronics | Design & Manufacture

Mouser now stocks the i.MX 8M Plus evaluation kit from NXP Semiconductors. The comprehensive kit offers a complete evaluation platform for the new i.MX 8M Plus embedded multi-core heterogeneous applications processors the first in the family to combine a dedicated NPU for advanced machine learning inference at the edge in industrial and IoT applications.

The device plus evaluation kit comprises a compact compute module with onboard i.MX 8M Plus Quad processor and a larger baseboard bring out the wide connectivity required for product evaluation. The processor integrates four Arm Cortex-A53 cores running at up to 1.8GHz, plus an 800MHz Arm Cortex-M7 core for low-power real-time processing. Utilising the integrated NPU, the i.MX 8M Plus processor can simultaneously identify multiple highly complex neural network functions, including human pose and emotion detection, multi-object surveillance, and the recognition of more than 40,000 English words.

The kit is excellent for furthering designs in applications such as surveillance, robot vision, smart retail, home health monitors, building control, smart home, smart city, and industrial IoT.

The rest is here:
i.MX 8M plus eval kit with machine learning and voice and vision capabilities - Electropages

Read More..

Machine Learning in Healthcare Market: Find Out Essential Strategies to expand The Business and Also Check Working in 2021-2029 KSU | The Sentinel…

The recently released report byMarket Research Inctitled as Global Machine Learning in Healthcaremarket is a detailed analogy that gives the reader an insight into the intricacies of the various elements like the growth rate, and impact of the socio-economic conditions that affect the market space. An in-depth study of these numerous components is essential as all these aspects need to blend-in seamlessly for businesses to achieve success in this industry.

Request a sample copy of this report @:https://www.marketresearchinc.com/request-sample.php?id=16640

Top key players::

Intel Corporation, IBM Corporation, Nvidia Corporation, Microsoft Corporation, Alphabet Inc (Google Inc.), General Electric (GE) Company, Enlitic, Inc., Verint Systems, General Vision, Inc., Welltok, Inc., iCarbonX

The geographical segmentation includes study of global regions such asNorth America, Latin America, Asia-Pacific, Africa, Middle Eastand Europe. The report also draws attention to recent advancements in technologies and certain methodologies which further help to boost the outcome of the businesses. Furthermore, it also offers a comprehensive data of cost structure such as the cost of manpower, tools, technologies, and cost of raw material. The report is an expansive source of analytical information of different business verticals such as type, size, applications, and end-users.

This market research report on the Global Machine Learning in HealthcareMarket is an all-inclusive study of the business sectors up-to-date outlines, industry enhancement drivers, and manacles. It provides market projections for the coming years. It contains an analysis of late augmentations in innovation, Porters five force model analysis and progressive profiles of hand-picked industry competitors. The report additionally formulates a survey of minor and full-scale factors charging for the new applicants in the market and the ones as of now in the market along with a systematic value chain exploration.

Get a reasonable discount on this premium report @:https://www.marketresearchinc.com/ask-for-discount.php?id=16640

Additionally, this report recognizes pin-point investigation of adjusting competition subtleties and keeps you ahead in the competition. It offers a fast-looking perception on different variables driving or averting the development of the market. It helps in understanding the key product areas and their future. It guides in taking knowledgeable business decisions by giving complete constitutions of the market and by enclosing a comprehensive analysis of market subdivisions. To sum up, it equally gives certain graphics and personalized SWOT analysis of premier market sectors.

Rendering to the research report, the global Machine Learning in Healthcaremarket has gained substantial momentum over the past few years. The swelling acceptance, the escalating demand and need for this markets product are mentioned in this study. The factors powering their adoption among consumers are stated in this report study. It estimates the market taking up a number of imperative parameters such as the type and application into consideration. In addition to this, the geographical occurrence of this market has been scrutinized closely in the research study.

Further information:https://www.marketresearchinc.com/enquiry-before-buying.php?id=16640

In this study, the years considered to estimate the size ofMachine Learning in Healthcareare as follows:

History Year: 2015-2019

Base Year: 2020

Forecast Year 2021 to 2029.

Table of Contents:

Machine Learning in Healthcare Market Overview

Impact on Machine Learning in Healthcare Market Industry

Machine Learning in Healthcare Market Competition

Machine Learning in Healthcare Market Production, Revenue by Region

Machine Learning in Healthcare Market Supply, Consumption, Export and Import by Region

Machine Learning in Healthcare Market Production, Revenue, Price Trend by Type

Machine Learning in Healthcare Market Analysis by Application

Machine Learning in Healthcare Market Manufacturing Cost Analysis

Internal Chain, Sourcing Strategy and Downstream Buyers

Marketing Strategy Analysis, Distributors/Traders

Market Effect Factors Analysis

Machine Learning in Healthcare Market Forecast (2021-2029)

Appendix

About Us

Market Research Inc is farsighted in its view and covers massive ground in global research. Local or global, we keep a close check on both markets. Trends and concurrent assessments sometimes overlap and influence the other. When we say market intelligence, we mean a deep and well-informed insight into your products, market, marketing, competitors, and customers. Market research companies are leading the way in nurturing global thought leadership. We help your product/service become the best they can with our informed approach.

Contact Us

Market Research Inc

Kevin

51 Yerba Buena Lane, Ground Suite,

Inner Sunset San Francisco, CA 94103, USA

Call Us:+1 (628) 225-1818

Write [emailprotected][emailprotected]

https://www.marketresearchinc.com

More here:
Machine Learning in Healthcare Market: Find Out Essential Strategies to expand The Business and Also Check Working in 2021-2029 KSU | The Sentinel...

Read More..

Graphs, quantum computing and their future roles in analytics – TechRepublic

Graphs are used in mathematics, engineering and computer science, and they are growing as a technology in IT analytics. Here's how they relate to quantum computing.

Image: iStock/monsitj

A graph is a collection of points, called vertices, and lines between those points, are called edges. Graphs are used in mathematics, engineering and computer science, and they are growing as a technology in IT analytics.

"Graphs can be much more flexible than other [artificial intelligence] techniques, especially when it comes to adding new sources of data," said Steve Reinhardt, VP of product development at Quantum Computing Inc., which produces quantum computing software that operates on graphs. "For instance, if I'm storing patient data and I want to add a dimension to track the unlikely event of testing positive for coronavirus after being vaccinated, graphs only consume storage proportional to the number of patients encountering the rare event."

SEE: The CIO's guide to quantum computing (free PDF) (TechRepublic)

Graphs can be heady stuff, so let's break that down.

A database software, such as SQL or NoSQL, would be a logical technology to use if you want to plot the many different relationships between data. Analytics programs then operate on this data and how it is interrelated to derive insights that answer a specific business query.

Unfortunately, to process all of the data relationships in Reinhardt's patient example, a relational database must go through all patient records and store them in order to identify that subset of patients who tested positive for the coronavirus after being vaccinated. For an average hospital, this processing could involve hundreds of thousands of patient records and all of their multiple relationships to the coronavirus and the vaccine.

Now let's put that same problem into a graph. The graph uses data points, lines connecting those points and vertices which show where the lines intersect because they have a common shared context. This shared context enables the graph to identify a subset of patients who tested positive for COVID-19 after they had a vaccine and only store that subset of data for processing. Because a graph can intelligently identify a subset of data through its relationships before data gets processed, processing time is saved.

SEE: Big data graphs are playing an important role in the coronavirus pandemic (TechRepublic)

As IT expands into more data sources for its analytics and data stores, processing will grow more complex and cumbersome. This is where a combination of graphs and quantum computing will one day be able to process data faster than traditional methods.

"Graphs have a rich set of well-understood techniques for analyzing them," Reinhardt said. "Some of these are well-known from analyzing graphs that occur naturally, such as the PageRank algorithm that Google originally used to gauge the importance of web pages, and the identification of influencers in social networks. This is why we are focused on making these algorithms more practically usable."

That sounds good to IT, where there is an issue of understanding enough about graphs and quantum computing to put them to use.

SEE: Research: Quantum computing will impact the enterprise, despite being misunderstood (TechRepublic)

"The goal is to develop solutions so users need to know nothing about the details of quantum computers, including low-level architectural features such as qubits, gates, circuits, couplers and QUBOs," Reinhardt said. "Today, quantum processors are almost never faster than the best classical methods for real-world problems, so early users need to have appropriate expectations. That said, the performance of quantum processors has been growing dramatically, and the achievement of quantum advantage, superior quantum performance on a real-world problem, may not be far off, so organizations that depend on a computing advantage will want to be prepared for that event."

And that is the central point: While graphs and quantum computing are still nebulous concepts to many IT professionals, it isn't too early to start placing them on IT roadmaps, since they will certainly play roles in future analytics.

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

Excerpt from:
Graphs, quantum computing and their future roles in analytics - TechRepublic

Read More..

615 Million Euros Awarded to Quantum Delta NL for Quantum Research in the Netherlands – HPCwire

April 9, 2021 Quantum Delta NL, a research programme in which Leiden University participates, has been awarded 615 million euros from the National Growth Fund to help develop the Netherlands into a top player in quantum technology. This has been announced at the presentation of the honoured proposals in The Hague.

Quantum Delta NL is a cooperation of companies and research institutes in which the research has been organised in five hubs at the universities of Delft, Leiden, Amsterdam, Twente and Eindhoven.

The research groupApplied Quantum Algorithms (aQa)at the Leiden institutes for physics and computer science develops quantum algorithms for chemical and material science applications, in cooperation with Google, Shell, Volkswagen and Total.

Great enthusiasm

Research into quantum computing has been going on for twenty years, bringing real world application ever closer, says Carlo Beenakker, professor in Theoretical Physics and Deputy Chair of Quantum Delta NL. I seegreat enthusiasm in my students to apply abstract concepts from quantum physics to the solution of practical problems. This is the revolutionary technology of their generation.

The goal of aQa is to make quantum algorithms practically applicable, pertaining to questions ofsocietal and economical relevance. We cooperate narrowly with our industrial partners to render these large investments as useful as possible, says computer science researcher Vedran Dunjko. Recently, he published in the journal Natureabout artificial intelligence implemented through quantum computers.

Quantum technology

Quantum Delta NLs ambition is to position the Netherlands as a Silicon Valley for quantum technology in Europe during the coming seven years. The programme provides for the further development of the quantum computer and the quantum internet, which will be open for end users in business and societal sectors, including education.

It aims for a flourishing ecosystem where talent is fostered at all levels, and where cooperation happens over institutional borders to develop a new European high-tech industry.

Source: Leiden University

Excerpt from:
615 Million Euros Awarded to Quantum Delta NL for Quantum Research in the Netherlands - HPCwire

Read More..