Page 3,742«..1020..3,7413,7423,7433,744..3,7503,760..»

Neural networks facilitate optimization in the search for new materials – MIT News

When searching through theoretical lists of possible new materials for particular applications, such as batteries or other energy-related devices, there are often millions of potential materials that could be considered, and multiple criteria that need to be met and optimized at once. Now, researchers at MIT have found a way to dramatically streamline the discovery process, using a machine learning system.

As a demonstration, the team arrived at a set of the eight most promising materials, out of nearly 3 million candidates, for an energy storage system called a flow battery. This culling process would have taken 50 years by conventional analytical methods, they say, but they accomplished it in five weeks.

The findings are reported in the journal ACS Central Science, in a paper by MIT professor of chemical engineering Heather Kulik, Jon Paul Janet PhD 19, Sahasrajit Ramesh, and graduate student Chenru Duan.

The study looked at a set of materials called transition metal complexes. These can exist in a vast number of different forms, and Kulik says they are really fascinating, functional materials that are unlike a lot of other material phases. The only way to understand why they work the way they do is to study them using quantum mechanics.

To predict the properties of any one of millions of these materials would require either time-consuming and resource-intensive spectroscopy and other lab work, or time-consuming, highly complex physics-based computer modeling for each possible candidate material or combination of materials. Each such study could consume hours to days of work.

Instead, Kulik and her team took a small number of different possible materials and used them to teach an advanced machine-learning neural network about the relationship between the materials chemical compositions and their physical properties. That knowledge was then applied to generate suggestions for the next generation of possible materials to be used for the next round of training of the neural network. Through four successive iterations of this process, the neural network improved significantly each time, until reaching a point where it was clear that further iterations would not yield any further improvements.

This iterative optimization system greatly streamlined the process of arriving at potential solutions that satisfied the two conflicting criteria being sought. This kind of process of finding the best solutions in situations, where improving one factor tends to worsen the other, is known as a Pareto front, representing a graph of the points such that any further improvement of one factor would make the other worse. In other words, the graph represents the best possible compromise points, depending on the relative importance assigned to each factor.

Training typical neural networks requires very large data sets, ranging from thousands to millions of examples, but Kulik and her team were able to use this iterative process, based on the Pareto front model, to streamline the process and provide reliable results using only the few hundred samples.

In the case of screening for the flow battery materials, the desired characteristics were in conflict, as is often the case: The optimum material would have high solubility and a high energy density (the ability to store energy for a given weight). But increasing solubility tends to decrease the energy density, and vice versa.

Not only was the neural network able to rapidly come up with promising candidates, it also was able to assign levels of confidence to its different predictions through each iteration, which helped to allow the refinement of the sample selection at each step. We developed a better than best-in-class uncertainty quantification technique for really knowing when these models were going to fail, Kulik says.

The challenge they chose for the proof-of-concept trial was materials for use in redox flow batteries, a type of battery that holds promise for large, grid-scale batteries that could play a significant role in enabling clean, renewable energy. Transition metal complexes are the preferred category of materials for such batteries, Kulik says, but there are too many possibilities to evaluate by conventional means. They started out with a list of 3 million such complexes before ultimately whittling that down to the eight good candidates, along with a set of design rules that should enable experimentalists to explore the potential of these candidates and their variations.

Through that process, the neural net both gets increasingly smarter about the [design] space, but also increasingly pessimistic that anything beyond what weve already characterized can further improve on what we already know, she says.

Apart from the specific transition metal complexes suggested for further investigation using this system, she says, the method itself could have much broader applications. We do view it as the framework that can be applied to any materials design challenge where you're really trying to address multiple objectives at once. You know, all of the most interesting materials design challenges are ones where you have one thing you're trying to improve, but improving that worsens another. And for us, the redox flow battery redox couple was just a good demonstration of where we think we can go with this machine learning and accelerated materials discovery.

For example, optimizing catalysts for various chemical and industrial processes is another kind of such complex materials search, Kulik says. Presently used catalysts often involve rare and expensive elements, so finding similarly effective compounds based on abundant and inexpensive materials could be a significant advantage.

This paper represents, I believe, the first application of multidimensional directed improvement in the chemical sciences, she says. But the long-term significance of the work is in the methodology itself, because of things that might not be possible at all otherwise. You start to realize that even with parallel computations, these are cases where we wouldn't have come up with a design principle in any other way. And these leads that are coming out of our work, these are not necessarily at all ideas that were already known from the literature or that an expert would have been able to point you to.

This is a beautiful combination of concepts in statistics, applied math, and physical science that is going to be extremely useful in engineering applications, says George Schatz, a professor of chemistry and of chemical and biological engineering at Northwestern University, who was not associated with this work. He says this research addresses how to do machine learning when there are multiple objectives. Kuliks approach uses leading edge methods to train an artificial neural network that is used to predict which combination of transition metal ions and organic ligands will be best for redox flow battery electrolytes.

Schatz says this method can be used in many different contexts, so it has the potential to transform machine learning, which is a major activity around the world.

The work was supported by the Office of Naval Research, the Defense Advanced Research Projects Agency (DARPA), the U.S. Department of Energy, the Burroughs Wellcome Fund, and the AAAS Mar ion Milligan Mason Award.

See the original post:
Neural networks facilitate optimization in the search for new materials - MIT News

Read More..

Coronavirus lockdown: 10 free online computer science courses from Harvard, Princeton & other top universities to study – Gadgets Now

As India fights the spread of coronavirus disease with 21 days of lockdown, it may be a good idea to utilise the extra time at home to learn something new. There are lots of free online computer science courses from top universities like Harvard, Princeton, Stanford, MIT and others available online which you can start anytime and learn at your own pace. Class Central, a platform for free online courses, lists out thousands of courses in computer science, business, data science, humanities and more. Here are 10 free online computer science courses from Harvard, Princeton & other top universities that you may want to consider to upskill yourself and make the most of the lockdown period. (Note that only basic or introductory courses are listed and there are thousands of free online courses available which you can try.)

Read more:
Coronavirus lockdown: 10 free online computer science courses from Harvard, Princeton & other top universities to study - Gadgets Now

Read More..

Natural Language Processing is an Untapped AI Tool for Innovation – Yahoo Finance

Natural language processing (NLP) will improve processes including technology landscaping, competitive analysis, and weak signal detection

BOSTON, March 26, 2020 /PRNewswire/ --Innovation leaders are seeking ways to use artificial intelligence (AI) effectively to extract value and leverage data for maximum impact. Lux considers natural language processing (NLP) and topic modeling the AI tools of choice. These tools have the potential to accelerate the front end of innovation across many industries, but remain underutilized. According to Lux Research's new whitepaper, "Improving the Front End of Innovation with Artificial Intelligence and Machine Learning," NLP can improve processes including technology landscaping, competitive analysis, and weak signal detection.

(PRNewsfoto/Lux Research)

NLP enables rapid analysis of huge volumes of text, which is where most of the data driving innovation lives.

"When utilized effectively, machine learning can quickly mine data to produce actionable insights, significantly decreasing the time it takes for a comprehensive analysis to be performed. An analysis that would have previously taken weeks can now be reduced to days," said Kevin See, Ph.D., VP of Digital Products for Lux Research.

The speed conferred through NLP is enabled by the comprehensiveness of topic modeling, which extracts important concepts from text while eliminating the human assumption and bias associated with it. "Previously, an investigation was hindered by either the limited knowledge or bias of the primary investigator, both of which are mitigated when using machine learning. A beneficial technology or idea is less likely to be missed due to an error in human judgement," explained See.

There are many relevant applications that use machine learning to leverage speed and comprehensiveness in innovation. Landscaping is used to build a taxonomy that defines the trends for key areas of innovation under a specific topic. Concept similarity can take one piece of content and find other relevant articles, patents, or news to accelerate the innovation process. Topic modeling can also be used for competitive portfolio analysis when applied to a corporation instead of a technology, or for weak signal detection when applied to large data sets like news or Twitter.

When defining a successful AI and machine learning strategy, there are a few key points to consider, including whether you'll buy or build your technology, what data sources you'll use, and how you'll leverage experts to define and interpret the data. It's also important to adapt a culture of acceptance of these tools so that valued human resources see them as an asset to their skills rather than competition. "The confidence and speed AI and machine learning bring to the decision-making process is enabling innovation to happen at a more rapid pace than ever before, but don't think this means humans are no longer needed," said See. People are still necessary to define the starting points of an analysis, label topics, and extract insights from the data collected. "It's clear that a collaboration between humans and machines can generate better results, benefiting all involved," See continued.

For more information, download a copy of Lux Research's whitepaper here.

About Lux Research

Story continues

Lux Research is a leading provider of tech-enabled research and advisory services, helping clients drive growth through technology innovation. A pioneer in the research industry, Lux uniquely combines technical expertise and business insights with a proprietary intelligence platform, using advanced analytics and data science to surface true leading indicators. With quality data derived from primary research, fact-based analysis, and opinions that challenge traditional thinking, Lux empowers clients to make more informed decisions today to ensure future success.

For more information, visit http://www.luxresearchinc.com, read our blog, connect on LinkedIn, or follow @LuxResearch.

Contact Information: Jess Bonner press@luxresearchinc.com(617) 502-3219

View original content to download multimedia:http://www.prnewswire.com/news-releases/natural-language-processing-is-an-untapped-ai-tool-for-innovation-301030014.html

SOURCE Lux Research

See the article here:
Natural Language Processing is an Untapped AI Tool for Innovation - Yahoo Finance

Read More..

Udacity offers free tech training to laid-off workers due to the coronavirus pandemic – CNBC

A nanodegree in autonomous vehicles is just one of 40 programs that Udacity is offering for free to workers laid off in the wake of the COVID-19 pandemic.

Udacity

Online learning platform Udacity is responding to the COVID-19 pandemic by offering free tech training to workers laid off as a result of the crisis.

On Thursday the Mountain View, California-based company revealed that in the wake of layoffs and furloughs by major U.S. corporations, including Marriott International, Hilton Hotels and GE Aviation, it will offer its courses known as nanodegrees for free to individuals in the U.S. who have been let go because of the coronavirus. The average price for an individual signing up for a nanodegree is about $400 a month, and the degrees take anywhere from four to six months to complete, according to the company.

The hope is that while individuals wait to go back to work, or in the event that the layoff is permanent, they can get training in fields that are driving so much of today's digital transformation. Udacity's courses include artificial intelligence, machine learning, digital marketing, product management, data analysis, cloud computing, autonomous vehicles, among others.

Gabe Dalporto, CEO of Udacity, said that over the past few weeks, as he and his senior leadership team heard projections of skyrocketing unemployment numbers as a result of COVID-19, he felt the need to act. "I think those reports were a giant wake-up call for everybody," he says. "This [virus] will create disruption across the board and in many industries, and we wanted to do our part to help."

A nanodegree in autonomous vehicles is just one of 40 programs that Udacity is offering for free to workers laid off in the wake of the COVID-19 pandemic.

Udacity

Dalporto says Udacity is funding the scholarships completely and that displaced workers can apply for them at udacity.com/pledge-to-americas-workers beginning March 26. Udacity will take the first 50 eligible applicants from each company that applies, and within 48 hours individuals should be able to begin the coursework. Dalporto says the offer will be good for the first 20 companies that apply and that "after that we'll evaluate and figure out how many more scholarships we are going to fund."

The company also announced this week that any individual, regardless of whether they've been laid off, can enroll for free in any one of Udacity's 40 different nanodegree programs. Users will get the first month free when they enroll in a monthly subscription, but Dalporto pointed out that many students can complete a course in a month if they dedicate enough time to it.

Udacity's offerings at this time underscore the growing disconnect between the skills workers have and the talent that organizations need today and in the years ahead. The company recently signed a deal with Royal Dutch Shell, for instance, to provide training in artificial intelligence. Shell says about 2,000 of its 82,000 employees have either expressed interest in the AI offerings or have been approached by their managers about taking the courses on everything from Python programming to training neural networks. Shell says the training is completely voluntary.

We have to be asking how are we going to help them get the skills they need to be successful in their careers moving forward when this is all behind us.

Gabe Dalporto

CEO of Udacity

And as more workers lose their jobs in the wake of the COVID-19 pandemic, it will be even more crucial that they're able to reenter the job market armed with the skills companies are looking for. According to the World Economic Forum's Future of Jobs report, at least 54% of all employees will need reskilling and upskilling by 2022. Yet only 30% of employees at risk of job displacement because of technological change received any training over the past year.

"America is facing a massive shortage of workers with the right technical skills, and as employers, retraining your existing workforce to address that shortage is the most efficient, cost-effective way to fill those gaps in an organization," Dalporto says. "The great irony in the world right now is that at the same time that a lot of people are going to lose their jobs, there are areas in corporations where managers just can't hire enough people for jobs in data analytics, cloud computing and AI."

Dalporto, who grew up in West Virginia, says he sees this point vividly every time he revisits his hometown. "When I go back, I see so many businesses and companies boarded up and people laid off because they didn't keep pace with automation and people didn't upskill," he says. As a result, many of these workers wind up in minimum wage jobs and that "just creates a lot of pain for them and their families," he adds. What's happening now is only fueling that cycleone that Dalporto says can be minimized with the right action.

"Laying people off is never an easy decision, but companies have to move the conversation beyond how many weeks of severance they're going to offer," he says. "We have to be asking how are we going to help them get the skills they need to be successful in their careers moving forward when this is all behind us."

Continue reading here:
Udacity offers free tech training to laid-off workers due to the coronavirus pandemic - CNBC

Read More..

How AI Can Realize The Promise Of Adaptive Education – Forbes

Making education more effective, equitable and available to every child in the world has long been the holy grail for education technologists and entrepreneurs, who have developed countless solutions, like massive open online courses. Now, with advances in artificial intelligence (AI), we can take this a step further and pursue adaptive education.

The core promise of adaptive education is an intelligent, 1:1 experience for every student. Research has long established that students who receive personalized, one-on-one tutoring from human teachers outscore their peers on final exams. As early as 1984, Dr. Benjamin Bloom, the late educational psychologist, showed that students who receive personalized tutoring score higher than almost all of their fellow students who attend standard classroom lectures.

The implications are dramatic: We can build a computer-based tutor for millions of students around the world.

Adaptive education represents a paradigm shiftfrom the conventional model (an instructor-centric, passive learning experience) to an intelligent one (a learner-centric, interactive, active learning experience). In the adaptive model, each student is paired with a virtual coach. Its a concept that can be scaled to millions of students at a fraction of the cost of human tutors.

The full spectrum of technology innovation in education has long been understood using the SAMR frameworkfor substitution, augmentation, modification and redefinition. Adaptive education powered by AI, not surprisingly, fits into the final stage of redefinition, where transformative experiences truly happen and learning itself is redefined. (For more information on these perspectives, check out Rewiring Education, for which I co-wrote a Chinese version.)

How far away are we from putting adaptive learning platforms to work? In order to deliver on their promise, companies must first build an adaptive learning engine that can acquire, store and analyze data. At my company, we have invested heavily in the most advanced machine learning algorithms and continue to conduct research that can improve not only our products but also the AI and science that drives them.

Derek Haoyang Li

For example, one machine learning model we developedcalled a multidimensional probabilistic knowledge stateuses data from learners to measure their proficiency across a wide range of variables. The application module that uses an AI algorithm can measure a students approximate proficiency level on each of 500 knowledge factors using just 25-30 questions.

For years, conventional education has been defined, in part, by geography and the financial means to pursue college degrees. New types of online learning in recent years have been helpful in overcoming some of those limitations, but they havent solved the core challenge that adaptive AI canto shift the teaching model from one-to-many to one-to-one.

Ultimately, I believe adaptive systemswith the combination of AI and human coaching represent our best hope for giving all students the chance to reach their full potential.

CREDIT: Klaus Vedfelt/GettyImages; Courtesy of Derek Haoyang Li

See the rest here:
How AI Can Realize The Promise Of Adaptive Education - Forbes

Read More..

Micron Gains as Cloud Strength Boosts its Earnings and Guidance – TheStreet

While Microns (MU) - Get Reportsales to some major customers are under pressure, its sales to some others are holding up pretty well right now.

After the bell on Wednesday, the memory giant reported February quarter (fiscal second quarter) revenue of $4.8 billion (down 18% annually) and non-GAAP EPS of $0.45, topping consensus analyst estimates of $4.69 billion and $0.37.

Micron also guided for May quarter revenue of $4.6 billion to $5.2 billion and non-GAAP EPS of $0.40 to $0.70. The midpoints of those wider-than-usual guidance ranges are slightly above consensus estimates of $4.87 billion and $0.52.

With markets having been on edge about the COVID-19 pandemics impact on Microns sales, the numbers are going over well. As of the time of this article, Microns stock is up 5.3% in after-hours trading to $44.75. Shares are now up 44% from a March 18 low of $31.13, albeit still down 27% from a Feb. 12 52-week high of $61.19.

In its prepared remarks, Micron did caution that it expects smartphone, consumer electronics and automotive demand to be below prior expectations during the second half of its fiscal 2020 (ends in Aug. 2020). However, the company also noted it has seen higher notebook demand as more workers and students work and learn from home, and that demand from data center end-markets (already on the upswing in recent months thanks to a cloud capex rebound) is strong as usage for various online/cloud services grows, and even leading to shortages.

In addition, Micron reported that Chinas COVID-19 pandemic weighed on its sales to consumer electronics clients and caused factory shutdowns for some clients. However, it added local data center demand was strong, and that Chinese smartphone production volumes have begun to rebound.

Micron's near-term demand outlook. Source: Micron.

Microns demand commentary has some things in common with what GPU giant and Micron graphics DRAM client Nvidia (NVDA) - Get Reportshared on a Tuesday conference call. Among other things, Nvidia said its seeing strong demand for GPUs going into notebooks and cloud servers, and indicated Chinese demand has begun normalizing. Micron, for its part, disclosed that its bit shipments of GDDR6 graphics DRAM (used by some of Nvidias GPUs) rose over 40% sequentially last quarter.

With full-year supply and demand trends quite uncertain right now, Micron chose not to provide calendar 2020 outlooks for DRAM and NAND flash memory bit supply and demand growth -- either for itself or the memory industry at-large.

In December, Micron guided for DRAM industry bit demand to grow by a mid-teens percentage this year, and for NAND bit demand to grow by a high-20s to low-30s percentage. Now, Micron is merely reiterating long-term guidance for DRAM bit demand to see a mid-to-high teens compound annual growth rate (CAGR), and for NAND bit demand to see a roughly 30% CAGR.

Micron is also for now reiterating fiscal 2020 capital spending guidance of $7 billion to $8 billion, while adding that its evaluating its capex plans for calendar 2020. Several chip equipment makers, including Applied Materials (AMAT) - Get Report, have withdrawn their quarterly guidance, while noting that recent lockdown orders have impacted their manufacturing operations.

Micron received a few questions on its earnings call about inventory levels -- both its own and those of its customers.

Microns inventory rose by $300 million sequentially to $5.2 billion, leading its days of inventory to rise by 13 to 134. The company insisted that much of this growth was due to seasonality and the holding of additional NAND inventory ahead of a technology transition. However, it also reported building its raw materials stockpiles due to supply chain uncertainty.

Separately, Micron admitted that just as its stockpiling raw materials, some of its customers could be stockpiling memory products, and that these efforts could be masking weakening end-market demand.

Following a sharp downturn in late 2018 and early 2019, DRAM and (especially) NAND pricing trends have seen meaningful improvement, as industry capex cuts make themselves felt. And for now at least, Micron insists memory pricing trends remain favorable.

During the February quarter, Microns DRAM average selling price (ASP) was roughly flat sequentially, after having dropped by a high-single digit percentage in its November quarter. NAND ASP rose by a high-single digit percentage, after having risen by a low-single digit percentage in the November quarter.

Excerpt from:
Micron Gains as Cloud Strength Boosts its Earnings and Guidance - TheStreet

Read More..

Cloud-Based Security Tool Adoption: Latest Research Findings – MSSP Alert

Most organizations have migrated security tools to the cloud, according to a survey from security information & event management (SIEM) provider Exabeam.

by Dan Kobialka Mar 25, 2020

Approximately 58% of organizations have migrated at least a quarter of their security tools to cloud-based options, according to a survey from security information and event management (SIEM) platform provider Exabeam. In addition, 33 percent said they have moved more than half of their security tools to cloud options.

Other findings from Exabeams survey included:

The survey also revealed cloud-based security tools are being used to protect different types of data, including:

Cloud-based security tools may be beneficial, but organizations must still maintain visibility into their cloud services, Exabeam Security Strategist Sam Humphries stated. In doing so, these organizations can use cloud-based security tools to enjoy the functionality of traditional on-premise security solutions, along with reduced costs and maintenance issues.

Organizations that want to implement cloud-based security can leverageExabeam SaaS Cloud, which is available for hosting in 15 locations across the following regions:

SaaS Cloud helps organizations identify cyber threats and meet compliance and policy requirements, the company indicated. It also provides data lake, behavioral analytics, case management, security orchestration and incident response automation capabilities.

Read the original here:
Cloud-Based Security Tool Adoption: Latest Research Findings - MSSP Alert

Read More..

COVID-19 puts corporate WFH capabilities to the test – SC Magazine

While many organizationsalready have telecommute policies and solutions in place, they are mostcommonly for either fully-remote workers or for employees who typically work inthe office but need flexibility for unusual situations. The current environmentmost companies now face may put their remote workplace capabilities to thetest.

This is most pronounced whenconsidering security controls, cyber-hygiene, and reducing risk exposure that amore remote workforce creates. Are organizations prepared for such adistributed workforce and the potential risks that come with it?

When it comes to ITadministration teams, outsourced IT, and third-party vendors who might haveprivileged access to systems and infrastructure, they need secure, granularaccess to critical infrastructure resources regardless of location and withoutthe hassles of a virtual private network (VPN). Ideally, how privileged usersaccess these systems shouldnt be different, regardless of whether they are inan on-premise data center or accessing remotely.

Ditch the VPN

Last year it was reportedthat Citrix was breached through a password spraying attack that also sought toleverage VPN access. ARS Technica also reported last year that energy companies have specifically become targets of attacks thatuse password spraying and VPN hacking.

Unlike a VPN that generallygives users visibility to the entire network, organizations should only grantaccess to resources on a per-resource basis. This gives privileged internal ITadmins access to only as much infrastructure as necessary, while limitingaccess by an outsourced team to only the servers and network hardware their rolerequires.

Privileged users shouldauthenticate through Active Directory, LDAP, or whatever the authoritativeidentity store is, or grant granular, federated privileged access to resourcesfor business partners and third-party vendors.

Guard against cyber-attacksby combining risk-level with role-based access controls, user context and MFAto enable intelligent, automated and real-time decisions for grantingprivileged access to users who are remotely accessing servers, on passwordcheckout or when using a shared account to log into remote systems.

Secure Privileged Accessfor On-Site and Remote Administration

Here are six ways anyorganization can create consistency in their privileged access management (PAM)approaches to secure remote access to data center and cloud-basedinfrastructures through a cloud-based service or on-premises deployment.

Nate Yocom is Chief Technology Officer at Centrify

Link:
COVID-19 puts corporate WFH capabilities to the test - SC Magazine

Read More..

Supermicro Accelerates AI and Deep Learning with NGC-Ready Servers – insideHPC

Today Supermicro announced the industrys broadest portfolio of validated NGC-Ready systems optimized to accelerate AI and deep learning applications. Supermicro is highlighting many of these systems today at the Supermicro GPU Live Forum in conjunction with NVIDIA GTC Digital.

Supermicro NGC-Ready systems allow customers to train AI models using NVIDIA V100 Tensor Core GPUs and to perform inference using NVIDIA T4 Tensor Core GPUs. NGC hosts GPU-optimized software containers for deep learning, machine learning and HPC applications, pre-trained models, and SDKs that can run anywhere the Supermicro NGC-Ready systems are deployed whether in data centers, cloud, edge micro-datacenters, or in distributed remote locations as environment-resilient and secured NVIDIA-Ready for Edge servers powered by the NVIDIA EGX intelligent edge platform.

With over 26 years of experience delivering state-of-the-art computing solutions, Supermicro systems are the most power-efficient, the highest performing, and the best value, said Charles Liang, CEO and president of Supermicro. With support for fast networking and storage, as well as NVIDIA GPUs, our Supermicro NGC-Ready systems are the most scalable and reliable servers to support AI. Customers can run their AI infrastructure with the highest ROI.

Supermicro currently leads the industry with the broadest portfolio of NGC-Ready Servers optimized for data center and cloud deployments and is continuing to expand its portfolio. In addition, the company offers five validated NGC-Ready for Edge servers (EGX) optimized for edge inferencing applications.

NVIDIAs container registry, NGC, enables superior performance for deep learning frameworks and pre-trained AI models with state-of-the-art accuracy, said Ian Buck, vice president and general manager of Accelerated Computing at NVIDIA. The NGC-Ready systems from Supermicro can deliver users the performance they need to train larger models and provide low latency inference to make critical, real-time business decisions.

As the leader in AI system technology, Supermicro offers multi-GPU optimized thermal designs that provide the highest performance and reliability for AI, deep learning, and HPC applications. With 1U, 2U, 4U, and 10U rackmount NVIDIA GPU systems as well as GPU blade modules for our 8U SuperBlade enclosure, Supermicro offers the industrys best and widest selection of GPU systems.

Sign up for our insideHPC Newsletter

View original post here:
Supermicro Accelerates AI and Deep Learning with NGC-Ready Servers - insideHPC

Read More..

Discover aspects of the Hybrid Cloud Market as it value achieves $171926 million with CAGR 21.7% – WhaTech Technology and Markets News

Growing need for higher computational power and increasing demand across various organizations to enhance their IT service management capabilities without addition of servers would boost the growth of the global hybrid cloud market.

Increasing need for more computational power among organizations and growing awareness about the benefits of hybrid cloud drive the growth of the global hybrid cloud market. However, lurking concerns about data privacy and security hamper the market growth.

On the other hand, rapid increase in adoption of hybrid cloud among small- and large-sized companies and augmented demand among various organizations to boost its IT service management capabilities without the addition of servers are expected to create lucrative opportunities for the market players in near future.

The key market players analyzed in the report include Microsoft Corporation, VMware, Inc., Hewlett Packard Enterprise, Dell EMC, Google LLC, Cisco Systems, Inc., Amazon Web Services, Inc., Rackspace Inc., IBM Corporation (International Business Machines), and Verizon Enterprise. These market players have adopted various strategies such as partnerships, collaboration, mergers & acquisitions, and new product launch to maintain their leading position in the industry.

North Americacontributed about half share of the market in 2017, owing to the increasing number of cloud-based service providers in the region. However,Asia-Pacificregion would grow at the fastest CAGR of 25.3% during the study period, owing to rise in usage of cloud-based services and growth in deployment of data centers in the developing countries such asIndiaandChina.

In addition, the hybrid cloud market inEuropeis expected to grow gradually from 2018 to 2025.

The global hybrid cloud market was pegged at$36.14 billionin 2017 and is estimated to reach$171.93 billionby 2025, registering a CAGR of 21.7% from 2018 to 2025.

Among industrial verticals, the global hybrid cloud market report is analyzed across IT & Telecom, healthcare, BFSI, retail, government, media & entertainment, transportation & logistics, manufacturing, and other sectors. In 2017, the BFSI segment was the largest contributor, holding about one-third share of the market and would continue to retain its dominant position during the forecast period.

However, the healthcare segment is expected to manifest the fastest CAGR of 27.9% during the forecast period, as concerns regarding security, cost, and complexity have considerably increased among healthcare organizations.

In 2017, the small & medium enterprises segment was the largest contributor to the global hybrid cloud market in terms of revenue, holding more than two-thirds share. Moreover, the segment is expected to portray the fastest CAGR of 22.5% during the study period.

On the other hand, the large enterprises segment is estimated to manifest gradual growth through 2025.

Download Sample Report:www.alliedmarketresearch.com/request-sample/256

Based on service model, the global hybrid cloud market report is segmented into Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS). SaaS segment held the largest market share, contributing about 61% of the total revenue, owing to the increasing adoption of SaaS among organizations that seek complex software and hardware management.

However, IaaS segment is expected to register the fastest CAGR of 26.1% through 2025, owing to various benefits such as enhanced performance, improved productivity, flexible computing capabilities, and increased delivery speed. In addition, the PaaS segment is expected to grow at a steady rate during the forecast period.

The services segment is estimated to register the fastest CAGR of 23.1% during the forecast period, owing to the rise in adoption of hybrid cloud services on account of their cost-effectiveness and ease of access. However, the solutions segment held the largest market share, contributing about two-thirds of the total revenue, owing to the increasing inclination of companies toward building multi-cloud architectures.

For Inquiry:www.alliedmarketresearch.com/-enquiry/256

This email address is being protected from spambots. You need JavaScript enabled to view it.

Follow this link:
Discover aspects of the Hybrid Cloud Market as it value achieves $171926 million with CAGR 21.7% - WhaTech Technology and Markets News

Read More..