Nvidia, Baidu expand AI partnership to include cloud GPUs and self-driving cars – GeekWire

Baidus Silicon Valley research lab (Baidu Photo)

Baidu plans to announce a sweeping expansion of its partnership with graphics chip maker Nvidia on Wednesday, bringing Nvidias latest chips into Baidus cloud services for artificial intelligence research and optimizing Baidus deep learning system.

The two companies have been working together on AI-related projects for a few years now, and the new agreement, scheduled to be announced during the Baidu Create 2017 AI developer conference, expands on much of that existing work.

Baidu is the Google of China, with over 75 percent of that countrys internet search market share and hundreds of millions of active users for other internet services it offers, including public cloud computing.

Its also one of the worlds leading researchers in artificial intelligence, a division that until recently was led by former Stanford professor and AI expert Andrew Ng. Former Microsoft search and AI expert Qi Lu is now the president and chief operating officer at Baidu, and the company is expected to challenge rivals like Google, Facebook, and Microsoft in artificial intelligence research for years to come.

For its part, Nvidias GPUs (graphics processing units) are being snapped up left and right by AI researchers to run their models, and are also offered by all the major public cloud providers as a service to their customers. Google is mounting a challenge with its Tensor Processing Units, but Nvidia has a lot of momentum on its side.

We see AI transforming really every industry, and our strategy is to help democratize AI everywhere, said Ian Buck, vice president and general manager of accelerated computing at Nvidia, in a briefing last week.

The new Nvidia-AI partnership consists of four segments.

Artificial intelligence services are expected to be the next big area of competition for public cloud providers. There are only a handful of companies in the world that can afford to do leading-edge AI research, and most of those companies also happen to be among the few who can afford to invest in the massive computing power that enables the public cloud.

As Amazon Web Services Swami Sivasubramanian put it at our GeekWire Cloud Tech Summit last month, AI research concepts have been long been discussed, but what has accelerated adoption of it is that we have specialized compute infrastructure, such as GPUs, specialized CPUs, FPGAs (field programmable gate arrays), you name it. In this new market for chips, Nvidia is in the drivers seat.

Read more:
Nvidia, Baidu expand AI partnership to include cloud GPUs and self-driving cars - GeekWire

Related Posts

Comments are closed.