Google Training Gemini On Its Own Chips Reveals Another Of Its Advantages – Forbes

Google unveiled its highly anticipated new AI model.

Google on Wednesday unveiled its highly anticipated new artificial intelligence model Gemini, an impressive piece of software that can solve math problems, understand images and audio and mimic human reasoning. But Gemini also reveals Googles unique advantage compared to other AI players: Google trained it on its own chips designed in house, not the highly-coveted GPUs the rest of the industry is scrambling to stockpile.

As the AI arms race has heated up, GPUs, or graphics processing units, have become a powerful currency in Silicon Valley. The scrum has turned Nvidia, a company founded 30 years ago that was primarily known for gaming, into a trillion dollar behemoth. The White House has clamped down on chip exports to China, in an attempt to keep the AI prowess of a foreign adversary at bay.

But analysts say the fact that Google DeepMind, the tech giants AI lab, trained its marquee AI model on custom silicon highlights a major advantage large companies have against upstarts, in an age where giants like Google and Microsoft are already under intense scrutiny for their market dominance.

Googles compute hardware is so effective it was able to produce the industrys most cutting edge model, apparently one-upping OpenAIs ChatGPT, which was largely built using Nvidia GPUs. Google claims that Gemini outperforms OpenAIs latest model GPT-4 in several key areas, including language understanding and the ability to generate code. Google said its TPUs allow Gemini to run significantly faster than earlier, less-capable models.

If Google is delivering a GPT-4 beating model trained and run on custom silicon, we believe this could be a sign that AI tech stacks vertically integrated from silicon to software are indeed the future, Fred Havemeyer, head of U.S. AI research at the financial services firm Macquarie, wrote in a note to clients. Havemeyer added, however, that Google is uniquely positioned to make use of custom chips like few others can, flexing its scale, budget, and expertise.

Google showed that it's at least possible, Havemeyer told Forbes. We think that's really interesting because right now the market has been really constrained by access to GPUs.

Big tech companies have been developing their own silicon for years, hoping to wean themselves off of dependency from the chip giants. Google has spent nearly a decade developing its own AI chips, called Tensor Processing Units, or TPUs. Aside from helping to train Gemini, the company has used them to help read the names of the signs captured by its roving Street View cameras and develop protein-folding health tech for drug discovery. Amazon has also launched its own AI accelerator chips, called Trainium and Inferentia, and Facebook parent Meta announced its own chip, MTIA, earlier this year. Microsoft is reportedly working on custom silicon as well, reportedly code-named Athena. Apple, which has long designed its own silicon, unveiled a new chip earlier this year called R1, which powers the companys Vision Pro headset.

Lisa Su, CEO of the chip giant AMD, which has a smaller share of the GPU market, has shrugged off concerns that big tech customers could someday be competitors. Its natural, she told Forbes earlier this year. She said it makes sense for companies to want to build their own components as they look for efficiencies in their operations, but she was doubtful big tech companies could match AMDs expertise built up over decades. I think its unlikely that any of our customers are going to replicate that entire ecosystem.

Googles new model has the potential to shake up the AI landscape. The company is releasing three versions of Gemini with varying levels of sophistication. The most powerful version, a model that can analyze text and images called Gemini Ultra, will be released early next year. The smallest version, Gemini Nano, will be used to power features on Googles flagship Pixel 8 Pro smartphone. The mid-level version, Gemini Pro, is now being used to power Bard, the companys generative chatbot launched earlier this year. The bot initially garnered a lukewarm reception, generating an incorrect answer during a promo video and wiping out $100 billion in Google parent Alphabets market value. Gemini could be Googles best shot at overtaking OpenAI, after a bout of instability last month as CEO Sam Altman was ousted and reinstated in a matter of days.

Google also used the Gemini announcement to unveil the newest version of its custom chips, the TPU v5p, which Google will make available to outside developers and companies to train their own AI. This next generation TPU will accelerate Geminis development and help developers and enterprise customers train large-scale generative AI models faster, allowing new products and capabilities to reach customers sooner, Google CEO Sundar Pichai and DeepMind cofounder Demis Hassabis said in a blog post.

Gemini is the outcome of a massive push inside Google to speed up its shipping of AI products. Last November, the company was caught flat-footed when OpenAI released ChatGPT, a surprise hit that captured the publics imagination. The frenzy triggered a code red inside Google and prompted cofounder Sergey Brin, long absent after leaving his day-to-day role at the company in 2019, to begin coding again. In April, the company merged its two research labs, Google Brain and DeepMind, which had previously been notoriously distinct, in an attempt to give product development a push.

These are the first models of the Gemini era and the first realization of the vision we had when we formed Google DeepMind earlier this year, Pichai said. This new era of models represents one of the biggest science and engineering efforts weve undertaken as a company.

The rest is here:
Google Training Gemini On Its Own Chips Reveals Another Of Its Advantages - Forbes

Related Posts

Comments are closed.