Google AI heavyweight Jeff Dean talks about algorithmic breakthroughs and data center emissions – Fortune

Google sent a jolt of unease into the climate change debate this month when it disclosed that emissions from its data centers rose 13% in 2023, citing the AI transition in its annual environmental report. But according to Jeff Dean, Googles chief scientist, the report doesnt tell the full story and gives AI more than its fair share of blame.

Dean, who is chief scientist at both Google DeepMind and Google Research, said that Google is not backing off its commitment to be powered by 100% clean energy by the end of 2030. But, he said, that progress is not necessarily a linear thing because some of Googles work with clean energy providers will not come on line until several years from now.

Those things will provide significant jumps in the percentage of our energy that is carbon-free energy, but we also want to focus on making our systems as efficient as possible, Dean said at Fortunes Brainstorm Tech conference on Tuesday, in an onstage interview with Fortunes AI editor Jeremy Kahn.

Dean went on to make the larger point that AI is not as responsible for increasing data center usage, and thus carbon emissions, as critics make it out to be.

Theres been a lot of focus on the increasing energy usage of AI, and from a very small base that usage is definitely increasing, Dean said. But I think people often conflate that with overall data center usage of which AI is a very small portion right now but growing fast and then attribute the growth rate of AI based computing to the overall data center usage.

Dean said that its important to examine all the data and the true trends that underlie this, though he did not elaborate on what those trends were.

One of Googles earliest employees, Dean joined the company in 1999 and is credited with being one of the key people who transformed its early internet search engine into a powerful system capable of indexing the internet and reliably serving billions of users. Dean cofounded the Google Brain project in 2011, spearheading the companys efforts to become a leader in AI. Last year, Alphabet merged Google Brain with DeepMind, the AI company Google acquired in 2014, and made Dean chief scientist reporting directly to CEO Sundar Pichai.

By combining the two teams, Dean said that the company has a better set of ideas to build on, and can pool the compute so that we focus on training one large-scale effort like Gemini rather than multiple fragmented efforts.

Dean also responded to a question about the status of Googles Project Astraa research project which DeepMind leader Demis Hassabis unveiled in May at Google I/O, the companys annual developer conference. Described by Hassabis as a universal AI agent that can understand the context of a users environment, a video demonstration of Astra showed how users could point their phone camera to nearby objects and ask the AI agent relevant questions such as What neighborhood am I in? or Did you see where I left my glasses?

At the time, the company said the Astra technology will come to the Gemini app later this year. But Dean put it more conservatively: Were hoping to have something out into the hands of test users by the end of the year, he said.

The ability to combine Gemini models with models that actually have agency and can perceive the world around you in a multimodal way is going to be quite powerful, Dean said. Were obviously approaching this responsibly, so we want to make sure that the technology is ready and that it doesnt have unforeseen consequences, which is why well roll it out first to a smaller set of initial test users.

As for the continued evolution of AI models, Dean noted that additional data and computing power alone will not suffice. A couple more generations of scaling will get us considerably farther, Dean said, but eventually there will be a need for some additional algorithmic breakthroughs.

Dean said his team has long focused on ways to combine scaling with algorithmic approaches in order to improve factuality and reasoning capabilities, so that the model can imagine plausible outputs and reason its way through which one makes the most sense.

Those kind of advances Dean said, will be important to really make these models robust and more reliable than they already are.

Read more coverage from Brainstorm Tech 2024:

Wiz CEO says consolidation in the security market is truly a necessity as reports swirl of $23 billion Google acquisition

Why Grindrs CEO believes synthetic employees are about to unleash a brutal talent war for tech startups

Experts worry that a U.S.-China cold war could turn hot: Everyones waiting for the shoe to drop in Asia

Here is the original post:
Google AI heavyweight Jeff Dean talks about algorithmic breakthroughs and data center emissions - Fortune

Related Posts

Comments are closed.