Google debuts new AI assistant that looks through cameras, glasses – Quartz

AI assistants are picking up more senses. On Monday, OpenAI showed off a new ChatGPT model that promises to see, hear, and speak through smartphones, among other new abilities. Now Google is announcing a rival assistant with similar capabilities.

Nvidia stock has all-time closing high in its crosshairs

At the companys I/O developer conference Tuesday, DeepMind CEO Demis Hassabis debuted a prototype of Googles new expert AI assistant that can see through a users phone and other objects like smart glasses. The assistant build[s] on Gemini, Googles existing chatbot, the company says, and some of its capabilities are coming to the Gemini app and web experience later this year.

The development is part of Google DeepMinds Project Astra, which aims to create a universal AI agent for users everyday lives. Its easy to envisage a future where you can have an expert assistant by your side, through your phone, or new exciting form factors like glasses, Hassabis told a crowd of a few thousand developers in Mountain View, California.

A demo video shows a person speaking with an AI agent through their phone while walking through an office. Through their camera, they show the AI assistant a container of crayons as if they were talking through FaceTime and ask it to make a creative alliteration.

Creative crayons color cheerfully, it said. They certainly craft colorful creations. The person continues interacting with the AI bot on their walk, then realize they forgot their glasses and ask for help finding them. Theyre on the desk near a red apple, the bot responds.

When the user puts on those glasses, the AI assistant can look through them, too and identifies an illustration representing Shrodingers cat on a whiteboard.

Its unclear if those glasses are a new product Google plans to launch. The augmented reality glasses on display in the demo didnt look like Google Glass, the companys existing smart glasses, nor did they resemble typical, bulkier headsets.

An agent like this has to understand and respond to our complex and dynamic world just like we do, said Hassabis at the conference. It would need to take in and remember what it sees so it can understand context and take action. And it would have to be proactive, teachable, and personal. So you can talk to it naturally without lag or delay.

Thats what Project Astra is aiming to do, he said, and its making great strides.

While Googles prototype AI assistant is available to demo for attendees of its Google I/O conference, it will probably be a while before the tech makes its way into the hands of everyday consumers.

View original post here:
Google debuts new AI assistant that looks through cameras, glasses - Quartz

Related Posts

Comments are closed.