I Wore Meta Ray-Bans in Montreal to Test Their AI Translation Skills. It Did Not Go Well – WIRED

Imagine youve just arrived in another country, you dont speak the language, and you stumble upon a construction zone. The air is thick with dust. Youre tired. You still stink like airplane. You try to ignore the jackhammers to decipher what the signs say: Do you need to cross the street, or walk up another block, or turn around?

I was in exactly such a situation this week, but I came prepared. Id flown to Montreal to spend two days testing the new AI translation feature on Metas Ray-Ban smart sunglasses. Within 10 minutes of setting out on my first walk, I ran into a barrage of confusing orange detour signs.

The AI translation feature is meant to give wearers a quick, hands-free way to understand text written in foreign languages, so I couldnt have devised a better pop quiz on how it works in real time.

As an excavator rumbled, I looked at a sign and started asking my sunglasses to tell me what it said. Before I could finish, a harried Quebecois construction worker started shouting at me and pointing northwards, and I scurried across the street.

Right at the start of my AI adventure, Id run into the biggest limitation of this translation softwareit doesnt, at the moment, tell you what people say. It can only parse the written word.

I already knew that the feature was writing-only at the moment, so that was no surprise. But soon, Id run into its other less-obvious constraints. Over the next 48 hours, I tested the AI translation on a variety of street signs, business signs, advertisements, historical plaques, religious literature, childrens books, tourism pamphlets, and menuswith wildly varied results.

Sometimes it was competent, like when it told me that the book I picked up for my son, Trois Beaux Bbs, was about three beautiful babies. (Correct.) It told me repeatedly that ouvert meant open, which, to be frank, I already knew, but I wanted to give it some layups.

Other times, my robot translator was not up to the task. It told me that the sign for the notorious adult movie theater Cinma LAmour translated to Cinma LAmour. (F for effortGoogle Translate at least changed it to Cinema Love.)

At restaurants, I struggled to get it to read me every item on a menu. For example, instead of telling me all of the different burger options at a brew pub, it simply told me that there were burgers and sandwiches, and refused to get more specific despite my wheedling.

When I went to an Italian spot the next night, it similarly gave me a broad summary of the offerings rather than breaking them down in detailI was told there were grilled meat skewers, but not, for example, that there were duck confit, lamb, and beef options, or how much they cost.

All in all, right now, the AI translation is more of a temperamental party trick than a genuinely useful travel tool for foreign climes.

To use the AI translation, a glasses-wearer needs to say the following magic words: Hey Meta, look at and then ask it to translate what its looking at.

The glasses take a snapshot of whatever is in front of you, and then tell you about the text after a few seconds of processing. Id expected more straightforward translations, but it rarely spits out word-for-word breakdowns. Instead, it paraphrases what it sees or offers a broad summary.

View original post here:

I Wore Meta Ray-Bans in Montreal to Test Their AI Translation Skills. It Did Not Go Well - WIRED

Related Posts

Comments are closed.