Googles latest smartphones put AI photo-editing in your hands – Sydney Morning Herald

Where many smartphone-makers fill their software with cascading options and heaps of micro features, Pixels have always put a focus on attractive, understated design and proactively useful touches. The new Pixels are crammed full of stuff, its just that it tends to pop up when its required rather than being tucked away in menus or needing to be stacked all over your home screen.

The Pixel 8 has new cameras, more powerful AI and keeps the chunky camera bar. Supplied

As you might expect, a lot of Googly features are baked in and benefit greatly from the Pixel 8s new Tensor G3 chip, which provides on-device machine learning. That includes fast and accurate voice-to-text for sending messages, interpreting other languages and automatically transcribing recordings, as well as powerful photo and video editing, and health-tracking via Fitbit. The tech even extends to using the device for old-fashioned phone calls, with great noise-cancelling that makes the caller sound clear and the ability to have Google Assistant screen your calls.

But while the Pixel 8s continue the tradition of putting the best of Google front and centre, and may be the best Androids on the market, theyre also a testing ground for new features and developments that dont always feel ready for prime time.

Physically not a lot has changed from last years Pixel 7, with the new phones packing the familiar and wonderfully symmetrical chunky camera bar on the back. Both phones have great screens that can get phenomenally bright (2000 nits on the 8, 2400 on the Pro, though youll only get those maximum results in direct sun), and both have 120Hz refresh rates for very smooth scrolling and animation.

In fact, the two phones are the closest theyve been since the introduction of the Pro moniker with the Pixel 6. The main differences are screen size (6.2 inches or 6.7 inches) and an extra camera, with the Pro packing a 5x telephoto lens in addition to the main shooter and ultra-wide. Oh yeah, and the Pixel 8 Pro has a thermometer.

The Pixel 8 Pro and Pixel 8 (right), with the iPhone 15 Pro Max and iPhone 15 (left). Tim Biggs

I have to assume that Google intended this sensor to be for checking peoples foreheads, and then didnt get the appropriate medical approval or gave up on the idea, because theres no other reason for it to be there. When you use it for the first time you have to tick a box to say you understand it is not a medical device, and then youre free to put it five centimetres in front of any object to check its temperature. It seems to work, but I couldnt find a compelling use for it.

Despite a similar hardware design, Googles own take on the new Android 14 software does a lot to give the Pixel 8s a fresh look. There are a lot of lock-screen styles to choose from, which double as always-on-display designs (I especially like the new one that puts the date and temperature against the phones longer sides), and the options to colour theme your phone, widgets and icons automatically to your wallpaper continue to impress.

Embedded in the top of the display is a little selfie shooter similar to last years, but it has a new trick thanks to AI, in that it can recognise you a lot more accurately. The change means you can authenticate banking apps and sign in to services with your face, just like you can on an iPhone, without a big black spot at the top of the phone. The one downside to Googles approach here is it doesnt work in the dark, so you still need a fingerprint or PIN.

Speaking of AI, this years update is big on generative features, both obvious and subtle. One example of the former is the new AI wallpaper feature, which is a DALL-E style text-to-image generator but with a lot of limitations. Choose a theme, pick some keywords, wait 20 seconds and youll be given eight different options to choose from. Theyre always fairly abstract and dont stand up to close scrutiny to be honest, Id always prefer one of the many real photos or paintings from Googles wallpaper collection but as a gimmick it works and will only get better.

Googles Best Take feature can swap peoples faces when you take multiple shots. Supplied

Generative AI is more present in the new Google Photos editing suite too, where it toes the line between allowing you to realise the intention of a photograph and letting you straight-up invent stuff. The marquee feature here is Magic Editor, which is scarily good at replacing a gloomy sky with a nice blue one, shifting the colours for a golden hour look, removing unwanted elements or even completely changing the composition.

For example, I grabbed an older photo of my two kids at mini golf, posing in the left of the frame. Behind them is a giant statue of a cartoon rhino, centre frame. In the Magic Editor I tapped one kid to select him, held down to edit, then dragged him over to the right of the frame. After processing, the image just looks like the kids were naturally standing either side of the rhino. The part where the repositioned kid used to be standing now has some convincing invented detail, including a bit of path, some scattered bark chips and one of the rhinos hands. It even gives you a few options to choose from so you can pick the most natural. The feature is also great at enlarging the moon, or removing dead tree branches. But for whatever reason, any time I tried to change the size of a person I was told it was against the companys ethics policy.

On the left is my original photo, the middle is with Googles AI-generated sky, the right has golden hour turned on. Tim Biggs

Theres also a feature called Best Take, which appears if youve taken a series of photos featuring a group of people. Pick one of the photos, and you can tap on each persons face to cycle through the various expressions they made throughout the set, ending up with one picture featuring everyones best face. Like the Magic Editor its far from foolproof but can result in fakes nobody would pick at a glance.

To be honest, I cant ever see myself using these tools on my own personal photos. Im aware that some level of AI processing has been present in smartphone photography for a long time and is here to stay, but intentionally changing the content feels unnerving. That said, I can definitely see using it in place of Photoshop if I needed a specific edit of a non-human subject quickly.

For video, a new Audio Eraser will analyse clips for sounds and show a few of them as separate waveforms (for example, wind, speech or nature). Then you can watch and listen to the video while moving the levels around to cut out talking or annoying gusts. Like a lot of AI photo editing it works very well but leaves artefacts youll notice if youre specifically looking for them. Other features, like a video enhancer that utilises Googles cloud servers, are not present at the phones launch.

The new Pixels come hot on the heels of new iPhones, and the two families of devices share some similarities despite being fundamentally tough to compare. Googles phones have been becoming more premium and expensive year over year, while Apples have been becoming more open notably this year with the introduction of a USB-C port so theyre closer to equivalent than ever.

Comparing the standard phones, Pixel 8 and iPhone 15, Googles immediately stands out as more premium despite being $300 cheaper. Theyre similar sizes and have similar camera set-ups, but Apples lacks a fast refresh and an always-on display. Under the hood Apple has also withheld some features from the standard iPhone that the Pixel happily supports, including USB 3.2 for much faster charging and data transfer, and autofocus on the ultra-wide camera which makes for a great close-up macro mode.

When it comes to the high end, Pixel 8 Pro against the iPhone 15 Pro Max, the gap is much closer though the prices are farther apart. (The iPhone starts at 256GB here, so a true like-for-like comparison would put the Pixel 8 Pro at $1800, but thats still a $400 gap.)

Click to reload

The phones are evenly matched across almost all specs, though Apple has the strong advantage of an immensely powerful processor that outpaces the Pixel in raw strength. In cameras Google has opted for bigger sensors and more flexibility in editing, and the Pro has an excellent manual control mode, but Apples shots tend to be more pleasant for quick snaps since you can preset your preferred temperature and crop. When it comes to portrait mode and low light photography the Pixel is far more confident, and both phones now support adding a bokeh blur after the fact; Apple by capturing depth data when it detects a face and Google through the Magic Editor.

Both happily shoot 4K HDR video that looks amazing, though Apple supports Dolby Vision and ProRes which may suit professionals better.

iPhone users have access to a pretty full suite of Google products these days, and even features like the Magic Eraser have made their way from Pixel to iPhone, though the opposite is not true; those who choose Apple services tend to have a hard time on Android. The unique strength of the Pixel, then, is hardware thats been specially tuned for Googles AI tasks, and the latest in experimental features that may or may not become widely used in the future.

Still, putting the AI features aside these are still the nicest Pixels Google has yet produced, with some of the most genuinely useful features and best cameras youll find on an Android.

Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday.

Originally posted here:
Googles latest smartphones put AI photo-editing in your hands - Sydney Morning Herald

Related Posts

Comments are closed.