How to use the intelligence features in iOS 16 to boost productivity and learning – TechRepublic

Apple has packed intelligence features into iOS 16 to allow for translations from videos, copying the subject of a photo and removing the background, and copying text from a video.

A few years ago, Apple began betting on local machine learning in iOS to boost the user experience. It started simple with Photos, but now machine learning is a mainstay in iOS and can help to boost productivity in every turn. iOS 16 adds to the machine learning features of iOS to allow for the ability to copy text from a video, perform quick text actions from photos and videos, and allow you to easily copy the subject of a photo and remove the background, creating an easy alpha layer.

Well walk through these three new intelligence features in iOS 16, find out how to use them, and show you all of the ways that you can use these features to boost your productivity and more.

SEE: iCloud vs. OneDrive: Which is best for Mac, iPad and iPhone users? (free PDF) (TechRepublic)

All of the features below only work on iPhones containing an A12 Bionic processor or later, and the translation and text features only available in the following languages: English, Chinese, French, Italian, German, Japanese, Korean, Portuguese, Spanish and Ukrainian text.

One of the cooler features in iOS 16 was the ability to lift the subject of a photo off the photo, creating an instant alpha of the subject. This removes the background from the photo and leaves you with a perfectly cut out photo subject that you can easily paste into a document, iMessage or anywhere else you can imagine (Figure A).

Figure A

This feature works on iPhone with A12 Bionic and later, and can be done by performing these steps inside of the Photos app:

This doesnt only work in Photos, but also in the Screenshot Utility, QuickLook, Safari and other apps soon. This feature saves a lot of time over opening the photo into a photo editor and manually removing the background.

iOS 15 introduced Live Text, which lets you copy text from a photo or search through your Photo library using text that might be contained in a photo (Figure B). Apple is ramping up this feature in iOS 16 by allowing you to pause a video and copy text from it as well.

Figure B

It works like this:

This feature is great for online learning environments where students might need to copy an example and paste it into a document or other file.

Live Text has been around for two iterations of iOS, so Apple has started building additional features around the Live Text feature, namely the ability to perform actions on text from a photo or paused video frame (Figure C).

Figure C

When you select text in a photo or paused video, you now have the option of performing the following actions on the text:

You can do this by selecting the text from the photo or video, then selecting one of the quick actions presented. This works in the Camera app, Photos app, QuickLook and in the iOS video player.

Original post:
How to use the intelligence features in iOS 16 to boost productivity and learning - TechRepublic

Related Posts

Comments are closed.