Home Tech News Apple features replicate user’s voice, make iPhones easier to understand

Apple features replicate user’s voice, make iPhones easier to understand

0
5
7fp42vg32n7jooxia65e2hpoyu.jpgw1440.jpeg
7fp42vg32n7jooxia65e2hpoyu.jpgw1440.jpeg

Starting later this year, you’ll be able to type out a friendly greeting or your coffee order on an iPhone and hear your own voice — or something like it — speak it aloud.

It won’t require any additional apps or accounts, either; just a free software update from Apple.

To people who have full use of their voice, this tool — a feature the company calls Personal Voice — may not seem like much more than a clever curiosity. But for those who can no longer speak with the clarity or confidence they once did, tools like this could help them interact with the world, and the people in it, a little more easily.

Personal Voice is one of a handful of new assistive features that will arrive on Apple devices like iPhones, iPads and Mac computers “later this year,” the company said.

Apple wouldn’t elaborate on exactly when users could expect to try these tools for themselves, but the company often highlights features like these before they appear in new versions of its iOS, iPadOS, and macOS software, which historically launch in the fall.

To help you keep these new accessibility features straight — and to help flag ones you may want to use yourself — here’s our brief guide to the tools coming to an Apple device near you.

How it works: Once it’s enabled, you’ll be able to type out messages and remarks on an iPhone, iPad or Mac computer for the device to read out loud. And if there are certain sentences or phrases you find yourself relying on frequently, you can save these as shortcuts to play aloud with a tap.

Unless you create a Personal Voice model — which we’ll get to in a moment — you’ll hear Siri’s voice reading your words. This tool doesn’t just help when in-person conversations unfold — it also feeds that spoken audio into phone and FaceTime calls.

How it works: To make those Live Speech messages sound like you, you’ll have to create a Personal Voice model. While we haven’t gotten to try the feature for ourselves, the company claims an iPhone or iPad can create a sound-alike voice after it’s provided 15 minutes of spoken samples — in this case, a set of randomly chosen voice prompts. (Don’t worry — if you get busy or uncomfortable, you don’t have to finish the process in one sitting.)

Once that’s done, you can expect a bit of a wait — your device will chew on those samples overnight, after which you’ll be able to type out messages and hear them played back in your voice.

There’s just one more thing to keep in mind: When you build a Personal Voice model, it lives on whatever device you created it on by default. That means you’ll have to go through the training process again on any other device you want to use that model on unless you give explicit permission for it to be shared across devices.

How it works: Designed for users with cognitive impairments, Assistive Access strips all the visual cruft out of the iPhone and iPad to give people a dead-simple way to interact with their device.

Consider your phone’s home screen, for example — you’re meant to choose a small number of apps you rely on to take center stage, with each getting an enormous app icon for easy visibility. Apple also streamlined how other common phone functions work — rather than asking you to make phone calls and FaceTime calls from different apps, for instance, you can set up a handful of favorite contacts for quick access. Tapping one of those names then brings up options for voice or video calls. Think of it as a sort of beginner’s mode for iPhones and iPads, and you’re on the right track.

How it works: This feature, which will live in Apple’s Magnifier app, does exactly what it says — point your finger at something in front of the camera, and the app will read whatever text is on or near it. In a brief example Apple showcased, a person could use Point and Speak to read the tiny text on microwave buttons as they pointed to different ones. The catch? This feature will work only on Apple devices with a built-in lidar sensor, and for now at least, those are exclusive to the company’s pricey Pro iPhones and iPads.