With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations.News 

Apple brings advanced accessibility features – live speech, personal voice and more

US tech giant Apple has announced new software features for cognitive, visual, auditory and mobility accessibility, as well as innovative tools for people who do not speak or are at risk of losing their ability to speak.

Coming later this year, the new Live Speech feature on iPhone, iPad and Mac will allow users to type what they want to say to be spoken aloud during phone and FaceTime calls, as well as in-person conversations. Users can also save frequently used phrases for quick dialing in lively conversations with family, friends and colleagues.

“At Apple, we’ve always believed that the best technology is technology built for everyone. We’re excited to share incredible new features that build on our long history of making technology accessible so everyone can create, communicate and do what they love,” said Tim Cook, Apple CEO.

For users who are at risk of losing their ability to speak – for example, those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other diseases that can gradually affect the ability to speak – Personal Voice is a simple and safe way to create a voice that sounds like them.

Apple said users can create a personalized voice by reading along with a randomized text prompt to record 15 minutes of audio on an iPhone or iPad. The Assistive Access feature uses design innovations to distill apps and experiences down to their essential features to ease cognitive load.

The feature includes a customized experience for Phone and FaceTime combined into a single Calls app, along with Messages, Camera, Photos and Music. It offers a distinct user interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience to the individual they’re supporting.

Users and trusted supporters can also choose between a more visual, grid-based layout for their home screen and apps, or a row-based layout for text-based users. The point-and-talk feature in the magnifying glass makes it easier for visually impaired users to interact with physical objects that have multiple text labels.

Point and Speak is built into the Magnifier app on iPhone and iPad, works with VoiceOver, and can be used with other Magnifier features like people recognition, door recognition, and image descriptions to help users navigate their physical environment.

The voice guidance feature adds phonetic suggestions for text editing, so that users typing with their voice can choose the correct word from several that may sound the same. Additionally, the Voice Control Guide allows users to learn tips and tricks for using voice commands as an alternative to touch and typing on iPhone, iPad, and Mac.

Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual gamepad to play games on iPhone and iPad. The SignTime feature will launch in Germany, Italy, Spain and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters.

Point and Speak is available on iPhone and iPad with LiDAR scanner in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese and Ukrainian.

Read all the Latest Tech News here.

Related posts

Leave a Comment