Modest Apple talks up these 'incredible' advances in iOS
It's 2023 so of course it involves machine learning
Apple on Tuesday previewed several accessibility features planned for future operating system updates, including a way to have its text-to-speech app respond in the user's voice.
CEO Tim Cook described the forthcoming features as "incredible," a word which can be found more than 500 times in Apple press releases.
The features – Assistive Access, Live Speech, Personal Voice Advance Speech Accessibility, and Point and Speak in Magnifier, among others – are being talked up in advance of Global Accessibility Awareness Day (GAAD) on Thursday, May 18.
Being able to access digital devices is critical for people with physical or cognitive challenge. It remains a largely unresolved issue, prompting recent proposed legislation like the American Websites and Software Applications Accessibility Act, to make websites and software more usable by everyone.
“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, referring to ease of device interaction for people with disabilities and not access that facilitates repairs or enables the installation of unapproved apps.
"These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways."
While major operating system vendors like Apple, Google, and Microsoft, and the open source Linux community all implement assorted accessibility features, Apple's iOS hardware gets high marks from the American Foundation for the Blind. The organization notes, "Apple’s iOS has been the standard in accessibility for tablets and smartphones since these devices were first developed."
Apple said that later this year, Assistive Access will be available for iPhone and iPad. Designed to simplify app interfaces for people with cognitive disabilities, Assistive Access unifies Phone and Facetime functions into a single Calls app that sits beside Messages, Camera, Photos, and Music. The pared-down grid interface consists of large, high-contrast buttons and labels. And there's a row-based layout tuned for text, in case that's preferred.
The Live Speech service, intended for people with speech challenges, lets the user type text and have that text spoken during calls and messaging sessions. It works on iPhone, iPad, and Mac.
The synthesized voice employed to read typed text can be set to sound like the non-speaking user, if a suitably long source recording on that person's voice has been stored in advance. This feature, Personal Voice, is intended to be used by people dealing with a condition expected to lead to voice deterioration or speaking disability.
Personal Voice is set up by having the user read a set of random text prompts for fifteen minutes. This recording is then run through on-device machine learning so a sound-alike voice can be used with Live Speech.
- Microsoft rolls out tools and improvements to make its stuff more accessible
- Someone's at last helping AI models understand those with speech disabilities
- Microsoft helps devs create chatbots – because who needs human interaction anyway?
- ESA names first Parastronaut: paralympian and aspiring surgeon John McFall
Users of Magnifier, an app for iPhone and iPad that magnifies screen viewing, among other accessibility functions like identifying doors, will be able to use Detection Mode in conjunction with a capability called Point and Speak. The feature will identify text visible to the device camera and read the text aloud in order to make interactions with things like household appliances more manageable.
These accessibility features are scheduled to arrive "later this year," which may or may not mean in iOS/iPadOS 17, expected in September or October, and in the as-yet-unnamed macOS 14 revision. Some of these features, perhaps in a more limited form, may also show up in tvOS, watchOS, and rOS (realityOS), the internal name for the augmented reality headset Apple is expected to announce at its Worldwide Developer Conference next month. Possibly coincidentally, Apple has just registered the rxOS domain in New Zealand.
The update also includes: the ability to pair Made for iPhone hearing devices to the Mac; phonetic suggestions for text editing in Voice Control to make voice recognition more flexible; turning any switch on an iPhone or iPad into a virtual game controller via Switch Control; easier Text Size adjustment across various Mac apps; pausing images with moving elements like GIFs in Messages and Safari; and Siri voice improvements in Voice Over, including playback speed adjustment.
"We are pleased to see Apple taking steps to make their products more accessible to people with all types of disabilities," said Marlene Sallo, executive director of the National Disability Rights Network, in an email to The Register.
"Accessible technology can expand opportunities for more people with disabilities to go to school, communicate with their friends and family, gain employment, and pursue their dreams. We still have a long way to go before we reach universal accessible design. Hopefully, this sends a signal to other companies that making accessible products is not only the right thing to do but can also be profitable." ®