WWDC One year ago, at 2019's WWDC, Apple showed off a set of accessibility features designed to help people operate their iPhone, iPad, or Mac without the use of their hands.
Voice Control was supposed to help those with disabilities better use all the features of their macOS and iOS gear, and in the process regain some independence.
One year on from that unveiling – see the video below for Apple's recap of the technology – Cupertino's heralded control tools are sorely lacking in a number of key areas, we're told, preventing folks from carrying out some of the basic tasks that keyboard and mouse-equipped users take for granted.
Garbled Google, and un-ending calls
"When Voice Control was unveiled at WWDC last year, Apple made a big deal of it, and the mainstream media reported it uncritically – probably out of ignorance as most journalists won’t ever need to rely on it as their only source of communication," Hughes told The Register yesterday. "The truth is, if you ask most severely physically disabled users, who can’t take to the keyboard, they will say they find Voice Control disappointing, particularly the accuracy of its dictation."
Among the most frustrating aspects of the software is its inability to perform a simple Google search properly. Hughes said that, when using Voice Control, the dictated query text is often mangled, spaces are left out, and other basic errors are inserted, leaving you with irrelevant or imprecise search results. This, he said, has been the case since Voice Control debuted last year.
Brit comms regulator Ofcom: Disabled left behind by techREAD MORE
The Register asked Apple for comment, and was referred to the iGiant's support line. Hughes said he has tried to explain the software's shortcomings to Apple support to no avail.
It doesn't end with search. Hughes has encountered a host of other problems with Voice Control preventing basic operations.
For example, using Siri on iOS devices, users can place a call by saying "Hey Siri, call…", but there's no "Hey Siri, end call" command. This isn't too much of a problem if the other person on the line can hang up and end the call for you. However, this does mean that if you call a number and go to voicemail, you have to wait until the mailbox times out before the call ends – which is frustrating for both parties.
Speaking of phone calls, Hughes told us Siri's Voice Control lacks the ability to accept an incoming call via voice command. There is an auto-answer feature but, well, we'll talk more about that later.
The bigger picture
Voice Control's failings aren't down to just one or two bugs or poorly implemented features. Rather, it's part of a larger problem stemming from Apple not quite understanding how people with disabilities or limited mobility use their Macs and iOS gear.
Take, if you will, the auto-answer function on the iPhone we mentioned above. This feature lets you receive calls without touching the screen, critical for folks who lack mobility or the ability to use their hands. Unfortunately, there is no option to enable auto-answer with your voice, meaning folks have to rely on – you guessed it – the touchscreen to turn it on.
Or, there's the issue that has been the bane of many a Brit who tries to interact with voice-recognition tech: dialects and accents. As with so many voice-control services, Apple's is tuned primarily for an American voice. Hughes, like other Brits, often runs into issues where Siri doesn't recognize commands or clearly hear his input.
This is made far worse by one key missing feature: the ability to fine-tune recognition and train the system on specific words. While vocabulary can be added, some words and names cannot be properly learned, it seems. Hughes said this is a problem when, for example, he wants to type the name of a Polish friend that Siri can't recognize.
"These are 'below the radar' features that won't bother many people but will be incredibly important for physically disabled people, and they are things I have been campaigning for the past couple years," he said. "Having them can literally make or break my day, my life even."
All of these problems start to add up, Hughes told us, to a highly frustrating experience for people who last year were really looking forward to iOS and Mac products that would work well for everyone, regardless of how they control their hardware.
"I feel Apple has behaved in a lip-service, patronizing-kind-of-way with Voice Control. It is not worthy of the Apple name or brand, it really is a very poor application and they need to fix it and invest in it if it is to be usable," he said. "Until they do, they have no right to claim how accessibility-friendly they are as a company, which they do very often."
This is all made worse by the fact that Apple is the only real game in town on its own kit, since market leader Nuance bailed from the Mac platform in 2018.
Hope on the horizon?
To be fair to Apple, there were a few accessibility-related features shown off at this year's WWDC. It's also hoped Apple engineers will iron out the above flaws, and related shortcomings, in the upcoming iOS 14 and macOS 11.0 Big Sur releases. What is needed to ultimately solve the problem, Hughes said, is a shift in the way software development is approached, and not only by Apple.
"When I complain about accessibility on Apple devices, Android users say 'why don’t you get Android?' And Apple users contradict that and say Apple has a great record on accessibility," Hughes said. "My own view is if every company started with inclusive design we would be in a better place." ®