One of the perils of launching a clever new feature at the zoo called Mobile World Congress is that clever new things get lost in the noise.
With its P10, one of the three main flagships launched at the show, Huawei quietly introduced a new feature that few people even noticed. It's abolished the Navigation Bar by incorporating the three main functions into a sensor button. In the small world of mobile UX, this is big news; since the Navigation Bar has barely changed over almost a decade. As I've discovered, this changes how you use Android quite significantly. To see why, let's recap how two billion people have grappled with Android so far.
The Huawei P10 Plus
Buttons. We got buttons
Google's first version of Android was basically a BlackBerry clone, and it had been under development for three years, when in January 2007 Apple unveiled the iPhone. The Android team made a dramatic pivot to emulate Apple's new UI standard, a little too closely for Steve Jobs' liking. Android's UI rapidly evolved from "mainly QWERTY with some touchscreen" to "mostly touchscreen" to "full touch" very quickly, but retained a few BlackBerry quirks. BlackBerries had a back key and a menu key and so did the first full-touch Androids. The devices retained these buttons for second-generation Android 2.0 devices, such as the HTC Hero, which had six hardware buttons (home, menu, back, search and two call keys: the old "send" and "receive").
Did you know it's an Overview key not a Task Switcher? Well, Google says so.
Given the huge pent-up demand for a modern phone experience (Apple was rationing its iPhone through territorial carrier exclusives, and neither Nokia nor BlackBerry could produce anything competitive), Android became a smash hit. Gradually the buttons began to vanish. Designers honed the keys down to three or four: invariably some permutation of home, menu, back and search. Samsung's runaway success with the S and S2 (menu – home – back – search) proved you didn't need quite so many.
Samsung's Galaxy S2 cut down the number of buttons
And since then there have only really been two changes. Android 4.0 appeared in 2012 allowing designers to dispense with dedicated offscreen buttons and replace them with an onscreen navigation bar, to save money and reduce the size of the bezel. And the menu key became largely superfluous and replaced by a task switcher.
That brings us to the generic slab we have today. Only Samsung has persisted with a real hardware button. However the shift to onscreen navigation has had a few downsides. It's rather more fiddly for the user, and eats up pixels that would otherwise be used by an application. Huawei's idea is to add swipe gestures to a front fingerprint sensor. The onscreen navigation bar is still there if you don't want to use the sensor, and you can restore it, but it's turned off by default.
So did it pass a real-life test? There was only way to find out.
Huawei had already experimented with loading gestures on to the fingerprint sensor in its Honor 7, a breakthrough device for the company here. A down swipe would pull down the Notifications shade, which is one of the most common actions on a modern Android phone.
The P10's fingerprint sensor understands three gestures when in general use. A long press returns you home. A tap takes you back. And a swipe brings up the task switcher (note for pedants: according to Google it's now officially called the "Overview" button, but most phone OEMs ignore this).
There's no way to customise this in current software builds: you get what you're given.
Coming straight from another Huawei phone, the Mate 9, I immediately missed its rear-mounted sensor. Just like Tim Anderson, who wrote our Hands On with the P10 did. Putting the fingerprint sensor on the back isn't popular with everyone, and makes using a flip case a bit awkward. But I'd very quickly become accustomed to picking it up unlocking and using the down swipe to see new notifications.
I suspect most people will discover the feature in the P10 by accident. They'll accidentally tap or swipe the sensor and find something unexpected has happened. And this is what happened to me. Programming my muscle memory to use the sensor for navigation took a bit more than a day. Very gingerly at first, you find a tap sends you back. And it's so much quicker than shifting focus to "find the back key" and carefully pressing. Your brain is doing less. It's much less disruptive.
Using a swipe to task switch takes a bit longer than tap-to-go-back as it's a kind of toggle. Swipe the sensor once to bring up the task switcher, and again to dispense with it.
There are a few downsides of the new approach, just as there downsides of any design decision. On previous Huawei phones swiping the on screen navigation bar was used for a "reachability" function, created to make the top of the device accessible in one-handed use. But with no on-screen navigation bar this is invoked using a diagonal swipe from one of the bottom corners of the screen. Which hardly ever works. And that notifications shade remains as hard to access one-handed as on other phones. Could Huawei have overloaded the sensor with another up/down gesture? Perhaps they did and users got confused. Or perhaps the sensor got confused. I'll try to find out.
Android dominates mobile with over 80 per cent of new devices shipping – but very little changes from year to year. It's nice to see manufacturers trying something a little different.
Nokia's abandoned all-gesture Harmattan UI dispensed with navigation buttons completely. Model: Nokia N9
This user regrets that the opportunity to create a ground-up approach to design with all swipes, as expressed in the Nokia N9 and BlackBerry's BB10, never caught on. Neither required any onscreen buttons at all. Android evolved so rapidly between 2009 and 2011, there was no opportunity for a rethink. And "unused" gestures rapidly become adopted by application developers.
More than 30 years after Microsoft introduced Windows, it still supports Alt-F4 to close a window, just as IBM said it must in 1987. ®