This article is more than 1 year old
EyeSight tries to please MWC crowd with touch-free fingering
Next-gen interface tech for TV, PC and mobe
MWC 2013 Israeli gesture tech specialist EyeSight is in Barcelona demonstrating how one finger can take control of a phone, TV or computer without so much as a caress, using the already-present camera.
EyeSight's hand-level gesture detection is already being used in TVs from HiSense and phones from ZTE, but the company has refined the technology to permit navigation by pointing a single finger, which is tracked using the camera already embedded in most devices - thus both enabling the fondleslab to give up being fondled and ending the search for the TV remote control forever.
Recognising a hand isn't that hard, the XBox Kinect has been doing it for years, but increasing the resolution is more challenging. The Kinect can't see fingers at all, resulting the the weird selection process which involves holding one's hand stationary for a few seconds to trigger a "click". More intuitive is to clench a fist to "click", which is what Samsung's Smart TVs do and is also how EyeSight's existing deployments work, though for the ZTE phone you have to push towards the screen slightly.
It might seem pointless to use gestures on a device you're already holding, but waving a hand in front of a smartphone to scroll sideways is surprisingly intuitive. It need not replace touch - everyone likes to caress now and then - but gestures provide an additional way of interacting.
Right now gesture systems are novelties, proven by the excessively expansive gestures required by most interfaces, but EyeSight has their technology working on an unmodified Nexus 7 tablet, using the integrated camera and able to track a single finger which can be crooked to trigger a "click".
EyeSight talks a lot about the natural appeal of navigation by pointing, but that's something of a misnomer as you don't really point at the interface at all, instead you point at the ceiling to present your finger to the camera. The software tracks the finger well, and "clicking" worked perfectly when we tried it, but the experience felt much more like using an invisible, frictionless touchscreen than navigation by pointing.
We were also disappointed to see how far away you had to be to make the system work. EyeSight celebrates the range, pointing out that a 1.3 Mega Pixel camera is good enough to track the user's digits from 5 metres (16ft) off, but it's not often that you want to control a mobile phone, or even a TV, from 5m away. When we brought this up we were told the focal distance of the cameras, and the configuration of the software, could both be changed to enable closer operation but we didn't get to see that in action.
The other problem is that a traditional computer configuration puts your hands well out of camera shot, so right now the technology is limited to more innovative devices like Lenovo's Yoga uber-slab which comes with EyeSight pre-installed. Gestures are clearly coming, and in a few years we'll be waving and crooking and fisting at our machines happily, but for the moment the cutting edge is still bleeding a little even if it can see when it's being given the finger. ®