How would you like a phone that gives you gesture recognition - without needing to buy a new phone?
That's the tantalising prospect offered by a project at the University of Washington (UoW), which uses the existing Wi-Fi capabilities of consumer-grade devices (laptops were used for the research paper) to work out peoples' movements.
In this paper at Arxiv, Rajalakshmi Nandakumar, Bryce Kellogg and Shyamnath Gollakota of the UoW say that the interaction between bodies and Wi-Fi signals, particularly the amplitude of the signals, are enough for software to infer the user's gesture.
Their software, dubbed Wi-Fi Gestures, “detects large amplitude peaks caused by human gestures”, the authors write.
“Specifically, as the human moves her arm, the wireless reflections from her arm either constructively or destructively interfere with the direct signal from the Wi-Fi transmitter. This results in peaks and troughs in the amplitude of the received signals”, the paper states. The algorithm they created in their research classifies gestures according to the size and timing of the peaks.
The technique works at distances of “up to one [foot]” and claims a 91 per cent accuracy. Because of the relatively short range, the researchers say Wi-Fi Gestures is suitable for use even in an environment like an office, especially if a “start gesture” is implemented to cull out false positives.
WiSee, however, was based on the phase of signals, and relied on devices big enough to have multiple antennas.
Because Wi-Fi Gestures only needs amplitude to work with, it should be possible to implement as a smartphone app, and the paper notes that the range is great enough that it could be used even if the device is in a backpack or a pocket (so you could, for example, answer the phone without reaching for it). ®