Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customise your settings, hit “Customise Settings”.

Review and manage your consent

Here's an overview of our use of cookies, similar technologies and how to manage them. You can also change your choices at any time, by hitting the “Your Consent Options” link on the site's footer.

Manage Cookie Preferences
  • These cookies are strictly necessary so that you can navigate the site as normal and use all features. Without these cookies we cannot provide you with the service that you expect.

  • These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.

  • These cookies collect information in aggregate form to help us understand how our websites are being used. They allow us to count visits and traffic sources so that we can measure and improve the performance of our sites. If people say no to these cookies, we do not know how many people have visited and we cannot monitor performance.

See also our Cookie policy and Privacy policy.

Boffins spy on iPhone screens from 200ft away

Shoulder surfing goes high-tech


Vid North Carolina boffins have been watching text entered into iPhones from 60 meters (197ft) behind the shoulders of users – or from the front, by reading the reflections in the users' glasses.

The process uses a standard video camera. It is even possible using an iPhone's camera, though the range decreases and relies on the iPhone's habit of popping up big versions of characters typed. Once the video has been fed through the researcher's image stabilisation software, and run through some optical character recognition software and natural-language analysis, the meaning emerges, as this (silent) video demonstrates:

Apple's iPhone isn't the only smartphone to provide visual feedback by popping up an enlarged version of the character pressed, but the technique won't work with those that don't. The researchers also admit that alternative text-entry techniques, such as Swype, will confound the recognition, but those are only used by a minority.

There are some other videos showing how reflections can be read, and the accuracy possible, on the boffin's own site. Their full paper (PDF, interesting, but very mathematical in places) demonstrates that with a decent video camera they were able to collect very accurate renditions of what was typed from a considerable distance.

It seems that the biggest limitation was motion blur. Stabilisation can only work so well and as the characters pop up on the screen only for a moment, a single blur make a character impossible to read. That's easily addressed with better video equipment, and better analysis, but this research was deliberately based on standard kit.

One can imagine Jason Bourne using such a technique, and it's interesting to hear that it is possible. It might pay to think about one's surroundings when entering a password, but in reality there are already plenty of other threats to be concerned about without worrying about what people might be able to pick up reflected in your sunglasses. ®

Similar topics


Other stories you might like

Biting the hand that feeds IT © 1998–2021