Sensors, not CPUs, are the tech that swings the smartphone market

The more your phone knows about the world, the more useful - and invasive - it becomes


A computer without sensors is a pitiful, useless thing. Keyboards are sensors, as are mechanical-optical paper-tape readers, magnetic heads on storage discs, and the logic scanning for ones and zeroes on an ethernet interface. Everything a computer does - outside of calculations - involves a sensor.

Despite this, we tend to judge of our computers by their CPUs, rather than their complement of sensors - or we did, until the smartphone came along. Although CPU is important on a smartphone - my iPhone 6S Plus is faster than any desktop I’d purchased before this year - the raw grunt at the core only matters if the mobile has the right complement of sensors. The history of the smartphone is an arms race, fought with sensors.

The most important sensor on a smartphone - it’s raison d'être - is its microphone. (Or was, back when we used our smartphones to make voice calls.) A video camera came along quickly. Apple cleverly added a proximity sensor - infrared reflected off skin - so the smartphone screen dims (saving battery) when cradled against your head.

Apple’s competitors, looking for a bigger slice of the smartphone market, upped the ante with sensors of their own. Before it imploded, Nokia built high-quality, ultra-high-resolution cameras into its smartphones. And, from the first iPhone, smartphones have used gyroscopes and accelerometers, so they can detect and adapt to portrait and landscape orientation.

While that last may seem a trifling addition, it opened the smartphone to something completely unexpected.

Flash back a quarter of a century: I’m sourcing components for a consumer virtual reality system. An accelerometer is an absolute necessity in a head-mounted display, because it senses the motion of the head. Accelerometers exist in silicon, but priced at US$25 apiece, their only customer is the automotive industry - sensors used to trigger deployment of the airbags in a crash.

In the end, I invented my own sensor, because silicon accelerometers cost too much.

A few hundred million smartphones later, accelerometers and gyroscopes have become cheap as chips. Literally. From twenty-five dollars to less than twenty-five cents, the conjunction of Moore’s Law and Steve Jobs made these sensors cheap and abundant.

With many smartphones using high-quality accelerometer/gyroscope sensors, the groundwork had been laid for Google’s Cardboard - really no more than a cheap set of plastic lenses set at the right distance from a smartphone screen. Everything else about the Cardboard experience happened inside the smartphone - because the smartphone suddenly had the right suite of sensors to generate a head-tracking display.

Theoretically, Google’s Cardboard should give you the same smooth virtual reality experience as Samsung’s Gear VR. But it’s like chalk and cheese: Cardboard does the job, but it always feels as though you’re fighting the hardware, where Gear VR feels as comfortable as an old shoe.

The reason for that lies with the sensors built into Gear VR. Oculus CTO John Carmack worked with Samsung to specify an accelerometer/gyroscope sensor suite that could feed Samsung's flagship Galaxy S6 smartmobe with a thousand updates a second. The average sensors, on a typical smartphone - even the very powerful Galaxy S6 - won’t come anywhere near that.

Head tracking can only be as good as the sensors used to track the head. The proof of this is the difference between Galaxy S6 in Cardboard, and Galaxy S6 in Gear VR - try both and see for yourself.

This is one bleeding edge in the smartphone sensor arms race. Within the next eighteen months, every high-end smartphone will specify incredibly sensitive and fast accelerometers and gyroscopes. Smartphones work well both in the palm of your hand and when mounted over your eyes. Every major manufacturer will have their own Gear VR-like plastic case for wearing their latest top-of-the-line handset. Except at the very high end - the province of serious gamers and information designers - smartphones and VR will become entirely interchangeable.

Studded with sensors, our smartphones become increasingly sensitive. My iPhone tracks every step I take and every flight of stairs I climb, courtesy of a dedicated motion sensor. I can track my energy output against my weight in Apple’s Health app, and I don’t need a Fitbit for that - my smartphone is enough.

But sensitivity also has its downside: A 2014 demonstration by security researchers at Stanford demonstrated that the accelerometer of a smartphone can be used to ‘listen’ to conversations happening around the smartphone, because the accelerometer senses the pressure of the voice.

Back during the Cold War, the Soviets were caught out shining laser beams onto the windows at the White House, reading voices out of the reflections. The White House responded by pointing speakers at their windows, playing music just loud enough to drown out any other signal. We may need a new app for our smartphones, one that keeps just enough music piping out its speaker to confound anyone using our newly sensitive accelerometers against us. ®


Other stories you might like

  • D-Wave deploys first US-based Advantage quantum system
    For those that want to keep their data in the homeland

    Quantum computing outfit D-Wave Systems has announced availability of an Advantage quantum computer accessible via the cloud but physically located in the US, a key move for selling quantum services to American customers.

    D-Wave reported that the newly deployed system is the first of its Advantage line of quantum computers available via its Leap quantum cloud service that is physically located in the US, rather than operating out of D-Wave’s facilities in British Columbia.

    The new system is based at the University of Southern California, as part of the USC-Lockheed Martin Quantum Computing Center hosted at USC’s Information Sciences Institute, a factor that may encourage US organizations interested in evaluating quantum computing that are likely to want the assurance of accessing facilities based in the same country.

    Continue reading
  • Bosses using AI to hire candidates risk discriminating against disabled applicants
    US publishes technical guide to help organizations avoid violating Americans with Disabilities Act

    The Biden administration and Department of Justice have warned employers using AI software for recruitment purposes to take extra steps to support disabled job applicants or they risk violating the Americans with Disabilities Act (ADA).

    Under the ADA, employers must provide adequate accommodations to all qualified disabled job seekers so they can fairly take part in the application process. But the increasing rollout of machine learning algorithms by companies in their hiring processes opens new possibilities that can disadvantage candidates with disabilities. 

    The Equal Employment Opportunity Commission (EEOC) and the DoJ published a new document this week, providing technical guidance to ensure companies don't violate ADA when using AI technology for recruitment purposes.

    Continue reading
  • How ICE became a $2.8b domestic surveillance agency
    Your US tax dollars at work

    The US Immigration and Customs Enforcement (ICE) agency has spent about $2.8 billion over the past 14 years on a massive surveillance "dragnet" that uses big data and facial-recognition technology to secretly spy on most Americans, according to a report from Georgetown Law's Center on Privacy and Technology.

    The research took two years and included "hundreds" of Freedom of Information Act requests, along with reviews of ICE's contracting and procurement records. It details how ICE surveillance spending jumped from about $71 million annually in 2008 to about $388 million per year as of 2021. The network it has purchased with this $2.8 billion means that "ICE now operates as a domestic surveillance agency" and its methods cross "legal and ethical lines," the report concludes.

    ICE did not respond to The Register's request for comment.

    Continue reading
  • Fully automated AI networks less than 5 years away, reckons Juniper CEO
    You robot kids, get off my LAN

    AI will completely automate the network within five years, Juniper CEO Rami Rahim boasted during the company’s Global Summit this week.

    “I truly believe that just as there is this need today for a self-driving automobile, the future is around a self-driving network where humans literally have to do nothing,” he said. “It's probably weird for people to hear the CEO of a networking company say that… but that's exactly what we should be wishing for.”

    Rahim believes AI-driven automation is the latest phase in computer networking’s evolution, which began with the rise of TCP/IP and the internet, was accelerated by faster and more efficient silicon, and then made manageable by advances in software.

    Continue reading
  • Pictured: Sagittarius A*, the supermassive black hole at the center of the Milky Way
    We speak to scientists involved in historic first snap – and no, this isn't the M87*

    Astronomers have captured a clear image of the gigantic supermassive black hole at the center of our galaxy for the first time.

    Sagittarius A*, or Sgr A* for short, is 27,000 light-years from Earth. Scientists knew for a while there was a mysterious object in the constellation of Sagittarius emitting strong radio waves, though it wasn't really discovered until the 1970s. Although astronomers managed to characterize some of the object's properties, experts weren't quite sure what exactly they were looking at.

    Years later, in 2020, the Nobel Prize in physics was awarded to a pair of scientists, who mathematically proved the object must be a supermassive black hole. Now, their work has been experimentally verified in the form of the first-ever snap of Sgr A*, captured by more than 300 researchers working across 80 institutions in the Event Horizon Telescope Collaboration. 

    Continue reading

Biting the hand that feeds IT © 1998–2022