Collar the lot of us! The biometric delusion

Optimism beats evidence in the drive to fingerprint the world


Special report Until the 16th century, educated opinion, as codified by Ptolemy, held that the Earth is at the centre of the universe. Then along came Copernicus.

On 29 June 2009, the Identity & Passport Service (IPS) published their latest paper on the National Identity Service (NIS). According to Safeguarding Identity (pdf), "the vision for the NIS is that it will become an essential part of everyday life, underpinning interactions and transactions between individuals, public services and businesses and supporting people to protect their identity" (para.3.32).

Early adopter

Placing the NIS at the centre of social interaction like this makes IPS about 92 million miles wide of the mark.

How is the NIS supposed to achieve IPS's vainglorious objective? “Our intention is that, at the core of the information used to prove identity will be biometrics, such as photographs and fingerprints” (para.3.6).

It follows that, in the eyes of IPS, the NIS stands or falls on the reliability of the biometrics chosen. If they don’t work, the NIS can’t work.

When he was Home Secretary, David Blunkett told us that biometrics "will make identity theft and multiple identity impossible. Not nearly impossible. Impossible". That is the commonly held view.

It may be the commonly held view, but is it correct?

Ask the Home Office's scientists

Not everyone agrees. Dr Tony Mansfield and Mr Marek Rejman-Greene, for example, opened their February 2003 report to the Home Office by saying the exact opposite: “Biometric methods do not offer 100% certainty of authentication of individuals” (para.4).

Tony Mansfield specialises in biometric device testing at the National Physical Laboratory and Marek Rejman-Greene is the Senior Biometrics Advisor at the Home Office Scientific Development Branch (HOSDB). Who is right? Those two individuals? Or David Blunkett and all the politicians and civil servants and journalists in the UK and abroad who agree with him?

For Copernicans, the answer depends on the evidence.

There follows a review of the biometrics evidence that has come to light over the past six years.

Tony Mansfield and Marek Rejman-Greene’s report makes the distinction between two different jobs for biometrics – identification (section 2.1) and verification (section 2.2).

Identification is the job of proving that each person has one and only one entry on the population register. Professor John Daugman, the father of biometrics based on the iris, demonstrates easily that that job is not feasible for large populations.


Other stories you might like

  • Everything you wanted to know about modern network congestion control but were perhaps too afraid to ask

    In which a little unfairness can be quite beneficial

    Systems Approach It’s hard not to be amazed by the amount of active research on congestion control over the past 30-plus years. From theory to practice, and with more than its fair share of flame wars, the question of how to manage congestion in the network is a technical challenge that resists an optimal solution while offering countless options for incremental improvement.

    This seems like a good time to take stock of where we are, and ask ourselves what might happen next.

    Congestion control is fundamentally an issue of resource allocation — trying to meet the competing demands that applications have for resources (in a network, these are primarily link bandwidth and router buffers), which ultimately reduces to deciding when to say no and to whom. The best framing of the problem I know traces back to a paper [PDF] by Frank Kelly in 1997, when he characterized congestion control as “a distributed algorithm to share network resources among competing sources, where the goal is to choose source rate so as to maximize aggregate source utility subject to capacity constraints.”

    Continue reading
  • How business makes streaming faster and cheaper with CDN and HESP support

    Ensure a high video streaming transmission rate

    Paid Post Here is everything about how the HESP integration helps CDN and the streaming platform by G-Core Labs ensure a high video streaming transmission rate for e-sports and gaming, efficient scalability for e-learning and telemedicine and high quality and minimum latencies for online streams, media and TV broadcasters.

    HESP (High Efficiency Stream Protocol) is a brand new adaptive video streaming protocol. It allows delivery of content with latencies of up to 2 seconds without compromising video quality and broadcasting stability. Unlike comparable solutions, this protocol requires less bandwidth for streaming, which allows businesses to save a lot of money on delivery of content to a large audience.

    Since HESP is based on HTTP, it is suitable for video transmission over CDNs. G-Core Labs was among the world’s first companies to have embedded this protocol in its CDN. With 120 points of presence across 5 continents and over 6,000 peer-to-peer partners, this allows a service provider to deliver videos to millions of viewers, to any devices, anywhere in the world without compromising even 8K video quality. And all this comes at a minimum streaming cost.

    Continue reading
  • Cisco deprecates Microsoft management integrations for UCS servers

    Working on Azure integration – but not there yet

    Cisco has deprecated support for some third-party management integrations for its UCS servers, and emerged unable to play nice with Microsoft's most recent offerings.

    Late last week the server contender slipped out an end-of-life notice [PDF] for integrations with Microsoft System Center's Configuration Manager, Operations Manager, and Virtual Machine Manager. Support for plugins to VMware vCenter Orchestrator and vRealize Orchestrator have also been taken out behind an empty rack with a shotgun.

    The Register inquired about the deprecations, and has good news and bad news.

    Continue reading

Biting the hand that feeds IT © 1998–2021