wins pals with pledge not to keep hold of innocents' mugshots and biometric data

So why can't do the same? – campaigners

The Open Rights Group has backed the Scottish government's plans to immediately delete mugshots at the end of legal retention periods – something Whitehall said is impossible in its own systems.

The Scottish government is consulting on proposals to improve oversight of the use and retention of biometric data, which would see the nation appoint its first biometrics commissioner.

They would be responsible for overseeing adherence to a Code of Practice that sets out the rules on how long authorities can keep DNA, fingerprints and custody images.

The code (PDF), which has been published as part of the consultation, is clear that retaining biometric data interferes with people's right to privacy, and that "the obvious approach is to have a presumption in favour of deletion following the expiry of any minimum retention period as prescribed in law".

As such, all data must be deleted as soon as the relevant retention period had passed – and authorities must ensure records are deleted from both the primary database and any other databases they are replicated on.

London, UK - March, 2018. Police officers patrolling Leicester Square and Piccadilly Circus in central London. Pic Paolo Paradiso /

Zero arrests, 2 correct matches, no criminals: London cops' facial recog tech slammed


Establishing such a rule would be in contrast to the situation in England and Wales, where custody images are retained indefinitely in a mammoth database – it now holds 21 million shots of faces and identifying features – and only removed if someone requests it.

This is widely thought to go against a 2012 High Court ruling that said keeping images of presumed innocent people on file was unlawful, and that there must be a distinction between convicted and non-convicted people.

But the Home Office has countered that it isn't technically possible to automatically link or delete records because national and local databases don't talk to each other, and that doing it manually would be too costly to justify. It claimed ongoing efforts to update the systems will address this in the longer-term.

However, its approach – and ministers' attitudes – is a source of constant frustration for activists and opponents.

By comparison, the Scottish government's proposal demands automatic deletion, and indicates that in cases where a system won't allow it, steps must still be taken to protect un-convicted people, until legacy systems are replaced.

"In relation to custody images held by Police Scotland on legacy force custody systems where there is no automated means of distinguishing between records of convicted and non-convicted persons, it will suffice for the records within those systems to be protected from access in the operational environment until deleted as those systems are shut down," it said.

Campaigners have welcomed the plan, and urged the Home Office to follow suit.

"Open Rights Group called for rules establishing an automatic deletion procedure," said the organisation's Scotland director, Matthew Rice. "It is welcome to see them included in the Code of Practice for Scotland and we encourage the rest of the UK to follow Scotland's lead."

Game of Thrones

UK Home Office grilled over biometrics, being clingy with folks' mugshots


Elsewhere in the code, the Scottish government proposed handing out a "biometrics information sheet or leaflet" as a "practical way" to ensure that people whose biometric data is captured understand how it might be used and how they can appeal.

This is another area in which Whitehall has fallen short in the eyes of critics, who argue that most people who have been taken into custody have no idea their images are retained or that they need to request they be deleted.

The Scottish government also noted that the code covered not just DNA, fingerprints and custody images – but also biometric data generated by second-generation tech, like facial recognition software, remote iris recognition and voice pattern analysis.

It said that the code would apply to Police Scotland and the Scottish Police Authority, as well as any bodies that collect data while exercising powers of arrest for devolved purposes – but not for national security or private companies.

However, the Open Rights Group said that this "does not reflect the direction of travel for biometrics in our lives" as there is an increasing amount of surveillance carried out by housing associations and private firms in the retail sector.

"These applications will have an effect on individuals' rights, and the Code should reflect that," Rice said. "At the moment, adoption in other areas such as public bodies or private bodies is on a voluntary basis. The Code should go further and apply to those bodies directly."

Rice also called for there to be more power granted to the biometrics commissioner should an organisation break the code. As proposed, a breach is not a civil or criminal offence. Rather, the role-holder can only be able to issue an "improvement notice".

The consultation closes on 1 September. ®

Broader topics

Other stories you might like

  • Clearview AI promises not to sell face-recognition database to most US businesses
    Caveats apply, your privacy may vary

    Clearview AI has promised to stop selling its controversial face-recognizing tech to most private US companies in a settlement proposed this week with the ACLU.

    The New-York-based startup made headlines in 2020 for scraping billions of images from people's public social media pages. These photographs were used to build a facial-recognition database system, allowing the biz to link future snaps of people to their past and current online profiles.

    Clearview's software can, for example, be shown a face from a CCTV still, and if it recognizes the person from its database, it can return not only the URLs to that person's social networking pages, from where they were first seen, but also copies that allow that person to be identified, traced, and contacted.

    Continue reading
  • Astra Space to launch satellites from Shetland
    Rockets could fly from the UK as soon as next year

    The UK's SaxaVord spaceport has agreed a deal with Astra Space to launch satellites from the Unst facility from 2023.

    Continue reading
  • Research finds data poisoning can't defeat facial recognition
    Someone can just code an antidote and you're back to square one

    If there was ever a reason to think data poisoning could fool facial-recognition software, a recently published paper showed that reasoning is bunk.

    Data poisoning software alters images by manipulating individual pixels to trick machine-learning systems. These changes are invisible to the naked eye, but if effective they make the tweaked pictures useless to facial-recognition tools – whatever is in the image can't be recognized. This could be useful for photos uploaded to the web, for example, to avoid recognition. It turns out, this code may not be that effective.

    Researchers at Stanford University, Oregon State University, and Google teamed up for a paper in which they single out two particular reasons why data poisoning won't keep people safe. First, the applications written to "poison" photographs are typically freely available online and can be studied to find ways to defeat them. Second, there's no reason to assume a poisoned photo will be effective against future recognition models.

    Continue reading
  • 1,000-plus AI-generated LinkedIn faces uncovered
    More than 70 businesses created fake profiles to close sales

    Two Stanford researchers have fallen down a LinkedIn rabbit hole, finding over 1,000 fake profiles using AI-generated faces at the bottom.

    Renée DiResta and Josh Goldstein from the Stanford Internet Observatory made the discovery after DiResta was messaged by a profile reported to belong to a "Keenan Ramsey". It looked like a normal software sales pitch at first glance, but upon further investigation, it became apparent that Ramsey was an entirely fictitious person.

    While the picture appeared to be a standard corporate headshot, it also included multiple red flags that point to it being an AI-generated face like those generated by websites like This Person Does Not Exist. DiResta was specifically tipped off by the alignment of Ramsey's eyes (the dead center of the photo), her earrings (she was only wearing one) and her hair, several bits of which blurred into the background. 

    Continue reading

Biting the hand that feeds IT © 1998–2022