Will Police Scotland use real-time discrimination-happy face-recog tech? Senior cop tells us: We won't... for now

After panel urges halt to live matching, top brass says it would only be 'used in an intelligence-led, targeted way'


A Scottish Parliamentary panel has urged police to not invest in live facial-recognition technology, and the plod seem to agree.

In a report published this week, the Justice Sub-Committee on Policing noted that today's real-time facial-recognition software discriminated against “females, and those from black, Asian and ethnic minority communities.” A vast majority of facial-recognition algorithms struggle to identify women and people of color as accurately as white men, in other words, it said.

“For this reason, the sub-committee believes that there would be no justifiable basis for Police Scotland to invest in this technology,” it continued. “We therefore welcome confirmation from Police Scotland that they have no intention to introduce it at this time.”

Live facial recognition, to be clear, refers to software that captures camera footage of a person and, in real time, runs it against a database to look for a match. The sub-committee confirmed Police Scotland does use facial recognition retroactively, which means it takes stills earlier obtained from CCTV footage and cameras, and searches for matches.

The sub-committee recommended the Scottish Policy Authority and a potential Scottish Biometrics Commissioner review procedures on using face-matching AI, now and for the future. The position of an office of Scottish Biometrics Commissioner was introduced in a bill last year, and has yet to be confirmed.

Policeman claps in London street

This episode of Black Mirror sucks: London cops boast that facial-recog creepycams will be on the streets this year

READ MORE

The sub-committee’s report was sparked by Police Scotland's announcement that it hoped to employ live facial recognition by 2026. Although parliamentarians disapproved of investing in the technology right now, they did not say the cops could or should not use it in the future.

Instead, the panel insisted the software has to be accurate enough to stop false positives for women and people of color if it is to be used in the future. Police Scotland also has to demonstrate that there is public consent for the use of real-time facial-recognition technology before introducing it. In the meantime, Scotland’s Cabinet Secretary for Justice must provide legal and regulatory oversight to help Police Scotland legitimize the technology before it’s implemented.

In any case, the tech is not on the immediate horizon, for now anyway in Scotland.

"Police Scotland is not using, trialing or testing live facial recognition technology,” Assistant Chief Constable Duncan Sloan, lead for Major Crime and Public Protection, confirmed to The Register on Wednesday.

“We are keeping a watching brief on the trialing of the technology in England and Wales. Prior to any such technology being implemented we would carry out a robust program of public consultation and engagement around the use of this technology, its legitimacy, viability, and value for money.

"This would include taking advice and guidance on ethical, human rights and civil liberties considerations. In my view, the use of such technology would not be widespread, but would be used in an intelligence-led, targeted way." ®


Other stories you might like

  • US won’t prosecute ‘good faith’ security researchers under CFAA
    Well, that clears things up? Maybe not.

    The US Justice Department has directed prosecutors not to charge "good-faith security researchers" with violating the Computer Fraud and Abuse Act (CFAA) if their reasons for hacking are ethical — things like bug hunting, responsible vulnerability disclosure, or above-board penetration testing.

    Good-faith, according to the policy [PDF], means using a computer "solely for purposes of good-faith testing, investigation, and/or correction of a security flaw or vulnerability."

    Additionally, this activity must be "carried out in a manner designed to avoid any harm to individuals or the public, and where the information derived from the activity is used primarily to promote the security or safety of the class of devices, machines, or online services to which the accessed computer belongs, or those who use such devices, machines, or online services."

    Continue reading
  • Intel plans immersion lab to chill its power-hungry chips
    AI chips are sucking down 600W+ and the solution could be to drown them.

    Intel this week unveiled a $700 million sustainability initiative to try innovative liquid and immersion cooling technologies to the datacenter.

    The project will see Intel construct a 200,000-square-foot "mega lab" approximately 20 miles west of Portland at its Hillsboro campus, where the chipmaker will qualify, test, and demo its expansive — and power hungry — datacenter portfolio using a variety of cooling tech.

    Alongside the lab, the x86 giant unveiled an open reference design for immersion cooling systems for its chips that is being developed by Intel Taiwan. The chip giant is hoping to bring other Taiwanese manufacturers into the fold and it'll then be rolled out globally.

    Continue reading
  • US recovers a record $15m from the 3ve ad-fraud crew
    Swiss banks cough up around half of the proceeds of crime

    The US government has recovered over $15 million in proceeds from the 3ve digital advertising fraud operation that cost businesses more than $29 million for ads that were never viewed.

    "This forfeiture is the largest international cybercrime recovery in the history of the Eastern District of New York," US Attorney Breon Peace said in a statement

    The action, Peace added, "sends a powerful message to those involved in cyber fraud that there are no boundaries to prosecuting these bad actors and locating their ill-gotten assets wherever they are in the world."

    Continue reading

Biting the hand that feeds IT © 1998–2022