It's bizarre we're at a point where reports are written on how human rights trump AI rights

But that's what UN group has done


The protection of human rights should be front and centre of any decision to implement AI-based systems regardless of whether they're used as corporate tools such as recruitment or in areas such as law enforcement.

And unless sufficient safeguards are in place to protect human rights, there should be a moratorium on the sale of AI systems and those that fail to meet international human rights laws should be banned.

Those are just some of the conclusions from the Geneva-based Human Rights Council (HRC) in a report for the United Nations High Commissioner for Human Rights, Michelle Bachelet.

"The right to privacy in the digital age" [download] takes a close look at how AI – including profiling, automated decision-making, and other machine-learning technologies – affects people's rights.

While the report acknowledges that AI "can be a force for good," it also highlights serious concerns around how data is stored, what it's used for, and how it might be misused.

"AI technologies can have negative, even catastrophic, effects if they are used without sufficient regard to how they affect people's human rights," Bachelet said in a statement.

"Given the rapid and continuous growth of AI, filling the immense accountability gap in how data is collected, stored, shared and used is one of the most urgent human rights questions we face."

The report is critical of the way governments and businesses have "often rushed to incorporate AI applications, failing to carry out due diligence," citing "numerous cases of people being treated unjustly because of AI" including being arrested due to "flawed facial recognition."

In July, the US House Committee on the Judiciary heard how facial recognition technology (FRT) is being used by law enforcement agencies in America. The hearing called on testimony from all sides of the debate as legislators seek to balance the benefits of FRT against issues such as the right to personal privacy and wrongful identification.

But it was the personal testimony of Robert Williams – who was wrongly identified, arrested, and detained all because of a "blurry, shadowy image" – that brought the debate into sharp focus.

Indeed, the issue of how AI is used in law enforcement and the criminal justice system has also been keeping the House of Lords Justice and Home Affairs Committee in the UK busy over the summer.

Most recently, Professor Elizabeth E Joh, of the UC Davis School of Law, told the committee that there are concerns over some predictive policing tools.

In some cases, Joh explained, there have been calls for technologies such as facial recognition to be banned but that attempts to do so have been "piecemeal" and not on a national scale. And with respect to some predictive policing tools, she suggested they "may not be as reliable or as effective as promised."

It's a point picked up by the HRC report, which flagged that some predictive tools "carry an inherent risk of perpetuating or even enhancing discrimination, reflecting embedded historic racial and ethnic bias in the data sets used, such as a disproportionate focus of policing of certain minorities."

The report recommends calls for human rights to be centre stage in the "development, use and governance of AI as a central objective."

It also calls for a ban on "AI applications that cannot be operated in compliance with international human rights law and impose moratoriums on the sale and use of AI systems that carry a high risk for the enjoyment of human rights, unless and until adequate safeguards to protect human rights are in place."

Bachelet said: "We cannot afford to continue playing catch-up regarding AI – allowing its use with limited or no boundaries or oversight, and dealing with the almost inevitable human rights consequences after the fact. Action is needed now to put human rights guardrails on the use of AI, for the good of all of us."

Asked to comment on the HRC report, the UK Home Office declined to be drawn on the specifics but instead insisted that, when it comes to issues such as facial recognition, policy is free to change and that it is keen to ensure a "consistent approach is taken nationwide."

It pointed to last year's ruling by the Court of Appeal, which found that South Wales Police broke the law with an indiscriminate deployment of its automated facial-recognition technology in Cardiff city centre between December 2017 and March 2018.

As a result, the Home Office is updating its Surveillance Camera Code to reflect the judgment before facing Parliamentary scrutiny.

A Home Office spokesperson told us: "This government is delivering on a manifesto commitment to empower the police to use new technologies, like facial recognition to help identify and find suspects, to protect the public.

"There is a robust legal framework for the use of such technology, in keeping with last year's Court of Appeal ruling. The independent College of Policing has been consulting extensively on national guidance to ensure a consistent approach is taken nationwide."

No one from the House of Lords Justice and Home Affairs Committee was available to comment at the time of writing. ®

Broader topics


Other stories you might like

  • VMware claims 'bare-metal' performance from virtualized Nvidia GPUs
    Is... is that why Broadcom wants to buy it?

    The future of high-performance computing will be virtualized, VMware's Uday Kurkure has told The Register.

    Kurkure, the lead engineer for VMware's performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported "near or better than bare-metal performance" for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia's NVLink interconnect.

    NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0's 2.5GB/s. The interconnect enabled Kurkure's team to pool 160GB of GPU memory from the Dell PowerEdge system's four 40GB Nvidia A100 SXM GPUs.

    Continue reading
  • Nvidia promises annual datacenter product updates across CPU, GPU, and DPU
    Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU

    Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).

    Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.

    "We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.

    Continue reading
  • Now Amazon puts 'creepy' AI cameras in UK delivery vans
    Big Bezos is watching you

    Amazon is reportedly installing AI-powered cameras in delivery vans to keep tabs on its drivers in the UK.

    The technology was first deployed, with numerous errors that reportedly denied drivers' bonuses after malfunctions, in the US. Last year, the internet giant produced a corporate video detailing how the cameras monitor drivers' driving behavior for safety reasons. The same system is now apparently being rolled out to vehicles in the UK. 

    Multiple camera lenses are placed under the front mirror. One is directed at the person behind the wheel, one is facing the road, and two are located on either side to provide a wider view. The cameras are monitored by software built by Netradyne, a computer-vision startup focused on driver safety. This code uses machine-learning algorithms to figure out what's going on in and around the vehicle.

    Continue reading

Biting the hand that feeds IT © 1998–2022