Dems are at it again, trying to break open black-box algorithms

Opening up code used in criminal prosecutions for scrutiny? But where's the text-to-vid hype and doomsaying?

Democratic lawmakers once again have proposed legislation to ensure that the software source code used for criminal investigations can be examined and is subject to standardized testing by the government.

On Thursday House Representatives Mark Takano (D-CA) and Dwight Evans (D-PA) reintroduced the Justice in Forensic Algorithms Act of 2024, a draft law that forbids using trade secret claims from barring defense attorneys from reviewing source code relevant to criminal cases and establishes a federal testing regime for forensic software.

Trade-secret privileges of software developers should never trump the due process rights of defendants

The bill, introduced in 2019 and in 2021 to no avail, aims to guarantee that criminal defendants have the opportunity to assess the fairness of software used against them.

Often this isn't the case because makers of forensic software can resist public review of their source code by claiming it's classified as a trade secret.

"As the use of algorithms proliferates in the prosecution of Americans, we must ensure that they can see and challenge black boxes that could determine if they are convicted," said Takano in a statement. "The trade secrets privileges of software developers should never trump the due process rights of defendants in the criminal justice system."

And yet they do. Northpointe, the developer of a system called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), used for calculating recidivism risk to inform pre- and post-trial decisions, considers its system to be proprietary and has refused to reveal how it works.

"As a privately developed algorithm, COMPAS is afforded the protections of trade secret law," wrote Andrew Lee Park in a 2019 UCLA Law Review article. "That means that COMPAS's algorithm – including its software, the types of data it uses, and how COMPAS weighs each data point – is all but immune from third-party scrutiny."

This might be tolerable if COMPAS treated everyone fairly, said Park, but research suggests it does not. Specifically, an analysis conducted by ProPublicain in 2016 found COMPAS to be biased against African Americans and was often inaccurate.

Northpointe published rebuttal research claiming its software is fair. And ProPublica countered, stating it stood by its findings. The fact remains that making criminal justice decisions without disclosing how those decisions were made remains problematic.

"We support the transparency and standards requirements in the Justice in Forensic Algorithms Act," said EFF staff attorney Hannah Zhao told The Register. "Criminal defendants and the public have a right to examine the algorithms being used to put people behind bars."

Defendants have argued that being denied access to such software's source code violates the Sixth Amendment Right to confront one's accuser, an issue raised unsuccessfully in an appeal [PDF] to overturn the 2015 murder conviction of convicted killer John Wakefield.

In a December 2023 law review article titled "Algorithmic Accountability and the Sixth Amendment: The Right to Confront an Artificial Witness," University of Baltimore School of Law student Dallon Danforth argues the court system will have to resolve the tension between the right to confront an accuser and the right to protect intellectual property.

The Justice in Forensic Algorithms Act of 2024 has something to say about that. Too bad it probably won't make it through both the House and Senate this time around either. ®

More about


Send us news

Other stories you might like