This article is more than 1 year old
UK watchdogs ask how they can better regulate algorithms
We have bad news: you probably can't... but good luck anyway
UK watchdogs under the banner of the Digital Regulation Cooperation Forum (DRCF) have called for views on the benefits and risks of how sites and apps use algorithms.
While "algorithm" can be defined as a strict set of rules to be followed by a computer in calculations, the term has become a boogeyman as lawmakers grapple with the revelation that they are involved in every digital service we use today.
Whether that's which video to watch next on YouTube, which film you might enjoy on Netflix, who turns up in your Twitter feed, search autosuggestions, and what you might like to buy on Amazon – the algorithm governs them all and much more.
While that all sounds pretty benign, regulators are hip to the fact that algorithms don't always work for the benefit of consumers.
"Algorithmic systems, particularly modern Machine Learning (ML) or Artificial Intelligence (AI) approaches, pose significant risks if used without due care," the group said. "They can introduce or amplify harmful biases that lead to discriminatory decisions or unfair outcomes that reinforce inequalities. They can be used to mislead consumers and distort competition."
The DRCF – comprised of the Competition and Markets Authority (CMA), the Information Commissioner's Office (ICO), and the Office of Communications (Ofcom) – has set out a workplan for the coming year in which it aims to:
- Better protect children online
- Promote competition and privacy in online advertising
- Support improvements in algorithmic transparency
- Enable innovation in the industries they regulate
However, one does not simply pop the hood on an algorithm and say "yep, this checks out." We're talking about vastly complex mathematics that can only really be assessed after they've done the job they were built for if you're looking to root out (un)intentional biases and other nastiness.
We're also talking about information that the companies who own them consider proprietary – as evidenced by the lawsuits launched over this fact.
While digital secretary Nadine Dorries – who reportedly asked Microsoft "when they were going to get rid of algorithms" – may not understand them, at least the DRCF has opened the floor to people who... might?
Gill Whitehead, DRCF chief executive, said in a statement: "The task ahead is significant – but by working together as regulators and in close co-operation with others, we intend for the DRCF to make an important contribution to the UK's digital landscape to the benefit of people and businesses online.
"Just one of those areas is algorithms. Whether you're scrolling on social media, flicking through films or deciding on dinner, algorithms are busy but hidden in the background of our digital lives.
"That's good news for a lot of us a lot of the time, but there's also a problematic side to algorithms. They can be manipulated to cause harm or misused because firms plugging them into websites and apps simply don't understand them well enough. As regulators, we need to make sure the benefits win out."
- Algorithm can predict pancreatic cancer from CT scans well before diagnosis
- Uni team demo algorithm to shield conversations from eavesdropping AI
- Broken password check algorithm lets anyone log into Cisco's Wi-Fi admin software
- Complaints mount after GitHub launches new algorithmic feed
- Twitter's machine learning algorithms amplify tweets from right-wing politicians over those on the left
Stefan Hunt, the CMA's Chief Data and Technology Insight Officer, added: "Much work has already been done on algorithms by the CMA, FCA, ICO and Ofcom but there is more to do. We're asking now, what more is needed, including from us as regulators and also from industry?"
As for what that work might be, two discussion papers from the DRCF on the benefits and harms of algorithms, and the landscape of algorithmic auditing and the role of regulators therein can be viewed in detail here.
Both point out: "This discussion paper is intended to foster debate and discussion among our stakeholders. It should not be taken as an indication of current or future policy by any of the member regulators of the DRCF."
It may be that we're just too far gone to do anything useful about harmful algorithms. It's not like the tech giants, for whom proprietary software prints money, are going to suddenly bin the programming that underpins their products and services.
And while one compsci boffin may be an "algorithm expert," there's no guarantee that they will be able to look at Twitter's and have a clue what any of it means or refers to.
The opportunity to comment is open until June 8. Opinions are to be submitted to DRCF@ofcom.org.uk.
To the regulators, we say good luck. And don't listen to any views from journalists. ®