Tough Euro crackdown on AI use passes key vote

It's a familiar story: Legislation versus rapidly evolving technology

A sweeping European Union-wide AI regulatory bill is one step closer to adoption, with the European Commission's Internal Market and Civil Liberties Committees voicing their approval by an overwhelming majority. Should the bill become law, it could lead to tough times for AI operators in the economic bloc.

Passed 84 to 7 (with 12 abstentions), the EU's Artificial Intelligence Act places a number of gradually stricter rules on AI providers based on the system's perceived level of risk. Under those regulations, AI systems the EU decides come with "an unacceptable level of risk to people's safety" would be banned outright. 

Such technologies initially included social scoring systems and technology that uses subliminal techniques to manipulate users, but the committees voting to send the Act along to a plenary vote made a number of amendments.

Committee MEPs added a variety of biometric identification systems to the "unacceptable" category – including real-time biometric ID systems in public spaces or that use sensitive categories to group people, predictive policing systems, and emotion recognition systems. Scraping biometric data from social media and CCTV to create facial databases was also added to the list of banned AI practices. 

Other changes made in committee saw politicians expand systems classified as "high risk" to include AI models that can "harm people's health, safety, fundamental rights or the environment" and AI systems that could be used to sway voters.

Committees address previous AI Act concerns

Several members of the group working on the AI Act expressed concern last month that general purpose AI systems like ChatGPT were progressing faster than expected, leaving the AI Act on its back foot in regard to being able to regulate such tools.

In response, several lines were added to cover foundational models and their providers by requiring them to "guarantee robust protection of fundamental rights, health and safety and the environment, democracy and the rule of law" were coded into their bots. To enforce that, MEPs added requirements that AI providers regularly assess and mitigate risks to comply with requirements and be allowed register their products in the EU database. 

Generative foundational models like ChatGPT will also face new requirements if the AI Act passes. Transparency rules would require providers to disclose that content was generated by AI, build in rules that prevent models from generating illegal content, and publish summaries of copyrighted data used to train the model. 

The AI Act has been a work in progress since April 2021, and general purpose AI has hardly been the only concern to crop up with the rule since then. Last year, US think tank Brookings said it was worried the rule would lead to open source software efforts in the EU being hamstrung over concerns that devs could be held accountable for misuse of their code. 

In response to that, committee MEPs added exemptions to the Act's rules for researchers and AI components provided under open source licenses. "We are confident our text balances the protection of fundamental rights with the need to provide legal certainty to businesses and stimulate innovation in Europe," said AI Act co-rapporteur Brando Benifei. 

Responding to the committees' passing of the reinforced AI Act, Amnesty International said the move was a significant step forward.

"Today the European Parliament sent a strong signal that human rights must be at the forefront of this landmark legislation, by voting to ban several AI-based practices which are incompatible with human rights," said Amnesty International advocacy advisor Mher Hakobyan.

The Act could be the first such set of rules in the world governing the use of AI if it passes in an upcoming EU plenary session next month. If passed, the rule will be advanced to the European Council for negotiations on the final form of the law. ®

More about

TIP US OFF

Send us news


Other stories you might like