This article is more than 1 year old

Tougher rules on targeted ads, deepfakes, crafty web design, and more? Euro lawmakers give a thumbs up

'This is strongly limiting the scope of maneuver by Big Tech', expert tells El Reg

Analysis The European Parliament has adopted a set of amendments to the Digital Services Act (DSA) that makes the pending legislation even more protective of personal privacy and requires businesses to give greater consideration to advertising technology, respecting user choice, and web design.

The DSA, advanced by the European Commission in late 2020, aims to police online services and platforms by creating "a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses." It's a set of rules for limiting illegal content and misinformation online and for making digital advertising more accountable and transparent.

It complements the Digital Markets Act (DMA), which focuses on regulating large technology "gatekeepers" like Amazon, Apple, Google, Meta (Facebook), and Microsoft.

Both of these packages of rules – the DSA and the DMA – are expected to take effect in 2023 or thereafter, subject to final approval from the European Parliament and Council.

On Tuesday, Members of Parliament (MEPs) voted 530 to 78, with 80 abstentions, to approve the text of the DSA, which will now be subject to negotiation with member states.

"Online platforms have become increasingly important in our daily life, bringing new opportunities, but also new risks," said Christel Schaldemose, an MEP from Denmark, in a statement. "It is our duty to make sure that what is illegal offline is illegal online. We need to ensure that we put in place digital rules to the benefit of consumers and citizens."

The revised DSA rules [PDF] are even more strict in some cases than they were initially, Dr Lukasz Olejnik, privacy researcher and consultant, told The Register in an email. As examples, he pointed to limitations on targeted advertisements and a requirement that deepfakes be labeled.

Recital 52 disallows targeted advertising to minors and prohibits the use of sensitive data (e.g. religion) for targeting adults. The rules also now require the ad repositories maintained by very large platforms to include with archived ads both data on the advertiser "and, if different, the natural or legal person who paid for the advertisement."

In addition, dark patterns have been explicitly forbidden: "Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof ('dark patterns')."

Olejnik observes that the DSA rules formalize a strict interpretation of user consent, outlined more generally in Europe's General Data Protection Regulation (GDPR).

The amended rules state that service providers should refrain from "urging a recipient of the service to change a setting or configuration of the service after the recipient has already made a choice." In other words, browser settings to not accept tracking must be respected.

"The novel and precise consent granting-rejecting-withdrawal stipulations in a separate article, are highly relevant," Olejnik said.

"This is strongly limiting the scope of maneuver by Big Tech when it comes to the data protection granting prompts. However, despite the fact that the article claims that it is 'without prejudice' to GDPR, it is in fact with prejudice to GDPR. I assume that this particular change may be highly contentious."

Olejnik expects both EU and US businesses will have to make substantial changes to adapt.

"Technology risk assessments will have to be developed," he said. "This means new needs of the tech and tech policy analysis teams. Chances are that the privacy impact process may be adapted to this broader risk assessment process. As with GDPR, the companies which deploy resources sooner than later, will be better prepared. This will be a competitive advantage, too. Societies will start expecting certain guarantees."

Olejnik however said such risk assessment will be futile if it's not forward-looking enough to consider future risks. And he notes that tech giants haven't demonstrated much capacity to anticipate the problems they've created.

"It is clear that Big Tech could foresee that their infrastructures will be used in microtargeting and disinformation," he said, pointing to his own cautionary post on the subject from 2016 as evidence such matters were being openly discussed at the time.

"But they did not want to devote corporate cycles to this scope. Now they'll be forced to do this. This is very fortunate because technology will continue impacting societies in the future."

Finalizing the DSA won't be without challenges: there's also the risk these rules will end up doing greater harm than good. Consider recent efforts in the US to tame social media platforms with legislation like Texas' HB 20, which legal scholars warn violate First Amendment speech protections.

"The big problem will be to weigh and balance each fundamental rights and freedoms, such as privacy, security, freedom of speech," said Olejnik. "It would be shameful if the application of DSA ends in legalized censorship." ®

More about

TIP US OFF

Send us news


Other stories you might like