The European Union is directing major technology companies to protect the upcoming election in June from disinformation and the threat of online hacking.
“We know that this electoral period that’s opening up in the European Union is going to be targeted either via hybrid attacks or foreign interference of all kinds. We can’t have half-baked measures,” Internal Market Commissioner Thierry Breton said in February.
The European Commission on Tuesday unveiled a set of new regulations for the biggest tech platforms to follow, aimed at reducing election risks, such as the spread of viral misinformation and orchestrated campaigns by Russian bots or fake media.
These guidelines, part of the Digital Services Act, are targeted only at the largest platforms and search engines, specifically those with over 45 million active users within the bloc.
Under these guidelines, platforms including Facebook, YouTube, and TikTok have to clearly label political ads and AI-generated deepfakes and adjust their algorithms to promote a diversity of content, without leaning left or right.
They also have to have dedicated teams in place to keep an eye on emerging threats and narratives in any of the 27 EU member countries. The Commission recommended introducing measures like pop-up alerts for users attempting to share posts containing debunked misinformation, and establish emergency protocols for situations where a deepfake involving a European leader becomes widely circulated on their platforms.
Companies must keep a public, searchable archive of political ads, too, that is updated almost instantly, allowing third parties to see who was targeted by specific content.
The guidelines serve as recommendations from the Commission on how to best comply with the DSA (Digital Services Act) regulations. While companies have the flexibility to implement these guidelines as they see fit, those that choose not to follow the EU’s advice must demonstrate to the Commission that their alternative actions are just as effective.
Companies that fail to comply can face penalties as steep as 6% of their worldwide revenue.
“We adopted the Digital Services Act to make sure technologies serve people and the societies that we live in. Ahead of crucial European elections, this includes obligations for platforms to protect users from risks related to electoral processes – like manipulation, or disinformation. Today’s guidelines provide concrete recommendations for platforms to put this obligation into practice,” said Margrethe Vestager, the EU’s executive vice-president for a Europe Fit for the Digital Age.