This is a new era for digital regulation in Europe. The major web platforms have until August 25 to comply with the DSA (Digital Services Act), this new European regulation on digital services, which intends to make platforms responsible for hateful content, “fake news” and illicit products offered on the Internet. The idea: to enforce the same etiquette online that prevails in real life.

Published in October 2022, this law will apply to all web players from February 17, 2024, except for “very large platforms” which the European Commission will impose from this summer. Among the 19 players concerned, Thierry Breton, the European Commissioner for Digital, designates the highly anticipated Facebook, Google, Twitter, TikTok, Instagram, Snapchat or even Amazon, but also more surprising companies such as Zalando, which has moreover decided to attack the European Commission for its presence in this list.

With great scale comes great responsibility ���� Extra

But, concretely, what will change for platforms, search engines and their millions of European users? “With the DSA, we are no longer content to simply leave the ball in their court,” replies Jean Cattan, secretary general of the National Digital Council. Web actors are required to show that they are in compliance with European rules and, therefore, that they prevent risks such as the dissemination of illegal content, the violation of fundamental rights and public security. »

In addition to certain measures such as the ban on targeted advertising for users who are minors or based on religion, sexual preferences, health information or political beliefs, the leitmotif of this regulation is to demand transparency efforts from platforms.

In terms of moderation, for example, the Web giants will have to ensure that the methods they apply are the most effective for their model. “Whether they opt for algorithmic moderation operated by artificial intelligence, a human moderating service or community moderation like Wikipedia, they will have to report to the European Commission, justify their choice and prove that it is effective. »

Another great novelty: the platforms will have to be able to explain how their recommendation system works and, even, to systematically offer an alternative method of recommendation, not based on user profiling, but a chronological flow without personalized content , For example. To ensure that these rules are respected, these actors will have to publish transparency reports and annual audits will be carried out by independent actors.

“With the DSA, a new balance of power is being built: the public authorities are now armed to respond to the arguments of the platforms, observes the secretary general of the National Digital Council. An arsenal, a toolbox allows them to verify the accuracy of the assertions of these actors. »

Among these tools, a much greater capacity for intervention than in the past. “It totally changes the game, insists Jean Cattan. The European Commission can collect much more information from the actors to be regulated. And if a social network does not answer its questions, it can decide to send the judicial police or agents of the Commission to carry out inspections on the ground. This is the practice of “dawn raids”, these raids at dawn with which she is familiar in the financial world for example. The platforms will no longer be able to tell what they want, we will have the means to verify it. »

Citizens are also invited to become more involved in the regulation of platforms. If France already has tools like the Pharos reporting platform, the DSA plans to “facilitate the work of the reporting person by ensuring him an optimal path, specifies the secretary general of the National Digital Council. The tools, today, do not give complete satisfaction because they are not accessible enough, so people have lost confidence in this lever. »

From August 25, major platforms will have to systematically respond to users and inform them of the action taken on their report. The latter will thus be able to challenge the measure taken if it does not suit them before an extrajudicial authority to request compensation.

But what, for Jean Cattan, characterizes the main advance of the Digital Services Act is its article 40. need to conduct studies and highlight the risks associated with the dissemination of illegal content, he explains. Concretely, this means that a consortium between Sciences Po and the CNRS could decide to carry out an investigation to find out if Twitter publishes more far-right content than other political opinions, for example. This article can have a colossal impact: we are emerging from thirty years of self-regulation of platforms and public ignorance of their functioning, and this involvement of the field of research will allow us to have a finer knowledge of social networks, but also to deconstruct certain myths. »

If a system of severe sanctions – up to 6% of the company’s income or annual turnover – is planned in the event of non-compliance with the DSA, Jean Cattan finds this relatively “ineffective” and believes more in the new regulatory tools provided by the law.

The countdown is on and the Internet behemoths now have less than twenty days to get into the nails, but Thierry Breton said he was quite worried. “During the interviews I had with the CEOs of the platforms to ensure that they will be ready for the entry into force of the DSA obligations, I could see that there is still work to be done, especially with regard to the resources devoted to the moderation of content ”, explained to the Point the European commissioner for digital.

A fear confirmed by a report by Arcom (ex-CSA) published last July, which considered insufficient the efforts made by the platforms so far in the fight against online hate.