Since the adoption of the Law on the fight against information disorder (LOI n° 2018-1202 du 22 décembre 2018 relative à la lutte contre la manipulation de l'information), online platforms with more than 5 million unique users per month in France* have to report each year to the audiovisual regulator – the CSA – on the measures and methods that they implemented to fight against information disorder (as reported in our news item of 16 March)
In this regard, eleven operators have replied to the questionnaire sent by the CSA, representing sixteen various online services such as search engines, marketplace, video-sharing platforms, online fora or social media. On this basis, the CSA released its first assessment report based on the year 2019 in July.
The main findings of the report are as follows:
Reporting mechanisms: even though these mechanisms have been well developed by the platforms, the lack of data regarding the human and financial means dedicated to content moderation makes it difficult to assess their effectiveness (e.g. platforms did not provide any information on the moderators’ training or the work process).
Transparency of algorithms: in this area, the lack of comprehensibility and the non-exhaustivity of the answers provided by the platforms do not allow the CSA to conduct a proper analysis of the compliance with the legal requirement.
Promotion of content emanating from publishing companies and news agencies and audiovisual media services: this first report highlights the lack of consistency regarding the mechanisms used (content labelling or an increase/decrease of content visibility) and the work process (outsourcing, in-house dedicated team, no fact-checking service…).
Fight against users’ accounts massively spreading false information: online operators emphasise the difficulty to identify such accounts.
Information provided to the users about the identity of the persons paying in exchange for the promotion of news content: In general, platforms identify sponsored content and apply an advertising policy. With regard to specifically sponsored content related to public interest issues, there are various approaches, ranging from a complete ban to limited targeting possibilities.
Measures and tools to promote media literacy: platforms are involved in MIL initiatives mainly through partnerships with associations or public institutions or financial support to research projects. Some platforms also provide areas in their service dedicated to specific resources for the young public or measures that allow users to identify clearly relevant and trusted content.
More generally, while the French audiovisual regulator welcomes the willingness of operators to cooperate and the development of a fruitful dialogue, the CSA calls on the platforms to provide more information - especially with regard to the human and financial means dedicated to information disorder and the comprehensibility and accountability of algorithms - to allow an effective assessment of the measures undertaken by the online operators.
The report also includes an overview of all the measures put in place by the platforms and the completed questionnaire of each platform.