Search

French Law against information disorder: first assessment by the regulator

posted on 17 September, 2020   (public)

While welcoming the willingness of online platforms to cooperate, the CSA calls for more data to assess effectiveness


[news item updated on 17/09/20 with the addition of the link to the English version of the report]

Since the adoption of the Law on the fight against information disorder (LOI n° 2018-1202 du 22 décembre 2018 relative à la lutte contre la manipulation de l'information), online platforms with more than 5 million unique users per month in France* have to report each year to the audiovisual regulator – the CSA – on the measures and methods that they implemented to fight against information disorder (as reported in our news item of 16 March)

In this regard, eleven operators have replied to the questionnaire sent by the CSA, representing sixteen various online services such as search engines, marketplace, video-sharing platforms, online fora or social media. On this basis, the CSA released its first assessment report based on the year 2019 in July.   

 

Update 17/09/2020: the summary of the report is now available in English (see below).

 

The main findings of the report are as follows:

  1. Reporting mechanisms: even though these mechanisms have been well developed by the platforms, the lack of data regarding the human and financial means dedicated to content moderation makes it difficult to assess their effectiveness (e.g. platforms did not provide any information on the moderators’ training or the work process).

 ⇒ The CSA encourages platforms to:
  • include a clear category of “false information” in the reporting mechanism and improve the accessibility and intelligibility of the reporting process and the moderation policy (remedies, fact-checkers partnerships…);
  • launch an ‘emergency’ reporting process;
  • develop partnerships with external fact-checkers and pool resources through a shared public database.
     
  1. Transparency of algorithms: in this area, the lack of comprehensibility and the non-exhaustivity of the answers provided by the platforms do not allow the CSA to conduct a proper analysis of the compliance with the legal requirement.

The CSA reminds the platforms that data related to business confidentiality will not be publicly released and invites operators to submit to the regulator more information on their policy regarding algorithms (criteria, data used and the role played by algorithms in content moderation) and to provide more transparency to users regarding the parameters and the possibility to adapt them).
 
  1. Promotion of content emanating from publishing companies and news agencies and audiovisual media services: this first report highlights the lack of consistency regarding the mechanisms used (content labelling or an increase/decrease of content visibility) and the work process (outsourcing, in-house dedicated team, no fact-checking  service…).

The CSA encourages platforms to:
  • clearly identify the source of content
  • explain to users how and why content is identified as ‘trusted’ content  
  • label content from news agency and audiovisual media providers
     
  1. Fight against users’ accounts massively spreading false information: online operators emphasise the difficulty to identify such accounts.

To get a clearer picture of the situation, the CSA invites online platforms to provide more information on the detection mechanisms and process and the advertising revenues generated by such accounts.
 
  1. Information provided to the users about the identity of the persons paying in exchange for the promotion of news content: In general, platforms identify sponsored content and apply an advertising policy. With regard to specifically sponsored content related to public interest issues, there are various approaches, ranging from a complete ban to limited targeting possibilities.

The CSA recommends the platform to:
  • put in place a clear flagging system of the content
  • inform users of the advertising parameters
  • adopt measures to secure trademarks and avoid parasitism
  • improve the dialogue with other platforms to harmonise the approach
  • release publicly their “advertising library” with a high level of transparency (amount spent, name of advertisers…).
     
  1. Measures and tools to promote media literacy: platforms are involved in MIL initiatives mainly through partnerships with associations or public institutions or financial support to research projects. Some platforms also provide areas in their service dedicated to specific resources for the young public or measures that allow users to identify clearly relevant and trusted content.

The CSA invites platforms to launch more multi-dimensional projects, targeting all ages and to share more data with the academic world to allow, for instance, an assessment of the impact of MIL measures. Moreover, pluriannual partnerships should be encouraged.


More generally, while the French audiovisual regulator welcomes the willingness of operators to cooperate and the development of a fruitful dialogue, the CSA calls on the platforms to provide more information - especially with regard to the human and financial means dedicated to information disorder and the comprehensibility and accountability of algorithms - to allow an effective assessment of the measures undertaken by the online operators.

The report also includes an overview of all the measures put in place by the platforms and the completed questionnaire of each platform.

Source: CSA

*with an activity exceeding 5 million visitors per month, calculated on the basis of the last calendar year

Countries