Further to the public outcry over information disorder and hate speech online, the social media networks are increasingly under the scrutiny of governments and regulators in many countries in Europe.
In France, several initiatives aim to make the big social media players more accountable, forcing them to adopt clear, effective and transparent mechanisms, under the supervision of the regulator. This approach is reflected in two recent documents, now both available in English:
Further to the Law on the fight against the manipulation of information, which was adopted in December 2018 in France, the platform operators whose activity exceeds five million unique visitors per month (calculated in the basis of the last calendar year) have a duty to cooperate and are required to send to the French audiovisual regulatory Authority CSA an annual declaration on the measures implemented for this purpose. The CSA is entitled to control the effectiveness of the measures and to issue recommendations. On 15 May 2019, the French regulator released Guidelines to help the platforms comply with the new legal requirements. Pursuant to the law, the CSA is required to release a periodic report on the implementation and the effectiveness of the measures.
The following table compares the legal requirements with examples drawn from the CSA's recommendation:
Legal Requirements |
Examples of CSA's recommendations for compliance |
To implement an accessible and visible reporting mechanism (compulsory) |
- A clear click-on title with a user-friendly mechanism, right next to the content/account to report. |
To guarantee the transparency of algorithms (recommended) |
- Traceability of data and clear, accessible and sufficiently detailed information on the settings and the way to change them;- A communication tool for real-time exchange with the operator to explain algorithms. |
To promote content from professional media providers (recommended) |
- Clear visible indicators of the certified origin of the content and technological means to highlight ‘fact-checking’ content. |
To fight against accounts that massively spread ‘false information’ (recommended) |
- Procedure to detect accounts disseminating false information on a massive scale and proportionate procedures to restrict their actions (warnings, deletion, quarantine etc.);- Public monitoring mechanisms and accessible information on practices likely to result in restriction of use. |
To provide information on the origin of the content and how it is disseminated (recommended) |
- Clearly identify sponsored content and the origin of any content (and if possible, the method of dissemination used). |
To promote media literacy (recommended) |
- Help users identify reliable sources of information with suitable tools (video modules, guides, etc.)- Develop and support partnership with stakeholders. |
Following a mission letter from the State Secretariat for Digital Affairs called “Facebook experiment”, an interministerial mission team composed of seven high-level experts and three permanent rapporteurs from a range of ministries was tasked to "explore a general framework for the regulation of social networks, starting from the fight against online hatred and relying on the voluntary cooperation of Facebook". The findings on the mission were published in May 2019.
Source: Websites of CSA and DINSIC (Direction interministérielle du numérique et du système d’information et de communication de l’Etat)