20 March, 2018
Creating a safe and enabling online environment where stakeholders know their rights and obligations is the key to a better content moderation
The Council of Europe adopted on 7 March 2018 policy guidelines on the roles and responsibilities of internet intermediaries such as search engines and social media. The Recommendation on the roles and responsibilities of internet intermediaries calls on States to provide a human rights and rule of law-based framework that lays out the main obligations of the States with respect to the protection and promotion of human rights in the digital environment, and the respective responsibilities of intermediaries.
The recommendation calls on States to create a safe and enabling online environment where all affected parties know their rights and duties; to encourage the development of appropriate self- and co-regulatory frameworks; and to ensure the availability of redress mechanisms for all claims of violations of human rights in the digital environment. It also underlines the importance of more transparency being introduced in all processes of content moderation.
The Recommendations’ key provisions aimed at governments include:
Public authorities should only make “requests, demands or other actions addressed to internet intermediaries that interferes with human rights and fundamental freedoms” when prescribed by law;
Legislation giving powers to public authorities to interfere with Internet content should clearly define the scope of those powers and available discretion, to protect against arbitrary application;
When internet intermediaries restrict access to third-party content based on a State order, State authorities should ensure that effective redress mechanisms are made available and adhere to applicable procedural safeguards;
When intermediaries remove content based on their own terms and conditions of service, this should not be considered a form of control that makes them liable for the third-party content for which they provide access.
The Recommendations’ provisions aimed at service providers include:
A “plain language” and accessible formats requirement for their terms of service;
A call to include outside stakeholders in the process of drafting terms of service;
Transparency on how restrictions on content are applied and detailed information on how algorithmic and automated means are used;
Any measures taken to removing or blocking content as a result of a State order should be implemented using the least restrictive means.
The standard setting proposals on the roles and responsibilities of Internet intermediaries was the key task which was entrusted to the Committee of experts on Internet Intermediaries (MSI-NET) alongside with a study on human rights dimensions of automated data processing techniques (in particular algorithms) and possible regulatory implications.
Note that the impact on algorithms on plurality was the focus of a plenary session at the 45th EPRA meeting in Edinburgh in May 2017, with a keynote presentation by Prof. Natali Helberger.
Source: Council of Europe Website