Transparency reports from VLOPSES: briefing by the Council for Media Services (SK)

posted on 09 February, 2024   (public)

Despite valuable insights further standardisation is needed to reach meaningful transparency

On 31 January 2024, the Council for Media Services, the media regulatory authority from Slovakia, published a brief analysis of the transparency reports provided by the very large platforms and search engines (VLOPSES) as required by the new EU Digital Services Act  (Article 15).

Generally, the CMS stresses the lack of standardisation within the structure and formats and a missing common understanding of the required metrics. The Slovak regulator calls for a simplified language and a homogenous format to “achieve effective and actional transparency”.



Some key findings from the CMS:

  • Removal requests

- Requests from State bodies usually treated as a top priority (Article 9 DSA)

- Suspected attempts to discourage users from reporting contents (e.g.: account required)

- Difficulty to assess the effectiveness of measures undertaken (no information on the type of actions put in place with respect to the type of alleged content). 

  • Own-initiative moderation

- Clear tendency to limit visibility rather than to remove content 

- Removal more effective on marketplaces. 

  • Complaint handling

- In case of complaint, high chance to see the original decision overturned

  • Automated content moderation

- Extensive reliance on automated content moderation (mainly on social media, surprisingly less on marketplaces)

  • Accounts' suspensions

- Similar policies but a significant difference with regards the number of accounts' suspensions

  • Human resources

- More detailed data needed on staff qualifications, training and support

- Except for Google, Facebook, Instagram and Tiktok (covering all EU MS languages), only a few major EU languages covered by human reviewers

Source: The CMS (SK)