Investigating the role of online platforms before/after the Bratislava shooting - Slovak regulator publishes report

posted on 16 March, 2023   (public)

A concise, well structured report deploring slow and inadequate content moderation, pointing at the shortcomings of online platforms, and calling for more consistent and enforceable regulatory oversight

On 9 March 2023, the Council for Media Services in Slovakia (CMS) published a report entitled "The Bratislava Shooting, Report on the role of online platforms". The report, jointly produced with Reset, investigates the role of Twitter, Facebook, Instagram, and YouTube before and after the Bratislava shooting which took place on 12 October 2022.

Under Act no. 264/2022 - which also transposed the AVMS Directive in Slovakia and established a new regulatory authority with enlarged powers (the Council for Media Services) replacing the Council for Broadcasting and Retransmission - the CMS is entrusted with the responsibility and legal competency to prevent the dissemination of illegal content online via systemic oversight over digital platforms. This entails cooperation with the platforms in the effective, proportionate, and non-discriminatory application of their community rules, norms, and standards. 



On 12 October 2022, a radicalised attacker killed two people and injured another outside an LGBTQ+ bar, an attack soon reclassified as terrorism by Slovak authorities. The shooter had shared a manifesto with extremist content prior to the attack on file-sharing platforms and subsequently posted on Twitter and 4chan before committing suicide. Debate about the attack spread rapidly on social media with a wave of hateful comments aimed at the LGBTQ+ community.

After the attack, the Slovak Council engaged in bilateral communications with Facebook, YouTube, and Twitter to prevent further incitement to violence. The CMS also started to monitor the online environment with the help of external tools to find potentially harmful or problematic content - with a specific focus on Facebook given its prevalence on the Slovak market.


Key findings

This report analyses the content moderation policies as well as the responsiveness of the platforms. In particular, it draws the attention to:

  • Clear failures in content moderation: both before and after the attack and regardless of language (i.e. Slovak but also english content)
  • A shocking lack of content moderation resources: with only one Facebook-contracted fact-checker for Slovakia, thus highlighting the limited resources invested in small markets
  • Inadequate platform responses: the level of responses to requests from the CMS flagging problematic content was slow and insufficient
  • Failures to act against "repeat offenders": they were the most frequent authors of potentially illegal or harmful content 
  • Inconsistent & outdated counter-terrorism policies: the report notes a great diversity in the platforms' policies and a focus on terrorist organisations at the detriment of sole perpetrators.

The report calls more consistent and enforceable regulatory oversight and welcomes the advent of the Digital Services Act (DSA) which will notably require platforms to set up systems and mechanisms to assess the risks their services pose to citizens.

The report also encourages platforms to cooperate with each other to create a common baseline for the definition and identification of extremist and terrorist content.

Source: Council for Media Services, Slovakia