Measuring the impact of YouTube's recommendation algorithm on information diversity and pluralism: French CSA publishes empirical study (available in French and English)

posted on 03 December, 2019   (public)

YouTube's algorithm appears to rely more on keywords attached to a video than on its success

Do video-sharing platforms offer a pluralistic information space for citizens? This is the question that the French Conseil supérieur de l'audiovisuel (CSA) raised in a recent study. On 13 November 2019, the CSA published the findings of an empirical study which was conducted in order to understand how the recommendation algorithm works on the YouTube video-sharing platform and the potential impact on pluralism of information and the diversity of opinions presented to users. The study aimed to contribute to the public debate on the responsibility of content sharing platforms in democratic societies.

More specifically, the study focused on the algorithms that select videos that play automatically, a feature that "is similar to linear television broadcasting":

  • To understand how YouTube's automatic playback works, the CSA asked 38 volunteer staff members to watch videos using their Google Account and also created four fictitious profiles.
  • The content revolved around 23 topics identified as likely to provoke polarized views, such as secularism, global warning or vaccines.
  •  In total, the study was able to consider over 39,000 recommendations.

The main conclusions can be summarized as follows:

  • The selection of the first two videos presented automatically seems to be made on the basis of "the keywords associated with the initial theme". The number of views and the publication date seem to play a less crucial role.
  • More than a third of the recommended videos "express the same point of view as the original video". Ths indicates a potential risk of a "echo chamber", i.e. the repetition of the same ideas, to the detriment of other points of view, thus comforting the user in his assumptions.
  • Nevertheless, there seems to be a threshold from the third recommended content. From that point, the tools tend to move away from the original subject, favouring videos that are mostly recent and with a large number of views. The algorithm seems to individualize the recommendations without necessarily taking into account the profiles of the users.

While the CSA advocates caution in the interpretation of the findings by pointing out that these observations cannot necessarily be generalized to all algorithms, it nevertheless confirms that video-sharing platforms play a central role in accessing information when they have scheduling and content recommendation functions. The conclusions also highlight the need to strengthen media literacy and relations between public authorities and platforms in order to improve the transparency of algorithms and provide users with clear information on the recommendations made to them.

In a statement included in the report, YouTube stated that its recommendation engine had undergone several hundred changes since the tests were conducted last year. 

Update 7 January 2020: the study is now available in English:

Source: CSA Website


See also