While the use of AI systems can be a competitive factor in the digital marketplace and for the resilience of journalism in the digital age, the ability of acccess and use AI systems in conformity with human rights and professional values is essential. Against this background, on 1 December 2023, the Steering Committee on Media and Information Society (CDMSI) adopted Guidelines on the responsible implementation of artificial intelligence (AI) systems in journalism which were elaborated by the Committee of Experts on Increasing Resilience of Media (MSI-RES).
The Guidelines provide practical guidance to news media organisations and media professionals to implement journalistic AI systems in a responsible manner.
They also offer guidance for AI technology providers and platform companies and for States and national regulatory authorities to create favourable conditions for such a responsible implementation.
The Guidelines cover:
- (6.3.) Independent NRAs (or news media self-regulatory bodies) should be encouraged to help develop guidelines and standards for responsible use and development of journalistic AI (e.g. legal status of training data, best practice for fair data extraction, attribution and labelling of synthetic content, transparency and human oversight etc.).
- (6.4.) Independent NRAs, news media self-regulatory bodies or standard setting bodies should be encouraged to help news organisations develop procurement guidelines, making available standard clauses for the responsible procurement of journalistic AI systems. (Annex 1 of the Guidelines provides a Procurement checklist)
- (6.5.) Independent and accountable regulators may play a role to create the conditions for the critical review of the fairness of commercial relationships and contractual agreements between news organisations, platforms and technology providers (e.g. issues of possible imbalances of negotiation power for smaller/local news organisations).