This article, relating to Irish law, was written by the team in our Dublin office for Browne Jacobson Ireland LLP.
Ireland’s Coimisiún na Meán (“Media Commission”) has announced their new Online Safety Code (the “Code”). The Code ushers in a new, more intensely regulated era for online platforms. To date, large online platforms have acted of their own accord and established trust and safety teams which seek to maintain a safe user experience for social media users (“Users”). Various European Union regulations have required Ireland (and now the Online Platforms who do business here) to establish and conform to new safety standards.
The main legislative push for the Code is the Audio-Visual Media Services Directive (“AVMSD”), a significant EU Directive which overhauled the nature of how video and broadcasting is regulated in the EU. Ireland transposed that Directive by the Online Safety and Media Regulation Act 2022 (the “Act”), the same legislation which established the Media Commission. The Media Commission has quickly established itself and has already begun substantive regulation, including own-initiative regulatory sweeps of online platforms.
The Code
The Code has a clearly defined scope which relates to video-sharing platform services which are “under the jurisdiction of the state”. This is a legal concept which is discussed in the Act, but could be summarised as video sharing platforms (“Platforms”) who have their EU headquarters in Ireland. The Media Commission have designated ten companies as Platforms that will be covered by the Code, such as Facebook, Instagram, YouTube, Udemy, TikTok, LinkedIn, X, Tumblr, Pinterest, and Reddit (The Media Commission has not come to a designated decision on Reddit as of yet).
Snapchat has not been included in the above list despite their strong presence online and popularity among young people as their headquarters are not in Ireland. However, the Media Commission will be working quite closely with its regulatory counterparts in other EU member states to hold platforms like Snapchat accountable for how they plan on keeping younger Users safe on their platforms.
The Code was expected to be a much longer, and more detailed document, however, the Media Commission has a large and sophisticated mandate across multiple pieces of legislation which might prompt further codes and regulatory guidance.
Code implementation
General obligations will apply from next month which includes videos containing content that may impair physical, mental, and moral development in minors. There will also be a nine month implementation period for more detailed provision such as any content which contains cyberbullying, promotion of self-harm, suicide, eating disorders or dangerous challenges, as well as access to pornography or extreme gratuitous violence. The Code will be Legally binding, and companies could face up to €20 million in fines or 10% of the platform’s annual turnover.
The Code and recommender systems
Recommender systems are also well known as “algorithms” and they determine what Users see on the relevant Platforms based on their likes, interests, personal profile etc. This personal profile might include their search history, age, and location for example. Such practices have been the source of scrutiny form a data protection perspective, and the Media Commission must now undertake regulating such recommender systems from a content perspective. Importantly, the Code will not deal directly with recommender systems. The basis for how the Media Commission will regulate recommender systems is the DSA, which is a separate but similar content regulation piece of legislation.
Concluding thoughts
By establishing clear obligations and emphasizing user safety, the Media Commission aims to create a safer online environment particularly for younger users. As platforms adapt to these new standards, the focus on accountability and protection will play a crucial role in shaping the future of digital media in Ireland.