The Council of Europe has published a Guidance Note “Best practices towards effective legal and procedural frameworks for self-regulatory and co-regulatory mechanisms of content moderation”.
Adopted by the Steering Committee on Media and Information Society, this document offers practical guidance to member States of the Council of Europe for policy development, regulation and use of content moderation in the online environment as well as the internet intermediaries who have human rights responsibilities of their own.
It was adopted at the conference of Ministers responsible for Media and Information Society of the Council of Europe on “Artificial Intelligence – Intelligent Politics: Challenges and Opportunities for Media and Democracy” on June 11, 2021 in Nicosia in presence of 44 Ministers.
The Guidance references the I&J Outcomes: Mappings of Key Elements of Content Moderation which identify the different categories of notifiers without assigning any value or hierarchy and list components that should be included in any notification that is made by either individuals, identified notifiers (including trusted flaggers), or public authorities.
Learn more and download the I&J Outcomes here
The Outcomes were produced by Over 30 senior-level global key actors from governments, internet companies, technical operators, civil society, leading universities, and international organizations, working together in the Internet & Jurisdiction Policy Network Content & Jurisdiction Program Contact Group, to map key elements of content moderation. The resources aim to help frame approaches towards identification and reporting of problematic online content and build a common understanding of the processes that can ensure due process.
One of the key challenges in dealing with online content is addressing abuses in a way that is timely and effective, yet fully respectful of international human rights principles while enabling the further development of the digital economy. The process of content moderation can often involve several jurisdictions. To enable the coexistence of different norms in online spaces, and to ensure that content restrictions are necessary and proportionate, it can be helpful to divide the process of content moderation into four stages: identification and notification, evaluation, choice of action, and recourse.
The I&JPN Toolkit on Cross-Border Content Moderation provides an overview of this process and highlights some of the key issues that arise when managing online content considering diverse local laws and norms. It intends to support Service Providers, in the design of their content moderation activities and Notifiers in the detection and reporting of problematic or abusive content as well as Legislators and Policy-Makers in determining procedures for dealing with different types of content and abusive behavior.
The Guidance Note also references the Internet & Jurisdiction Global Status Report 2019 which has highlighted iIssues of relevance to content moderation. The pioneering Internet & Jurisdiction Global Status Report 2019 a new resource for policy makers and shapers to enable evidence-based policy innovation.