Moderation and Removal of Content

Among the major factors behind the success of the Internet has been the open, honest and freewheeling nature of online discourse. However, the sense of anonymity that is associated with being behind a computer or mobile screen can also encourage people’s darker impulses. The Internet can be a prime vehicle for vitriol and threats, as well as for the distribution of illegal material. This places intermediaries in a difficult position. On the one hand, for many the free flow of information is their bread and butter. However, their growing influence has placed them under increasing pressure, including from their own users, to mitigate the less desirable aspects of online speech. Gender-based harassment is notoriously endemic online, though it is only part of a broader “civility” problem which has led to a trend towards more active content management by some intermediaries. This, in turn, has given rise to difficult challenges in determining when and how forcefully to intervene:

Recommendations for Private Sector Online Intermediaries:

Clarity and Communication

  • Intermediaries should post, in a prominent place, clear, thorough and easy to understand guides to their policies and practices for taking action in relation to content, including detailed information about how they are enforced. Where policies need to be complex due to the fact that they form the basis of a legal contract with users, they should be accompanied by clear, concise and easy to understand summaries or explanatory guides.
  • Intermediaries’ copyright reporting mechanisms should provide information to both complainants and users about limitations and exceptions to copyright and, where applicable, warn complainants about the potential consequences of filing false claims.
  • Policies to address problematic content (such as deletion or moderation) which go beyond formal legal requirements should be based on clear, pre-determined policies which can be justified by reference to a standard which is based on objective criteria (such as a family friendly service) which are set out in the policy, and which is not based on ideological or political goals. Where possible, intermediaries should consult with their users when determining such policies.

 Process for Receiving and Adjudicating Complaints

  • Third parties who file a complaint about inappropriate or illegal content should be required to indicate what legal or policy rule the content allegedly violates.
  • Intermediaries should be consistent in applying any content moderation policies or legal rules and should scrutinise claims under such policies or rules carefully before applying any measures. They should have in place processes to track abuses of their content moderation systems and should apply more careful scrutiny to claims from users who repeatedly file frivolous or abusive claims.
  • Intermediaries should, subject only to legal or technical constraints, notify users promptly when content which the latter created, uploaded or hosts is subject to a complaint or restriction. The notification should include a reference to the legal or policy rule in question, and an explanation of the procedure being applied, the opportunities available to the user to provide input before a decision is taken, and common defences to the application of the procedure.
  • Where action is proposed to be taken in relation to content a user has created, uploaded or hosts, that user should normally be given an opportunity to contest that action. Where possible, subject to reasonable resource and technical constraints, users should be given a right to appeal against any decision to take action against the content at issue.

Restricting Content

  • Actions to remove or otherwise restrict third party content should be as targeted as possible and should only apply to the specific content which offends against the relevant legal or policy standard.
  • Intermediaries should consider whether less intrusive measures are available which provide protection against harmful content without necessarily taking that content down, such as providing for opt-ins to access the content.
  • Where action is taken against content, the intermediary should, subject to reasonable technical constraints, retain the means to reverse that action for as long as any appeal against the action, including any legal appeal, remains pending.
  • Where a user’s account is deleted or de-activated, users should be given an option to preserve and export the data from that account, unless the material is patently illegal (i.e. in the case of child sexual abuse imagery) or has been declared to be illegal by a clear and binding legal order.


Continue reading about Moderation and Removal of Content