Content Moderation

Frequently Asked Questions

What is content moderation?

Content moderation is the practice of reviewing and removing inappropriate or illegal content posted by a user online. Typically, online platforms have a set of rules, otherwise known as community guidelines, that outline what is and is not acceptable to post on their site. If a user violates these rules, the site may take a number of actions to remedy the situation including: 

  • flagging the post
  • removing the post 
  • suspending and/or banning the creator

Who engages in content moderation?

If a platform hosts any user-generated content (i.e. comment sections, reviews, etc.) then they likely engage in moderation. Social media companies, such as Meta and Youtube, are good examples of platforms with content moderation since the majority of their content is user-generated and requires constant review; however, content moderation is practiced in some form by most online platforms.

What common practices are employed to monitor content?

Content moderation can be handled in a number of ways that vary in efficacy based on the size and amount of user-generated content hosted on the platform. Companies may review content before it’s allowed to be posted, a process known as pre-moderation.

As a site grows in size and user-generated content increases pre-moderation becomes too restrictive and platforms often shift to post-moderation. 

Large platforms often use post-moderation with a combination of reactive and automated moderation. Under a reactive moderation regime, the platform relies on users to flag content that is inappropriate. Flagged content is then reviewed by an employee of the platform who decides whether it should be taken down (exactly what you did in our Takedown game). 

Under an automated or AI moderation regime, the platform will have a non-human review process that automatically flags and/or removes any content that violates the community guidelines. These decisions are also reviewed by human moderators when necessary.

For platforms with large quantities of user-generated content, moderation efforts are typically handled through the combined efforts of teams across departments.  The policy team will develop the community guidelines that users must adhere to, the ‘Trust and Safety’ team will manage overall efforts to ensure capability, and the engineering team will ensure that the infrastructure necessary to manually and automatically monitor content is in place. 

What are the primary laws or regulations that govern content moderation?

Fundamentally, the First Amendment protects the right of online platforms to practice content moderation. The First Amendment limits the government’s ability to restrict protected speech. As a result, online platforms, being private companies, have the right to decide what content they will and will not host.

Section 230 of the Communications Decency Act of 1996 (Section 230) is the most relevant piece of legislation pertaining to online content moderation. Section 230 states that, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In practice, this law shields online platforms from being held liable for user-generated content unless the platform is complicit in the creation or development of said content. 

Prior to Section 230, platforms would only be shielded from liability if they made no effort to moderate the content that appeared on their site. This practice disincentivized content moderation and lead platforms to adopt an “anything-goes” approach to user generated content. Section 230 allowed platforms to moderate content without opening themselves up to liability, which has allowed the internet to flourish.

What else is there to learn about content moderation?

CGO scholars and fellows frequently comment on a variety of topics for the popular press. The views expressed therein are those of the authors and do not necessarily reflect the views of the Center for Growth and Opportunity or the views of Utah State University.