Skip to main content
SearchLoginLogin or Signup

Regulating Platforms’ Invisible Hand: Content Moderation Policies and Processes

Published onAug 07, 2022
Regulating Platforms’ Invisible Hand: Content Moderation Policies and Processes

21 Wake Forest J. Bus. & Intell. Prop. L. 171

Before 2016, the calls for comprehensive regulation of social media
platforms were small in the United States. Since then, the push for
regulation, spanning the entire political spectrum, has been raised. Yet,
despite the introduction of several Congressional bills, extensive
regulation has not passed on a national scale in the United States.
Instead, Europe is taking the lead on disseminating law governing social
media platforms, covering issues ranging from terrorism to political
advertisements. As the United States continues to debate and assess its
approach, whether to follow in Europe’s footsteps or retain laissez-faire
policies, it is important to understand that any regulation should be
nuanced and tailored to the specific issues at hand.

By addressing how to regulate content moderation, this Article
endeavors to demonstrate the refinement required when approaching
any issue involving social media platforms. Through critiquing the
divergent approaches to content moderation assumed by the United
States and Europe, laissez-faire and heavy-handed regulation
respectively, this Article argues both that platforms are best suited to
create, apply, and enforce content moderation policies, and that the
United States should regulate the consequences of such freedom by
compelling transparency and procedural due process.

No comments here
Why not start the discussion?