Title
Moderating Content Moderation: A Framework for Nonpartisanship in Online Governance
Document Type
Article
Publication Date
2021
Abstract
Internet platforms serve two important roles that often conflict. Facebook, Twitter, YouTube, and other internet platforms facilitate the unfettered exchange of free speech by millions of people, yet they also moderate or restrict the speech according to their “community standards,” such as prohibitions against hate speech and advocating violence, to provide a safe environment for their users. These dual roles give internet platforms unparalleled power over online speech—even more so than most governments. Yet, unlike government actors, internet platforms are not subject to checks and balances that courts or agencies must follow, such as promulgating well-defined procedural rules and affording notice, due process, and appellate review to individuals. Internet platforms have devised their own policies and procedures for content moderation, but the platforms’ enforcement remains opaque—especially when compared to courts and agencies. Based on an independent survey of the community standards of the largest internet platforms, this Article shows that few internet platforms disclose the precise procedural steps and safeguards of their content moderation—perhaps hoping to avoid public scrutiny over those procedures. This lack of transparency has left internet platforms vulnerable to vocal accusations of having an “anti-conservative bias” in their content moderation, especially from politicians. Internet platforms deny such a bias, but their response has not mollified Republican lawmakers, who have proposed amending, if not repealing, Section 230 of the Communications Decency Act to limit the permissible bases and scope of content moderation that qualify for civil immunity under the section. This Article provides a better solution to this perceived problem—a model framework for nonpartisan content moderation (NCM) that internet platforms should voluntarily adopt as a matter of best practices. The NCM framework provides greater transparency and safeguards to ensure nonpartisan content moderation in a way that avoids messy government entanglement in enforcing speech codes online. The NCM framework is an innovative approach to online governance that draws upon safeguards designed to promote impartiality in various sectors, including courts and agencies, clinical trials, peer review, and equal protection under the Fourteenth Amendment.
Recommended Citation
Edward Lee,
Moderating Content Moderation: A Framework for Nonpartisanship in Online Governance,
70
Am. U. L. Rev.
913
(2021).
Available at:
https://scholarship.kentlaw.iit.edu/fac_schol/1078