Skip to main content

What the future of online content moderation might look like

Panel discusses the future of Section 230 of the Communications Decency Act

Internet monitoring
January 29, 2021

Censorship. Denying free speech. Fomenting an insurrection. Deplatforming. These are just some of the recent accusations directed at social media platforms by lawmakers who either think these companies are doing too much — or aren’t doing enough — to stop problematic material from being circulated on their sites.

Under Section 230 of the Communications Decency Act, “interactive computer services” like Twitter and Facebook can remove or moderate user material that the platforms deem obscene or offensive as long as the platforms act in good faith. Legislators on both sides of the aisle consider the law to be outdated and insufficient, and the odds that it will outlast the Biden administration without change are slim — so what will the future of Section 230 and online content moderation look like? Are “the 26 words"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." that created the Internet” headed for an overhaul, or for the scrap heap?

Future Tense, a partnership between New America, Slate and Arizona State University, recently brought together experts and professionals in the technology and public policy space to discuss the future of online content moderation and what it might look like in the coming years. During a conversation with Jennifer Daskal, New America ASU Future Security Fellow and professor and faculty director of the Tech, Law and Security Program at American University, technology policy leaders and professionals offered suggestions on what they think the Biden administration can do to promote effective online content moderation while remaining ethical and promoting growth and vigorous, free discourse on the internet.

Matt Perault, the director of the Center on Science and Technology Policy at Duke University, offered several policy suggestions for the new administration to combat some of the concerns that people have about Section 230, like creating more guidance to outline the difference between hosting content and creating content, or designing more products to facilitate individual accountability.

“The first priority is to modernize federal criminal law for the digital age. The important thing about changing or creating federal criminal law is that Section 230 doesn’t allow platforms to use 230 as a defense for criminal allegations,” Perault explained. “So, if there’s existing federal criminal law, and platforms are brought into court for potential violations of it, they can’t use 230 as a defense — and there are many areas of federal criminal law that I think are ripe for reform and updating to deal with some of the new challenges we have in our digital age, including voter suppression and voter fraud.”

With current debates on Section 230 focusing on its reform or repeal and the law’s impact on hate speech, disinformation, or bias, lawmakers have trained their sights on the biggest, most influential technology companies, with the biggest platforms, such as Facebook, Twitter and YouTube. Smaller platforms, like Tumblr, are often an afterthought, and a one-size-fits-all approach to updating Section 230 risks harming these smaller internet platforms.

“I think the biggest thing that we’re concerned about is this liability-freedom of expression conversation, and also the fact that we’re big fans of innovation and diversity in tech — much of our business thrives off that,” said Victoria McCullough, director of public policy and social impact at Automattic, which owns Tumblr and WordPress. “If legislation is not thoughtful, those of us representing the medium- and small-sized companies will be the ones likely to get hurt, and I also think you’ll see a real hesitation and a slowing down of the type of diversity and innovation we’ve seen in the last 20 years.”

Given the disparities in workforce sizes at the various online platforms like Reddit, Tumblr, TikTok and Facebook, a sweeping policy change like a repeal of Section 230 could have lasting, damaging effects on smaller companies’ operations, prompting some to rethink and come up with new models of content moderation, like a scalable version focused on ethics.

“The big idea is that it is certainly possible for the tech companies large and small to agree on ethical standards for treatment of individual users,” said Mike Godwin, visiting fellow at the Information Society Project at Yale Law School and member of the Internet Society board of trustees. “In fact, I think you can do that and also meet many of the obligations that are proposed under various implementations of the General Data Protection Regulation.”

While adopting a scalable, ethical model to content moderation is one of many suggestions for updating Section 230, the panelists noted that there is no sure way to know how policy changes would unfold and affect companies or users, prompting a call for better data transparency on moderation to better inform how moderation takes place.

“If we admit that there is a lot in this area that we don’t know, then we need to take a more experimental approach to policy, which means getting good data that will inform future policymaking,” Perault said. “From my standpoint, the issues around data transparency and data sharing that would enable researchers to really look deeply at these issues and understand first what the problem is and then have various different solutions would either exacerbate or help to address the problem seems to me to be a critical first step.”

Top photo courtesy of

More Law, journalism and politics


Exterior of the Walter Cronkite School of Journalism and Mass Communication on ASU's Downtown Phoenix campus.

First director named for ASU’s Center for Culture and Inclusion in Media

In a significant move toward enhancing media practices, Arizona State University’s Walter Cronkite School of Journalism and Mass Communication has named Chelsea Reynolds as the inaugural director of…

Image of an aerial view of a group of people seated at a table with laptops and papers superimposed with the letters "SUSI."

ASU's Cronkite School to host international scholars, students for SUSI programs

This summer, the Walter Cronkite School of Journalism and Mass Communication at Arizona State University is hosting two Study of the U.S. Institutes (SUSI) programs for international media…

A gavel sits on top of a laptop.

ASU Law launches AI focus across multiple degree programs

The Sandra Day O’Connor College of Law at Arizona State University — ranked the nation’s most innovative university since U.S. News and World Report created the category in 2015 — has embraced…