What the future of online content moderation might look like


Internet monitoring

Censorship. Denying free speech. Fomenting an insurrection. Deplatforming. These are just some of the recent accusations directed at social media platforms by lawmakers who either think these companies are doing too much — or aren’t doing enough — to stop problematic material from being circulated on their sites.

Under Section 230 of the Communications Decency Act, “interactive computer services” like Twitter and Facebook can remove or moderate user material that the platforms deem obscene or offensive as long as the platforms act in good faith. Legislators on both sides of the aisle consider the law to be outdated and insufficient, and the odds that it will outlast the Biden administration without change are slim — so what will the future of Section 230 and online content moderation look like? Are “the 26 words"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." that created the Internet” headed for an overhaul, or for the scrap heap?

Future Tense, a partnership between New America, Slate and Arizona State University, recently brought together experts and professionals in the technology and public policy space to discuss the future of online content moderation and what it might look like in the coming years. During a conversation with Jennifer Daskal, New America ASU Future Security Fellow and professor and faculty director of the Tech, Law and Security Program at American University, technology policy leaders and professionals offered suggestions on what they think the Biden administration can do to promote effective online content moderation while remaining ethical and promoting growth and vigorous, free discourse on the internet.

Matt Perault, the director of the Center on Science and Technology Policy at Duke University, offered several policy suggestions for the new administration to combat some of the concerns that people have about Section 230, like creating more guidance to outline the difference between hosting content and creating content, or designing more products to facilitate individual accountability.

“The first priority is to modernize federal criminal law for the digital age. The important thing about changing or creating federal criminal law is that Section 230 doesn’t allow platforms to use 230 as a defense for criminal allegations,” Perault explained. “So, if there’s existing federal criminal law, and platforms are brought into court for potential violations of it, they can’t use 230 as a defense — and there are many areas of federal criminal law that I think are ripe for reform and updating to deal with some of the new challenges we have in our digital age, including voter suppression and voter fraud.”

With current debates on Section 230 focusing on its reform or repeal and the law’s impact on hate speech, disinformation, or bias, lawmakers have trained their sights on the biggest, most influential technology companies, with the biggest platforms, such as Facebook, Twitter and YouTube. Smaller platforms, like Tumblr, are often an afterthought, and a one-size-fits-all approach to updating Section 230 risks harming these smaller internet platforms.

“I think the biggest thing that we’re concerned about is this liability-freedom of expression conversation, and also the fact that we’re big fans of innovation and diversity in tech — much of our business thrives off that,” said Victoria McCullough, director of public policy and social impact at Automattic, which owns Tumblr and WordPress. “If legislation is not thoughtful, those of us representing the medium- and small-sized companies will be the ones likely to get hurt, and I also think you’ll see a real hesitation and a slowing down of the type of diversity and innovation we’ve seen in the last 20 years.”

Given the disparities in workforce sizes at the various online platforms like Reddit, Tumblr, TikTok and Facebook, a sweeping policy change like a repeal of Section 230 could have lasting, damaging effects on smaller companies’ operations, prompting some to rethink and come up with new models of content moderation, like a scalable version focused on ethics.

“The big idea is that it is certainly possible for the tech companies large and small to agree on ethical standards for treatment of individual users,” said Mike Godwin, visiting fellow at the Information Society Project at Yale Law School and member of the Internet Society board of trustees. “In fact, I think you can do that and also meet many of the obligations that are proposed under various implementations of the General Data Protection Regulation.”

While adopting a scalable, ethical model to content moderation is one of many suggestions for updating Section 230, the panelists noted that there is no sure way to know how policy changes would unfold and affect companies or users, prompting a call for better data transparency on moderation to better inform how moderation takes place.

“If we admit that there is a lot in this area that we don’t know, then we need to take a more experimental approach to policy, which means getting good data that will inform future policymaking,” Perault said. “From my standpoint, the issues around data transparency and data sharing that would enable researchers to really look deeply at these issues and understand first what the problem is and then have various different solutions would either exacerbate or help to address the problem seems to me to be a critical first step.”

Top photo courtesy of pixabay.com.

More Law, journalism and politics

 

People gather around display of campaign memorabilia at the opening reception for "We the People..." exhibit

Exhibit uses rare memorabilia to illustrate evolution of US presidential campaigns

After one of the most contentious elections in history, a new museum exhibit offers a historical perspective on the centuries-old American process.“We The People! Electing the American President” had…

Group of five panelists sitting on stage holding microphones in front of sign that says ASU California Center

TechTainment conference explores the crossroads of law, technology, entertainment

What protections do writers, actors, producers and others have from AI? Will changing laws around name, image and likeness (NIL) eliminate less lucrative college sports programs?And what does…

A stack of four pizza boxes

How to watch an election

Every election night, adrenaline pumps through newsrooms across the country as journalists take the pulse of democracy. We gathered three veteran reporters — each of them faculty at the Walter…