Al Gore didn’t invent the internet, but he did play a role in ensuring that the internet could grow organically as an open platform through his work on the Telecommunications Act of 1996. The act, signed by President Clinton in an elaborate ceremony at the Library of Congress at the dawn of the public internet era, established that broadband internet service would be less heavily regulated than other telecommunications infrastructure like radio, telephone landlines, and broadcast, cable, and satellite TV. Title V of the act encompassed the Communications Decency Act, which included provisions designed to protect children from indecent online materials. The Supreme Court pared back many of those provisions in a series of cases, but one part of the CDA that survived intact was Section 230—at least for now.
Section 230 addressed the “moderator’s dilemma” of whether or not to moderate user-generated content. A communications platform that takes no steps to moderate content usually is not a “publisher,” and therefore not liable for content, under defamation law. Like a bulletin board on which anyone can paste any kind of flier or handbill, if the board’s owner makes no effort to vet or remove what is posted, they take no responsibility for the content of the flyers, and therefore can’t be held responsible. But as soon as the platform owner takes some steps to moderate content—say, by tearing down handbills that say mean things about the neighbors—the platform owner takes responsibility. Whatever they don’t take down, they implicitly endorse, and defamation liability may attach for failing to moderate adequately. Section 230 addresses the moderator’s dilemma by providing a safe harbor to online platform providers, which allows them to host and moderate user-generated content without fear of publisher or other secondary liability.
This special carve-out for online publishers has been part of the laissez-faire legal environment that helped incubate the rapid growth and unfettered creativity of the early internet age. It removed significant legal (and therefore financial) risks for internet startups, and allowed the growth of everything from Facebook and Reddit to Nextdoor and Spotify.
There are serious concerns that the anything-goes environment of big social media platforms under Section 230 can be abused by terrorists, mass shooters, child pornographers, human traffickers, conspiracy theorists, cyber thieves, and other criminals. But the MAGA movement has been in a tizzy about Section 230 for years because of the perceived (and perhaps sometimes real) political bias of the people and algorithms that moderate huge platforms such as Twitter. Now that Donald Trump has been deplatformed and Parler has been de-hosted, these objections have compounded.
The terrible irony in this “conservative” reaction to social media bias is that the proposals being floated read like the vodka-soaked ravings of a drunken…
Read More: Judicial Activism Can’t Fix Section 230