There’s a bipartisan effort to change laws that govern speech on the internet
There’s a bipartisan effort to change laws that govern speech on the internet
Internet companies, websites and web applications have a kind of legal immunity that they say makes the internet as we know it possible. First, they’re generally immune from legal liability if a Facebook or Twitter user posts something illegal. They’re also immune from liability if they take down a post they find objectionable. Users generally can’t legally challenge that.
There’s a concerted campaign underway in Congress to roll back some of that immunity. “Marketplace Morning Report” host Sabri Ben-Achour spoke about that effort with Daphne Keller, director of the Program on Platform Regulation at Stanford University’s Cyber Policy Center. The following is an edited transcript of their conversation.
Sabri Ben-Achour: Just how big is this effort to weaken these types of legal immunity the internet companies enjoy?
Daphne Keller: It is very big. It has support on both sides of the aisle, although you find that often what Republicans want out of it and what Democrats want out of it are inconsistent goals. But I think we should expect to see changes to this law in the near future. And if we’re lucky, there will be smart changes. And if we are unlucky, they will be not very smart changes.
Ben-Achour: Well, what is first the argument in favor of limiting these types of immunity?
Keller: People have a lot of different goals in proposing changes to this law. Some people want platforms to take down more harmful and offensive content. I think that’s actually very widespread. Lots of people want to see more content taken down from from platforms like Facebook or Twitter. Part of the issue is that a lot of that is what we call “lawful but awful” speech. It’s protected by the First Amendment. And so changing an immunity like CDA 230 wouldn’t necessarily change the appearance of that content anyway. But we also see people who want to change the immunity in order to get platforms to leave up more of this lawful but awful speech, and compel them to carry things like, maybe racist diatribes or anti-vaccine scientific theories, or even potentially electoral misinformation, and not give them the leeway to take that stuff down.
Ben-Achour: What’s the downside to having tech platforms, internet sites, be held liable for the content that their users post? Like promoting terrorism or child exploitation or something?
Keller: To be clear, they already face liability for a number of those things. Anything that is a federal crime, like supporting terrorism and child exploitation, doesn’t have a special immunity in the first place. But, broadly, the issue with putting liability on platforms for their users’ speech is that it gives them powerful incentives to err on the side of taking things down, just in case it’s illegal, because they don’t want to get in trouble. And you can imagine what that would have done to the Me Too movement, for example. If a platform felt like it had to take down any allegation that could possibly prove to be defamatory later on, obviously, that has a consequence for speech.
It’s also a problem for competition. If we changed the laws and platforms had to assume more risk for user speech or put in place expensive processes, that’s something that Facebook and Google could probably deal with, but their smaller competitors could not. And then, finally, if we have new obligations on platforms to police their users’ speech, that gives them reasons to adopt clumsy tools like automated speech filters. And we know that those have disparate impact, for example, on speakers of African American English. So there’s this mess of speech issues, competition issues, equality issues, and there are ways to respond to them. It’s not that regulation is impossible here. But almost none of the bills we’re seeing in Congress now really even try to grapple with them in intelligent ways.
Ben-Achour: Let’s turn to the question of whether you can sue Instagram [if it] takes down your post. What’s the downside of seeking to erode that immunity?
Keller: It means that platforms will have to worry more about lawsuits from really extreme speakers, like Alex Jones would be an example, saying, you have to carry my speech. Even if you don’t want to carry Holocaust denial, you have to. Even if you don’t want to carry organization for a white supremacist rally, like the one in Charlottesville, you have to. And we’ve seen a number of those lawsuits and the platforms ultimately win them all. And even without 230, they would ultimately win them all, but it would be much more expensive without 230. The nuisance cost of dealing with these lawsuits trying to compel platforms to carry speech that violates their policies would be significant. It would give them reason to just give up and carry it, rather than face that burden. And again, in particular, the smaller platforms who might be competitors to today’s incumbents are particularly unable to bear the burden of those kinds of nuisance suits that these changes would enable.
Ben-Achour: Under the Department of Justice proposal, a company would be protected from legal liability for taking down something that promotes terrorism or is unlawful, but they would be open to getting sued for taking down something racist or false about coronavirus — something just generally objectionable. What do you make of that?
Keller: I think this is one of a hundred cases where the gloves have come off in American politics, really in the past few months. And this one hasn’t had that much attention because people see this as regulation about platforms and technology. But the proposal from [Sen.] Lindsey Graham, the proposal from Attorney General Barr, and several other proposals, are just remarkably naked in their speech preferences, in the rules that they want platforms to uphold. And so they’re saying platforms should be encouraged and protected to take down some content, like pornography, but we should take away protection when they take down these “lawful but awful” categories, including hate speech, and white supremacist speech, and misogynist speech, and racist speech, and disinformation. The other thing that we see in the grant proposal, and then the DOJ proposal, the Justice Department proposal, is that they say to platforms, if you do fact-checking, and you put labels on people’s posts or tweets saying this is false or this is very debatable, you risk liability for that. You can’t even leave the speech up and put a label on it without getting in trouble.
Ben-Achour: Are we talking about this simply because the president is angry that Twitter moderates his tweets and Facebook takes down some of his political advertisements? I mean, is that what this is about, ultimately?
Keller: No, I don’t think it is. I mean, I think the specific proposals we’ve seen recently, in particular from the Justice Department, those were absolutely prompted by President Trump’s executive order. And that seems to have been triggered by Twitter putting a label on his tweets. But overall, the sense of a need to regulate platforms is bipartisan. It’s global. It transcends politics. And the sense that platforms are acting as gatekeepers of discourse, that they’re the new public square, and that it’s kind of crazy that private companies are setting the speech rules, that’s global, too.
In the U.S., it’s very much a conservative talking point. But I think in the U.S. and all over the world, everyone has concerns. It’s one of the biggest policy questions of our age. And that means that we should do the work to make smart laws. And if instead Congress just passes some hastily drafted, politically motivated law, that’ll be kind of a dereliction of duty, in my opinion. And we will all have to live with the result for years.
There’s a lot happening in the world. Through it all, Marketplace is here for you.
You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible.
Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.