In this week’s Scheer Intelligence podcast, Electronic Frontier Foundation (EFF) Legal Director Corynne McSherry discusses with host Robert Scheer the internet control issues raised by Elon Musk’s purchase of Twitter and what may lie ahead for it and other social media giants.
Twitter along with Facebook, Google and others of that ilk have been treated as public utilities delivering a neutral product comparable to water or natural gas and allowed to develop monopoly dominance in their respective markets as a means of insuring efficiency. Towards that end they have been allowed to ignore antitrust restraints in gobbling up potential competitors and are protected by laws like the infamous Section 230 that shield them from libel and defamation suits.
But in the current hyper-charged political environment, the material carried on those networks is often judged to be anything but neutral or benign, and there are incessant demands for “content moderation” to exclude material some find toxic. The ascendence of Elon Musk to the ownership of Twitter has brought fresh publicity to that debate because Musk has at times spoken against some moderation that he defines as censorship. What will he do now in control of the world’s most influential forum for this current controversy?
Corynne McSherry, and the Electronic Frontier Foundation she represents in legal action, is a staunch opponent of political censorship but believes that responsible content moderation is now called for and fears that Musk will not deliver. Host Robert Scheer believes content moderation is a polite label for censorship in the hands of humongous private corporations that are not restricted by the free speech and press guarantees of our Constitution that apply only to government actors. Both agree that breaking up the large monopolies to provide greater competition is the best way to proceed.
Hi, this is Robert Scheer with another edition of Scheer Intelligence, which sounds arrogant, but the intelligence comes from my guest and in this case, Corynne McSherry, one of the better-known communications attorneys. But fortunately in this case, she works in the public interest rather than from some high priced law firm or company, and it’s the Electronic Frontier Foundation. And I must say, when it comes to civil liberties questions, free speech and what have you, I trust the EFF probably more than just about any other group. It used to be a close call with the ACLU, but they’re getting a little more political these days.
And so, when this whole controversy about Elon Musk buying Twitter, I thought what better group to turn to than the EFF and Corynne McSherry, who’s their top attorney. And people are alarmed. Elon Musk is quite outspoken. As we do this, he’s already now taking on Facebook, telling people to close their accounts. There’s a lot of issues about content moderation and what will he do. So why don’t I just begin by asking your take on the significance of his assuming leadership and what is at stake?
Well, so first of all, thank you for having me. It’s my pleasure even though I’m in San Francisco, a KCRW fan, so I always like to be on. So what I think is at stake here is a couple of different things. One is that Twitter actually is not nearly as big as Facebook, despite what some people think. But it is an important platform for online speech. People use it to organize, people use it to make jokes. People use it also to be hateful and hurt people. It’s a complicated forum and so its policies matter and it’s a little scary when one person is setting the policy apparently all by themselves. That I think rubs a lot of people the wrong way. It might not be illegal, but it makes a lot of folks nervous.
And the other thing that’s happening is some of his initial decisions I think don’t look great. So his threat to fire most of the content moderation people, I think that’s really dangerous. What Twitter actually needs more than anything is to hire more content moderation people if it actually wants to do a good job of being a strong [space] for speech. And I’m really worried that Musk will go the other direction and think he could just automate everything. And we’ve been studying this for many years and it’s pretty clear that you can’t actually do a good job of content moderation through robots.
Well, let me play devil’s advocate here, and I think Elon Musk is a breath of fresh air and that he’s making us aware that these are controversial matters and that people should be debating them. What do we mean by content moderation? It sounds so neutral, but the fact is it’s a form of political censorship. You could say you’re on the right side and it’s all—you’re preserving civilization and decency, but as we’ve seen already, the internet has gotten very tight. I have a well-known journalist writing from my website, Patrick Lawrence, who somehow his Twitter account got frozen. I know Elon Musk came to the defense of another writer that I published and others who are controversial.
It just seems to me who are these moderators? And I want to raise what I think is the basic question. These public… By the way, the writer was fumbling on was Chris Hedges, certainly a very important journalist who was with the New York Times for 20 years. And then somehow he runs into problems of this kind. And we saw that even the former president gets banned from Twitter. And that’s a bad precedent that a leading politician, particularly one who was president, can’t communicate on what was supposed to be a public utility. Right. Let me just begin with the legal question.
The reason these companies were allowed to become so big and antitrust wasn’t used or anything else, it was assumed that they were just being neutral. Whether you’re talking about Facebook, you’re talking about Google, you’re talking about Twitter and that. So it didn’t really matter. And that’s why when you even have an exemption in section 230 of the Old Obscenity Act, which says they can’t be sued for libel or anything else, it was just supposed to be a neutral activity and you hoped you didn’t get a lot of electronic static or something. Now we’re demanding that they become editors and regulate our dialogue. Isn’t that a contradiction?
Well, so let me clarify a little bit about Section 230 in particular because it gets a lot of attention these days. What Section 230 did is it said basically in layman’s terms that platforms wouldn’t be held liable for the content posted by their users under traditional state law like defamation. And as you just said, there’s some exceptions to that. We don’t have to pay attention to those right here. But the reason that law was passed and that provision was passed is because prior to that, platforms were in a tight spot because what had happened is there were a pair of cases that came down and both of them defamation cases, and in one situation the platform in question was moderating content on its site and it just missed this defamatory thing. And that platform was held liable. The other platform didn’t moderate at all.
They were like, “Hey, go forth, whatever. We don’t moderate.” And they weren’t held liable because the court said, “Look, you don’t have to intervene, but once you start intervening, now you’re taking responsibility for the content.” So that set up a situation where you would have platforms not moderate at all. And a lot of people didn’t think that was a great idea. So what section 230 actually does is it says essentially, if you moderate, it frees platforms up to do that. And in fact, I think in many cases we want them to be able to do that. I think many people don’t want to be, and people are leaving Twitter because they think it’s going to become a forum for hate speech and racism. Now I don’t know that that’s true or that’s going to happen, but a lot of people don’t actually want that.
They sometimes say they want it, but I don’t think they really do. And there’s lots of other platforms that aren’t like Twitter or Facebook that are just sort of regular knitting forums, stuff like that. And we want them to be able to say, “This is a space where we’re going to have political conversations and this is a place where we’re not going to have political conversations.” Section 230 is what sort of makes that all possible under the law. And then the other thing is the First Amendment protects the right of platforms to host speech or not. They get to edit their sites, they get to curate their sites. So to set a legal baseline, that’s where we are. And I actually don’t think that the biggest social media platforms have been neutral for a very long time. It’s been well over a decade that they’ve been actively engaging in content moderation, including YouTube for example.
The problem is they don’t do a particularly good job of it. So they end up taking down lots of speech that they shouldn’t take down and leaving up other speech that does violate their own policies. So usually what we’ve been trying to argue for is, please would you be consistent, be clear, be transparent so users actually know what you’re doing. And also make sure that there’s a way for people to appeal when speech is taken down improperly. And I think if we could get to that, that would be a very great outcome I think for everybody. I want to make one last point because I can’t resist making it. I know we’re here to talk about Twitter and Musk and that’s fine and it’s important, but I always want to remind people that the internet is not Twitter or Facebook or even social media. That’s just an area that gets a lot of attention.
But really the internet is your email. The internet is text messaging. It’s a million other things that the internet enables. And the reason I think that’s important is because one of the things that happens is that lawmaker come in and start yelling at big tech by which they mostly mean social media and then want to pass a bunch of new regulations that aren’t confined to just the big platforms, but actually could end up messing with all kinds of other bits of internet infrastructure. So that was maybe more of an answer than you were looking for, but-
Oh no, it’s an answer and it’s an answer that can be found in your very significant writings on this question. However, I think there’s a contradiction, frankly, and I know this has been the main argument in defense of Silicon Valley from the beginning, and that obviously the great fear in our constitutional system is not of private sector regulation, but of government regulation. And that’s why we have the First Amendment and other protections that are aimed at government. But here you have a situation where monopolies, which also could be considered in violation of constitutional protections of free trade and so forth, restraint of trade were allowed to develop, were not broken up. And that’s true. The big industries in Silicon Valley, they were allowed to become dominant in their area, gobble up all opposition. So you don’t have a free market, meaning if you want to leave Twitter, lots of luck finding something comparable.
You want to leave Facebook, lots of luck finding something comparable. Google, find an alternative search engine. They exist, but they don’t have the reach, they aren’t in effect public utilities. These companies got to be so large legally because the government said they’re basically providing a politically neutral commodity and therefore could be trusted to do that. That was their request for exemption from antitrust. Now we’re in a situation where a company could have such political power, it could deny an ex-president of the United States access to it. So what is Donald Trump supposed to do? Go find the alternative. Go invent the alternative. Or somebody like Chris Hedges who might be excluded, someone who’s excluded from Facebook, go find the alternative.
So we’ve allowed these restraints of trade, these subversions of the free market to become so big, these monopolies, and then we say, “Oh, you should form a board of people you pick to basically regulate the main arenas of potential debate in this country.” That’s where most people get the debate, the issues, learn about them and so forth. And we’re asking whether it’s Elon Musk or his predecessors, we’re saying, you pick a board that you trust to regulate this debate. That is a very dangerous concept, is it not?
Well, so I would mostly agree with you. I don’t think as a matter of law that they got an antitrust exemption. And in fact, Facebook in particular is facing pretty heavy scrutiny in private lawsuits. And from the FTC for a lot of what you just said about Facebook in particular. And because I think Facebook has such an enormous market power, not just over speech, but just over the market, over commerce, over all kinds of things that it’s really coming under a lot of pressure. And I think rightly so. I think that, and I think-
Well, can I correct you on one thing?
The pressure as you point out in your columns, I can’t complain about say Facebook. Let’s say Facebook decides to ban the website. I happen to edit a website called Scheer Post. Okay, let’s say they excluded, or their algorithm excludes me. I have no free press protection against what they do.
I do it. The government tries to do it. But the government has allowed Facebook to get to be this big. That to my mind is the contradiction. These companies are only allowed to have this enormous monopoly power because government has granted it to them, and yet they’re not subject to the restraints. The freedom to, requirement, the government would be required. They’re going to pick a private board who knows who sits on their board, who knows what their algorithm really says and why they get the right to ban in effect much of speech that they don’t like.
Right. Okay. So that’s what I was getting. What I’m trying to say is, for one thing, I do think that Facebook is under a great deal of pressure right now. Facebook in particular. But where I completely agree with you is that it is very hard to just switch, and that’s a real problem. Lots of people might want to leave Facebook, but their friends are all on Facebook and their family’s on Facebook. It’s a place where they communicate and they have community. And Facebook of course, is invested heavily in making sure that everybody’s stuck on Facebook or Twitter or pick your platform. So I agree with you that it is a real concern, it’s just a question of how we respond to it. So one of the things that I think all of these companies should be doing, especially if they want to avoid getting regulated and make it easier for people to leave if they want to and take their friends with them. Make it easier for competitors to rise up.
I’m old enough and I suspect you are too, to remember when there was this crazy service called MySpace that was a competitor to Facebook, and many people thought was going to win the battle. Many people thought we’d be all on MySpace right now, not Facebook. I continue to hold out hope that we will see new competitors emerge that we can’t even imagine right now because that sometimes is what happens in the history of the internet and technology, at least recent history. But nonetheless, it’s a real problem. And I just don’t know that that we’re going to see that. I don’t think that changes in government regulation are mostly going to be the answer to it except to the extent that the government can find ways to foster more competition so people have choices. This applies to YouTube as well, by the way, because YouTube is such an enormous force in the online video market, there really isn’t a meaningful competitor in the western world to YouTube.
So if you are an online creator, you make videos or even do news, you have a channel on YouTube, you have to have a channel on YouTube. You can’t just be on Vimeo. So they too, I think have an enormous power and control over online discourse. And for all of these companies, their terms of service are effectively the constitution for their little fiefdom. So this is all, I think, something to be really concerned about. I just don’t think that there’s a regulatory way around it, at least when it comes to the speech they choose to host or not host because the First Amendment says they can. And I think our balance, we want it to.
Well, but you are a leading constitutional lawyer. I don’t know if you define yourself that way, but you’ve been involved in major cases that involve the Constitution. And it seems to me that we’re up against a real contradiction in our political system. And it goes to the question of whether the constitution is a living document. And as the Roberts Court has said, a lot of liberal people who criticize the Roberts Court, but on the question of for instance, privacy protection in relation to cell phones, the Roberts Court declared that modern technology does not trump the intention of the founders. And that your data on a cell phone, when the police grab your cell phone, they can’t crack it and grab that data because it would be a violation of the Fourth Amendment. Now it seems to me that a court could say the same thing applies to protection of the first and amendment regarding press and speech.
That if a company has this power, this is not what the founders intended. In fact, the Constitution, you can find the basis as we have for breaking up that power. But as long as they have that power, they have to exercise it in a politically neutral way. Otherwise, we don’t have free speech and we don’t have free press. If the power of YouTube, which you mentioned is such that they can prevent people from effectively making movies or videos that get to the market, or Facebook doesn’t expose you to traffic, you don’t exist. The same thing with Twitter.
And so you’ve had a phenomenon just the last few years, thanks I think more to the Democrats than the Republicans actually constantly attacking these big organizations and demanding that they clean up their act. Well, that’s a call for censorship and just be able to say, I’m no lover of Donald Trump, but to be able to say that a former president of the United States cannot speak to the citizenry through something like Twitter, it seems to me as an enormous power that would violate the spirit of the free press protection and free speech protection of the First Amendment if we really want them to be vital in the modern world.
Well, there’s at least two state legislatures that agree with you, Florida and Texas, and both of those states have passed what we’re calling must carry laws. And effectively what they are, there’s differences between the two laws, but they’re basically saying… They’re trying to, they’re not very well written laws, but they are trying to say platforms, you have to carry all speech. If you’re going to carry X kind of speech, you have to carry a Y kind of speech, you have to carry all the views. You can’t discriminate based on political viewpoint, for example. And if you do, you can be sued for it. And one appellate court has said that’s okay. And another appellate court has said that’s not okay. And so we have two different appeals courts directly conflicting, which means it’s going to make its way up to the Supreme Court for sure, and we’ll get a final determination from them.But here’s why I worry about these laws. I think as a practical matter, actually making that a reality on any platform is essentially impossible. And the only way to do it is to not moderate at all. And I think that we would regret that very much if that actually was the world that we lived in, that all of the platforms just didn’t want to moderate in any way. One thing, by the way, I’ve been meaning to say in this conversation, I have to say I lose no sleep over the fact that a former president can’t speak. And the reason I lose no sleep over that is because he’s the former president of the United States. He has many, many arenas to speak. I’m not really worried about that we won’t get to hear what Trump has to say about things. And that’s true for many, many political figures who I think one of the reasons there’s a lot of focus on Twitter, by the way, is because politicians use Twitter in particular more than I think any other social media service.
But the reason I worry about the must carry laws is because I do think we want a world where the platforms, large and small, can make decisions about what speech they host. First of them aside, a lot of the moderation that they do is designed to correct misinformation and disinformation. It’s designed to intervene when a mob of people is going after a particular person and essentially driving them offline, silencing that person. Look, the problem is, from my viewpoint, it’s not that they moderate, it’s that they do it badly, they do it inconsistently. They have policies, they don’t follow them. And there’s very little transparency, I think for users to understand why they make the choices they make is very difficult, especially when it’s a big company. And when you disagree with the decision that a company makes, it’s really hard to appeal it, especially if you’re not in the United States.
And that’s one last point I would make is that let’s say we had a world in which we made it incredibly difficult for platforms to decide what content they’re going to host or not or what accounts they’re going to keep or not. Well, if in that world actually now you’ve got an international conflict because the laws of many other countries actually require that. And that’s another thing, by the way, coming back to Twitter, I have no idea what Elon Musk is going to do. If he’s going to really try to reconcile what he’s talking about doing for the United States with what Thailand requires, for example, or Poland or Germany or many other countries where, for example, he’s got Tesla factories and therefore will be under pressure from authorities in those countries. It’s really going to be a mess. And I don’t think that he really understands what he’s gotten himself into.
Well, democracy’s supposed to be messy.
Yeah, well, [inaudible 00:25:19]-
Right. Well, this is what the founders again warned us about because they knew that even the more limited press that we had could be quite scarless and attack them viciously. Nonetheless, they put this absolute in there as a requirement, as a good, and after all, the United States proclaims itself as an example for the world. And so what our rules are, particularly for companies that are registered here, considered American companies, becomes very important. And it’s quite true. Well, for example, in Brazil right now, you had a closely contested election. And fortunately so far, the old government, the more conservative government is willing to let the more the progressive government come in. But they could easily have declared it invalid and what’s really, not logically, but with their power, what’s important in these principles, enshrined in our constitution, is a notion of limit on power and an implication of transparency.
Unfortunately, with these private companies now, they are a target for government intrusion. Government holds hearings, brings the head of Facebook, now will bring Elon Musk, challenges him. You’ve messed everything up and demands that they then controlled the debate. And in fact, the internet was much freer, speaking as an editor working on the internet for the last 20 years. We’ve gone from a situation where there was a lot of free press, free speech. You could find an audience even without major resources to a very controlled media. You have people, for instance, consortium news, done by respected journalists who, because they took a position that people think is more favorable to Russia than Ukraine, they can be blocked and they’re not alone. And that happens against right wing organizations. Now, the Democrats, when they control Congress, they can intimidate heads of companies like Facebook, but then when the Republicans get in, they can do it.
And we know that these big private companies cannot be trusted to be some judicial moderator of content. And it is concerning to me that there isn’t more awareness. One of the really good things about having Elon Musk out there is he’s willing to make these controversies known, including by the way, charging for the service, something we should talk about before we end. Because in fact, if it is one of the real deceptions of the internet is that we are not the customers for Facebook, we the public, or Twitter. Now at least by putting a fee in there, he’s giving us at least the power of the customer. Do we want to pay for the service?
And so we’ve had this illusion that these companies that are doing this, yes, you’re mining our data, they’re exploiting our privacy. But at least Eon Musk has let the cat out of the bag that wait a minute, and if he starts charging, at least it puts it on a commercial relationship. Maybe others will come in and say, “We’ll give you the service cheaper and we won’t censor you”. But again, I want to end this by talking about this contradiction of government permitting monopolies as long as the monopolies do what government wants. That means we lose the protection against big government but we’re still at the mercy of its power.
So I think the concerns about the outsize power of a small group of companies are absolutely legit. And it’s something actually we’ve been working on a lot. We have a whole competition working group now. And the reason is not because we care about competition by itself, right in the abstract, but it’s because the lack of competition affects our civil liberties in a bunch of different ways. Because as you suggest, it means that a small group of companies, small group of people, have enormous power over not just our speech, but our privacy and our ability to function in the world and also on innovation, the new services, new opportunities that we might see emerge don’t. So it’s a very small group of people have a lot of control over human rights particular, and that’s important. And I do think it’s worthwhile paying attention to.
And in fact, partly one of the answers to a lot of the concerns about content moderation decisions to my mind would be to focus more than we do on what can we do to foster alternatives so that we don’t have to care. One of the things I often think is, I would really like it if I could go, I don’t know, 24 hours even without having to care about what Mark Zuckerberg or Elon Musk think about anything. But I have to, because my job is to protect civil liberties on the internet. And unfortunately those two people, just those two people have an enormous amount of power over what happens on the internet. That seems wrong to me. And I think the only question is how do you best get at that problem? And I think the way you best get at that problem is not by having new speech regulations or must carry laws, but rather addressing the more fundamental predicate, which I think you’re speaking to, which is that these companies are too big and because they’re so big, it’s very easy for them to shut down alternatives.
One of the things that happens is Facebook just buys its competitors and then there’s no competition. It’s not like there have been competitors that have emerged over the years, but they get acquired. One of the things actually that I think really should happen is when those acquisitions happen, when Google buys anything, when Apple buys new companies, there should be a lot more scrutiny of those mergers in advance. And we do have processes in place for this to happen, but it hasn’t been happening. There hasn’t been the rigor that there needs to be. The current FTC administration, I think is a lot more interested than any FTC I’ve seen before, the Federal Trade Commission in actually putting the screws to the companies and really applying that scrutiny and actually bringing lawsuits, which they’re doing. So I’m hopeful that actually takes us in the right direction and we have more alternatives down the line than we do right now.
Well, that’s a good point on which to end. I just want to say, I think what you’re saying is not just a possible suggestion, it is essential because I mean, come on. Silicon Valley has lived with an illusion of the free markets places if they’re all disciples of Adam Smith or Ricardo. And the fact is, you need a free market. You have to break up. The argument against breaking up standard oil was, “Oh no, it’ll hurt their efficiency. They have to be big and they have to control it or the pipelines will all be different”. We broke up standard oil and it didn’t destroy the efficiency of the petroleum industry. And I think we’re going to have to bite the bullet. You go either one of two ways, you either break up these companies and create the space for competition, meaning they can’t buy up their competitors.
That’s a restraint of trade and that’s a violation of antitrust and you’d be vigorous about it. Or you say, “Hey, they are public utilities delivering a neutral product like water and so forth. And others will have to worry about moderation. And in that sense, take away the fact that they don’t have to account for libel and all of the other things that would be accountable for. So I don’t know if you agree with that or not, but it seems to me there’s a fundamental contradiction here. We’re expecting companies that they have these secret algorithms, they have their moderator boards that are not accountable, they’re not elected, we don’t even know who they are, what their viewpoint is. And we’re expecting them basically to control debate in America. And if they can control debate in America, it makes a mockery of our Constitutional amendment protections. Is that a point on which we can agree?
I know it’s a mouthful then. I would hope that the… Let me get back to my celebration of your organization. I would hope that the Electronic Frontier Foundation would accept that. After all, to the degree that you have some libertarian spirit floating around there in San Francisco, there was a belief in the free market. The free market requires an invisible hand. It involves competition. It means you have to have antitrust. Does it not?
Yeah. So what I think I, we fundamentally agree. I think it may be that we might disagree somewhat on tactics, but I don’t think that we disagree that the fact that a small number of country companies has enormous power over the internet or online speech is a very serious problem. And nor do I think we disagree that in part, in very significant part, it comes down to the fact that we don’t have competitive alternatives and we need them and we need them yesterday. I just wish that more folks in Congress were actually focusing on that issue as opposed to where they are focusing, because it’s pretty frustrating from where we sit.
Well, that’s a positive point on which to end. I want to thank you for being so generous with your time. I always promise I’ll keep this under 30 minutes, but well, we’re close. Thank you very much. Karin McSherry from the Electronic Frontier Foundation. They have a great website, they have a lot of literature. It’s a very good place to start. If you want an education about freedom in the internet world, it’s probably the best source. I want to thank Laura Kondourajian and Christopher Ho from KCRW for posting these shows. The very strong NPR station in Santa Monica. I want to thank our own executive producer, Joshua Scheer, for having found our guests and putting these shows together. And I want to thank the JKW Foundation, which in the memory of a very independent journalist, Jean Stein, helps fund these shows. See you next week with another edition of Scheer Intelligence.