Robert Scheer SI Podcast

James Steyer: Wrestling Back Privacy From the Jaws of Big Tech

The Common Sense Media founder is taking on Mark Zuckerberg and other tech barons and he wants to empower the rest of us to do the same.
[Illustration by Mr. Fish]

Over the course of just a few decades, technology has come to play a role in nearly every single aspect of our lives. While there have been undeniable benefits to technological advances, one of the main concerns that has grown alongside its presence in daily life is how tech companies collect, use and profit from our data in ways we’re often unaware of. James Steyer, a professor at Stanford University and the founder and CEO of Common Sense Media, a leading consumer advocacy group that promotes safe media and technology for families, joins Robert Scheer on this week’s installment of “Scheer Intelligence” to discuss how we can fight back against tech companies’ encroachments on our privacy. Steyer’s most recent book, a collection of essays he edited titled Which Side of History?: How Technology Is Reshaping Democracy and Our Lives, is dedicated to examining various aspects of technology featuring a wide range of voices from various backgrounds, such as Shoshana Zuboff, the author of The Age of Surveillance Capitalism, and actor Sacha Baron Cohen. 

“Technology is everywhere in our lives,” asserts the Stanford professor. “It is shaping all of our lives, our kids’ lives, our futures, our economies, but it’s also dramatically impacting [American] democracy and the way we live every day.

“The question we pose is [in the book is] ‘Which side of history are [tech companies] on?’” Steyer continues. “If you look at Facebook and Instagram, for example, they’re clearly on the wrong side of history when it comes to impact on democracy, elections, and the wellbeing of children and families.” 

A recent piece for The Guardian calls Steyer “the man who took on Mark Zuckerberg” precisely because of his outspoken criticism of the Facebook founder’s inability to address how his platform has become a mouthpiece for white supremacy and other forms of racism, as well as his work setting up the Stop Hate for Profit campaign. Campaigns such as these, which find ways to curtail Facebook’s power through ad boycotts and other means, are an important tool in the fight against big tech’s ever growing-power. In Which Side of History? many of the authors also offer other concrete ideas about how to challenge Silicon Valley’s outsized role everywhere, from classrooms to voting booths and beyond.

Listen to the full conversation between Steyer and Scheer as they discuss the possible break-up of tech monopolies and the work of NSA whistleblower Edward Snowden, as well as how a few tech companies, such as Apple, might actually be on the right side of history when it comes to privacy.

Credits: 

Host:
Robert Scheer

Producer:
Joshua Scheer

Introduction:
Natasha Hakimi Zapata 

Transcript:
Lucy Berbeo 

RS: Hi, this is Robert Scheer with another edition of Scheer Intelligence, where the intelligence comes from my guests. In this case it’s Jim Steyer. His author name on the book, Which Side of History?: How Technology Is Reshaping Democracy and Our Lives, is James P. Steyer. He’s a lawyer, professor at Stanford University, right there in the belly of the beast of this whole internet world. And I must say, the subtitle of this book, Which Side of History?, understates it. It’s a collection of articles by very prominent, expert people on where the internet is going, and the subtitle is How Technology Is Reshaping Democracy and Our Lives. I think a better subtitle would be “how big technology corporations are destroying democracy.” 

And the Stanford Magazine captured that about Jim’s two other books that I know of that are very important, in earlier critiques of Facebook and other companies. One is called The Other Parent, and what Stanford Magazine says is it “paints a frightening picture of greedy media companies.” I think that’s what your current book does, also. And his other book states it very clearly in the title: Talking Back to Facebook. And Talking Back is on–both of those books have a lot to do with Jim’s concern about children, how they’re impacted by technology. 

And I must say, my hat’s off to you, because you know, you’re taking these folks on right where they live. I mean, much of this internet world, originally developed by the Pentagon as part of Cold War military strategy, nonetheless really took off with Stanford as the center. And you teach there, and a number of these people that write in the book have actually been important to that. So why don’t you lay out how you would summarize this new collection, and the takeaways. Do we–as a congressional subcommittee just recommended as we’re going to air, that they be broken up? Are they that kind of a menace? Do you see some prospect of reform? Just first introduce the book. 

JS: OK, that’s great, Bob. And thank you so much for having me on your show. I have been a big fan of yours for many years. So first of all, I’m the head of the biggest kid’s media and child advocacy group in the United States, Common Sense Media. And Common Sense is also the largest tech advocacy group, not just in the U.S., but globally. So we have been raising these issues about the impact of technology on society for 15-plus years. Now, we have well over 100 million unique users who go to CommonSenseMedia.org for ratings and reviews of movies, TV, video games, websites, apps, books, et cetera. So the broader public–and I’m sure our listening audience knows Common Sense mostly for the consumer platform that’s so popular with parents here in the U.S. and around the world. 

But the truth is we’re significantly the largest advocacy group in this space, both in the United States and globally now. So we for example are the people who spearheaded, authored and spearheaded the passage of the California Consumer Privacy Act, the landmark legislation in 2018 that is basically the national law of the land regarding privacy. We’ve passed and spearheaded most of the major tech legislation in the United States over the past 10 years, usually framing it in the context of children and families. But as you can tell from this book, we also are big players in issues related to the impact of technology on democracy. And for example, most recently, we’ve been partnering with the ADL and the NAACP to run a major campaign against Facebook called Stop Hate for Profit, which led to a massive advertising boycott of the Facebook and Instagram platforms, and other sanctions against Facebook, which we considered the biggest offender of all the tech companies. 

To your beginning question, I agree, I could have had a more provocative title in terms of really going after the big tech companies. But the truth is, what Which Side of History? is about is a series of essays by friends and colleagues of mine in the field about the impact of technology on society. So some of them are from people like Kara Swisher, or Sacha Baron Cohen, or Marc Benioff, the CEO of Salesforce, about the bigger impacts of technology on society. Some of them are about how tech is hurting kids, so that’s people like Willow Bay, the dean at the Annenberg School at USC; Madeline Levine, the well-known psychologist; Chelsea Clinton, who writes about kids’ brain development. Then we have chapters in here about big tech being a threat to democracy, and that includes everybody from Larry Lessig, the Harvard professor, to the conservative Yuval Levin, who has written really well about the internet, to Senator Mark Warner, who writes about the assault on civil discourse.

And then, honestly, we look at issues about breaking up big tech. People like Tim Wu at Columbia; Ken Auletta, the journalist who writes for the New Yorker; Tom Friedman, the New York Times columnist. And we write about tech and race with some of the leading folks in the country around that, like Michelle Alexander, or Shaun Harper, another USC professor; Ruha Benjamin, a Princeton professor. This is–it’s a broad collection of essays, but there’s one theme, which is this: technology is everywhere in our lives, Bob. It is shaping all of our lives, our kids’ lives, our futures, our economy. But it’s also dramatically impacting our democracy and the way we live every day. 

So it’s a big book, and the question we pose, and I’d love to discuss with you, is which side of history are these guys on? And I would say that if you look at Facebook and Instagram, for example, they’re clearly on the wrong side of history when it comes to the impact on democracy, elections, and the well-being of children and families. So that’s a long-winded overview, but this is a broad look at tech’s impact on society at an absolutely critical moment in American history. 

RS: OK. So let me stipulate at the outset, I spent quite a bit of time reading this book, and I want to push it, I want to recommend it; I think it’s a really necessary part of the discussion. And so I don’t want to, in anything I do now, to dissuade people from reading it. However, I am going to push back, because I think it’s only part of a discussion. And frankly, it contains some alarming points to my mind. First of all, let me stipulate, I agree with the Democrats on the House Judiciary Subcommittee who recommend breaking up these companies. Whatever happened to anti-monopoly? That is true, they have enormous power. And they’ve been indulged, by the way, by both Democrats and Republicans as this wunderkind, and they’ve been allowed to grow enormously. And their business model is atrocious; it’s basically exploiting our privacy, certainly for Google and Facebook; less so, say, for Apple, that actually makes things. But the fact of the matter is, everyone was asleep at the switch. 

And I recall, because I was covering this as a reporter for the Los Angeles Times back in the 1990s when Bill Clinton was president, and we had the Financial Services Modernization Act, one of the most horrible pieces of legislation that deregulated Wall Street, and which Bill Clinton did with the Republicans. But at that time a conservative columnist, William Safire, who had actually been Richard Nixon’s speechwriter, supported people like Ed Markey, who was then in the House–he’s now the senator from Massachusetts–and Shelby, who was a conservative from Alabama. Nonetheless, they said something very important about the deregulation of Wall Street. They said, if you’re going to allow these commercial and investment banks and insurance companies all to merge, the most important thing they’re merging is the data about Americans. Their most intimate medical records, financial records, and so forth. And they pushed, at that time, for something called opt in. Not opt out; they’re different. 

And in your book, in your essay in the book, you favor opt out. Now, opt out is the reason–just to explain to people, that’s where if you learn they’re doing something with your data, and you’re offended, you write a letter, you complain, and then hopefully they’ll drop the use of your private data. You know, your shoe size or what books you read or what movies you watched, or who your friends are and so forth. Opt in means when they gather your data–which is the only reason they’re giving you this stuff for free; the reason we don’t pay for Google, the reason we don’t pay for Facebook, is because we are not the customer. We’re the sucker whose privacy is being exploited; that, your book and your writing makes quite clear. 

But the only way to interfere with their profit model–and these are enormously profitable companies, to the expense of traditional journalistic outlets and what have you, which are all floundering–is to require that if they’re going to use your data, what book you bought at Amazon or who you ate with, that you have to opt in. You have to give specific permission for them to use that. And if I have–so one error in what you’ve written in this current book, you stress opt out, which I think, frankly, is a cop-out. Because it doesn’t really work; it just inundates people with a lot of mail that they don’t read. And the way to get at their business model, even more important than breaking them up, is to require them to have the explicit permission of their customers to use their private data. And as I say, William Safire is someone who grasped it back in 1997, ‘8, ‘9, and Bill Clinton didn’t; he brushed it away, and he gave these banks and insurance companies the right to do whatever they want with our data. 

JS: Yep. Can I answer?

RS: Yeah, you can take the–

JS: I agree with you.

RS: Oh, OK.

JS: Bob, I agree with you. I believe in opt in. Let me tell you, I’ve had this argument with senators, with President Obama, probably with President Clinton; I agree with you. I think the right standard actually is opt in. And Common Sense, when we passed the California Consumer Privacy Act in 2018, the landmark privacy law in the United States, we initially proposed an opt in, explicit permission standard. So I actually agree with you, personally. The truth is, we’ve never been able to get anybody in the government to go that far with us. Maybe Bill Safire did years ago, but we’ve never been able to. But I agree with you.

So I’ll start out by saying you’re right, opt in is really the most protective standard for consumers. I would go there, personally, as Jim Steyer; I think Common Sense Media would, too. But it’s never been really on the table. Obviously, the tech companies don’t want that at all. There’s a really good piece in Which Side of History? by a professor at Harvard Business School, Shoshana Zuboff, called “Surveillance Capitalism.” Now, that’s a great way to look at the absolute hoovering up of data by these big tech companies. And let’s be honest, data is the new oil. It’s the new holy grail. And they have absolutely hoovered up everybody’s data without any regulation over the past 10 to 15 years. 

But I’ll be frank with you, Bob, I agree with you that opt in should be the standard; but we’ve never been able to get there close, politically, in any of the discussions we had. And so when we initially wrote the CCPA, the California law that is the law of the land in the United States–and even in GDPR, which is the European privacy law–we always wanted opt in. It’s just, we’ve never had the power to get it. But I would agree with you. I think you’re right philosophically, and I think before they use your data in any way, you ought to opt in for it. So I actually agree with you.

RS: OK, so just to let listeners know what we’re talking about, the difference is that if they know, they’ve learned what book you’re reading and how far you’ve read in that book, and who you had dinner with, and who you went home with, and all this detail–whether you’re Yelping or you’re Facebooking or you’re Googling, all this enormous data, this minutiae about you, and they’re going to use it for a different purpose than finding the restaurant, OK–they have to tell you. And they can tell you they don’t want to do it, but the reason they don’t want to do it is not that it’s difficult technologically; it’ll cut into their profit. Because the thing they’re selling to advertisers is your privacy. 

JS: But Bob, can I just make a point? Here’s the thing. I agree with you totally, but here’s the truth. We’re in the business of getting stuff done. I run the most important tech advocacy group in the United States, if not globally. And that’s only part of what we do. To pass legislation, we’ve had to compromise on certain things. And honestly, the difference between opt in and opt out is a big difference. But we could have never passed this landmark privacy legislation in California–which is the law of the land in the United States, essentially–without having made that compromise. But I agree with you personally–

RS: OK, but let me try to be optimistic–oh–

JS: –Bob, we have a law now. We didn’t have a law two years ago. [overlapping voices] Legislation requires some degree of compromise, even with some of the basic principles. But I agree with you philosophically, completely. I do. 

RS: All right. So let me give your cause an optimistic view of this. I believe very much in exploiting the contradictions of capitalism. Or the contradictions of any system; as Leonard Cohen said, there’s a crack in everything, that’s how the light gets through. OK, so we had some light recently, because not all of these companies have the same exact interest or profit model. And let me just give you the tension–I’m sure you’re aware of it–between Apple and Facebook that has surfaced recently. 

JS: Exactly.

RS: OK. Apple has embraced, pretty strenuously and consistently embraced a notion of privacy protection. Why? Because, in fact, people won’t trust their phones or their computers, particularly because their multinational sales are critical. Why should people in another country, whether in one of the European Union countries or in India or China, trust an Apple iPhone if Apple is just using their data, either by giving it to the U.S. government or using their data to market with other companies? So privacy becomes very important to people who make these products that we’re using. Apple has pushed back effectively on privacy, fighting against back doors by government, fighting to protect your privacy. Facebook sees it as fundamentally threatening its business model.

JS: Exactly.

RS: Because Facebook makes its profit by this. So we should recognize there are conflicts within these ruling giants of the internet world. And I’d think a company like Apple would support a much more extensive protection–

JS: Can I comment there? 

RS: Yes, of course, that’s why we’re doing the interview.

JS: Bob, you’re absolutely correct–Bob, you’re a wise man, indeed. You’re absolutely correct about this. And one of the things we told people when we passed the consumer privacy law in California–which, again, is the U.S. law, essentially–we passed it because we split the tech industry. Because Apple supported us. Because Salesforce supported us, and Microsoft supported us. And initially, Google and Facebook were on the other side. And then they wimped out at the end and joined in because we were going to get a unanimous vote in the California legislature. 

But you’re absolutely correct: you cannot think of the tech industry as monolithic when it comes to privacy. And you’re absolutely correct in highlighting Apple as having a completely different business–it’s all about the business model, stupid, as James Carville would say. And so if your business model is based on advertising, and about using my data, selling my data to other people, then you do not want an opt in standard, you don’t want privacy, you want to sell my data. But if you’re Apple or Microsoft or Salesforce, you’re OK. And that’s how we passed the law. So you’re absolutely correct about how this really works. And it is all about the business model, and that’s why I referred you to Shoshana Zuboff’s article, because I’d say she’s the preeminent scholar in the United States about surveillance capitalism, and how you monetize other people’s data. 

We have a ballot initiative on the ballot in 2020 here in California, Prop 24. It’s not a perfect ballot initiative, let me be clear. It doesn’t contain a number of things we would have liked to put in there, but it will pass. And the voters in our audience will vote for it, because it does improve privacy protections for consumers in California. And, same thing: if you are in the business of making all your money by, as Facebook does, by monetizing your and my data and selling it to advertisers, you oppose Prop 24. But if you’re Apple or Microsoft or Salesforce, you’re fine with it. And you have, as usual, Bob, hit the crux of the argument. It’s all about the business model. But we understand that. And that’s how we’ve been able to be so effective, even though we’re up against the largest companies in the world. 

RS: OK, so let me test the waters even further. [Laughs] And I appreciate–the biggest problem I have–not with your work; I think it’s great, and I think the book is important. I just don’t think–I think it goes–and this, I think, is a serious concern, not just in your approach. But when we had the Edward Snowden revelations about how easily our government, the U.S. government, was able to gather this data that we’re talking about, this private data that was collected in the market situation by Google, by Facebook, by Apple and others–and in clear violation of the Fourth Amendment to the U.S. Constitution. And after all, our protections in this country are primarily against government overreach, not, unfortunately, against private sector overreach. But the reason being that governments can kill you, arrest you, or threaten your freedom in a very fundamental way. 

So the wisdom of the founders was to try to control government, and these protections are sacred. What we have learned, thanks to whistleblowers like Edward Snowden and others, [is] that as bad as the private sector is in gathering this data, exploiting it, taking advantage of you–that doesn’t rise to constitutional protection of our system. You know, you can say, oh, they should be better actors and so forth. What rises to constitutional protection–and you’re the lawyer here, I’m not–but clearly is government grabbing that data. And what I found, frankly, disappointing in the book is that I really–I saw something about, a couple of people had essays about, you know, Russia interfering in our election, and you mentioned that as well. But the fact is it’s our–our government has set the model for the world in grabbing the data that private businesses collect, and using it, they say, to make us more secure. But they throw it–you know, what the NSA does, what the CIA does, and so forth. And again, I’m going to get to these contradictions of capitalism. Because these companies have to sell their products multinationally in other countries, that gives them a bad rep if they are just–if they have back doors for the CIA or the NSA or what have you. 

And so we’ve actually had pushback–and let me say something favorable about Facebook here–that Facebook, once the word was out that government was doing this, it cheapened their product internationally. So it did for Apple, so it did for Google. And there was actually, they have a joint website, there was pushback against NSA, against the CIA, against the FBI, including when cell phones are broken into and so forth. And as you know, the really most important decision was by John Roberts, you know, in Riley v. California, saying the police can’t grab your cell phone and grab that data, because it violates the Fourth Amendment. And to my mind, the Fourth Amendment in the U.S. Constitution is turning out to be the holy grail of freedom in the internet age. And I don’t–I think we really, if we’re going to talk about the private companies grabbing all this data, the far more ominous danger, whether it’s our government, the Egyptian government, which grabbed that data after the Arab Spring, the Chinese government, the Indian government–governments all over the world–the U.S. government has set the example. As bad as the private companies are, at least they’re restrained by consumers rejecting them. But when the government does it and lies about it, that is really the ominous threat that our Constitution was designed to guard against.

JS: OK, but so let me ask you this. What’s your question to me? I agree with what you just said. I am no fan of Edward Snowden, though. Let me be clear, Bob. I’m not going to get in here and support what Ed Snowden did. I am not a fan of his at all. But I obviously agree with you that government surveillance and the Fourth Amendment–as a con law professor at Stanford University for many years, I agree with you. But I would tell you that the issue about surveillance capitalism, and about the fact that Facebook in particular has just taken data and used it in so many different ways that are inimical to important aspects of our society, is an equally important problem. But I agree with you about the government issue. 

But I’ll tell you, I actually think in some ways the issue of what Facebook and others have done is a bigger deal today for the average citizen than even what the government’s doing. That’s worth debating. But I agree–it’s funny, Bob, I agree with much of what you’re saying, as usual. And since there are 50 incredible contributors to Which Side of History?, some of the most thoughtful and respected writers in the world on these topics, I didn’t control what they wrote. But I agree with you very much about the Fourth Amendment, about the incredible importance of protecting us from government surveillance and intrusion in our lives. But the role of the tech companies has been just phenomenally important, and not so good.

RS: Well, I certainly agree. And I said at the beginning, I think they need to be transparent, they need to be broken up–I mean, just the very idea that you could talk about the wonders of capitalism and deny what Adam Smith and Ricardo and all the great classical economists were talking about, and say that somehow now in the age of the internet, cartels and monopoly capitalism is the way to go. And by the way, the guy who’s advanced that most is one of your neighbors there, I still think, in–well, you’re not in Silicon Valley, and down there by Palo Alto. Yeah, but Peter Thiel, for example. A very good example of the crossover here–

JS: He’s ridiculous, he’s ridiculous. Come on–

RS: No, but don’t say he’s ridiculous, he’s very powerful–

JS: My wife was Peter Thiel’s RA at Stanford, Bob. Peter Thiel–

RS: OK–

JS: He’s a pure libertarian. He’s a pure libertarian.

RS: If he were a pure libertarian there’d be no problem, because at least then he’d be consistent and honest. And a pure libertarian would not believe in the concentration of capital, they would believe in the invisible hand and a free market. So let’s not–OK, I want to make my point, though, because this to me is very critical. Here you have a company, Palantir–if we talk about data being the new oil. The new oil, OK. Palantir just went public this week, and they were disappointed that they ended up selling, what, $20 billion worth of stock or something. They thought they would sell $40 billion. Palantir is a company that was started by the CIA. That’s not a conspiracy theory. 

And by the way, I can tell you, I can’t think of a person in the world right now I admire more than Edward Snowden. Because I wouldn’t be able to have this discussion now in an informed way–an informed way–if Edward Snowden had not told us, just like Daniel Ellsberg did in the Vietnam War when he revealed the Pentagon Papers, because I would be at the mercy of people saying, oh no, the government doesn’t grab data from Facebook and Google. But thanks to Edward Snowden and the volume of what he released, they can’t argue that. So they were revealed to be in cahoots with the government. But let me put that aside. Let me take Palantir as an example–

JS: Much as I respect you, Professor Scheer, I agree to disagree with you about Edward Snowden. Having said it, I agree with a lot of what you’re saying overall, as always. 

RS: OK. One of the great, I would say, actually the most important citizen we’ve had in the last 20 years. I just can’t tell you what courage and–and there were no channels. But let’s not go into that. OK. What I do want to say, I want to hold this up as an example, OK. Here’s a company started by a company called In-Q-Tel, was a venture capital company started by the CIA, owned by the CIA, OK. That isn’t something we would know. That’s a secret, that’s dark. We’re not allowed to know it; you go to jail if you reveal it, OK. So this company called In-Q-Tel then provides some of the capital for Palantir when it starts. It becomes Palantir’s only client for three years, total secret. And for three years this private company, which just went public and earned $20 billion–and these people, once again, Peter Thiel, who invested in it, who came from PayPal and so forth, and was a big investor in Facebook and what have you–OK, this company was given access to the most secret data files of the CIA. And this company then developed its investigative tools working in the CIA for three years, in this super-secret agency. And that’s how they developed their whole technology of data mining and what have you. They then go out and they get other clients as well, but they also remain the main advisor, or one of the key advisors, to all of the security agencies in building their clouds and everything else, the NSA and what have you. This is–now we’re talking Orwell, OK? This is where I’m heading, OK. 

So they are there, and this guy Peter Thiel that a lot of liberals don’t like, because he spoke at the Republican convention–by the way, one of the best things he’s done, because finally we had a gay man speak to Republicans, so that has to be good for human progress. But nonetheless, yes, he supports Trump and what have you. But this man, he knows everything about us, which is the title of my most recent book on this subject. And the fact of the matter is, that is what is really at stake here. We citizens are surrendering our private information to private companies. We have the ability to fire them–and one reason I am for breaking them up is that we shouldn’t have just one search engine. The other search engines should have a chance, and Google and Apple and Facebook should not be able to gobble up all their competitors. 

JS: I agree–

RS: It’s a clear violation of the free market. But I want you to agree with something I’m going to say right now. The Orwellian menace–and this is every society in the world; if you don’t believe it about our government, just think about all the information–the people in Egypt, there was a lot of cheering because Google, some Google executives were helping in Egypt during the Arab Spring. But those people most active in the Arab Spring were rounded up by the Egyptian military dictatorship. Some were killed, some tortured. Why? Because their government can get all the information that the U.S. government can get from this data, by breaking into coaxial cables and everything else. And Eric Schmidt of Google, in his book [unclear], admitted that even two-bit dictatorships can now take off the shelf stuff that J. Edgar Hoover couldn’t dream of. So we had the example in our own society where our FBI tried to destroy Martin Luther King and drive him to suicide–well, any government in the world now can get that information, or make up information, plant information about anyone. And the brilliance of Edward Snowden’s revelations was he told us our government was doing it when the best minds in our congressional committees that are supposed to monitor the government didn’t tell us, and they knew. The Senate Intelligence Committee, the House Intelligence Committee. And we were not forewarned, so–

JS: Bob.

RS: –ah, yes.

JS: My point is this. So I–look, as usual, you have a lot of really wise things to say. But here’s the thing. I agree with you, and I think that we–and we have to be concerned about government surveillance, whether it’s in the United States or in other–can you imagine saying “other authoritarian countries like the United States,” but you sort of feel that way right now. Having said it, these are big issues. Peter Thiel may have been right in this one area, but he’s been a huge problem in many other areas.

RS: No, no, I’m not saying he was right. He’s one of the owners of Palantir!

JS: I know he is. But the thing is, you’ve got to look, you’ve got–but you’ve got to understand from a consumer standpoint, and from like a kid’s and family standpoint–that’s the field that I work in and run the biggest organization in the country on. And look, the behavior, particularly of Facebook and Instagram, but also some of the other companies, including L.A.-based Snapchat, for example–are really, really problematic in this area. And I agree with you about the concerns, and I teach about this in my courses at Stanford, about government surveillance, if you will. But I also feel incredibly about just the pure market-based surveillance that’s going on. And that’s why we were the founders of, co-founders of that Stop Hate for Profit effort against Facebook. It wasn’t just about the fact that they amplify hate speech and racism, and have been a gathering place for white supremacists, and all the far-right insane militia groups. 

So these are really big issues, and you shouldn’t lose the forest for the trees. That’s all I would tell you, is there’s many more issues involved. And I actually think that Which Side of History?–again, I’m just the editor; I have my own piece in it about Section 230–but Which Side of History? is, you know, trying to paint all the big issues here. And they are big–you’re correct, these are really big issues facing our society because of the incredible pervasive impact of technology in our lives. So, you know, so all well and good in terms of limiting the impact of government surveillance. But let’s look at the bigger picture, too. And I agree with you on like 90% of what you’re saying. 

RS: OK, and I hope you can agree with this concluding idea–

JS: Not about Edward Snowden as a hero.

RS: Oh, well, you should reevaluate that. [overlapping voices] Was Daniel Ellsberg a hero?

JS: He’s not Daniel Ellsberg to me. I’m sorry, I don’t think he’s the same. But let’s move on, because we could spend too much time on this.

RS: But just make me a promise. Just think, when we’re not on the air–and we could talk later, because you know, I have a lot of respect for you. And just think about what is the real difference between Daniel Ellsberg and Edward Snowden. You don’t have to answer that now, that’s not the subject–

JS: Fair enough.

RS: OK, but let me try to conclude this with a point of unity. Because I do think your book is very important, and I think the work that you’ve done is probably the most important. I began by applauding your courage. 

JS: Thank you.

RS: You are–I mean, actually, I shouldn’t put down Stanford, because you have some of the best privacy advocates at Stanford. And you know, and by the way, this is why I respect libertarians like the Electronic Frontier Foundation for being consistent as libertarians. And if you’re libertarian, you believe in the free market, you don’t believe in monopolies and cartels. So let me defend honest libertarians. 

But I want to have a point of unity here. I think that Facebook should embrace your book. And let me explain–let me explain why. And again, it goes to the contradictions of capitalism, and multinational capitalism. Because Facebook, their profit is international. And everybody knows the American market is a market [that] is saturated; that’s why you have to get into China, you have to be credible in India, Brazil, everywhere in the world, and certainly with the European Union and so forth. Your book is valuable because it exposes–and your writing, up to now, your work. Absolutely invaluable, because you had the courage to expose the chicanery and the cynicism and the greed and the lust for profit of these companies. 

You deserve maybe more credit than anybody in exposing their corruption of purpose, their greed. OK, I just want to be clear about that, which is why I’m promoting your book. And I think that comes through. The very wide variety of people that you managed to put into this book–and we should give the title again, [Which Side of History?]. And the point is, either you are for protecting the key ingredient, the freedom–which is, again, encapsulated in our Fourth Amendment–if people don’t have private space, if they’re constantly manipulated and their privacy, whether in their home or anywhere else, is destroyed, they can’t be free, they can’t be independent, and they can’t have the courage of their convictions. They’re just exposed out there. 

So privacy is absolutely critical, and I applaud you for championing it. But I think the, again, quoting Leonard Cohen, the great songwriter and singer, there’s a crack in everything, that’s how the light gets through. The crack here is that multinational corporations are going to be undermined by their home government’s obsession with invading privacy and with a totalitarian impulse. Because they have to be credible all over the world. That’s their business model, even more important–and this is what drives Apple to take a very progressive position on this question. Because they have to be credible to everyone in the world. If there is a back door into your iPhone that the CIA can exploit, and get your information, why the heck should anybody in the world buy an Apple iPhone? They’ll buy one made by some European company if they’re in Europe; they’ll even feel better about buying it if they’re in China, at least the devil you know, and the other. 

And so what you have here is that, yes, as long as this was not known–revealed by courageous whistleblowers like Edward Snowden–but as long as it was not known, these companies could get away with it. Oh, we’re collecting your data, but it’s just to give you a better shopping experience, it’s to help you find a better restaurant, or these ads appear. Now those ads that appear on Facebook are creepy. How did they know I wanted a suit or a dress? How did they know I like Chinese food? How did they know this? And companies like Amazon and Apple, they’ve also specialized in that marketing. What is Amazon but getting all this–Amazon operated at a loss for many years, but they were gathering data. And the Amazon, much of their profit comes from their cloud, and the storing of data. And you’re right, data is the new oil, OK? But as long as it’s done by, not monopolies, but private enterprises that are kept reduced in size, as long as you have competition–then, say, a browser, and there are some that promise you a better level of privacy or allow opt in, they have a marketing tool. They can say, we’re more reliable. But if they suck up all the air and buy all the competition and form a monopoly or a cartel, you don’t have choice. 

So if we can do what you want to do and break up these companies or have competition, then at least consumers can find privacy-friendly companies, OK. But the only reason these companies are going to change is by the embarrassment of people all over the world knowing that they are pawns of their host government in the United States, or elsewhere. And that the governments that they’re living with can do the same thing. So the governments in the Emirates or Saudi Arabia, the government in Brazil–they can get that data. And so they have a stake in switching to the pro-privacy mode. Apple was the first; Microsoft dragged along. Apple, I must say, my hat’s off to them for embracing this as a marketing tool, privacy as a marketing tool. 

And the way we’re going to get some progress here–because we’re not going to have, suddenly nationalize all these companies or anything; that’d be worse, the government would get all the data directly. What we have to do is the thing you’re advocating, and I guess the congressional Democrats now are going to get behind it, is preventing their gobbling up all the competition, so you do have competition. But the other thing is to keep government away from that data. And remember, even if you love your own government and think it’s never going to do anything bad–which is really the beginning of accepting tyranny–you’ve got to remember, these products are being sold all over the world. And so somebody buying it in China or Saudi Arabia has to feel that an Apple product has some privacy protection that will apply to them as well. 

And I can tell you right now, as a teacher as you are, doing my classes online, I’m talking to students around the world who have gone home and yet are taking my USC classes. And I’m happy that you had our dean, Willow Bay at the Annenberg School, in your book, and I think it’s a really important contribution. And I respect a lot of what Annenberg, our school at USC, has done, same way you respect Stanford. But talking to students now who are using Zoom, for example–developed, actually, I give him credit, though, but somebody who started out with an education in China. I’m not putting down Zoom; I think it’s been a very useful tool. But the fact is, our students now back in their host country–whether, again, it’s Brazil, it’s Saudi Arabia, it’s China, it’s where have you–suddenly, the protection of their information and privacy is critical to their safety. Their safety, OK? And the U.S. government has set the standard, the horrible standard, for intrusion on their privacy. That has to be recognized. Our government has been exposed as the greatest invader of this private space of internet communication. So I’ve got to put that in as sort of my own closing little editorial, but I’ll give you the last word.

JS: Well, here’s the thing, Bob. First of all, you are one of the great thinkers that I’ve known for many years, one of the really great investigative journalists, and married to a fine journalist as well, whom I’ve known for years. I think, Bob, you’re raising such important points. I mean, the reason I wrote Which Side of History? is because whether we like it or not, technology has this incredibly important impact on all of our lives–all of our lives. And it’s incredibly important that we recognize that as a society. Whether you’re a parent, whether you’re an educator, whether you’re a social advocate or a journalist like you and your bride. You know, technology has revolutionized virtually all aspects of our lives. 

And I happen to be fortunate to run what is the most important advocacy group in the field, in the United States and potentially globally, which also happens to provide a lot of services for parents and educators around the world, in addition to the advocacy work we do. And this is a discussion that we need to have. It is almost as important–I didn’t say as important, but almost as important as the presidential election that we’re all facing. I mean, this is an existential issue in terms of how much technology is reshaping our lives in our democracy, and which side of history these companies are going to be on, and how we’re going to hold them accountable. 

So if Which Side of History? can provoke an incredibly important societal discussion about those issues, and get people as smart as Bob Scheer and the other people who are reviewing this book and focusing on it, to really hold the tech companies accountable for their behavior, and ask really important questions of our government leaders–whether it’s at the federal level, the state level, or even at the local level where you’re based in Los Angeles–then this book will have succeeded. Because these are fundamental, existential questions that we as a society face. And it is fair to say, it really is, that the answers to those questions will have a big impact on your children and grandchildren, my kids and grandkids, and the rest of our society. 

So the opportunity to hear your sage and thoughtful input, Bob, and to be part of a conversation with you about which side of history the tech industry is on, is a real privilege for me. And I’m absolutely grateful to you for welcoming me to your program, and let’s have many more discussions like this. On college campuses around the country, in living rooms and dining rooms around the country. And ultimately, let’s shape the kind of world that we all believe in and want to have. 

RS: Yes, and let me just say, you mentioned my “bride,” so people might be confused. She’s Narda Zacchino, and she was the associate editor, or Deputy Editor, of the San Francisco Chronicle [and a top editor at] the L.A. Times. And she was an early supporter of what you’ve done, and she as a journalist thought you were really on the right track. And she’ll probably be furious at me for giving you a hard time about some of these issues. 

And let me end the way I began. I think this is an incredibly important book. And I think it’s indispensable reading, because what it does is it establishes the power–the power of this new technology, which we have all taken for granted. I say the internet is the best and worst of all worlds. And we wouldn’t be able to teach now–we’d probably have mayhem if we didn’t have the internet right now. We have correspondence, we have information. So we have the best–we can get good news out, we can learn, we can study, we can do a lot. And this pandemic, if you compare it to the great pandemic of 100 years ago, we’re managing it much better. On the other hand, it is also the worst of all worlds, for the very reason that you describe in your book, and your contributors. So, read the book. 

And with that, that’s it for this edition of Scheer Intelligence. Our technical engineer and leader is Christopher Ho, who puts these things up on KCRW; we thank him. Natasha Hakimi Zapata, who writes the introduction. Lucy Berbeo, who does the transcription. Joshua Scheer, the producer. And I want to give a special thanks to the JWK Foundation, which in the name of a great writer, Jean Stein, has given me some funding to help me do these broadcasts. I want to shout out, read Jean Stein’s legacy of work if you want to know why. And that’s it for this edition of Scheer Intelligence. See you next week.  

4 comments

  1. James Steyer’s snide remarks about Snowden are a ‘red’ flag

    I would question this guy more about that

  2. When the Guardian asked an AI institute to have one of its robots write a one-page essay on Peace, the bot complied.
    In the very first sentence it said “I am not a feeling brain.”
    No feeling, no heart, no compassion, no empathy.
    No algorithm seems in position to program the seeming indeterminacy of the feeling brain into its system.
    An estimated 26 million displaced humans, primarily as an effect of warring America’s actions, circle the globe, along with the routine rejection of compassion for the millions of suddenly impoverished Americans due to the pandemic.
    Humans must have connection and community to survive as humans.
    High-tech can only provide its opposite.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: