Big Tech Lynn Parramore Supreme Court Surveillance

What Happens When Big Brother Meets Big Tech

Author and law professor Maurice Stucke warns that as fundamental privacy rights vanish, your personal data can and will be used against you.

By Lynn Parramore / Institute for New Economic Thinking

University of Tennessee law professor Maurice Stucke, author of “Breaking Away: How to Regain Control Over Our Data, Privacy, and Autonomy” has been critical as tech firms have grown into giant “data-opolies” profiting from surveillance and manipulation. In a conversation with the Institute for New Economic Thinking, he warns that legislative inaction and wider government complicity in this surveillance are eroding fundamental rights to privacy along with the ability of federal agencies to regulate Big Tech.

Lynn Parramore: Concern over privacy is increasing right now, with people worrying about different aspects of the concept. Can you say a bit about what privacy means in a legal context? With the digital revolution, privacy obviously means something different than it did 50 years ago.

MS: Yes, privacy is not a single unitary concept. There are different strands. There’s bodily privacy and decisional privacy – the right to make important decisions about one’s life, like whether to have a child or not, without governmental interference. Within the bucket of decisional privacy would also be marriage, contraception, and things of that nature. There’s intellectual privacy (such as what one reads, watches, or thinks) and associational privacy (such as the freedom to choose with whom one associates). Informational privacy is another strand, where you can control your personal information, including the purpose for which it is used.

There used to be the idea that data protection and privacy are fundamental human rights.

Numerous supporters of privacy rights have argued that U.S. Constitution should protect an individual’s right to control his or her personal information. One of the earlier Supreme Court cases involving informational privacy tested that belief. New York passed a law requiring doctors to disclose to the government their patients’ name, age, and address when they were prescribed certain drugs. All of this information was collected in a database in New York. A group of patients and their prescribing doctors challenged the law, contending that it invaded their constitutionally-protected privacy interests. The case was decided in 1977 — before the Internet and cloud computing. The Supreme Court, however, did not perceive any threat to privacy implicit in the accumulation of vast amounts of personal information in computerized data banks or other massive government files. The Court instead noted how the mainframe computer storing the data was isolated, not connected to anything else. Today, the data are not collected and maintained on some isolated mainframe. A torrent of data is being collected about us that we may not even have thought about. When you go to purchase gas at the local station, for example, you may not think of the privacy implications of that transaction. But there are powerful entities that collect vast reservoirs of first-party data from customers, and also sources that are reselling it, like the data brokers.

Congress, unlike the Supreme Court, recognized in the 1970s that the privacy of an individual is directly affected by the government’s collection, use, and dissemination of personal information and that the government’s increasing use of computers and sophisticated information technology has greatly magnified the harm to individual privacy. The preamble of the Privacy Act of 1974, enacted by Congress, states that privacy is a fundamental right protected by the Constitution. It was a landmark law in seeking to provide individuals greater control over their personal data in government files.

But the Supreme Court, on two occasions when it had the opportunity, declined to hold that the Constitution protects informational privacy as a personal and fundamental right. A majority of the justices just punted. They said that even if one assumed that such a right existed, it did not prevent the government from collecting the information it sought in both cases. Justices Scalia and Thomas were blunter in their concurring opinion: they simply argued that there is no constitutional right to informational privacy.

LP: What are some of the ways we are most vulnerable to government intrusion into our personal data right now?

MS: Well, the state can tap into the surveillance economy.

There are significant concerns about virtual assistants like Alexa. There was a case in Arkansas where someone was murdered in a hot tub, and the defendant had an Alexa device in his house. The government sought from Amazon any audio recordings and transcripts that were created as a result of interactions with the defendant’s Alexa device. The government wanted access to the data collected by Alexa to see if there was any incriminating evidence. Alexa records what information you ask it to find. That’s the data it’s supposed to store. But there have also been concerns that Alexa may record more data than it was intended to, such as communications between family members.

Geolocation data is another big concern. Consider the Supreme Court’s decision in Carpenter v. United States [the 2018 decision requiring a warrant for police to access cell site location data from a cell phone company]. The Court said there’s a privacy interest in one’s geolocation data under the Constitution’s Fourth Amendment. Our movements, the Court noted, provide an intimate window into our lives, revealing not only where we go, but through them our “familial, political, professional, religious, and sexual associations.” So, how then did the U.S. Department of Homeland Security obtain millions of Americans’ location data without any warrant? The Trump administration simply tapped into the surveillance economy. It purchased access to a commercially-available database that maps our movements every day through our phones and the apps on our phones. Unless you turn off your phone or leave it at home, your phone is tracking you and potentially letting the authorities know where you’re going, how long you stayed there, when you came home, etc.

LP: So our geolocation data can actually be purchased by government officials, bypassing the need for a search warrant?

MS: Exactly. Now the government can just buy it, and that’s even scarier. The current Supreme Court does not appear to view the right to privacy as a personal and fundamental right protected by the Constitution. This is where Europe differs — its Charter of Fundamental Rights specifically recognizes privacy and data protection as fundamental human rights. Some U.S. states recognize privacy as a fundamental right as well, including California, but not all. That’s one of the concerns with the Court’s overturning Roe v. Wade — it’s stripping away privacy rights that are inferred by multiple constitutional provisions. The Dobbs v. Jackson Women’s Health Organization decision really shows how a simple change in the composition of the Court can enable it to eliminate or chisel away what had been viewed as a fundamental privacy right. If the Court says you don’t have these rights, that these rights aren’t in the Constitution, you would have to then get a Constitutional amendment to change it. What are the chances of getting a constitutional amendment? I remember when growing up the challenges in getting the states to ratify the Equal Rights Amendment. No one even talks about amending the Constitution anymore.

So now you have the states and federal government tapping into the surveillance economy. The government can be complicit in it and even benefit from the private surveillance economy because now it’s easier to prosecute these cases without getting a warrant.

LP: So far, we’ve been talking about what we do online, but you’ve pointed out that it doesn’t stop there because the line between the online world and the offline world is blurry.

MS: That’s right. For example, Baltimore has a very high murder rate per capita and clears only 32 percent of its homicide cases. Even though Baltimore installed over 800 surveillance cameras and a network of license plate readers, the high crime rate persisted. So a private company offered three small airplanes equipped with surveillance cameras that can cover over 90% of the city at any moment. The pilot program tracked over 500,000 Baltimore residents during the daytime. Ultimately the program was struck down by the U.S. Court of Appeals for the Fourth Circuit as being an unreasonable search and seizure. [Leaders of a Beautiful Struggle v. Baltimore Police Dep’t] But nothing stops this private company from going to other communities to institute the same surveillance program. Other courts might take the same view as the dissenting judges in the Baltimore case, namely, that people should not expect any privacy in their public movements, even if they are tracked for weeks or months. As a result, you might have extensive aerial surveillance, in addition to all the other surveillance tools being employed already.

The thing about the Baltimore case (and this is what the seven dissenting judges focused on) is that the company obscured the faces of the individuals, which were represented as “mere pixelated dots.” This was by choice. So what’s the invasion of privacy if the police can only see dots moving across the city? The majority opinion noted how the police could employ its other existing surveillance tools, such as on-the-ground surveillance cameras and license-plate readers, to identify those dots. Moreover, if you see a dot going into a house around 6 pm and emerging in the morning, you can assume that the person lives at that house. So the fact that the aerial surveillance depicts you as a dot is a red herring. The police could just cross-reference the dot with all the other technology that it is already using, like the license plate readers, the street surveillance cameras, and the facial recognition software, to identify who that dot is. That’s a key takeaway. You might think, oh, I can protect my privacy in one avenue, but then you have to think about all the other data that’s being collected about you.

Thus, we should take little comfort when Google says that it will delete entries from a person’s location history if it detects a visit to an abortion clinic. One need only piece together the other data that is currently being collected about individuals, to determine whether they obtained an abortion.

LP: You’ve noted that the Supreme Court is doing away with any notion that there’s a fundamental right to privacy. Is this a sign of creeping authoritarianism?

MS: It could be. You could end up with either an authoritarian state model or a commercial surveillance economy that the government co-ops for its purposes.

We are also running into another problem when you consider the Supreme Court’s recent West Virginia v. EPA decision. The Court cut back on the ability of the federal administrative agencies to regulate absent a clear congressional mandate. That potentially impacts a lot of areas, including privacy. For example, the federal agencies, under the Biden administration, could regulate apps and Big Tech firms, and tell them not to disclose health information to law enforcement. But those regulations could be challenged, using a new weapon with the Court’s EPA decision. The Court might very well strike down such regulations on the basis that privacy protection implicates major social and economic policy decisions, and decisions of such magnitude and consequence rest primarily with Congress, and not with the FTC or any other agency. And because Congress has been incapable of providing a comprehensive privacy framework, you are out of luck, unless your state offers some privacy protections.

LP: What would you like to see Biden doing regarding data protection? You’ve noted the importance of behavioral advertising to this discussion – advertising which allows advertisers and publishers to display highly-personalized ads and messages to users based on web-browsing behavior.

MS: Behavioral advertising is why these companies are tracking us all across the web. We need to address the fundamental problem of behavioral advertising and the collection of all this data. One issue is to what extent can the FTC use its authority under the Federal Trade Commission Act of 1914 to promote privacy and give individuals greater control over their data after the Court’s recent EPA decision.

The second issue is how to create a robust framework that actually protects our privacy. If you just say, well, companies can’t collect certain kinds of data, it’s not going to be effective. Facebook, for example, can make so many inferences about an individual just from their “likes.” It could discern their age, gender, sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, and use of addictive substances — so much information by something seemingly innocuous. The more people “like” something, the more accurate the information is and the more personal it can become, such that Facebook can know more about an individual than that individual’s closest friends. If you prohibit behavioral advertising, you’re – hopefully – going to then lessen the company’s incentive to collect that data in order to profile you and manipulate your behavior.

Robust privacy protection means giving individuals greater control over what’s being collected, whether or not that data can be collected, and for what limited purposes it can be used.

LP: Can we really put the behavioral advertising genie back in the bottle now that it has become so pervasive, so key to the Big Tech business model?

MS: Absolutely. A business model can be changed. Most people are opposed to behavioral advertising, and that has bipartisan support. Senator Josh Hawley [R-MO], for example, offered the Do Not Track Act, which centered on data collected for behavioral advertising.

There’s also bipartisan support for anti-trust legislation to rein in these data-opolies. The House did a great report about the risks that these data-opolies pose to our economy and democracy, and there were several bipartisan bills on updating our antitrust laws for the digital economy. All the bills had bipartisan support and went through the committee. Unfortunately, they’re still being held up for a floor vote. There was even a recent John Oliver show about two of the proposed bills, and the legislation still hasn’t gotten through. This is the fault of Republican and Democratic leadership, including Schumer and Pelosi. Big Tech has spent millions of dollars lobbying against these measures and they’ve come up with these bogus commercials and bogus claims about how this legislation is going to harm our privacy.

In Europe, they’re getting this legislation through without these problems, but in the U.S., you’ve also got the Supreme Court and many lower courts chipping away at the right to privacy and the ability of the agencies to regulate in this area. The agencies can move faster than Congress in implementing privacy protections. But the status quo benefits these powerful companies because when there’s a legal void, these companies will exploit it to maximize profits at our expense.

Behavioral advertising is not about giving us more relevant ads. The data is not being used solely to profile us or predict our behavior. It’s being used to manipulate. That is what the Facebook Files [an investigative series from the Wall Street Journal based on leaked documents] brought to the fore. Facebook already tells advertisers how it can target individuals who have just had a recent breakup, for example, with advertising for certain products. They can maximize advertising profits by not just predicting what people might want but by manipulating them into emotional states in which they are more likely to make certain purchases. The Facebook Files showed that Facebook’s targeting actually causes teenage girls to develop eating disorders. It’s depressing when you think about it.

LP: People are increasingly thinking about how to protect themselves as individuals. What steps might be effective?

MS: There are some small steps. You can support a search engine that doesn’t track you, like DuckDuckGo. Cancel Amazon Prime. Avoid Facebook. But avoiding the surveillance economy is nearly impossible. If you don’t want to be tracked, don’t bring your phone with you. Of course, Carpenter v. United States is instructive on that point. The Court noted how “nearly three-quarters of smartphone users report being within five feet of their phones most of the time, with 12% admitting that they even use their phones in the shower.” Some people even bring them into the shower! It’s not realistic to force people to forgo their phones if they want their privacy. Realistically there are very few protections, and it’s very, very hard to opt out because even seemingly benign bits of information that you wouldn’t think would incriminate you can be very telling when they are combined with other data.

New York did a study about how much health information is being transmitted every day to Facebook, and it’s staggering. Facebook receives approximately one billion events per day from health apps alone on users, such as when someone opened the app, clicked, swiped, or viewed certain pages, and placed items into a checkout. All of these health-related apps are continually sending the data to Facebook, most likely without the individual’s knowledge. So, you might think you’re going to avoid Facebook, but if you’re on a popular app or using a smartwatch, it may very likely be sending detailed, highly sensitive information about you – including when you are menstruating or wanting to get pregnant – to Facebook and the other data-opolies.

We’re moving into a situation where our every movement can be tracked. Just look at China. We don’t have to imagine what the counterfactual is: China is actively investing in the surveillance of its citizens. There it’s mostly the government. Here in the U.S., you could say, well, the government is not doing that. But here the government doesn’t have to. These powerful firms are already doing it, and some of the government agencies are complicit in that surveillance economy.

LP: So we’re really not as different from China as we might like to think.

MS: Right. The companies that are surveilling us are largely unaccountable. Google and Facebook have committed numerous privacy violations. As the technology improves, the invasiveness will get even creepier. You’re going to have technologies that read a person’s thoughts and decipher their emotions — and not even just decipher their emotions but predict and manipulate their emotions. To see what’s on the horizon, just look at the influx of patented technology. It’s scary.

After the initial reaction to the Supreme Court’s recent decisions has subsided, we need to consider the broader implications of these rulings. Hopefully, people will, even if they don’t agree philosophically or ideologically with the dissenting justices, be concerned with what the majority is doing. Will the Court make other personal decisions about myself and my family? What is to stop some states and this Court from deciding whom I can marry? What birth control can I use if any? To what extent are my rights, including the right to be left alone, protected? We’re seeing a steady erosion happening now. History teaches us that anything is possible. Germany was said to be the land of poets and thinkers – a nation that would never, ever accede to something like the Nazi Party. Totalitarianism was supposed to be beyond the realm of possibility.

Privacy legislation seems unlikely right now, and things are looking bad on so many levels. The economy has tanked. Inflation is eating away at our paychecks and savings. Gun violence. Global warming. Greater mistrust across political lines. Greater tribalism and rancor. No wonder most Americans believe that the country is heading in the wrong direction. It seems like we’re incapable of building or achieving anything. One wonders whether we are approaching the decline of civilization. But the thing about human events is that you could have remarkable change coming from unexpected places. Consider the Berlin Wall. It was for decades a fact of life: people thought their children and grandchildren would have to live in a city and country divided by this physical and ideological wall. Then all of a sudden, the wall was gone. It wasn’t the politicians that negotiated this to happen. It was the thousands of Germans who had enough of the Stasi, the surveillance state, and the repression of their freedoms. Meaningful privacy change requires people to say, I’m not going to tolerate what these companies are doing. I’m not going to tolerate the government engaging in surveillance.

I don’t want to seem defeatist. Just look at the California Consumer Privacy Act of 2018 and California Privacy Rights Act of 2020. There, a real estate developer spearheaded a revolution in privacy legislation. California was the last place one would expect this to occur — the home of Google, Apple, and Facebook. But the developer spearheaded privacy reform by threatening a ballot. And when that 2018 legislation proved to be insufficient, that same real estate developer was able to get a ballot for amending and strengthening the law, and the majority of the Californians voted in favor of it. The 2020 statute is complex, over 50 pages long. There was a lot of lobbying by Big Tech against it, but the people got it.

We don’t have to accept the status quo. We can change things in small part through our behavior. If you don’t like Google, then don’t use it. If you switch to DuckDuckGo, it’s not going to be that great at first, but as more people switch to it, it’s going to get better through network effects. If you don’t like Facebook, then delete your account – but recognize that it’s not sufficient. You’ve still got to support privacy legislation. Congress can get it done. They were able to pass other legislation, like requiring federal judges to disclose conflicts, rather quickly. There’s no reason they shouldn’t be able to do this except for the lobbying and all the money that’s being thrown around. If the people push it, it can be done.

People can have an awakening that things are not all right. Young people could have an awakening about just how precarious our rights are and not take them for granted. Maybe they will see that our democracy is on not on cruise control and it’s not just operating on its own. It takes everybody getting involved on a local level and saying, I’m not going to take this any longer. Change can occur, but only if we demand it.

Lynn Parramore
Lynn Parramore

Lynn Parramore is Senior Research Analyst at the Institute for New Economic Thinking. A cultural theorist who studies the intersection of culture and economics, she is Contributing Editor at AlterNet, where she received the Bill Moyers/Schumann Foundation fellowship in journalism for 2012. She is also a frequent contributor to Reuters, Al Jazeera, Salon, Huffington Post, and other outlets. Her first book of cultural history, Reading the Sphinx (Palgrave Macmillan) was named a “Notable Scholarly Book for 2008” by the Chronicle of Higher Education. A web entrepreneur, Parramore is co-founder of the Next New Deal (formerly New Deal 2.0) blog of the Roosevelt Institute, where she served as media fellow from 2009-2011, and she is also co-founder of Recessionwire.com, and founding editor of IgoUgo.com. Parramore received her doctorate from New York University in 2007. She has taught writing and semiotics at NYU and has collaborated with some of the country’s leading economists her ebooks, including “Corporations for the 99%” with William Lazonick and “New Economic Visions” with Gar Alperovitz. In 2011, she co-edited a key documentary book on the Occupy movement: The 99%: How the Occupy Movement is Changing America.

7 comments

  1. This writer has so many blind spots.

    Go to Raul Diego or Alison McDowell.

    Stigliz’s project for Sarkozy asserted that social well-being be prioritized. The era of bio-capitalism and investments in prescribed social behaviors was getting off the ground. Assisting Stiglitz in this effort was the Bengali economic theorist and Harvard professor Amartya Sen whose career centered: development finance; social choice, whereby individual preferences were guided to generate collective decisions; and capabilities, taking into consideration people’s abilities to pursue their choices. Sen has served on Berggruen’s philosophy prize selection panel.

    Read this selection from their report keeping the following factors in mind. Well-being metrics will quantify ALL our relationships, social and environmental. By assigning everything a numerical value our lives are made visible to the machine. This information will be used to inform the actions of autonomous intelligent systems in mixed reality. Everything will be compelled to interact with these systems on the basis of the data that comprises unique digital identities, digital twins. Blockchain transactions connected to cyber-physical systems will enclose our lives in mechanical energies. This is the new rule of law – smart contract law. Global financial interests, guided by artificial intelligence prediction markets, will place bets on our ability to attain indicators of “success” specified. Submission to the machine is being framed as good for society and the planet, appealing to progressives, and economically efficient, appealing to free-market libertarians. This is how the Earth may be remade as a planetary computer. This is the “wellness” economy.

    https://wrenchinthegears.com/2022/07/14/wellness-metrics-teaching-machines-to-live-with-us-synthetic-pretenders-part-15d/?amp

    Social control through social impact investing and universal basic income gulags and cash transfers and forcing youth and parents and adults to follow that WEF rulebook. She isn’t even discussing Fourth Industrial Revolution.

    More staid thinkers here on Sheared Off Post.

  2. Many political observers have cited Orwell in the last few years. However rarely do they understand his work or connect the reader to the very real world we now live in…

    “We have cut the links between child and parent, and between man and man, and between man and woman. No one dares trust a wife or a child or a friend any longer”. Orwell, ‘1984’.

    ‘This is why, in part, many ‘leading’ commentators take the popular path of least resistance and quote the more simplistic, straightforward and obvious of Orwell’s writings but do so within the seemingly ‘safe’ but pyrrhic confines of social acceptance i.e. ‘within the safety the herd’. As such, by sub/consciously and cyclically self-censoring their own minds and writings, these ‘independent’ journalists not only blunt their ability for actual critical-thinking but most cripplingly of all, have abrogated themselves of that most basic, yet rarest of constructive qualities to be found in academics, politicians and ‘journalists’ today; *curiosity*.’

    ‘Cherrypicking Orwell: Perpetuating The Party’s ‘1984’ Global ‘Gasocracy’ (2022) https://wp.me/p94Aj4-3cZ

    Johnny McNeill
    #GaslightingGilligan (© 2017) 
    Twitter: @GasGilligan (*free download*)

  3. Question: “What Happens When Big Brother Meets Big Tech”?
    He loses his ‘front’ (as in façade) ‘teeth’, as well as his (duplicitous) smile!
    Or does one need magnifying glasses to see the truth of how effective true journalism can be, when a real journalist, like Julian Assange, applies ‘big tech’ techniques to beat them at their own game.
    They get murderously vindictive and will stop at nothing to hide their infamy in the sands of history, buried, sublimated and conveniently repressed!
    What more important headline is there today for ‘civilization’ than to at last begin to take a look at ‘self’ in the mirror, to reveal the truth of the lunacy in his purportedly erstwhile, historical inhumanity?

  4. Ahh, schucks, it’s just big biz and surveillence capitalism in its continuing criminal enterprises:

    Now now, Amazon going to follow HIPPA values? Right.

    Read:

    Amazon (AMZN) on Thursday said it has entered an agreement to acquire primary healthcare company One Medical in an all-cash deal valued at approximately $3.9 billion.

    One Medical is a membership-based primary care service that promises customers “24/7 access to virtual care.” The company operates in a dozen major U.S. markets, according to its website, and works with over 8,000 companies to offer One Medical health benefits to their employees.

    The acquisition is just the latest example of the tech giant expanding its footprint in the healthcare industry. Amazon acquired PillPack, an online pharmacy, in 2018 and later launched its own digital pharmacy in the United States.

    Separately, Amazon partnered with JP Morgan Chase and Berkshire Hathaway on an effort to provide better healthcare services and insurance at a lower cost to workers and families at the three companies, and possibly other businesses, too. That effort, called Haven, shut down last year.

    +–+

    More robust thinking and reporting here: the old new colonizers, those racist Anglophiles.

    Building the Impact Finance Regime: Nigeria’s Civil War and the Road to Cyber-Colonialism
    by Raul Diego July 19, 2022

    Boasting three quarters of a century of “investing for development”, British International Investment recently unveiled its new name along with a five-year plan to pour billions of pounds into technology, climate and “inclusive finance” projects across Africa, South East Asia, and the Caribbean.

    Thick layers of marketing copy peddle the familiar themes of hyper optimistic innovation and economic growth so commonplace in 21st century corporate literature. From the document’s executive summary to its conclusion, the cheery, calculated vacantness of each paragraph leaves us with a sense of a promise waiting to be broken.

    British International’s CEO, Nick O’Donohoe, peppers his foreword with key buzzwords like “green, renewable” and “sustainable” or “inclusive”. As the co-founder of Big Society Capital (BSC) with Sir Ronald Cohen, O’Donohoe can claim to be one of the originators of this new ‘conscious’ capitalist lingo. BSC was the world’s very first venture capital firm dedicated exclusively to funding startups focused on social impact, and emerged out of the UK government’s own initiatives to foster this space.

    Now, the former Colonial Development Corporation (CDC), as British International Investment was once called, has become the UK’s primary vehicle for the propagation of the impact finance model. Repeatedly referring to itself as “the impact investor” in the paper, the wholly-owned property of the UK government estimates that £5 to £6 Billion will be invested throughout the Commonwealth over the next five years, and Africa in particular.

    https://siliconicarus.org/2022/07/19/building-the-impact-finance-regime-nigerias-civil-war-and-the-road-to-cyber-colonialism/

  5. big tech drove covid facsism hysteria…New Zealand where the dictator Arden c continues to impose covid fascist irrationalism—vaccines, masks now has double covid attributed deaths compared to UK, 4 times that of USA
    NZ—the 1 true hermit kingdom

Leave a Reply

Your email address will not be published.

%d bloggers like this: