“Electrical information devices for universal, tyrannical womb-to-tomb surveillance are causing a very serious dilemma between our claim to privacy and the community’s need to know.”
The Medium is the Massage
The past few years have ushered in a sea change in public perception of the tech industry. While Silicon Valley was once cherished by progressives for its social liberalism and championed by conservatives for its ruthless ingenuity, its luminaries are now under assault from all sides. On the left, figures like Elizabeth Warren are revitalizing the language of antitrust; on the right, pundits and politicians are up in arms over supposed biases in mass communication channels, and across the board, people are scrambling to protect their privacy and personal information from the likes of Facebook and Cambridge Analytica. The fears animating this backlash often center on the ills of private- sector surveillance, with the language of privacy rights and personal data collection entering mainstream conversations about civil liberties. But the dominant coalition now fighting the invasive oversteps of tech firms—an ideologically diverse cohort of legislators, activists, and nonprofit pressure groups—lack a coherent diagnosis of the material dynamics of defense of individual privacy rights, antitrust action that would seek solutions through increased market competition, and a libertarian culture of “opting out,” or attempting to hide from the surveillance regime. But the diffuse network generating these problems operates at a mass social scale, and thus demands an organized and collective response. In this essay, I hope to sketch a foundation for diagnosing this ambient social malaise, understanding how the private-sector surveillance regime has renegotiated the relationship between lei- sure and work, and what it might mean to prepare a response.
If you own a smart car, its manufacturer probably knows a thing or two about how you drive. If you keep an Alexa at home, Amazon has likely recorded some of your private conversations. Last year, the Associated Press reported that Google stores location data from smartphone users who explicitly try to opt out. And if you use the internet with any regularity, Facebook almost certainly maintains an elaborate profile of your proclivities—even if you’ve never signed up for an account.
While the robust tech backlash that we see today is a relatively new phenomenon, the economic forces planting these sensors into our lives are not. Over the past two decades, an expanding matrix of intelligent consumer products—from thermostats and wristwatches to vibrators and electric toothbrushes—has penetrated many of surveillance capitalism, as well as a meaningful platform for action. Proposed solutions are often limited to technocratic regulation in what were once considered the most intimate spaces in our lives. The internet of things has produced a number of abstract social anxieties, from unease over the implications of mass automation to an urgent need to renegotiate traditional notions of individual privacy—but in more concrete terms, it has meant that the things around us are starting to pay attention.
Perceptive hardware makes our commodities more reactive, but it also animates consumer goods with double lives as productive capital, empowering firms to prune surplus value from human activities not traditionally legible as work. Bodily functions are refined into caches of valuable metadata, elaborate behaviors are boxed into standardized mediums, private moments are recast as legible inputs for profit-driven algorithms, and leisure time is increasingly available to firms seeking to predict and control our futures.
We routinely trade behavioral surplus, or “data exhaust,” for access to goods and services, but the exchange is rarely conscious. Complex and unilateral service agreements leave disparate consumer-producers without leverage or information to negotiate, and since the labor involved in generating behavioral surplus is not recognized as work, there is no expectation that people ought to be paid for the data they create. And while the value of behavioral output is often difficult to quantify, the social costs of its extraction are even murkier. As developers battle to filter our leisure time through one interface or another, overstimulation interconnection becomes a constant source of anxiety. Rising smartphone use, for example, has been linked to increases in depression, anxiety, and loneliness. It is difficult, but necessary, to imagine a different arrangement.
. . . [S]ince the labor involved in generating behavioral surplus is not recognized as work, there is no expectation that people ought to be paid for the data they create.
At the beginning of this year, Harvard Business School Professor Emerita Shoshana Zuboff gave a widely cited name to this arrangement—“surveillance capitalism”—in her 700- page book The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, as well as in her essay in New Labor Forum (see “Surveillance Capitalism and the Challenge of Collective Action”, Winter 2019) published around the same time. In both, she depicted the creep of personal smart tech as a new enclosure of the commons, a process through which private forces lay claim to resources previously understood as exterior to the economy and immune to ownership. Surveillance capitalists accumulate wealth by commodifying and claiming ownership of what- ever passive human behaviors they can systematically observe.
As . . . processing units are crammed into a diverse array of goods, hotel chains, farm equipment suppliers, and manufacturers of children’s toys have taken on side hustles as data merchants.
Private-sector surveillance is generally associated with Silicon Valley and primarily enabled by the internet, but its logic transcends what we might recognize as tech firms, permeating traditional industries as well. So-called precision agriculture demands that farmers adopt smart sensors and connected equipment to stay competitive, and global supply chains are continuously refined and reworked according to the insights of artificial intelligence (AI). As the cost of computing dwindles and processing units are crammed into a diverse array of goods, hotel chains, farm equipment suppliers, and manufacturers of children’s toys have taken on side hustles as data merchants.
Under this regime firms do not only approach people as potential consumers of material goods, but also as potential producers of behavioral commodities, dealing in the prediction and actualization of possible futures. These futures are auctioned off in turn—to advertisers, for example, or to empowered political interests— and in this way the explosion of private surveillance ushers a simultaneous shift toward mass social control, not in the interest of any recognizable human agenda, but the profit motive: a blind imperative for growth.
Besides the proliferation of extractive surveillance apparatuses, Zuboff notes two novel features of surveillance capitalism: that it lacks market reciprocities between firms and consumers (commodities, in the form of predictive information, flow in one direction without explicit remuneration), and second, that it replaces reciprocity with dependency (users tacitly consent to this extraction because they lack alternatives). The resulting system looks less like a voluntary market and more like one in which experiences are flattened into raw material and harvested from passive producers. Zuboff charts an extensive vocabulary for cataloging the phenomena tied to this shift. “Shadow text” describes the behavioral surplus that users create in excess of what is needed to maintain the use values that they consume, and the “dispossession cycle” refers to the process of lifting this excess from temporal experience. But the most useful terms for charting a path against this exploitative dynamic are “instrumentarianism” and the “Big Other. “Instrumentarianism” is a management philosophy that seeks to under- stand and guide society as a measurable collective rather than a web of individual actors, sorting subjects into standard categories and engineering behavior to conform to easily surveilled patterns. (Think of the dehumanizing, heavily quantified surveillance techniques used to manage Amazon warehouse workers, but applied to society at large.) And “Big Other” is the diffuse network of machines that observe, catalog, and translate human behavior into shadow text. It executes “instrumentarian power” in the interest of organizing behavior to optimize for certain outcomes.
If instrumentarianism is theory, Big Other is its worldly practitioner.
Big Other serves as a stand-in for an expanding, inhuman presence that both encodes quantifiable behaviorism into new facets of our lives and extracts the output of labor time captured in this way. Zuboff depicts it as “profoundly, infinitely, and . . . radically indifferent to our meanings and motives.” It is a force of alienation and exploitation, and if we are to address the unique features of surveillance capitalism, it is the enemy that must be tamed.
Until recently, the prevailing opposition to what Zuboff identifies as “instrumentarian power” has come from an eclectic patchwork of advocacy organizations like the Electronic Frontier Foundation (EFF), a non-profit digital rights group that offers legal representation and lobbies on privacy from a civil liberties stand- point. The privacy rights movement has long drawn upon a narrow, individual rights-focused framework—a logic that is visible in the work of many in the information security (infosec) community, as well as hackers interested in honing techniques to preserve anonymity and hide in walled gardens. With its obsessive focus on self-help tools, encryption software, consumer awareness, and the sanctity of the sovereign individual, this type of thinking is more likely to produce a protracted retreat into shrinking terrain than substantive structural change.
The desire to “opt-out” of the surveillance regime makes sense at first glance: The individual is the most obvious political unit to contrast against an amorphous blob that flattens experience into a collective, bureaucratic, quantified mass—but a fragmented mass of self-determined survivalists will inevitably run into problems in trying to out- flank a coordinated industry that earns material incentives for ever-more-aggressive forays into the lives of its consumers. As private surveillance techniques multiply and expand into physical space, hiding becomes increasingly difficult, cumbersome, and demanding of one’s time and attention. The perpetual retreat into flawed end-to-end encrypted communication channels, behind virtual private networks, and away from commonplace consumer goods becomes unsustainable for all but the most dedicated hobbyists.
Companies can build sophisticated profiles of individuals based solely on the data-sharing choices of others . . .
And even if it were realistic to keep pace with the surveillance creep, it is often impossible for even the savviest consumers to opt out entirely. Companies can build sophisticated profiles of individuals based solely on the data- sharing choices of others, and often purchase information through third parties.
And yet, individual protection is widely perceived as the most aggressive option available. Progressives often hold up the European General Data Protection Regulation (GDPR), an E.U. regulation that aims to defend privacy and individual data, as an ideal to look up to. But the law does not protect much of the secondary data that is inferred by algorithms and used by online advertisers, and many of the discrete ways in which we are passively surveilled are difficult or impossible to regulate through a model of explicit consent.
So why do we care so much about the sanctity of “personal information?” In reality, our “personal” data only becomes meaningful—and gains value—as a commodity at scale. It is not necessarily helpful for a firm to know that you searched for a peanut butter cookie recipe at 2:15 pm, or that you drove 30 mph down a particular road, or that your heart rate increased shortly after clicking a link, until it can compare the resulting profile to those of different users. Most digital activity takes the form of interaction with others, and as a result, quantifiable behavioral surplus is something that we coproduce rather than generate individually. Big Other acts upon us as, and indeed seeks to shape us into, a collective mass. The proliferation of private surveillance devices thus not only converts our atomized leisure time into generative work, but into collective labor. It can only be confronted collectively.
If you [use] a smart watch . . . but it is also used to generate behavioral surplus . . . then who does it belong to?
Big Other is an abstracting agent, but it also erodes the separation between personal and private property, or the distinction between the means of production and non-capital goods. This complicates the nature of ownership: If you employ a smart watch as a use value, but it is also used to generate behavioral surplus for an outside firm, then who does it belong to? Resolving this contradiction is central to two intertwined tasks: first, preventing the encroaching commodification of moments that people want to keep private, and second devising a just allocation of the behavioral surplus generated from those that are freshly brought into the realm of production.
Contrary to expectation, defending the sancity of personal property may require collectivizing the interconnected, “smart” surveillance features that increasingly animate our possessions. Imagine, for example, a situation in which all driving data harvested from smart car owners—or Uber drivers, for that matter—was collectively owned and democratically governed by its producers, rather than silently extracted by capitalist firms. This would be distinct from private ownership of “personal” data because it both acknowledges the inherently collective nature of shadow text and operates at a scale that is both transparent and politically legible to Big Other. It would empower us to both control the ends to which our information is used, and forestall surveillance creep into areas of life that we would prefer to keep private—in other words, it would truly give us the tools we need for collective self-determination. Unless we begin to imagine a collective model for reclaiming the means of data extraction, our personal goods will continue to transfigure into private capital working for the ends of others.
As it stands, Big Other exists only to extract, irrespective of the social implications. This is a nonstarter in imagining a socialized surveillance superstructure: A more benevolent instrumentarianism would reorient in the service of more complex social ends than blind growth. Part of what makes this difficult to envision is that the operation of shadow text is defined, in part, by its incomprehensibility to humans. Information is culled from our lives and inter- related at such an immense scale that it takes a network of machines to realize its hidden uses, or rationalize its commodity form. But what we could do would be to control for certain out- comes. For example, what if social algorithms sorted information to optimize for self-reported happiness and mental health scores, rather than profit and maximizing raw attention?
. . . [W]hat if social algorithms sorted information to optimize for self-reported happiness . . . rather than profit . . . ?
Reality is more complex than this, of course, and any arrangement would necessarily encode biases and bear unintended consequences. The best way to overcome them? Through constant democratic renegotiation. Since the surveillance matrix is so diffuse—embedded in buses, WiFi kiosks, refrigerators, and so forth—it is much easier to imagine this negotiation along the lines of how to govern the extraction and management of behavioral surplus in general, rather than under specific circumstances. A valorization of the invisible labor that now accompanies our leisure time, or a recognition that play is increasingly vital to capital production, is essential to this. The fact that we collectively produce shadow text through constant, active, productive work is the strongest foundation from which to govern its ends.
Rather than the “opting out” of the extractive surveillance matrix, we should opt into collective, transparent, and democratic decision-making processes. This means more than checking a box to say that you understand the contract. It means demanding, and fighting for, a continuous say over the manner in which data exhaust is extracted, refined, commoditized, and reinvested into any service.
A growing number of activists and columnists have speculated that we should demand payment for personal data, but why stop there? That arrangement still presumes that private firms should be left in control of our networked information—a collective good. A fairer arrangement would leave us, the producers of behavioral surplus, to determine how it is put to use, and how its profits are reinvested. After all, we are just as responsible for producing the functionality of many of our commodities as the firms that profit from and ostensibly produce them.
To arrive at this state, the left must advocate for a recognition of the collective labor that was injected into our lives by Big Other’s pervasive presence. This means recasting concerns about individual privacy as structural problems of collective exploitation, and naming the injustice that accompanies any surveillance firm’s efforts to rip information from our hands. This is a pre- condition to exercising the collective power of productive leisure activities, which could in turn demand a say over how and when private surveillance can take place, and to what ends.
A growing climate of tech cynicism might lead one to believe that this path is on its way, but the expropriation of collective information is far from inevitable. Brandeisian trustbusters like Senator Elizabeth Warren and law professor Tim Wu, often considered radical tech critics in the American context, have been picking up steam, and their calls to break up monopolists like Facebook and Google have found boosters as unlikely as Facebook co- founder Chris Hughes, who called for the firm to be broken up in the New York Times opinion pages. But these self-styled firebrands advocate for little more than a mythical regulated market solution, and to quote a surprisingly agreeable Vox headline, “Facebook is a capital- ism problem, not a Mark Zuckerberg problem.” Heightened competition might be able to rebalance market equilibriums in a traditional manufacturing industry, but when it comes to firms that peddle in the extraction of immaterial qualities in the form of behavioral surplus, it will do nothing to subdue the endless drive for growth into immaterial personal and emotional spaces.
Rather than the “opting out” of the extractive surveillance matrix, we should . . . [demand] . . . a . . . say over the manner in which data exhaust is extracted, refined, commoditized, and reinvested into any service.
The problems with surveillance capitalism are not particular to individual actors, or even specific firms. Rather, they are endemic to an economic system that allows for the wholesale enclosure of unrecognized and uncompensated labor. In turn, this is framed by how routinely workers are conditioned to expect little or no control over the proceeds yielded by their activities.
This is why Zuboff’s structural approach, whatever its flaws, remains useful. It serves as a corrective to the market-focused regulatory frameworks that dominate punditry on the topic, which often limit the debate to one between antitrust and state-driven market intervention. These both focus on reigning in particular manifestations of the surveillance matrix, and on the excesses of specific firms or individuals.
It is true that people like Jeff Bezos and Mark Zuckerberg personally benefit from the development of surveillance capitalism, but shifting the focus from individual actors to a generalized Big Other means we do not need to worry about their personal morality or motivations. They benefit because of their relationship to the means of producing behavioral surplus, and as the dispossession cycle extends from what we commonly recognize as the “technology” sector and into other industries, this particular relation will become increasingly common to the capitalist class. Thus, the struggle to confront Big Other becomes inseparable from that against capital more broadly.
Thankfully there is ample precedent for successful labor struggles against instrumentarian power—which will be necessary to ensure collective self-determination over commodity- forms of data exhaust. In West Virginia, teachers successfully organized against a proposed pro- gram to monitor and grade performance on health and wellness indicators that could have hit workers with arbitrary fees. So too, a successful UNITE HERE strike won contract language that protected Marriott workers from predatory workplace automation. These types of victories have largely been confined to traditional workplaces, but widespread activist opposition to state surveillance programs, such as the “Stop Watching Us” rally organized in the wake of the Snowden leaks, show promising energy that could be turned against their private-sector counterparts.
It is tempting to imagine the state as a vehicle for expropriating and redesigning the surveillance matrix according to collectivist ends. It is ostensibly democratic, and arguably the only existing body with the recognized authority to overrule capitalists’ assumed claims to passive behavioral output. But the specter of nationalization swiftly poses problems. Not only would a state-owned surveillance superstructure threaten the civil liberties of U.S. citizens, it would almost certainly require laying claim to the behavioral surplus generated by non-citizens both at home and abroad. Dramatic ideological shifts between elections also mean that any safe- guards would be vulnerable to repeal, leaving an unpredictable body in control of an excessively powerful instrumentarian network. And if Big Other were broken into a patchwork of nation- state-directed coalitions of behavioral producers, it would not only reaffirm the reactionary concept of national identity as the highest social ordering, but also reinforce postcolonial power imbalances by punishing countries with relatively little access to networked smart tech.
It is tempting to imagine the state as a vehicle for expropriating and redesigning the surveillance matrix according to collectivist ends.
While the state might feasibly play a transitional role in handing over collective determination to surveilled populations, we would ultimately need a more democratic structure than it could realistically provide. We need one that not only allows for broader forms of solidarity, but empowers the surveilled to participate in directing their collective future. Confronting Big Other through an independent, non-state collective would require new kinds of organizing, particularly around the recognition of denied labor woven into our leisure time.
The disavowed labor that fuels surveillance capital is far from putting up a unified front, but over the past few years, a tech-worker organizing boom has proven the ability of organized labor to confront and reform industry practices for the better. In 2017, Google workers successfully pressured the company to abandon a project to supply drone-related AI products to the U.S. military, and earlier this year they walked out of offices across the world to protest dire mishandling of sexual harassment grievances. Meanwhile, thousands of tech workers signed a pledge not to develop tools to target Muslims and immigrants, and the Tech Workers Coalition has seen explosive growth, with chapters springing up across the country. By expanding our conception of the labor exploited by these firms, we might engender a large enough coalition to truly seize control of the fruits of our collective work.
Given the opportunity to collectively control for Big Brother’s automated outcomes, there are a number of things we might hope to accomplish. The most coercive and socially unnecessary surveillance structures would likely be abolished, but we would likely find utility in tolerating others. For example, fitness apps could benefit public health without stimulating anxieties about how their owners would use our bodily information, search engines could pull up results tailored to our interests without exploiting them for attention and profit, and ticket vendors could suggest events based on our tastes without spamming our inboxes to satisfy the profit motive.
. . . [F]itness apps could benefit public health without stimulating anxieties about how their owners would use our bodily information . . .
It’s difficult to predict what would happen in a successful struggle to win collective self- determination over our interconnected behavioral surplus. The core fact is that collective will must ground any effort to tame the surveillance infrastructure multiplying around us. Tepid, market-friendly regulations will never be able to keep pace—nor will misguided efforts to retreat from the surveillance matrix as individuals. Instead, we will need to establish a collectivized system to control for better, more democratic outcomes and ensure that they tend away from merely profit, growth, and material productivity. The first step and most important step to reclaiming our leisure time will likely be in recognizing and laying claim to its productive elements.
Rodrigo de Matos, Expresso. www.rodrigocartoon.com
Evan Malmgren is a freelance writer, researcher, and fact-checker who lives all over. He writes about technology and power for outlets like Logic, Dissent, The Nation, The Baffler, and Jacobin.
 Jeff Plungis, “Who Owns the Data Your Car Collects?” Consumer Reports, May 2, 2018, available at https://www.consumerreports.org/ automotive-technology/who-owns-the-data-your-car-collects/.
 Matt Day, Giles Turner, and Natalia Drozdiak, “Amazon Workers Are Listening to What You Tell Alexa,” Bloomberg, April 10, 2019, avail- able at https://www.bloomberg.com/news/ articles/2019-04-10/is-anyone-listening-to- you-on-alexa-a-global-team-reviews-audio.
 Ryan Nakashima, “AP Exclusive: Google Tracks Your Movements, Like It or Not,” Associated Press, August 13, 2018, available at https://www.apnews.com/828aefab64d4411bac 257a07c1af0ecb.
 David Ingram, “Facebook Fuels Broad Privacy Debate by Tracking Non-Users,” Reuters, April 15, 2018, available at https://www.reuters.com/ article/us-facebook-privacy-tracking/facebook- fuels-broad-privacy-debate-by-tracking-non- users-idUSKBN1HM0DR.
 Cameron Glover, “Your Vibrator Is a Sky,” Logic, July, 2017.
 Erik Peper and Richard Harvey, “Digital Addiction: Increased Loneliness, Anxiety, and Depression,” NeuroRegulation 5 (2018): 3-8.
 Jason Tatge, “The Land Grab for Farm Data,” TechCrunch, July 6, 2016, available at https://techcrunch.com/2016/07/06/ the-land-grab-for-farm-data/.
 Lisa C. Dunn, “How Big Data Is Changing Supply Chains,” Supply Chain Resource Cooperative, October 23, 2018, available at https://scm.ncsu.edu/scm-articles/article/ how-big-data-is-changing-supply-chains.
 Shoshana Zuboff, “Surveillance Capitalism and the Challenge of Collective Action,” New Labor Forum 28 (2019): 10-19.
 Lily Hay Newman, “Encrypted Messaging Isn’t Magic,” WIRED, June 14, 2018, available at https://www.wilderssecurity.com/threads/ encrypted-messaging-isnt-magic.404918/.
 Chris Hughes, “It’s Time to Break Up Facebook,” The New York Times, May 9, 2019, available at https://www.nytimes.com/ 2019/05/09/opinion/sunday/chris-hughes-facebook-zuckerberg.html.
 Katie Fitzpatrick, “None of Your Business,” The Nation, May 13, 2019, available at https://www.thenation.com/article/shoshana-zuboff- age-of-surveillance-capitalism-book-review/.
 “UNITE HERE Local 5 Members Approve Historic Contract,” November 28, 2018. https://www.unitehere5.org/2018/11/unite-here-local- 5-members-approve-historic-contract.
 Ben Tarnoff, “Tech Workers versus the Pentagon,” Jacobin, June 6, 2018, avail- able at https://jacobinmag.com/2018/06/goo gle-project-maven-military-tech-workers.
 Daisuke Wakabayashi, Erin Griffith, Amie Tsang and Kate Conger, “Google Walkout: Employees Stage Protest over Handling of Sexual Harassment,” The New York Times, November 1, 2018, available at https://www.nytimes.com/2018/11/01/technology/google- walkout-sexual-harassment.html.
 Shirin Ghaffary, “Why These Young Tech Workers Spent Their Friday Night Planning a Rebellion Against Companies Like Google, Amazon, and Facebook,” Vox, January 18, 2019, available at https://www.vox.com/2019/1/18/18185842/tech-workers-friday-night-google-amazon-facebook.