Publisher’s note: This article has been published by permission of the author. Requests for article permissions and reprints should be directed to the author at firstname.lastname@example.org
What Is Surveillance Capitalism?
In our time, surveillance capitalism repeats capitalism’s “original sin” of primitive accumulation. It revives Karl Marx’s old image of capitalism as a vampire that feeds on labor, but with an unexpected turn. Instead of claiming work (or land, or wealth) for the market dynamic as industrial capitalism once did, surveillance capitalism audaciously lays claim to private experience for translation into fungible commodities that are rapidly swept up into the exhilarating life of the market. Invented at Google and elaborated at Facebook in the online milieu of targeted advertising, surveillance capitalism embodies a new logic of accumulation. Like an invasive species with no natural predators, its financial prowess quickly overwhelmed the networked sphere, grossly disfiguring the earlier dream of digital technology as an empowering and emancipatory force. Surveillance capitalism can no longer be identified with individual companies or even with the behemoth information sector. This mutation quickly spread from Silicon Valley to every economic sector, as its success birthed a burgeoning surveillance-based economic order that now extends across a vast and varied range of products and services.
While the titanic power struggles of the twentieth century were between industrial capital and labor, the twenty-first century finds surveillance capital pitted against the entirety of our societies, right down to each individual member. The competition for surveillance revenues bears down on our bodies, our automobiles, our homes, and our cities, challenging human autonomy and democratic sovereignty in a battle for power and profit as violent as any the world has seen. Surveillance capitalism cannot be imagined as something “out there” in factories and offices. Its aims and effects are here … are us.
Just as surveillance capitalism can no longer be conflated with an individual corporation, neither should it be conflated with “technology.” Digital technologies can take many forms and have many effects, depending on the social and economic logics that bring them to life. The economic orientation is the puppet master; technology is the puppet. Thus, surveillance capitalism is not the same as algorithms or sensors, machine intelligence or platforms, though it depends on all of these to express its will. If technology is bone and muscle, surveillance capitalism is the soft tissue that binds the elements and directs them into action. Surveillance capitalism is an economic creation, and it is therefore subject to democratic contest, debate, revision, constraint, oversight, and may even be outlawed.
The primacy of economics over technology is not new, but capitalism has long found it useful to confound society by concealing itself within the Trojan horse of technology, in order that its excesses might be perceived as the inexorable expression of the machines it employs. Surveillance capitalists are no exception. For example, in 2009 the public first became aware that Google maintains search histories indefinitely. When questioned about these practices, the corporation’s former CEO Eric Schmidt explained, “… the reality is that search engines including Google do retain this information for some time.” In truth, search engines do not retain, but surveillance capitalism does. Schmidt’s statement is a classic of misdirection that bewilders the public by conflating commercial imperatives and technological necessity.
Surveillance capitalism is not inevitable but it is unprecedented. It operates through the instrumentation of the digital milieu, as it relies on the increasingly ubiquitous institutionalization of digital instruments to feed on, and even shape, every aspect of every human’s experience. Although it is easy to imagine the digital without surveillance capitalism, it is impossible to imagine surveillance capitalism without the digital. In pursuing these operations, surveillance capitalism is compelled by economic imperatives and “laws of motion,” which pro- duce extreme asymmetries of knowledge and power. Together the new capitalism and its unique production of power are as untamed by law as were the capitalism and economic power of the Gilded Age, and its consequences, though wholly distinct, are just as dangerous.
Google’s success derives from its ability to predict the future–– specifically the future of human behavior.
A century ago, Americans learned to master new forms of collective action that leveraged their roles as workers and customers to challenge, interrupt, and outlaw the worst injustices of raw industrial capitalism. The full resources of our democracy were eventually brought to bear in new legislative and regulatory institutions that subordinated the laws of supply and demand to higher order laws aimed at fostering and defending the conditions of a more equal, fair, and humane society. Will existing forms of collective action be sufficient to tame, interrupt, or outlaw the unprecedented operations of surveillance capitalism? How might a deeper grasp of its mechanisms, imperatives, and production of power illuminate both its unique threats to people and democratic society as well as the novel challenges it presents to collective action in our age?
Surveillance Capitalism’s Origins and “Laws of Motion”
Borrowed from Newton’s laws of inertia, force, and equal and opposite reactions, “laws of motion” is a metaphor that has been used to describe the necessary and predictable features of industrial capitalism.4 Although surveillance capitalism does not abandon established capitalist “laws,” such as competitive production, profit maximization, productivity, and growth, these earlier dynamics now operate in the context of a new logic of accumulation that also introduces its own sui generis laws of motion, first discovered and honed in the early years of Google.
Most people credit Google’s success to its advertising model, but the discoveries that led to Google’s rapid rise in revenue and market capitalization are only incidentally related to advertising. Google’s success derives from its ability to predict the future––specifically the future of human behavior. From the start, Google had collected data on users’ search-related behavior as a by-product of query activity. Back then, these data logs were treated as waste, not even safely or methodically stored. Eventually, the young company came to understand that these logs could be used to teach and continuously improve its search engine. The problem was this: Serving users with effective search results “used up” all the value that users created when they inadvertently provided behavioral data. It was a complete and self-contained process in which users were ends-in-themselves. All the value that users created was reinvested in their experience in the form of improved search, a progression that I have called the behavioral value reinvestment cycle. In this interaction, there was nothing “left over,” no surplus for Google to turn into capital. In 2001 Google was remarkable, but it wasn’t yet capitalism––just one of many internet start- ups that boasted “eyeballs” but no revenue.
The year 2001 brought the dot.com bust and mounting investor pressures at Google. Back then advertisers selected the search term pages for their displays. Google decided to try and boost ad revenue by applying its already substantial analytical capabilities to the challenge of increasing an ad’s relevance to users––and thus its value to advertisers. Operationally this meant that Google would finally repurpose its growing cache of “useless” behavioral data. Now the data would be used to match ads with keywords, exploiting subtleties that only its access to behavioral data, combined with its analytical capabilities, could reveal.
Behavioral data that were once discarded or ignored were rediscovered as what I call behavioral surplus;…this new market exchange was not an exchange with users but rather with companies who understood how to make money from bets on users’ future behavior.
It’s now clear that this shift in the use of behavioral data was an historic turning point. Behavioral data that were once discarded or ignored were rediscovered as what I call behavioral surplus: data reserves that are more than what is required for product and service improvements. Google’s dramatic success in “matching” ads to pages revealed the transformational value of this behavioral surplus as a means of generating revenue and ultimately turning investment into revenue.
Key to this formula was the fact that this new market exchange was not an exchange with users but rather with companies that understood how to make money from bets on users’ future behavior. In this new context, users were no longer ends-in-themselves. Instead they became a means to profits in new behavioral futures markets in which users are neither buyers nor sellers nor products. Instead, users are the human natural source of free raw material that feeds a new kind of manufacturing process designed to fabricate prediction products. These products are calculations that predict what individuals and groups will do now, soon, and later. The more raw materials that are fed into this new machine intelligence-based “means of production,” the more powerful are its prediction products. While these processes were initially aimed at online ad targeting, they are no more restricted to that application than mass production was restricted to the manufacture of automobiles, where it was first applied at scale.
Many of the facts I describe here are well known, but their significance has not been fully appreciated or adequately theorized. Google and other surveillance platforms are sometimes described as “two-sided” or “multisided” markets, but the mechanisms of surveillance capitalism suggest something different. Google had discovered a way to translate its non-market interactions with users into surplus raw material for the fabrication of products aimed at genuine market transactions with its real customers: advertisers. It was the translation of private human experience situated outside the market into behavioral data that circulates within the market that finally enabled Google to convert investment into revenue and capital. The corporation thus created out of thin air and at zero marginal cost an asset class of vital raw materials derived from users’ non-market online experience. At first those raw materials were simply “found,” a byproduct of users’ search action. Later those assets were hunted aggressively, procured, and accumulated— largely through unilateral operations designed to evade individual awareness and thus bypass individual decision rights––operations that are therefore best summarized as “surveillance.”
Google had discovered a way to translate its nonmarket interactions with users into surplus raw material for the fabrication of products aimed at… its real customers: advertisers.
That behavioral surplus that became the defining element of Google’s success was well understood by its leaders. Google’s former CEO Eric Schmidt credits Hal Varian’s early development of the firm’s ad auctions with pro- viding the eureka moment that clarified the true nature of Google’s business, “All of a sudden, we realized we were in the auction business,” referring to the automated behavioral futures markets deployed in ad targeting. But Larry Page is credited with a different and far more insightful answer to the question, “What is Google?” Former Google executive Douglas Edwards recounts a 2001 session with the founders that probed their answers to that precise query. It was Page who ruminated, “If we did have a category, it would be personal information….The places you’ve seen. Communications….Sensors are really cheap….Storage is cheap. Cameras are cheap. People will generate enormous amounts of data….Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”
Page’s vision perfectly reflects the history of capitalism as a process of taking things that live outside the market sphere and declaring their new life as market commodities. In historian Karl Polanyi’s 1944 grand narrative of the “great transformation” to a self-regulating market economy, he described the origins of this translation process in three astonishing and crucial mental inventions that he called “commodity fictions.” The first was that human life could be subordinated to market dynamics and reborn as “labor” to be bought and sold. The second was that nature could be translated into the market and reborn as “land” or “real estate.” The third was that exchange could be reborn as “money.” Page grasped that human experience would be Google’s virgin wood––that it could be extracted at no extra cost online and at a low marginal cost out in the real world, where “sensors are really cheap,” thus producing a surplus as the basis of a wholly new class of market exchange. Surveillance capitalism originates in this act of digital dispossession, operationalized in the rendition of human experience as behavioral data. This is the lever that moved Google’s world and shifted it toward profit, changing the trajectory of information capitalism as it claimed undefended human experience for a market dynamic that would encounter no impediment in the lawless spaces of the internet.
The significance of behavioral surplus was quickly camouflaged, both at Google and eventually throughout the internet industry, with labels like “digital exhaust” and “digital breadcrumbs.” The extraordinary financial power of surveillance capitalism’s hidden inventions was only revealed when Google went public in 2004. At that time it became clear that on the strength of its secrets, the firm’s revenue had increased by 3,590 percent, from $86 million in 2001 to $3.2 billion in 2004.
In the case of surveillance capitalism, camouflage, euphemism, and other methodologies of secrecy aim to prevent interruption of critical supply chain operations that begin with the rendition of human experience and end with the delivery of behavioral data to machine intelligence-based production systems. These operations of secrecy-by-design turn us into exiles from our own behavior, denied access to or control over knowledge derived from our experience. Knowledge and power rest with surveillance capital for which we are merely “human natural” resources. We are the native peoples now whose tacit claims to self-determination have vanished from the maps of our own lives.
…[W]hen Google went public in 2004… it became clear that on the strength of its secrets, the firm’s revenue had increased by 3,590 percent, from $86 million in 2001 to $3.2 billion in 2004.
To be sure, there are always sound business reasons for hiding the location of your gold mine. In Google’s case, an explicit “hiding strategy” accrued to its competitive advantage, but there were other, more pressing reasons for concealment and obfuscation. Douglas Edwards writes about the corporation’s culture of secrecy: According to his account, Page and Brin were “hawks,” insisting on aggressive data capture and retention. “Larry opposed any path that would reveal our technological secrets or stir the privacy pot and endanger our ability to gather data.” Page questioned the prudence of the electronic scroll in the reception lobby that displays a continuous stream of search queries, and he “tried to kill” the annual Google Zeitgeist conference that summarizes the year’s trends in search terms.
What might the response have been back then if the public were told that Google’s magic derived from its exclusive capabilities in unilateral surveillance of online behavior and methods specifically designed to override awareness and thus individual decision rights? Secrecy was required in order to protect operations designed to be undetectable because they took things from users without asking and employed those illegitimately claimed resources to work in the service of others’ purposes.
That Google was able to choose secrecy is itself testament to the success of its own claims and an illustration of the difference between “decision rights” and “privacy.” Decision rights confer the power to choose whether to keep something secret or to share it. One can choose the degree of privacy or transparency for each situation. U.S. Supreme Court Justice William O. Douglas articulated this view of privacy in 1967: “Privacy involves the choice of the individual to disclose or to reveal what he believes, what he thinks, what he possesses…” Surveillance capitalism laid claim to these decision rights.
Secrecy was required in order to protect operations designed to be undetectable because [Google] took things from users without asking…
The typical complaint is that privacy is eroded, but that is misleading. In the larger societal pattern, privacy is not eroded but redistributed, as decision rights over privacy are claimed for surveillance capital. Instead of many people having the right to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism. Google discovered this necessary element of the new logic of accumulation: it must declare its rights to take the information on which its success depends. These operational necessities paved the way for what would eventually become the unprecedented asymmetries of knowledge over which surveillance capitalists now preside.
Fast forward two decades and these laws of motion are visible in every direction. So-called digital assistants like Google’s Home and Amazon’s Alexa are frontier examples. Disguised as engines of “personalization,” digital assistants operate as complex supply chains for continuous automatic extraction of behavioral surplus from human experience, its predictive value ultimately realized in markets for future behavior. Consider Amazon’s Alexa, intended to become the operating system for your life. The corporation aggressively opened Alexa to third-party developers in order to expand the “assistant’s” range of “skills,” such as reading a recipe or ordering a pizza. It also opened the Alexa platform to smart-home device makers from manufacturers of lighting systems to dishwashers, turning Alexa into the voice-interface for controlling home systems and appliances. In 2015 Amazon announced that Alexa would be sold as a service, known as “Amazon Lex,” enabling any company to integrate Alexa’s brain into its products. As Alexa’s senior vice president explained, “Our goal is to try to create a kind of open, neutral ecosystem for Alexa … and make it as pervasive as we possibly can.” “As pervasive as possible” explains why Amazon wants its Echo/ Alexa device to also function as a home phone, able to make and receive calls; why it inked an agreement to install Echo in the nearly 5,000 rooms of the Wynn resort in Las Vegas; and why it is selling Alexa to call centers to automate the process of responding to live questions from customers by phone and text. By 2018 the corporation had inked deals with home builders, installing its Dot speakers directly into ceilings throughout the house as well as Echo devices and Alexa-powered door locks, light switches, security systems, door bells, thermostats.
…Google’s Home and Amazon’s Alexa,… [d]isguised as engines of “personalization,” … operate as complex supply chains for continuous automatic extraction of behavioral surplus from human experience…
Alexa’s skills, shape-shifting, and ubiquity produce more and more varied interfaces with human experience, which is then alienated from its source, translated into behavioral data, and claimed as behavioral surplus. In the process, Amazon acquires comprehensive data on people’s actual living habits, which it learns how to fabricate into behavioral predictions for sale in behavioral futures markets for real-world services, such as house cleaning, plumbing, or restaurant delivery. Amazon thus reproduces in the real world the same logic that Google perfected in the virtual world, where it learned to mine behavioral surplus from online search for predictions of click-through rates sold into behavioral futures markets for online ad targeting. Already forward-looking Amazon patents include the development of a “voice-sniffer algorithm” integrated into any device and able to respond to hot words, such as “bought,” “dislike,” or “love” with product and service offers.
The lure of behavioral futures markets explains why the company joined Apple and Google in the contest for the automobile dashboard, forging alliances with Ford and BMW. The idea is to host behavioral futures markets in the front seat, “shopping from the steering wheel” as Alexa delivers restaurant recommendations or advice on where to get your tires checked.
The summary of these developments is that behavioral surplus can be considered as surveillance assets. These assets are critical raw materials in the pursuit of surveillance customers for the sake of surveillance revenues and their translation into surveillance capital. The entire logic of accumulation is most accurately under- stood as surveillance capitalism, which is the foundational framework for a surveillance- based economic order: a surveillance economy.
The accumulation of behavioral surplus is the master motion of surveillance capitalism from which key economic imperatives can be induced. The quality of prediction products depends on volume inputs to machine processes. Volume surplus is thus a competitive requirement. This dynamic establishes the extraction imperative, which expresses the necessity of economies of scale in surplus accumulation and depends on automated systems that relentlessly track, hunt, and induce more behavioral surplus. These systems, which began in the online environment and later spread to the “real” world, constitute an extraction architecture that has evolved in the direction of ubiquity, just as Larry Page anticipated in 2001. Under the lash of the extraction imperative, digital instrumentation has been transformed into a global, sensate, computational, connected architecture of behavioral surplus capture and analysis, fulfilling computer scientist Mark Weiser’s 1999 vision of “ubiquitous computing” memorialized in two legendary sentences: “The most profound technologies are those that disappear. They weave them- selves into the fabric of everyday life until they are indistinguishable from it.”
However, the volume of surplus became a necessary but not sufficient condition for success. Even the most sophisticated process of converting behavioral surplus into products that accurately forecast the future is only as good as the raw material available for processing. In the race for higher degrees of certainty, it became clear that the best predictions would have to approximate observation. The next threshold was defined by the quality, not just the quantity, of behavioral surplus. These pressures led to a search for new supplies of surplus that would more reliably foretell the future. This marks a critical turning point in the trial-and-error elaboration of surveillance capitalism and crystallizes a second economic imperative—the prediction imperative—as the expression of these competitive forces.
“The most profound technologies are those that… weave themselves into the fabric of everyday life until they are indistinguishable from it.”
The first challenge of the prediction imperative is economies of scope. Behavioral surplus must be vast, and scale remains critical, but surplus must also be varied. These variations have developed along two dimensions. The first is the extension of extraction operations from the virtual world into the “real” world of embodied human experience. Surveillance capitalists understood that their future wealth would depend on new supply routes that extend to real life on the roads, among the trees, throughout the cities. Extension wants your bloodstream and your bed, your breakfast conversation, your commute, your run, your refrigerator, your parking space, your living room, your pancreas.
Economies of scope also proceed along a second depth dimension. The idea here is that more predictive, and therefore more lucrative, behavioral surplus can be plumbed from intimate patterns of the self. These supply operations rely on emergent rendition techniques trained on new forms of surplus from facial recognition and affective computing to voice, gait, posture, and text analysis that lay bare your personality, moods, emotions, lies, and vulnerabilities. As the prediction imperative drives deeper into the self, the value of these intimate sources of surplus becomes irresistible, and the competitive pressures to corner lucrative supplies escalate. It is no longer a matter of surveillance capital wringing surplus from what you search, buy, and browse. Surveillance capital wants more than your body’s coordinates in time and space. Now it violates the inner sanctum, as machines and their algorithms decide the meaning of your sighs, blinks, and utterances; the pattern of your breathing and the movements of your eyes; the clench of your jaw muscles; the hitch in your voice; and the exclamation points in a Facebook post once offered in innocence and hope.
Surveillance capitalists…want… your bloodstream and your bed, your breakfast conversation, your commute, your run, your refrigerator, your parking space, your living room, your pancreas.
Just as scale became necessary but insufficient for higher quality predictions, the demands of the prediction imperative eventually encountered the limitations of economies of scope. While behavioral surplus must be vast and varied, surveillance capitalists gradually came to understand that the surest way to predict behavior is to intervene at its source and shape it. The processes invented to achieve this goal are what I call economies of action.
Of course, advertisers and their clients have always tried to shape customer behavior through priming, suggestion, and social comparison. What distinguishes today’s efforts is that not only do they extend beyond advertising, but they employ a ubiquitous digital architecture––Page’s “cheap sensors”––that is finally able to automate the continuous comprehensive monitoring and shaping of human behavior with unprecedented accuracy, intimacy, and effectiveness. Economies of scale and scope are well-known industrial logics, but automated economies of action are distinct to surveillance capitalism and its digital milieu.
…[S]urveillance capitalists gradually came to understand that the surest way to predict behavior is to intervene at its source and shape it.
In order to achieve these economies of action, machine processes are configured to intervene in the state of play in the real world among real people and things. These interventions are designed to augment prediction products in order that they approximate certainty by “tuning,” “herding,” and conditioning the behavior of individuals, groups, and populations. These economies of action apply techniques that are as varied as inserting a specific phrase into your Facebook news feed, timing the appearance of a BUY button on your phone with the rise of your endorphins at the end of a run, shutting down your car engine when an insurance payment is late, or employing population-scale behavioral micro-targeting drawn from Facebook profiles. Indeed, the notorious manipulations of the data firm Cambridge Analytica, which scandalized the world in 2018, simply appropriated the means and methods that are now both standard and necessary operations in the surveillance capitalism arsenal.
As the prediction imperative gathers force, it gradually becomes clear that economies of scale and scope were the first phases of a more ambitious project. Economies of action mean that ubiquitous machine architectures must be able to know as well as to do. What began as an extraction architecture now doubles as an execution architecture through which hidden economic objectives are imposed on the vast and varied field of behavior. As surveillance capitalism’s imperatives and the material infrastructures that perform extraction and execution operations begin to function as a coherent whole, they produce a twenty-first-century means of behavioral modification to which the means of production is subordinated as merely one part of this larger cycle.
… [E]conomies of action apply techniques that are as varied as inserting a specific phrase into your Facebook news feed, timing the appearance of a BUY button on your phone with the rise of your endorphins at the end of a run…
The means of behavioral modification does not aim to compel conformity to or compliance with social norms, as has been the case with earlier applications of the behaviorist paradigm. Rather, this new complex aims to produce behavior that reliably, definitively, and certainly leads to predicted commercial results for surveillance customers. The research director of Gartner, the respected business advisory and research firm, makes the point unambiguously when he observes that mastery of the “internet of things” will serve as “a key enabler in the transformation of business models from ‘guar- anteed levels of performance’ to ‘guaranteed outcomes.’” This is an extraordinary statement, because there can be no such guarantees in the absence of the power to make it so. The wider complex of “the means of behavioral modification” is the expression of this gathering power. The prospect of businesses competing on the promise of guaranteed outcomes enabled by a global digital architecture alerts us to the force of the prediction imperative, which now demands that surveillance capitalists make the future for the sake of predicting it.
The conflation of economic imperatives and behavior modification at scale locates the surveillance capitalist project squarely in the paradigm of radical behaviorism associated with B.F. Skinner, which draws on formulations in early theoretical physics, especially the philosophical work of Max Planck. Following Planck, radical behaviorism insists on the reduction of human experience to observable measurable behavior purged of inwardness, thus establishing psychological science as the objective study of behaving objects comparable to the research paradigms of the natural sciences.
Just as industrial capitalism was driven to the continuous intensification of the means of production, so surveillance capitalists are now locked in a cycle of continuous intensification of the means of behavioral modification. Although it is possible to imagine something like a ubiquitous connected sensate computational architecture without surveillance capital- ism, the means of behavioral modification depend entirely on this pervasive networked architecture.
The means of behavioral modification…aims to produce behavior that reliably, definitively, and certainly leads to predicted commercial results for surveillance customers.
Economies of scale and scope ignored privacy norms and laws, relying on weak legitimation processes characteristic of meaningless mechanisms of notice and consent (privacy policies, end-user agreements, etc.) to accumulate decision rights in the surveillance capitalist domain. Economies of action go further. These new systems and procedures take direct aim at individual autonomy, systematically replacing self-determined action with a range of hidden operations designed to shape behavior at the source. Economies of action are constructed through systematic experimentation that began with apparent banalities like the A/B testing of webpage design elements and eventually progressed to more complex undertakings. One example is the secret manipulation of emotions demonstrated in Facebook’s vast experiments in shaping social behavior, about which the corporation’s researchers concluded, “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness….Online messages influence our experience of emotions, which may affect a variety of offline behaviors.” Another example is the population-scale social herding experiments popularized by the Google-incubated augmented reality application of Niantic Labs’ Pokémon Go, in which innocent players are herded to eat, drink, and purchase in the restaurants, bars, fast-food joints, and shops that pay to play in the company’s behavioral futures markets.
It is no longer enough to automate information flows about us; the goal now is to automate us.
Ultimately behavioral modification capabilities are institutionalized in “innovative” commercial practices in which individuals are called on to fund their own domination. One finds digital tuning, herding, and conditioning embedded in such varied practices as the insurance industry’s embrace of “behavioral under- writing,” the gamification of retailing, the remote-control operations of automotive telematics, and the “personalized services” of the so-called digital assistants.
The means of behavioral modification are the subject of creative elaboration, experimentation, and application, but always outside the awareness of its human targets. For example, the chief data scientist for a national drugstore chain described how his company designs auto- mated digital reinforcers to subtly tune custom- ers’ behaviors: “You can make people do things with this technology. Even if it’s just five per- cent of people, you’ve made five percent of people do an action they otherwise wouldn’t have done, so to some extent there is an element of the user’s loss of self-control.” A software engineer specializing in the “internet of things” explained his company’s approach to conditioning: “The goal of everything we do is to change people’s actual behavior at scale … we can capture their behaviors and identify good and bad. Then we develop ‘treatments’ or ‘data pellets’ that select good behaviors.” Another recounted the operational mechanisms of herding, “We can engineer the context around a particular behavior and force change that way….We are learning how to write the music, and then we let the music make them dance.”
What these examples share is the explicit aim to produce planned behavioral outcomes with methods of behavioral modification that operate through unprecedented and proprietary digital architectures, while carefully circumventing the awareness of human targets. It is no longer enough to automate information flows about us; the goal now is to automate us. This phase of surveillance capitalism’s evolution finally strips away the illusion that the net- worked form has some kind of indigenous moral content––that being “connected” is somehow intrinsically pro-social, innately inclusive, or naturally tending toward the democratization of knowledge. Instead, digital connection is now a brazen means to others’ commercial ends. Such a self-authorizing power has no grounding in democratic legitimacy, usurping decision rights, and eroding the processes of individual autonomy that are essential to the function of a democratic society. The coda here is simple: Once I was mine. Now I am theirs.
The Rise of Instrumentarian Power
There can be no behavioral modification without the power to make it so. But what is this power? Just as twentieth-century scholars of totalitarianism once looked to nineteenth-century imperialism to explain the violence of their time, it is we who now reach for the familiar vernaculars of twentieth-century power like lifesaving driftwood. Invariably we look to Orwell’s Big Brother and more generally the specter of totalitarianism as the lens through which to interpret today’s threats. The result is that Google, Facebook, and the larger field of commercial surveillance are frequently criticized as “digital totalitarianism.”
I admire those who have stood against the incursions of commercial surveillance, but I also suggest that the equation of its new power with totalitarianism and the Orwellian trope impedes our understanding as well as our ability to resist, neutralize, and ultimately vanquish its potency. Instead, we need to grasp the specific inner logic of a conspicuously twenty- first-century conjuring of power to which the past offers no adequate compass. Its aims are in many ways just as ambitious as those of totalitarianism, but they are also utterly and profoundly distinct. The work of naming a strange form of power unprecedented in the human experience must begin anew for the sake of effective resistance and the creative power to insist on a future of our own making.
Such…self-authorizing power has no grounding in democratic legitimacy,… eroding the processes of individual autonomy that are essential to the function of a democratic society.
As to the new species of power, I have suggested that it is best understood as instrumentarianism, defined as the instrumentation and instrumentalization of human behavior for the purposes of modification, prediction, monetization, and control. In this formulation, “instrumentation” refers to the ubiquitous, sensate, computational, actuating global architecture that renders, monitors, computes, and modifies, replacing the engineering of souls with the engineering of behavior. There is no brother here of any kind, big or little, evil or good—no family ties, however grim. Instead this new global apparatus is better understood as a Big Other that encodes the “otherized” viewpoint of radical behaviorism as a pervasive presence. “Instrumentalization” denotes the social relations that orient the puppet masters to human experience, as surveillance capital overrides long-standing reciprocities of market democracy, wielding its machines to transform us into the raw material for its own production.
Although he did not name it, Mark Weiser, the visionary of ubiquitous computing, foresaw the immensity of instrumentarian power as a totalizing societal project. He did so in a way that suggests both its utter lack of precedent and the danger of confounding it with what has gone before: “hundreds of computers in every room, all capable of sensing people near them and linked by high-speed networks have the potential to make totalitarianism up to now seem like sheerest anarchy.” In fact, all those computers are not the means to a digital hyper- totalitarianism. They are, as I think Weiser sensed, the foundation of an unprecedented power that can reshape society in unprecedented ways. If instrumentarian power can make totalitarianism look like anarchy, then what might it have in store for us?
There is no brother here of any kind, big or little, evil or good[,]…[i]nstead this new global apparatus is better understood as a Big Other…
While all power yearns toward totality, instrumentarian power’s specific purposes and methods are not only distinct from totalitarian- ism, they are in many ways its precise opposite. Surveillance capitalists have no interest in murder or the reformation of our souls. Instrumentarian power, therefore, has no principle to instruct. There is no training or transformation for spiritual salvation, no ideology against which to judge our actions. It does not demand possession of each person from the inside out. It has no interest in exterminating or disfiguring our bodies and minds in the name of pure devotion. Totalitarianism was a political project that con- verged with economics to overwhelm society. Instrumentarianism is a market project that con- verges with the digital to achieve its own unique brand of social domination. Totalitarianism operated through the means of violence, but instrumentarian power operates through the means of behavioral modification, and this is where our focus must shift. What passes for social relations and economic exchange now occurs across the medium of this robotized veil of abstraction.
Instrumentarianism’s specific “viewpoint of observation” was forged in the controversial intellectual domain of “radical behaviorism.” Thanks to Big Other’s capabilities, instrumentarian power reduces human experience to measurable observable behavior, while remaining steadfastly indifferent to the meaning of that experience. It is profoundly, infinitely, and, following its behaviorist origins, radically indifferent to our meanings and motives. This epistemology of radical indifference produces observation without witness. Instead of an intimate violent political religion, Big Other’s way of knowing us yields the remote but inescapable presence of impenetrably complex systems and the interests that author them, carrying individuals on a fast-moving current to the fulfilment of others’ ends. Big Other has no interest in soiling itself with our excretions, but it may aggressively hunt data on the behavior of our blood and shit. It has no appetite for our grief, pain, or terror, although it welcomes the behavioral surplus that leaches from our anguish.
Trained on measurable action, Big Other cares only about observing what we do and ensuring that we do it in ways that are accessible to its ever-evolving operations of rendition, reinforcement, calculation, and monetization. Instrumentarianism’s radical indifference is operationalized in Big Other’s dehumanized methods of evaluation that produce equivalence without equality by reducing individuals to the lowest common denominator of sameness—an organism among organisms.
In the execution of economies of action, Big Other simulates the behaviorists’ “vortex of stimuli,” transforming “natural selection” into the “unnatural selection” of variation and reinforcement authored by market players and the competition for surveillance revenues. The gentle seductive voice crafted on the yonder side of this veil— Google, is that you?—herds us along the path that coughs up the maximum of behavioral surplus and the closest approximation to certainty.
The Challenge to Collective Action
How do they get away with it? Dozens of surveys conducted since 2008 attest to substantial majorities in the United States, the European Union, and around the world that reject the premises and practices of surveillance capital- ism, yet it persists, succeeds, grows, and dominates, remaining largely uncontested by either existing or new forms of collective action. In other work I have detailed sixteen conditions that enabled this new logic of accumulation to root and flourish. Here I want to underscore two of these conditions: The first is the absence of organic reciprocities between surveillance capitalist firms and their populations. This absence produces the second condition, in which dependency replaces reciprocity as the fulcrum of this commercial project.
A first answer to the question “How do they get away with it?” concerns a novel structural feature of this market form that diverges sharply from the history of market democracy. For all the failings, injustice, and violence of earlier forms of modern capitalism, the necessity of organic reciprocities with its populations has been a mark of endurance and adaptability. Symbolized in the twentieth century by Ford’s five-dollar day, these reciprocities reach back to Adam Smith’s original insights into the productive social relations of capitalism, in which firms rely on people as employees and customers. Smith argued that price increases had to be balanced with wage increases “so that the laborer may still be able to purchase that quantity of those necessary articles which the state of the demand for labor … requires that he should have.” By the 1980s, globalization and neoliberal ideology, operationalized in the shareholder-value movement, went a long way toward destroying these centuries-old reciprocities between capitalism and its communities. Surveillance capitalism completes the job.
Instrumentarianism is a market project that converges with the digital to achieve its own unique brand of social domination.
First, surveillance capitalists no longer rely on people as consumers. Instead, the axis of supply and demand orients the surveillance capitalist firm to businesses intent on anticipating the behavior of populations, groups, and individuals. The result is that populations are conceptualized as undifferentiated “users,” who are merely the sources of raw material for a digital-age production process aimed at a new business customer. Where individual consumers continue to exist in surveillance capitalist operations—purchasing smart appliances, digital assistants, dolls that spy, or behavior-based insurance policies, just to name a few examples—social relations are no longer founded on mutual exchange. In these and many other instances, products and services are merely hosts for surveillance capitalism’s data extraction operations. For example, the concept of the “smart home” has become emblematic of this new asymmetry. By 2018 the global smart home market was valued at $36 billion USD and expected to reach $151 billion by 2023. The numbers betray an earthquake beneath their surface. Consider just one smart home device: the Nest thermostat owned by Alphabet, the Google holding company, and merged with Google in 2018. The Nest thermostat collects data about its usage and environment. It uses motion sensors and computation to “learn” the behaviors of a home’s inhabitants. Nest’s apps can also gather data from other connected products such as cars, ovens, fitness trackers, beds. Such systems can, for example, trigger lights if an anomalous motion is detected, signaling video and audio recording, and even sending notifications to homeowners or others. As a result of the merger with Google, the thermo- stat, like other Nest products, will be built with Google’s artificial intelligence capabilities, including its personal digital “Assistant.” The thermostat and its brethren devices create immense new stores of knowledge and there- fore new power—but for whom?
The absence of consumer reciprocities is complemented by the absence of employment reciprocities. By historical standards the large surveillance capitalists employ relatively few people compared to their unprecedented computational resources. This pattern, in which a small, highly educated workforce leverages the power of a massive capital-intensive knowledge-production infrastructure, is called “hyper- scale.” The historical discontinuity of the hyperscale business operation becomes apparent by comparing seven decades of General Motors (GM) employment levels and market capitalization to recent post-IPO (initial public offering) data from Google and Facebook. (I have confined the comparison here to Google and Facebook because both were pure surveillance capitalist firms even before their public offerings.)
Nest takes little responsibility for the security of the information it collects and none for how the other companies in its ecosystem will put those data to use.
From the time they went public to 2016, Google and Facebook steadily climbed to the heights of market capitalization, with Google reaching $532 billion by the end of 2016 and Facebook at $332 billion, without Google ever employing more than 75,000 people or Facebook more than 18,000. General Motors took four decades to reach its highest market capitalization of $225.15 billion in 1965, when it employed 735,000 women and men. Most startling is that GM employed more people during the height of the Great Depression than either Google or Facebook employs at their heights of market capitalization.
The GM pattern is the iconic story of the United States in the twentieth century, before globalization, neoliberalism, the shareholder- value movement, and plutocracy unraveled the public corporation and the institutions of what historian Karl Polanyi called “the double movement,” a network of “measures and policies … integrated into powerful institutions designed to check the action of the market relative to labor, land, and money.” Polanyi’s studies led him to conclude that the operations of a self-regulating market are profoundly destructive when allowed to run free of such countervailing laws and policies. It was the institutions of the double movement that tamed GM’s employment policies with fair labor practices, unionization, and collective bargaining, emblematic of stable reciprocities during the pre-globalization decades of the twentieth century. The societal result was predictable. In the 1950s, for example, 80 percent of adults said that “big business” was a good thing for the country, 66 percent believed that business required little or no change, and 60 percent agreed, “the profits of large companies help make things better for everyone who buys their products or services.”
[A]…survey in 2015 found 91 percent of respondents disagreeing that the collection of personal information “without my knowing” is a fair tradeoff for a price discount.
Although some critics blamed GM’s institutional reciprocities for its failure to adapt to global competition in the late 1980s, leading eventually to its bankruptcy in 2009, analyses have shown that chronic managerial complacency and doomed financial strategies bore the greatest share of responsibility for the firm’s legendary decline, a conclusion that is fortified by the successes of the German automobile industry in the twenty-first century, where strong labor institutions formally share decision-making authority.
Nearly seventy years later and in the absence of democratic checks on the power of surveillance capitalists, the picture is very different. For example, a major 2009 survey found that when Americans are informed of the ways that companies gather data for targeted online ads, 73 to 86 percent rejected such advertising. Another substantial survey in 2015 found 91 percent of respondents disagreeing that the collection of personal information “without my knowing” is a fair tradeoff for a price discount. Fifty-five percent disagreed that it was a fair exchange for improved services. In 2016 PEW Research reported only 9 percent of respondents as very confident in trusting social media sites with their data and 14 percent very confident about trusting companies with personal data. More than 60 percent wanted to do more to protect their privacy and believed there should be more regulation to protect privacy.
Hyperscale firms have become emblematic of modern digital capitalism, and as capitalist inventions they present significant social and economic challenges, including their impact on employment and wages, industry concentration, and monopoly. In 2017 there were 24 hyperscale firms operating 320 data centers with anywhere between thousands and millions of servers (Google and Facebook are among the largest). One hundred more data centers are expected to be online by late 2018. Microsoft invested $20 billion in 2017, and in 2018 Facebook announced plans to invest $20 billion in a new hyperscale data center in Atlanta. According to one industry report, hyperscale firms are also building the world’s networks, especially subsea cables, which means that “a large portion of the global internet traffic is now running through private networks owned or operated by hyperscalers.” In 2016 Facebook and Google teamed up to build a new subsea cable between the United States and Hong Kong, described as the highest-capacity trans- pacific route to date. The surveillance capitalists who operate at hyperscale or outsource to hyperscale operations dramatically diminish any reliance on their societies as sources of employees, and the few for whom they do compete are largely drawn from the most-rarified strata of data science.
The absence of organic reciprocities with people as sources of either consumers or employees is a matter of exceptional importance in light of the historical relationship between market capitalism and democracy. In fact, the origins of democracy in both Britain and America have been traced to these very reciprocities. Even a brief glance at these histories can help us grasp the degree to which surveillance capitalism diverges from capitalism’s past, a divergence in which an extreme structural independence from people lays the foundation for surveillance capitalism’s unique approach to knowledge that we have called “radical indifference.”
In Britain, the rise of volume production and its wage-earning labor force in the nineteenth century contributed not only to workers’ economic power but also to a growing sense of labor’s political power and legitimacy. This produced a new sense of interdependence between ordinary people and elites. Economists Daron Acemoglu and James A. Robinson show that the rise of democracy in nineteenth-century Britain was inextricably bound to industrial capital- ism’s dependency on the “the masses” and their contribution to the prosperity made possible by the new organization of production.
Acemoglu and Robinson conclude that the “dynamic positive feedback” between “inclusive economic institutions” (i.e., institutions defined by reciprocities) and political institutions was critical to Britain’s substantial and non-violent democratic reforms. Inclusive economic institutions, they argue, “level the playing field,” especially when it comes to the fight for power, making it more difficult for elites to “crush the masses” rather than accede to their demands. Reciprocities in economics produced and sustained reciprocities in politics. “Clamping down on popular demands,” they write, “and undertaking a coup against inclusive political institutions would…destroy…[economic] gains, and the elites opposing greater democratization and greater inclusive- ness might find themselves among those losing their fortunes from this destruction.”
The spread of democracy also depended on the reciprocities of consumption, and the American Revolution is the outstanding example of this dynamic. Historian T.H. Breen argues in his path-breaking book, The Marketplace of Revolution, that it was the violation of these reciprocities that set the American Revolution into motion, uniting disparate provincial strangers into a radical new patriotic force. Breen explains that American colonists had come to depend on the “empire of goods” imported from England, and that this dependency instilled the sense of a reciprocal social contract: “For ordinary people, the palpable experience of participating in an expanding Anglo-American consumer market” intensified their sense of a “genuine partnership” with England. Eventually, the British Parliament famously misjudged the rights and obligation of this partnership, imposing a series of taxes that turned imported goods such as cloth and tea into “symbols of imperial oppression.”
Breen describes the unprecedented inventiveness of a political movement originating in the shared experience of consumption, the out- rage at the violation of essential producer–consumer interdependencies, and the determination to make “goods speak to power.” The translation of consumer expectations into democratic revolution occurred in three waves, beginning in 1765, when the Stamp Act triggered popular protests, riots, and organized resistance finally expressed in the “nonimportation movement.” (Today we would call it a consumer boycott.)
As Breen tells it, the details of the Act were less important than the colonists’ realization that England did not perceive them as political or economic equals bound in mutually beneficial reciprocities. “By compromising the Americans’ ability to purchase the goods they desired,” he writes, “Parliament had revealed an intention to treat the colonists like second- class subjects,” levying a heavy price “on the pursuit of material happiness.”
In the absence of the organic reciprocities between producers, customers, and employees that bind populations in a shared fate, “user” dependency is the fulcrum of the surveillance capitalist project. Surveillance capitalism spread across the internet just as digital communications became the salient means of social participation. A 2010 BBC poll found that 79 percent of people in twenty-six countries considered internet access to be a fundamental human right. Six years later in 2016, the United Nations Human Rights Council would adopt specific language on the importance of internet access. In the United States, many people call the emergency services number, 911, on those rare occasions when Facebook is down. Most people find it difficult to with- draw from these utilities, and many ponder if it is even possible. The result has been an involuntary merger of personal necessity and economic extraction, as the same channels that we rely on for daily logistics, social interaction, work, education, health care, access to products and services, and much more, now double as supply chain operations for surveillance capitalism’s surplus flows. The result is that effective social participation leads through the means of behavioral modification, eroding the choice mechanisms that once adhered to the private realm––exit, voice, and loyalty. There can be no exit from processes that are intention- ally designed to bypass individual awareness and on which we must depend for effective daily life. Users lack reliable channels for voice. Loyalty is an empty suit, as participation is better explained in terms of necessity, dependency, helplessness, resignation, the foreclosure of alternatives, and enforced ignorance.
The result has been an involuntary merger of personal necessity and economic extraction, as the same channels that we rely upon for daily logistics, social interaction, work… now double as supply chain operations for surveillance capitalism’s surplus flows.
“User” dependency is thus a classic Faustian pact in which the felt needs for effective life vie against the inclination to resist instrumentarian power’s bold incursions. This conflict produces a psychic numbing that inures users to the realities of being tracked, parsed, mined, and modified. It disposes users to rationalize the situation in resigned cynicism, shelter behind defense mechanisms (“I have nothing to hide”), or find other ways to stick their heads in the sand, choosing ignorance out of frustration and helplessness. In this way, surveillance capitalism imposes a fundamentally illegitimate choice that twenty-first-century individuals should not have to make, and its normalization leaves users dancing in their chains.
These chains mark the frontier of twenty- first-century collective action. A historical parallel is instructive. Polanyi notes the “prophetic anticipation” of the early-nineteenth-century historian and social observer Harriet Martineau who in 1833 criticized “the vulgar error of the aristocracy of supposing only one class of society to exist below that wealthy one with which they are compelled by their affairs to have business.” This “error,” she argued, led to including in the single notion of “the lower classes,” “everybody below the wealthiest bankers— manufacturers, tradesmen, artisans, labourers, and paupers….” It would be decades until the distinct social, economic, and political interests of the “laborer,” and later “the working class,” emerged from the undifferentiated maw of the lower classes, distinctions that both enabled and resulted from collective action.
If surveillance capitalism remains unchallenged…,what fresh legacy of damage and regret will be mourned by future generations?
Now in the first decades of the twenty-first century the distinct social, political, and economic interests of “users” have yet to be care- fully distinguished from the de facto conditions of experiential dispossession, datafication, control, and commodification introduced by surveillance capitalism, reified in its behavioral futures markets, and enforced by its unique and ever-widening instrumentarian power. Unless this latency is evoked into new forms of collective action, the trajectory of the digital future will be left to the new hegemon: surveillance capitalism and its unprecedented asymmetries of knowledge and power.
Industrial civilization flourished at the expense of nature and threatens to cost us the earth. An information civilization shaped by surveillance capitalism and its new instrumentarian power will thrive at the expense of human nature, especially the hard-won capacities associated with self-determination and moral autonomy that are essential to the very possibility of a democratic society. The industrial legacy of climate chaos fills us with dismay, remorse, and fear. If surveillance capitalism remains unchallenged as the dominant form of information capitalism in our time, what fresh legacy of damage and regret will be mourned by future generations? By the time you read these words, the reach of this new form will have grown, as more sectors, firms, start-ups, app developers, and investors mobilize around this one plausible version of information capitalism. This mobilization and the resistance it engenders will define a key battleground on which the next generation of collective action will be con- tested at the new frontier of power.
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Shoshana Zuboff, Harvard Business School Professor Emerita, is the author of The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019).
 Hannah Arendt, The Origins of Totalitarianism (New York: Schocken, 2004), 198. Karl Polanyi, The Great Transformation: The Political and Economic Origins of Our Time (Boston, MA: Beacon Press, 2001). David Harvey, The New Imperialism (New York: Oxford University Press, 2005).
 For readers who want to explore these themes more deeply, they are elaborated in Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019).
 Jared Newman, “Google’s Schmidt Roasted for Privacy Comments,” PCWorld, December 11, 2009, http://www.pcworld.com/article/184446/ googles_schmidt_roasted_for_privacy_com- ments.html.
 Ellen Meiksins Wood, The Origin of Capitalism: A Longer View (London: Verso, 2002), 76, 93, 125.
 Jean-Charles Rochet and Jean Tirole, “Two- Sided Markets: A Progress Report,” RAND Journal of Economics 37, no. 3 (2006): 645–647, http://www.jstor.org/stable/25046265.
 Katherine J. Strandburg, “Free Fall: The Online Market’s Consumer Preference Disconnect,” Working Paper, New York University Law and Economics (New York University, October 1, 2013).
 Douglas Edwards, I’m Feeling Lucky (Boston: Houghton Mifflin Harcourt, 2011), 291.
 Polanyi, The Great Transformation, 75–76.
 Edwards 2011, 340–345.
 (Douglas 1967; Farahany 2012).
 Kevin McLaughlin et al., “Bezos Ordered Alexa App Push,” Information, November 16, 2016, https://www.theinformation.com/bezos- ordered-alexa-app-push; “The Real Reasons That Amazon’s Alexa May Become the Go-to AI for the Home,” Fast Company, April 8, 2016, https://www.fastcompany.com/3058721/app- economy/the-real-reasons-that-amazons-alexa- may-become-the-go-to-ai-for-the-home.
 “Amazon Lex—Build Conversation Bots,” Amazon Web Services, February 24, 2017, https://aws.amazon.com/lex.
 “Dave Limp, Exec Behind Amazon’s Alexa.”
 Ryan Knutson and Laura Stevens, “Amazon and Google Consider Turning Smart Speakers into Home Phones,” Wall Street Journal, February 15, 2017, https://www.wsj.com/articles/ amazon-google-dial-up-plans-to-turn-smart- speakers-into-home-phones-1487154781; Kevin McLaughlin, “AWS Takes Aim at Call Center Industry,” Information, February 28, 2017, https://www.theinformation.com/aws-takes-aim-at-call-center-industry.
 Aaron@theinformation.com et al., “Apple Loses Ground to Amazon in Smart Home Deals With Builders,” Information, accessed April 16, 2018, https://www.theinformation.com/articles/ apple-loses-ground-to-amazon-in-smart-home- deals-with-builders.
 Sapna Maheshwari, “Hey, Alexa, What Can You Hear? And What Will You Do With It?,” New York Times, March 31, 2018, sec. Media, https://www.nytimes.com/2018/03/31/busi- ness/media/amazon-google-privacy-digital- assistants.html.
 “Alexa, Take the Wheel: Ford Models to Put Amazon in Driver Seat,” Bloomberg.com, January 5, 2017, https://www.bloomberg.com/ news/articles/2017-01-05/steering-wheel-shop- ping-arrives-as-alexa-hitches-ride-with-ford.
 Mark Weiser, “The Computer for the 21st Century,” Scientific American, July 1999.
 Roland Marchand, Advertising the American Dream: Making Way for Modernity, 1920-1940 (Berkeley, CA: University of California Press, 1985).
 Christy Pettey, “Treating Information as an Asset,” Smarter with Gartner, February 17, 2016, http://www.gartner.com/smarterwithgartner/ treating-information-as-an-asset/.
 Max Planck, “Phantom Problems in Science,” in Scientific Autobiography and Other Papers (New York: Philosophical Library, 2007), 52–79.
 Adam D.I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock, “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” Proceedings of the National Academy of Sciences 111, no. 24 (June 17, 2014): 8788–8790, https://doi.org/10.1073/pnas.1320040111.
 For more on Pokémon Go and other examples of economies of action see the discussion in Zuboff, The Age of Surveillance Capitalism, 2019, chapter 10.
 Zuboff, The Age of Surveillance Capitalism, 2019, p. 295.
 See, for example, Peter S. Menell, “2014: Brand Totalitarianism,” UC Berkeley Public Law Research Paper (Berkeley, CA: University of California, September 4, 2013), http://papers. ssrn.com/abstract=2318492; “Move Over, Big Brother,” Economist, December 2, 2004, http://www.economist.com/node/3422918; Wojciech Borowicz, “Privacy in the Internet of Things Era,” Next Web, October 18, 2014, http://thenextweb.com/dd/2014/10/18/privacy- internet-things-era-will-nsa-know-whats-fridge; Tom Sorell and Heather Draper, “Telecare, Surveillance, and the Welfare State,” American Journal of Bioethics 12, no. 9 (2012): 36–44, https://doi.org/10.1080/15265161.2012.699137; Rhys Blakely, “‘We Thought Google Was the Future but It’s Becoming Big Brother,’” Times, September 19, 2014, http://www.thetimes. co.uk/tto/technology/internet/article4271776. ece; CPDP Conferences, Technological Totalitarianism, Politics and Democracy, 2016, http://www.internet-history.info/media-library/ mediaitem/2389-technological-totalitarianism-politics-and-democracy.html; Julian Assange, “The Banality of ‘Don’t Be Evil,’” New York Times, June 1, 2013, https://www.nytimes.com/2013/06/02/opinion/sunday/the-banality- of-googles-dont-be-evil.html; Julian Assange, “Julian Assange on Living in a Surveillance Society,” New York Times, December 4, 2014, https://www.nytimes.com/2014/12/04/opinion/ julian-assange-on-living-in-a-surveillance- society.html; Michael Hirsh, “We Are All Big Brother Now,” Politico, July 23, 2015, https://www.politico.com/magazine/story/2015/07/ big-brother-technology-trial-120477.html; Cory Doctorow, “Unchecked Surveillance Technology Is Leading Us Towards Totalitarianism,” International Business Times, May 5, 2017, http://www.ibtimes.com/unchecked-surveil- lance-technology-leading-us-towards-totali- tarianism-opinion-2535230; Martin Schulz, “Transcript of Keynote Speech at CPDP2016 on Technological, Totalitarianism, Politics and Democracy,” Scribd, 2016, https://www.scribd. com/document/305093114/Keynote-Speech-at- Cpdp2016-on-Technological-Totalitarianism- Politics-and-Democracy.
 Weiser, “The Computer for the 21st Century,” 89.
 See, for example, Chris Jay Hoofnagle and Jennifer King, “Research Report: What Californians Understand About Privacy Offline” (SSRN Scholarly Paper, Rochester, NY: Social Science Research Network, May 15, 2008), http://papers.ssrn.com/ abstract=1133075; Joseph Turow et al., “Americans Reject Tailored Advertising and Three Activities That Enable It,” Annenberg School for Communication, September 29, 2009, http://papers.ssrn.com/abstract=1478214; Joseph Turow, Michael Hennessy, and Nora Draper, “The Tradeoff Fallacy: How Marketers Are Misrepresenting American Consumers and Opening Them Up to Exploitation,” Annenberg School for Communication, June 2015, https:// www.asc.upenn.edu/news-events/publications/ tradeoff-fallacy-how-marketers-are-misrepre-senting-american-consumers- and; Lee Rainie, “Americans’Complicated Feelings About Social Media in an Era of Privacy Concerns,” Pew Research Center (blog), March 27, 2018, http:// www.pewresearch.org/fact-tank/2018/03/27/americans-complicated-feelings-about-social- media-in-an-era-of-privacy-concerns.
 Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019).
 Smith, The Wealth of Nations, 939–940.
 Reuters, Global Smart Homes Market 2018 by Evolving Technology, Projections & Estimations, Business Competitors, Cost Structure, Key Companies and Forecast to 2023, February 19, 2018. https://www.reuters.com/ brandfeatures/venture-capital/article?id=28096.
 Ron Amadeo, “Nest Is Done as a Standalone Alphabet Company, Merges with Google,” Ars Technica, February 7, 2018, https://arstechnica. com/gadgets/2018/02/nest-is-done-as-a-stand- alone-alphabet-company-merges-with-google/; Leo Kelion, “Google-Nest Merger Raises Privacy Issues,” BBC News, February 8, 2018, sec. Technology, http://www.bbc.com/news/ technology-42989073.
 Kelion, “Google-Nest Merger Raises Privacy Issues.”
 “Nest to Join Forces with Google’s Hardware Team,” Google, February 7, 2018, https:// www.blog.google/topics/hardware/ nest-join-forces-googles-hardware-team/.
 Grant Hernandez, Orlando Arias, Daniel Buentello, and Yier Jin. “Smart nest thermo-stat: A smart spy in your home,” Black Hat USA (2014), https://www.blackhat.com/docs/ us-14/materials/us-14-Jin-Smart-Nest-Thermo- stat-A-Smart-Spy-In-Your-Home-WP.pdf.
 Guido Noto La Diega, “Contracting for the ‘Internet of Things’: Looking into the Nest,” Research Paper (London, UK: Queen Mary University of London, School of Law, 2016); Robin Kar and Margaret Radin, “Pseudo- Contract & Shared Meaning Analysis,” Legal Studies Research Paper (University of Illinois College of Law, November 16, 2017), https:// papers.ssrn.com/abstract=3083129.
 Hernandez, et. al., “Smart nest thermostat: A smart spy in your home,” Black Hat USA (2014).
 James Manyika and Michael Chui, “Digital Era Brings Hyperscale Challenges,” Financial Times, August 13, 2014, http://www.ft.com/ intl/cms/s/0/f30051b2-1e36-11e4-bb68-00144feabdc0.html?siteedition=intl#axzz3JjX PNno5; “Hyperscalers Taking over the World at an Unprecedented Scale,” Data Economy (blog), April 11, 2017, https://data-economy. com/hyperscalers-taking-world-unprecedented- scale/; Paul McNamara, “What Is Hyperscale and Why Is It So Important to Enterprises?,” n.d., http://cloudblog.ericsson.com/digital- services/what-is-hyperscale-and-why-is-it-so- important-to-enterprises; Digital Realty, “What Is Hyperscale?,” text/html, Digital Realty, February 2, 2018, https://www.digitalrealty. com/blog/what-is-hyperscale/.
 These data are drawn from my own compilation of General Motors market capitalization and employment data from 1926 to 2008; Google from 2004 to 2016; and Facebook from 2012 to 2016. All market capitalization values are adjusted for inflation to 2016 dollars, as per the Consumer Price Index from Federal Reserve Economic Data, Economic Research Division, Federal Reserve Bank of St. Louis. The sources used to compile these data include Standard & Poor’s Capital IQ (Google Market Capitalization and Headcount), Wharton Research Data Services – CRSP (General Motors Market Capitalization), Standard & Poor’s Compustat (General Motors Headcount), Thomas Eikon (Facebook Market Capitalization), Company Annual Reports (General Motors Headcount), and SEC Filings (Facebook Headcount).
 Polanyi, The Great Transformation, 79.
 Opinion Research Corporation, “Is Big Business Essential for the Nation’s Growth and Expansion?,” ORC Public Opinion Index (USA, August 1954); Opinion Research Corporation, “Which of These Comes Closest to Your Impression of the Business Setup in This Country?,” ORC Public Opinion Index (USA, January 1955); Opinion Research Corporation, “Now Some Questions about Large Companies. Do You Agree or Disagree on Each of These? … Large Companies Are Essential for the Nation’s Growth and Expansion.,” ORC Public Opinion Index (USA, June 1959); Louis Harris & Associates, “Which Two or Three Best Describe Most Business Corporation Leaders in the Country?,” Harris Survey (Connecticut, April 1966); Louis Harris & Associates, “Compared with What We Have Produced in the Past in This Country, Do You Feel That Our Present Leadership in the Field of Business Is Better, Worse or about the Same as We Have Produced in the Past?,” Harris Survey (Connecticut, June 1968); Professor Louis Galambos, The Public Image of Big Business in America, 1880-1940: A Quantitative Study in Social Change (Baltimore: Johns Hopkins University Press, 1975).
 See, for example, Alfred D. Chandler, “The Enduring Logic of Industrial Success,” Harvard Business Review, March 1, 1990, https://hbr. org/1990/03/the-enduring-logic-of-industrial- success; Susan Helper and Rebecca Henderson, “Management Practices, Relational Contracts, and the Decline of General Motors,” Journal of Economic Perspectives 28, no. 1 (February 2014):49–72, https://doi.org/10.1257/jep.28.1.49.
 Turow et al., “Americans Reject Tailored Advertising and Three Activities That Enable It.”
 Turow, Hennessy, and Draper, “The Tradeoff Fallacy: How Marketers Are Misrepresenting American Consumers and Opening Them Up to Exploitation,” Annenberg School for Communication.
 Rainie, “Americans’ Complicated Feelings about Social Media in an Era of Privacy Concerns.”
 David H. Autor et al., “The Fall of the Labor Share and the Rise of Superstar Firms,” SSRN Scholarly Paper (Rochester, NY: Social Science Research Network, May 22, 2017), https://papers.ssrn.com/abstract=2971352. See also, Michael Chui and James Manyika, “Competition at the Digitial Edge: ‘Hyperscale’ Businesses,” McKinsey Quarterly, March 2015.
 See “Hyperscalers Taking Over the World at an Unprecedented Scale,” Data Economy (blog), April 11, 2017, https://data-economy. com/hyperscalers-taking-world-unprecedented- scale; “Facebook, Google Partners in 12,800Km Transpacific Cable Linking US, China,” Data Economy (blog), October 13, 2016, https:// data-economy.com/facebook-google-partners-in-12800km-transpacific-cable-linking-us- china; “Facebook Could Invest up to $20bn in a Single Hyperscale Data Centre Campus,” Data Economy (blog), January 23, 2018, https://data-economy.com/facebook-invest-20bn-single-hyperscale-data-centre-campus/.
 Daron Acemoglu and James A. Robinson, Why Nations Fail: The Origins of Power, Prosperity, and Poverty (New York: Crown Business, 2012).
 Historian Jack Goldstone observes that the magnitude of Britain’s parliamentary reforms defused the pressure for more violent change, creating a more durable and prosperous democracy. Like Acemoglu and Robinson, he concludes that “national decay” is typically associated with a social pattern in which elites do not identify their interests with those of the public, suggesting the danger of precisely the kind of structural independence enjoyed by surveillance capitalists. See Jack A. Goldstone, Revolution and Rebellion in the Early Modern World, (Berkeley: University of California Press, 1993), 481,487. See also Barrington Moore, Social Origins of Dictatorship and Democracy, (Boston: Beacon, 1993), 3-39.
 BBC, “Internet Access ‘a Human Right,’” BBC News, March 8, 2010, sec. Technology, http:// news.bbc.co.uk/2/hi/8548190.stm.
 “The Promotion, Protection and Enjoyment of Human Rights on the Internet” (United Nations Human Rights Council, June 27, 2016), https://www.article19.org/data/files/Internet_ Statement_Adopted.pdf.
 “911 Calls about Facebook Outage Angers L.A. County Sheriff’s Officials,” Los Angeles Times, August 1, 2014, http://www.latimes.com/local/ lanow/la-me-ln-911-calls-about-facebook-out- age-angers-la-sheriffs-officials-20140801-ht- mlstory.html.
 Cecilie Schou Andreassen et al., “Development of a Facebook Addiction Scale,” Psychological Reports 110, no. 2 (April 2012): 501–517, https:// doi.org/10.2466/02.09.18.PR0.110.2.501-517; Cecilia Cheng and Angel Yee-lam Li, “Internet Addiction Prevalence and Quality of (Real) Life: A Meta-Analysis of 31 Nations across Seven World Regions,” Cyberpsychology, Behavior and Social Networking 17, no. 12 (December 2014): 755–60, https://doi. org/10.1089/cyber.2014.0317; Adam Alter, Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked (New York: Penguin Press, 2017); Silvia Casale and Giulia Fioravanti, “Satisfying Needs through Social Networking Sites: A Pathway towards Problematic Internet Use for Socially Anxious People?,” Addictive Behaviors Reports 1, no. Supplement C (June 1, 2015): 34–39, https://doi.org/10.1016/j.abrep.2015.03.008; Cecilie Schou Andreassen and Stale Pallesen,“Social Network Site Addiction: An Overview,” Current Pharmaceutical Design 20, no. 25 (August 2014): 4053–4061, http://www.ingentaconnect.com/content/ben/cpd/2014/00000020/00000025/art00007; Claudia Dreifus, “Why We Can’t Look Away From Our Screens,” New York Times, March 6, 2017, sec. Science, https://www.nytimes. com/2017/03/06/science/technology-addiction-irresistible-by-adam-alter.html; Mark D. Griffiths, Daria J. Kuss, and Zsolt Demetrovics, “Social Networking Addiction,” in Behavioral Addictions (Elsevier, 2014), 119–141, https://doi.org/10.1016/B978-0-12-407724-9.00006-9.
 The phrase is from Roberto Mangabeira Unger, “The Dictatorship of No Alternatives,” in What Should the Left Propose? (London: New York: Verso, 2006), 1–11.
 Polanyi, The Great Transformation, 104–105.