28 min read

> "Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioral surplus, fed...

Chapter 34: Surveillance Capitalism and Its Critics

"Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as 'machine intelligence,' and fabricated into prediction products that anticipate what you will do now, soon, and later."

— Shoshana Zuboff, The Age of Surveillance Capitalism (2019)

"The problem with surveillance capitalism is not that it is capitalism. The problem is that it is surveillance."

— Paraphrase of the Zuboff thesis


Opening: Jordan Reads a Book

Dr. Osei had assigned the reading two weeks ago. Three chapters of Shoshana Zuboff's The Age of Surveillance Capitalism — the Introduction and the first two chapters. "Read it carefully," she'd said. "Then read it critically. The argument is important. The argument is also incomplete."

Jordan had read it carefully. Now, three nights before the class discussion, they sat with the book and their notes and tried to figure out what Dr. Osei meant by incomplete.

The book was compelling. Zuboff, a retired Harvard Business School professor, had spent years arguing that Google's advertising business model represented something genuinely new — not just capitalism adapted to the internet but a new economic logic that used human experience itself as raw material. She called it "surveillance capitalism." Her writing was urgent and sweeping.

Jordan had highlighted two passages. The first:

"Surveillance capitalism's story begins with Google and quickly spreads to Facebook, then to an expanding roster of companies... The game is not to sell products to you, but to sell your future behavior to their actual customers: businesses."

The second, from Chapter 2:

"We are no longer the subjects of value realization. Nor are we, as some have insisted, the 'product.' We are the objects from which raw materials are extracted and expropriated for Google's prediction factories."

Jordan put the book down. Objects. Not customers, not products — objects. The warehouse flashed through their mind. The way the tracker on their vest logged movements to the nearest three feet. The way the algorithm set the pace for picking orders.

But then they thought about what Dr. Osei had said: incomplete. And they started to see the edges.

Zuboff focused on Google and Facebook. She said surveillance capitalism was new — a departure from capitalism's prior logic. But Jordan had read about slave labor plantations in another class. About how enslaved people's bodies had been commodified, their reproductive capacity calculated as asset value, their time and energy extracted as raw material. Was that surveillance capitalism? It didn't involve behavioral data. But the structure — human beings as objects from which value is extracted without consent — seemed familiar.

Jordan wrote in their notes: Is the problem surveillance? Or is the problem extraction? Is Zuboff right that this is new? Or is she seeing a new form of something old?

They kept reading. This chapter explores those questions.


34.1 Zuboff's Thesis: The Full Architecture

Shoshana Zuboff's The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (2019) is the most comprehensive and widely read critical account of the data economy's political economy. Running to nearly 700 pages, it offers a theory, a history, a critique, and a warning. Understanding it — and understanding its critics — is essential to any serious analysis of surveillance in the contemporary world.

The Rendition Cycle

Zuboff describes what she calls the "rendition cycle" — the process through which human experience becomes behavioral data becomes prediction product:

  1. Extraction: Human behavior — every click, search, location ping, purchase, hover, pause, and expression — is observed and collected. Zuboff uses the word "rendition" deliberately: to render is to extract raw material (to render fat from meat), and also to surrender (to render oneself to authorities). We both surrender ourselves and are processed for raw material.

  2. Analysis: The extracted behavioral data is processed using machine learning and statistical analysis to identify patterns. These patterns allow prediction of future behavior — what you will click, buy, search for, watch, feel.

  3. Actuation: Prediction products — behavioral futures — are sold to advertisers, insurers, employers, financial institutions, and governments. The goal is not merely to predict behavior but to influence it toward commercially valuable outcomes. This is what Zuboff calls "behavior modification at scale."

Behavioral Surplus

A key concept in Zuboff's analysis is "behavioral surplus." When Google processes a search query to return relevant results, some of that behavioral data is useful for improving the service. The rest — information about the user's psychology, context, interests, relationships, and likely future behavior — is surplus. This surplus is not needed to provide search results. It is claimed by Google as a raw material for its prediction products.

The concept of behavioral surplus explains the incentive structure of surveillance capitalism: the more behavioral data collected, the more surplus, the more accurate the predictions, the more valuable the products. This creates unlimited appetite for data collection. There is no "enough" — every additional behavioral signal adds potential value.

The "Three Laws" of Surveillance Capitalism

Zuboff proposes three "laws" that describe surveillance capitalism's logic:

First Law: "Everything can be informated." Any aspect of human behavior can be translated into data. The continuous reduction in cost of sensors, computing, and data storage means the practical barriers to informating human experience continuously fall. What could not be collected yesterday can be collected today.

Second Law: "Surveillance capitalism claims human experience as free raw material for translation into behavioral data." The extraction is unilateral — users do not choose to provide behavioral surplus; it is taken from them as a condition of using services they depend on.

Third Law: "The prediction products that are derived from behavioral surplus are sold in behavioral futures markets." The customers of surveillance capitalism are not users — they are the companies, governments, and institutions that purchase predictions about user behavior.

📊 Real-World Application: The behavioral futures market is visible in the structure of advertising auctions. When you visit a webpage, before the page loads, an auction has already occurred in milliseconds: advertisers bid on the right to show you an ad, based on behavioral predictions about your likelihood to respond to specific types of advertising. The price of the auction slot reflects the value of the prediction — how confident the system is that you will click, buy, or convert. This auction, invisible to you, is the commercial core of surveillance capitalism.


34.2 The Historical Argument: Why Zuboff Calls This New

A central claim in Zuboff's analysis is that surveillance capitalism is genuinely novel — not simply an extension of industrial capitalism or consumer capitalism into digital space, but a new economic logic with new forms of power.

Her argument for novelty rests on several claims:

Novel raw material: Industrial capitalism extracts value from nature and labor. Consumer capitalism extracts value from consumers through commodity sale. Surveillance capitalism extracts value from experience itself — the behavioral traces of how people live their lives. Experience, as such, has not previously been a raw material of economic production.

Novel market form: Behavioral futures markets — markets in which predictions about future human behavior are bought and sold — are genuinely new. There have been markets in information about consumers (credit bureaus, marketing lists) but not markets in predictions about specific individuals' future behavior at this granularity and scale.

Novel power claims: Surveillance capitalism, Zuboff argues, claims the power to shape behavior toward commercially valuable outcomes — to influence what people do, buy, feel, and think, not just to predict what they will do. She calls this "instrumentarian power" — power that operates not through violence or law but through the behavioral modification capabilities of digital systems.

The Google Origin Story

Zuboff traces surveillance capitalism's origins to a specific historical moment: the years after the dot-com crash of 2000, when Google discovered that its enormous store of behavioral data could be used to target advertising with unprecedented precision.

Before this discovery, Google operated as a relatively traditional technology company: users used the search engine, advertisers paid for placement, but the targeting was crude. After 2001, Google engineers realized that the behavioral data they were collecting as a byproduct of search — what people searched for, in what order, from what locations, and how they responded to results — was extraordinarily predictive of advertising response.

The behavioral surplus that had been an incidental byproduct became the primary raw material. The search engine became the vehicle for collecting it. This transition — from service (search) to extraction (behavioral data collection) — is the origin of surveillance capitalism as Zuboff understands it.

💡 Intuition Checkpoint: Google's search is "free" to use. The actual product — what Google is selling — is predictions about your behavior. Who are Google's actual customers? What are they actually buying? How does answering these questions change how you understand your relationship to Google's services?


34.3 The Chilling Effect at Scale: Reality Mining and Behavior Modification

One of Zuboff's most disturbing claims concerns the behavioral modification capabilities that surveillance capitalism enables. This is not merely about predicting behavior — it is about shaping it.

Reality Mining

"Reality mining" — a term from academic research at MIT's Media Lab — refers to the use of behavioral data to understand patterns of human life at a granularity previously impossible. Sandy Pentland, who coined the term, developed it in the context of mobile phone data: by analyzing call patterns, location traces, and sensor data from phones, researchers could predict with remarkable accuracy what someone would do tomorrow, or next week, based on what they had done before.

Surveillance capitalism companies operationalize reality mining at scale. Netflix uses viewing behavior to predict what you will watch next and what content will keep you watching. Spotify uses listening behavior to predict mood and attention. Facebook uses posting, liking, and engagement behavior to predict emotional state, political leanings, and consumer preferences.

The knowledge generated by reality mining is used not just for passive prediction but for active intervention. Netflix's algorithm doesn't just predict what you'd watch — it shapes what options are most prominently displayed. Spotify's algorithm doesn't just predict your mood — it shapes your listening experience. Facebook's algorithm doesn't just predict your political leanings — it shapes what political content you see.

The 2012 Facebook Emotional Contagion Study

In 2014, it emerged that Facebook had conducted a secret experiment in 2012 on approximately 689,000 users. The experiment manipulated the emotional valence of content in users' news feeds — showing some users more positive content, others more negative content — to test whether emotional states could be induced through feed manipulation.

The study found that they could: users exposed to more negative content in their feeds produced more negative posts; users exposed to more positive content produced more positive posts. Emotional states were "contagious" through social media, and the feed algorithm could be used to induce specific emotional states.

The experiment was conducted without user knowledge or consent. The paper describing it, published in Proceedings of the National Academy of Sciences, prompted outcry about the ethics of experimentation on social media users — both the lack of consent and the manipulation of emotions for research purposes.

For Zuboff, this experiment is evidence of surveillance capitalism's "actuation" capability: not just predicting behavior but modifying it through algorithmic intervention in ways users cannot detect or consent to.

🎓 Advanced Note: The Facebook emotional contagion study raises questions about research ethics that extend beyond surveillance capitalism. Was the experiment covered by the Common Rule (federal regulations for human subjects research)? Facebook argued it was not, because the experiment used existing platform functions (feed curation). Critics disagreed. The study prompted significant debate about the ethical obligations of tech companies conducting behavioral experiments at massive scale without IRB oversight.


34.4 Primary Sources: Zuboff in Her Own Voice

The following extended excerpt from The Age of Surveillance Capitalism captures Zuboff's central argument in her own framing:

"Surveillance capitalism originated at Google, then spread to Facebook and eventually to a wide range of businesses across the economy... The game is not to sell products to you. Surveillance capitalists discovered that the most predictive behavioral data comes from intervening in the state of play in order to nudge, coax, tune, and herd behavior toward profitable outcomes. Modification, monetization: surveillance capitalism's formula for turning our goals and purposes into its means."

And:

"Surveillance capitalism relies on unprecedented asymmetries of knowledge and the power they confer. Surveillance capitalists know everything about us, whereas their operations are designed to be unknowable to us. They accumulate vast domains of new knowledge from us, but not for us. They predict our futures for the sake of others' gain, not ours."

And on the concept of "instrumentarian power":

"Instrumentarian power... does not claim your soul. It does not need to. In this new scenario, it is enough to influence behavior. Instrumentarianism is indifferent to meaning, and it is certainly indifferent to your meaning... It is not interested in dispossessing you of your inner life. It is merely indifferent to it. The soul is irrelevant. Behavior is all."

These passages are essential primary source material. They convey not just the argument but Zuboff's moral alarm — her sense that something genuinely threatening to human self-determination is at stake.


34.5 What Zuboff Gets Right

Zuboff's analysis captures genuinely important and accurate features of the data economy.

The novelty of behavioral futures markets: Markets in predictions about specific individuals' future behavior, conducted at millisecond speed across billions of data points, are genuinely new. Nothing in prior capitalism resembled the advertising auction that occurs before every web page loads.

The asymmetry of knowledge: Surveillance capitalists know vastly more about users than users know about surveillance capitalists. The opacity of data collection, data use, and algorithmic decision-making is real, persistent, and deliberately maintained.

The actuation capability: The evidence that behavioral data is used not just to predict but to modify behavior — through feed algorithms, recommendation systems, and targeted messaging — is substantial. Zuboff is right that this represents a qualitative change from advertising that merely tried to persuade.

The political risks: Zuboff's chapters on "instrumentarian power" and the risks to democracy from behavioral modification at scale — including the chapter on the 2016 election and Cambridge Analytica — identify real political risks that subsequent events have amplified.

The extraction without consent: The basic point that behavioral surplus is extracted without meaningful consent is accurate. The opt-out framing of most privacy mechanisms, and the "consent" represented by clicking through terms of service, does not constitute genuine voluntary participation in behavioral data extraction.


34.6 What Zuboff Gets Wrong (or Incomplete)

Dr. Osei told Jordan that the argument was important and incomplete. Let's examine the incompleteness.

Critique 1: Surveillance Capitalism as Novelty

Zuboff's claim of novelty has been challenged on historical grounds. Scholars of race and empire have argued that the extraction of behavioral data — and more broadly, the treatment of human beings as raw material from which value is extracted — is not new. It has precedents in:

  • Slavery: The commodification of human beings as property, with their labor, reproduction, and physical existence treated as raw material for economic production. Enslaved people were under constant surveillance — their movements, communications, relationships, and bodies monitored and controlled for the benefit of enslavers. The behavioral data collected (who was trying to escape, who was forming relationships, who was organizing) was used to modify behavior through violence.

  • Colonial administration: European colonial powers developed detailed surveillance systems for monitoring and controlling colonized populations — census data, identity documents, movement registration — in service of extraction of resources and labor.

  • Scientific management (Taylorism): Frederick Taylor's time-and-motion studies of factory workers in the early twentieth century used observation and data collection to optimize worker behavior for economic output. The warehouse where Jordan works is a contemporary manifestation of Taylorist logic applied to digital tracking.

Nick Couldry and Ulises Mejias, in The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism (2019), argue that Zuboff's novelty claim misses the colonial logic of data capitalism. They describe "data colonialism" — the extraction of data from populations in the Global South, often under conditions where those populations have no meaningful ability to refuse, and the use of that data to generate value for corporations in the Global North. This is structurally analogous to the extraction of natural resources under colonialism, with human behavioral data as the extracted material.

Critique 2: The Problem Is Capitalism, Not Surveillance Capitalism

A second critique: Zuboff's argument implies that there was a prior capitalism — "industrial capitalism," "consumer capitalism" — that was acceptable, and that surveillance capitalism is a departure from it. This framework has been challenged by scholars who argue that capitalism, in all its forms, has depended on surveillance, extraction, and coercion.

If the problem is not the surveillance but the capitalism — the economic logic that reduces everything to raw material for profit — then Zuboff's analysis leads not toward regulation of data collection practices but toward more fundamental transformation of economic relations.

This critique connects to abolitionist arguments about predictive policing (from Chapter 33) and about facial recognition (from Chapter 35): if the problem is structural, incremental reforms of the worst practices may legitimate the underlying system rather than challenging it.

Critique 3: Agency and Resistance

Critics have argued that Zuboff's framework leaves little room for human agency. Users appear in her account primarily as victims — objects of extraction, subjects of behavioral modification. The extensive practices of counter-surveillance, collective organization, and legal resistance documented in Chapters 31-33 suggest a more complicated picture: people are not simply passive in the face of surveillance capitalism.

This is partly a question of emphasis. Zuboff's goal was to expose the mechanism, not catalog the resistance. But the theoretical implication is that resistance is either impossible (the system is too powerful) or beside the point (the structural problem requires structural solutions). Neither conclusion is well-supported by evidence.

Critique 4: The Tech Exceptionalism Problem

Zuboff focuses almost exclusively on Google, Facebook, and the advertising-tech sector. But the logic she describes — collecting behavioral data, using it to predict and modify behavior, selling predictions — also characterizes: financial institutions, insurance companies, health care systems, governments, and employers. The behavioral futures markets that concern Zuboff are embedded in social institutions that predate Google.

Ben Green, in The Smart Enough City (2019), argues that technology companies are credited with inventing problems and solutions that predate them. Zuboff's Google origin story may miss the ways in which the behavioral surveillance logic she describes was already present in direct marketing, credit scoring, and actuarial insurance before the internet made it digital and automated.

⚠️ Common Pitfall: Reading Zuboff as the definitive account risks missing the critiques from the left (data colonialism, capitalism as the root problem) and from the right (tech exceptionalism, prior capitalism was not so different) that clarify and complicate her analysis. Zuboff is essential reading, not because her argument is complete, but because it is the most fully developed version of a perspective that any serious student of surveillance must engage with.


34.7 Alternative Frameworks

Data Colonialism: Couldry and Mejias

Nick Couldry and Ulises Mejias's The Costs of Connection (2019) offers the most sustained alternative framework. They argue that the extraction of data from human life is best understood as a new form of colonialism:

  • Like territorial colonialism, data colonialism involves the seizure of resources (data) from people who have no meaningful ability to refuse
  • Like territorial colonialism, data colonialism generates value for the powerful (corporations in the global technology industry) through extraction from the less powerful (users, particularly those in the Global South)
  • Like territorial colonialism, data colonialism uses the language of development and benefit (connectivity, services, digital inclusion) to justify the extraction
  • Like territorial colonialism, data colonialism reshapes the colonized environment to facilitate extraction — digital infrastructure becomes the equivalent of railroad networks built to extract colonial resources

The colonial frame extends Zuboff's analysis in two ways: it provides historical depth (data colonialism has precedents in territorial colonialism) and it centers global power asymmetries (the extraction operates along existing North-South and rich-poor axes).

🌍 Global Perspective: The data colonialism framework is particularly important for understanding surveillance in the Global South. Facebook's Free Basics program — which provided internet access in developing countries but only to Facebook's own services — is an example of the colonial logic: providing connectivity as a vector for data extraction, with the "connected" populations having no meaningful ability to choose connectivity without data extraction.

Ben Green: The Smart Enough City

Ben Green's The Smart Enough City (2019) argues for a more pragmatic political economy of urban technology — one that resists both the techno-optimist claims of "smart city" boosters and the techno-pessimist sweeping critiques of Zuboff.

Green's argument: technology is not inherently good or bad. The question is always: who controls it, for whose benefit, and through what accountability mechanisms? Rather than asking whether digital technology should be used in cities (it will be, he argues), the productive question is how to design and govern it to serve public rather than private interests.

This leads to practical recommendations: public data infrastructure (rather than private), democratic governance of algorithmic systems, community participation in design, and transparency requirements. Green is less interested in indicting surveillance capitalism as a system than in identifying the levers of reform within existing political and institutional contexts.


34.8 Regulation Debates: What Should Be Done?

If Zuboff's diagnosis is correct — or even largely correct — what follows for policy?

Break Up Big Tech?

The antitrust argument: Google, Meta, Amazon, and Apple have achieved monopoly or near-monopoly positions in their respective markets. Their dominance in search, social media, e-commerce, and mobile operating systems creates barriers to entry that prevent competition from challenging their surveillance capitalism business models.

The antitrust response: require divestiture (Facebook must sell Instagram and WhatsApp), mandate interoperability (social media platforms must allow users to move data and connections to competing platforms), and prevent acquisitions that eliminate potential competition.

Critics note that antitrust enforcement focuses on market competition, not privacy. Breaking up Facebook into three companies (Facebook, Instagram, WhatsApp) would create three surveillance capitalism operations rather than one. The underlying business model — behavioral data extraction and sale of behavioral futures — would be unchanged.

Property Rights in Data?

Some advocates propose that individuals should have property rights in their personal data — that you own the data generated by your behavior, can license its use, and must be compensated when it is used commercially.

This approach has intuitive appeal (it gives individuals control and compensation) but faces serious objections:

  • It commodifies rather than protects privacy. If your data is property you can sell, then corporations will simply pay for it — establishing a market in surveillance. The poor will sell their data; the rich will protect their privacy. Privacy becomes a luxury good.
  • It doesn't address the structural problem. Individual data rights don't change the incentive structure that generates behavioral data collection; they just create a market for it.
  • Data is not like other property. Once data is shared, it can be used, combined, analyzed, and re-identified in ways that property rights don't adequately capture.

Public Data Infrastructure?

An alternative proposal: treat data infrastructure as public rather than private. Just as roads, water systems, and electrical grids are treated as public utilities in most democracies, data infrastructure — the networks through which behavioral data flows and the systems through which it is analyzed — could be publicly owned, operated in the public interest, and governed democratically.

This would not require eliminating private companies but would change the terms on which they operate: behavioral data collected through public infrastructure is public, governed by democratic institutions, used for public purposes. Private companies could provide services on top of public infrastructure but would not own the infrastructure or the data it carries.

This is a radical proposal — it challenges the private ownership of the internet's commercial infrastructure. But it has precedents in how other critical infrastructure has been governed, and it addresses the structural problem rather than individual practices.

Abolition vs. Reform

The debate between abolition and reform within surveillance studies mirrors the debate in criminal justice: Is the goal to eliminate surveillance capitalism (abolition) or to make it less harmful through better regulation (reform)?

Abolitionists argue that surveillance capitalism's fundamental logic — treating human experience as raw material — is incompatible with human dignity and democratic governance. Reforms that mitigate specific harms while accepting the underlying logic legitimate the system rather than challenging it.

Reformists argue that abolition is politically unrealistic in the near term and that meaningful reform — strong data protection law, algorithmic transparency, limits on behavioral surplus collection, antitrust enforcement — can reduce harms substantially while more fundamental change is built.

This is not an abstract debate. It determines what advocates fight for, what they're willing to accept as partial victory, and how they relate to policymakers and tech companies who are willing to negotiate reforms but not abolition.

📝 Note: The debate maps onto the broader political economy of capitalism: reformists (social democrats, progressives) and abolitionists (socialists, transformationists) disagree not just about tactics but about the nature of the problem. Whether surveillance capitalism can be reformed from within is a version of whether capitalism can be reformed from within — a question with a long history and no settled answer.


34.9 The "Techlash" and Its Limits

Beginning around 2016 — with the Cambridge Analytica scandal, European GDPR enforcement, and increasing attention to platform content moderation, algorithmic discrimination, and mental health effects of social media — a "techlash" emerged: public and regulatory skepticism about major technology companies after a period of largely uncritical celebration.

The techlash has produced: - Congressional hearings with tech CEOs (widely criticized as revealing congressional ignorance of how the technology works) - GDPR enforcement (discussed in Chapter 31) - Increased academic research on algorithmic bias, surveillance, and platform power - Investigative journalism (New York Times series on facial recognition, Wall Street Journal "Facebook Files," etc.) - Some legislative action (COPPA enforcement, state privacy laws, some antitrust action)

But the techlash has also revealed the limits of public pressure on structural problems:

  • Facebook's stock price and user numbers largely weathered Cambridge Analytica
  • Congressional hearings produced awareness but little legislation
  • GDPR enforcement, while significant, has not fundamentally changed the behavioral advertising business model
  • Google's and Meta's revenue from behavioral advertising has grown through the techlash period

The techlash has changed the discourse — surveillance capitalism is now a recognized public concern in a way it was not in 2010. Whether discourse change translates into structural change depends on political will, organizational capacity, and economic incentives that the techlash alone has not been sufficient to redirect.


34.10 Can Surveillance Capitalism Reform Itself? The Google/Facebook Privacy Pivot

Google and Meta have both made significant public commitments to privacy in recent years:

  • Google announced plans to phase out third-party cookies in Chrome, replacing them with a Privacy Sandbox system that would conduct behavioral targeting within the browser rather than through cross-site tracking
  • Meta launched "Privacy Checkup" tools and committed to end-to-end encryption for Messenger
  • Apple's App Tracking Transparency required apps to obtain user permission before tracking across apps and websites — a significant reduction in cross-app tracking that dramatically reduced Meta's advertising revenue

Are these privacy pivots evidence that surveillance capitalism can reform from within?

The skeptical reading: these moves are primarily competitive positioning rather than genuine privacy protection. Apple's ATT hurt its competitors (particularly Meta) while positioning Apple's own advertising products as more privacy-respecting. Google's Privacy Sandbox replaces third-party tracking with a system that still enables behavioral targeting but centralizes it within Google's infrastructure — potentially increasing Google's competitive advantage while claiming privacy improvement.

The sympathetic reading: competitive pressure can produce genuine privacy improvements. If Apple's ATT reduces cross-app tracking (which it has), users genuinely benefit regardless of Apple's motivations. Market forces and regulatory pressure together can move the industry in privacy-protective directions even without abolishing the underlying business model.

The realistic assessment: partial reforms that reduce specific surveillance practices while preserving the fundamental logic of behavioral data extraction and sale of behavioral futures are not what Zuboff's analysis indicates is needed. They are better than nothing. They are not sufficient.

🔗 Connection: This debate returns to the abolition vs. reform argument (Section 34.8) and to the question of whether individual actions, even at corporate scale, can change structural problems. The answer depends on what you believe the structural problem is — whether it is surveillance capitalism specifically (in which case corporate privacy pivots might address it) or capitalism more broadly (in which case they cannot).


34.11 Jordan's Reflection: Between the Library and the Warehouse

Jordan finished the Zuboff chapters on a Tuesday. On Wednesday they went to the warehouse.

The tracker on their vest logged every movement. The picking algorithm assigned tasks at a pace calibrated to the 75th percentile of picker efficiency — fast enough to keep the warehouse profitable, slow enough that most people could sustain it. If Jordan's efficiency rating dropped below a threshold, the system flagged it. If it dropped far enough, a supervisor would appear.

Jordan thought about Zuboff. Human experience as raw material. The warehouse wasn't harvesting Jordan's search behavior or social media posts. It was harvesting Jordan's labor — their physical movement through space, their decision-making speed, their capacity for sustained attention.

And then Jordan thought about what Zuboff missed. The warehouse logistics company was not a surveillance capitalist in Zuboff's sense — it didn't sell behavioral predictions. It used behavioral data (Jordan's movement logs) to optimize extraction of labor value. This was industrial capitalism with digital instruments. The surveillance was not a departure from prior capitalism — it was prior capitalism's logic applied to digital tracking.

Maybe, Jordan thought, the problem is the extraction. The surveillance makes the extraction more efficient, more granular, more total. But the extraction came first. The desire to treat human beings as raw material — as means rather than ends — was not invented by Google.

They thought about the workers around them. Most of them didn't know what Surveillance Capitalism was. They knew their bodies hurt. They knew the pace was designed by someone who had never tried to sustain it. They knew the tracker knew where they were and the supervisor showed up when the tracker said something was wrong.

Was the solution to understand surveillance capitalism theoretically? Or was it to organize a union?

Jordan thought Yara would say: both. Theory without action is just description. Action without theory keeps aiming at the wrong targets.

They picked up the pace for a moment, watched the tracker update, then returned to their normal rhythm. A small act of obfuscation. A tiny disruption in the data.

It wasn't enough. But it was something.


34.12 Chapter Summary

This chapter examined Shoshana Zuboff's surveillance capitalism thesis and the critiques that complicate and extend it.

Zuboff's thesis: Surveillance capitalism is a new economic logic that treats human behavioral experience as raw material, extracts "behavioral surplus" from users of digital platforms, uses that surplus to produce prediction products, and sells those products in behavioral futures markets. The result is an unprecedented asymmetry of knowledge and power between surveillance capitalists and their subjects, and an actuation capability that can modify behavior at scale.

What Zuboff gets right: The novelty of behavioral futures markets, the knowledge asymmetry, the actuation capability, the extraction without meaningful consent, and the political risks to democracy are all real and important insights.

What Zuboff gets wrong or leaves incomplete: The novelty claim ignores structural precursors in slavery, colonial administration, and scientific management. The focus on Google and Facebook misses the surveillance logic embedded in insurance, finance, employment, and government. The treatment of users as passive objects misses substantial evidence of resistance and agency.

Alternative frameworks: Data colonialism (Couldry and Mejias) centers global power asymmetries and historical continuities with territorial colonialism. Green's pragmatic political economy focuses on governance and accountability rather than systemic condemnation.

Policy debates: Break-up, data property rights, public data infrastructure, and strong privacy regulation represent different responses to the structural problem. The abolition vs. reform debate is genuine and consequential.

The techlash: Has changed discourse without producing structural change. Corporate privacy pivots are better than nothing and insufficient by themselves.


Next: Chapter 35 examines facial recognition as a specific, urgent manifestation of the surveillance architecture — its technical workings, documented harms, and contested regulation.