Jordan Ellis has, in the past six months, begun tracking things. It started with running: a friend convinced Jordan to sign up for a 5K, and Jordan downloaded a running app that tracked pace, distance, and calories. The app showed a graph of...
In This Chapter
- Opening: The Dashboard and the Mirror
- 20.1 The Quantified Self: Origins and Philosophy
- 20.2 The Therapeutic Appeal
- 20.3 Foucault's Technologies of the Self
- 20.4 Wearable Technology and the Data It Generates
- 20.5 Continuous Glucose Monitors and Insurance Implications
- 20.6 When Voluntary Becomes Obligatory: Wellness Programs
- 20.7 The Quantified Self and Capitalism
- 20.8 Social Self-Surveillance: Posting Your Life
- 20.9 Jordan's Employer, the Fitbit, and the Choice That Isn't
- 20.10 Structural Analysis of the Quantified Self
- Chapter Summary
- Key Terms
- Discussion Questions
Chapter 20: Self-Surveillance: The Quantified Self and Voluntary Monitoring
Opening: The Dashboard and the Mirror
Jordan Ellis has, in the past six months, begun tracking things. It started with running: a friend convinced Jordan to sign up for a 5K, and Jordan downloaded a running app that tracked pace, distance, and calories. The app showed a graph of improvement over time, and the graph was satisfying.
Then Jordan downloaded a sleep tracker. Then a mood journal. Then — after a difficult month at the warehouse — a productivity app that tracked how Jordan spent their phone's screen time. The apps multiplied quietly, and Jordan found something unexpected in them: a feeling of legibility. The chaos of a hard semester, a demanding job, a social life that was sometimes complicated — all of it could be reduced to data, graphed, analyzed, maybe improved.
One evening Jordan is looking at their phone and realizes they have, at this moment, five apps actively logging their behavior: location (still), sleep (via the phone's motion sensor), mood (they've been logging three times a day), running (in background), and screen time (continuous). They have become, without quite deciding to, a monitored entity. The difference from the surveillance examined in previous chapters is that the monitor is themselves.
Or is it? The running data is on the app company's servers. The sleep data is stored by another company. The mood journal syncs to the cloud. The screen time data goes to Apple. The location data, as Jordan now knows viscerally, goes everywhere.
This chapter examines the Quantified Self — the cultural movement and personal practice of comprehensive self-measurement — and asks what it means when the surveillance is something you do to yourself, for yourself, with your own eyes and your own tools. And then it asks: is that actually what is happening?
20.1 The Quantified Self: Origins and Philosophy
Gary Wolf, Kevin Kelly, and the Movement's Founding
The "Quantified Self" (QS) as a named movement and cultural phenomenon was articulated by Gary Wolf and Kevin Kelly, editors at Wired magazine, who coined the term in a 2007 article. Wolf and Kelly described a community of people who were using technology to systematically measure and analyze their own physiological and behavioral data in order to understand themselves better and improve their lives.
The movement's founding philosophy had several key elements:
Self-knowledge through numbers: The QS ethos held that the body and behavior were knowable in new ways through systematic measurement. Where traditional introspection was unreliable and biased, data could provide objective self-knowledge. "What can I know about myself that I couldn't know before?" was the movement's central question.
Individual agency: QS was presented as a tool for individual empowerment — by understanding your own patterns, you could make more effective choices. The individual, not their doctor or employer, was the analyst and the beneficiary of the self-data.
Open sharing: The early QS community had a strong norm of sharing self-quantification results — at meetups, conferences, and online. The "show and tell" format of QS talks (here's what I measured, here's what I found, here's what I changed) created a culture of personal data transparency within the movement.
The Maker ethos: Early Quantified Self practitioners often built their own measurement devices and analysis tools, reflecting a broader maker/hacker culture of DIY innovation.
The community that formed around these ideas was, in demographic terms, homogeneous: predominantly white, male, technically educated, and economically secure. This demographic profile is important context for understanding the assumptions built into the movement's philosophy — particularly the assumption that self-quantification is a tool of individual empowerment rather than institutional control.
💡 Intuition Check: Gary Wolf's original formulation of the Quantified Self asked "what can I know about myself?" The corporate smart wearable market asks a different question: "what can we know about you?" Consider how shifting the pronoun changes the surveillance relationship. When you track your own steps, who owns the data? When your employer's wellness program tracks your steps, who owns the data, and what are they allowed to do with it?
What People Track
The domains of self-tracking have expanded dramatically from the movement's early years. Today, people track:
Physical health: Steps (pedometers, accelerometers), heart rate (continuous and during exercise), sleep (duration, stages, quality), blood oxygen levels, skin temperature, menstrual cycles, blood glucose (continuous glucose monitors), weight, body composition.
Mental and emotional states: Mood (daily or multiple-times-daily logging), anxiety levels, energy, focus, stress (often inferred from heart rate variability).
Productivity and time: Hours worked, time on specific tasks (Toggl, Harvest), computer screen time by application, reading speed and volume, goals tracked against completion.
Financial: Daily spending by category, net worth changes, investment returns.
Social: Number of social interactions, quality of conversations (self-assessed), time with specific people.
Consumption: Food (calories, macros, nutrients), alcohol, caffeine, medications, supplements.
Environmental: Sleep environment temperature, light exposure, air quality, noise levels.
The breadth of this list reflects the movement's ambition: the entire lived experience, reduced to data, analyzed for patterns, optimized. This ambition connects to a cultural moment in which quantification has become the dominant mode of legitimation — if it can be measured, it is real; if it can be improved, it is your responsibility to improve it.
20.2 The Therapeutic Appeal
Why People Track
Self-tracking is not primarily driven by corporate mandate or external pressure. Many people find genuine value in it. Understanding why is essential for analyzing what the Quantified Self movement reveals about contemporary culture.
Self-knowledge: Many trackers describe the experience of seeing patterns in their data as revelatory. "I didn't know I was getting six hours of sleep on average until the app showed me" is a common formulation. The data externalizes knowledge that subjective experience obscures.
Agency and motivation: Tracking creates a feedback loop that many people find motivating. The running graph that shows improvement, the step counter that celebrates hitting a goal, the mood chart that makes visible the correlation between sleep and emotional state — these feedback loops can be genuinely useful for behavior change.
Accountability: For people working toward health or behavior goals, tracking creates a form of accountability to oneself. Logging food before eating it can reduce impulsive consumption. Tracking exercise frequency makes it harder to tell yourself "I exercised a lot this week" when you didn't.
Illness and chronic condition management: For people with chronic conditions — diabetes, heart disease, autoimmune disorders — continuous tracking can provide clinically valuable data and enable more effective management. Continuous glucose monitors (CGMs) have transformed diabetes management by providing real-time blood sugar information that allows immediate behavioral adjustments.
Curiosity: Many people track simply because they are curious about themselves. The QS movement's original "show and tell" ethos reflected genuine curiosity about what self-data would reveal.
These are real benefits. Acknowledging them is important for a structural analysis that does not lapse into reflexive skepticism of any form of self-measurement. The question is not whether self-tracking has value — it often does — but what structural conditions surround that value and what happens when voluntary self-tracking meets institutional demands.
📊 Real-World Application: A 2019 Pew Research Center survey found that 53% of American adults reported tracking some aspect of their health data regularly — exercise, diet, weight, or other metrics. Among adults with chronic conditions, the proportion was significantly higher. The survey found that tracking was most common among college-educated, higher-income Americans — the same demographic profile as the early QS movement. This demographic pattern matters for thinking about who self-tracking serves and who it disadvantages.
20.3 Foucault's Technologies of the Self
Self-Surveillance as Normalization
In Chapter 2, we examined Michel Foucault's analysis of the panopticon as a mechanism of disciplinary power — an architecture that produces self-regulating subjects through the possibility of observation. Foucault's later work, particularly his lectures at the Collège de France in the early 1980s, examined what he called "technologies of the self" — practices through which individuals act on themselves in order to conform to norms, improve themselves, or achieve desired states.
Technologies of the self include practices like confession (examining one's behavior and thoughts, disclosing them to an authority, accepting judgment), diary writing (recording and reviewing one's inner life), meditation (training attention), and — as Foucault's framework is extended by later scholars — self-tracking.
The Foucauldian analysis of the Quantified Self argues that self-tracking is not primarily a tool of individual liberation but a technology of normalization: a practice through which individuals internalize social norms about what a "good" or "healthy" or "productive" body and life should look like, and work on themselves to conform to those norms.
When you track your steps, you are not simply measuring neutral data. You are participating in a social discourse about what an "adequate" number of steps is (10,000, established by a Japanese pedometer marketing campaign in the 1960s, not clinical research), what "fitness" means, and who is and is not responsible for their own health. The step counter is not objective; it encodes specific norms about physical activity, health, and individual responsibility.
Similarly, productivity tracking encodes norms about what constitutes sufficient work, proper time allocation, and the relationship between activity and value. The mood journal encodes norms about emotional regulation — what emotional states are appropriate, what fluctuations are problematic. Every self-tracking metric encodes assumptions about what "normal" and "good" look like.
🎓 Advanced Concept: The Normalization of Productivity
The productivity tracking market — apps like Toggl, RescueTime, and Notion — reflects a broader cultural normalization of the idea that human time and attention can and should be optimized as productive resources. This framing, which sociologist and theorist Judy Wajcman has analyzed in her work on time and technology, positions every moment as potentially productive or wasteful, and positions the individual as the manager of their own human capital.
Foucault's analysis of technologies of the self would characterize productivity tracking as a practice through which individuals internalize and apply to themselves the managerial logic that Taylor's scientific management (Chapter 4) applied externally. The difference is not one of surveillance versus freedom; it is one of who administers the disciplinary gaze. The managed worker is observed by the manager; the productivity tracker is observed by themselves — but in service of the same productive optimization that capitalist management has always sought.
20.4 Wearable Technology and the Data It Generates
Fitbit, Apple Watch, Oura Ring
The consumer wearable market transformed self-tracking from a niche practice of technically sophisticated individuals to a mainstream consumer category. Key devices:
Fitbit (acquired by Google in 2021): The pioneering consumer fitness tracker. Basic models track steps, distance, and sleep. Advanced models add heart rate, blood oxygen, and skin temperature monitoring. The Fitbit ecosystem includes a social platform where users can share stats and compete.
Apple Watch: The most commercially successful smartwatch. Beyond fitness tracking, it monitors heart rate continuously, can perform electrocardiogram (ECG) readings, detects falls, monitors blood oxygen, tracks sleep, and — through third-party apps — supports extensive self-quantification.
Oura Ring: A discrete form factor (ring) providing continuous heart rate, temperature, blood oxygen, and sleep monitoring, with a focus on recovery and readiness scores that aggregate multiple metrics into a single daily indicator.
Whoop: Subscription-based wearable focused on athlete recovery, tracking heart rate variability, sleep, and activity strain. Adopted widely by professional athletes and, through employer wellness programs, by office workers.
Samsung Galaxy Watch, Garmin devices: Additional major players in the consumer health wearable market with overlapping feature sets.
These devices collect data that is substantially more intimate than a step counter: continuous heart rate (which varies with stress, exercise, and health), skin temperature (which can indicate illness, menstrual cycle stage, or metabolic state), sleep patterns (including movements and, in newer devices, breathing patterns and potential sleep apnea detection), and ECG readings.
Who Owns Your Health Data
The data generated by consumer wearables is held by device manufacturers, and the terms governing its use are — as with all the data relationships examined in this part — more complex than the consumer experience suggests.
Fitbit/Google: Following Google's acquisition of Fitbit in 2021, Fitbit's data practices became subject to sustained scrutiny. Google had committed to the FTC, as a condition of the acquisition approval, not to use Fitbit health data to inform Google advertising. Whether Google can use the data for other purposes — including training health AI models, improving other Google health products, or sharing with health insurance affiliates — is governed by the commitment's specific terms, which critics have argued contain significant ambiguities.
Apple: Apple has positioned itself as a privacy-oriented health data custodian, emphasizing on-device processing of health data (health data is processed on the iPhone and not uploaded to Apple's servers by default). Apple has faced scrutiny, however, for what happens when users share Apple Health data with third-party apps, which are governed by the apps' own privacy policies, not Apple's.
Oura: Oura's privacy policy at the time of this writing disclosed sharing of data with "partners" for research purposes, with opt-out options. The scope of these partnerships is not always clearly disclosed.
The fundamental issue is that health data — particularly continuous, longitudinal health monitoring — is among the most sensitive data that can be collected about a person. It can reveal chronic conditions, mental health status, reproductive health, substance use, and patterns associated with specific diseases. Its commercial value is significant: to insurers, to pharmaceutical companies, to employers, to data brokers. The consumer wearable ecosystem has been built on the premise that users want to share this data (with the device manufacturer) in order to receive tracking services. What happens to the data after that is less clearly disclosed and understood.
⚠️ Common Pitfall: Students sometimes assume that health data generated by consumer wearables is protected by HIPAA (the Health Insurance Portability and Accountability Act). It is not. HIPAA protects health data held by covered entities — healthcare providers, health insurers, and their business associates. Fitbit, Apple, and Oura are consumer technology companies, not healthcare providers. Their data practices are governed by their privacy policies and (in California and a few other states) by consumer privacy law, but not by HIPAA's more stringent protections.
20.5 Continuous Glucose Monitors and Insurance Implications
The CGM as Data Producer
Continuous glucose monitors (CGMs) represent one of the most consequential frontiers in health self-tracking. CGMs measure blood glucose levels continuously — every one to five minutes — and transmit the data to a smartphone app. For people with diabetes or prediabetes, CGMs provide clinically valuable real-time information. For the growing population of people without diabetes who use CGMs for metabolic monitoring and optimization, they represent an extreme form of physiological self-surveillance.
The data a CGM produces over weeks and months is a comprehensive metabolic profile: the blood glucose response to every meal, every exercise session, every stressful event, and every night's sleep. This profile reveals information about diet, activity, metabolic health, and — through patterns over time — potential chronic disease risk factors.
This data has obvious value to health insurers. Insurers base premiums and coverage decisions on health risk assessments; a comprehensive metabolic profile is a detailed risk assessment tool. The question of whether CGM data can be used by insurers — either through voluntary sharing incentivized by premium discounts or through less transparent mechanisms — is one of the most important emerging questions in health insurance law.
Current protections: The ADA (Americans with Disabilities Act) and the Genetic Information Nondiscrimination Act (GINA) provide some protection against insurance discrimination. GINA specifically prohibits health insurers from using genetic information in coverage and premium decisions. Health status information from CGMs is not genetic information; GINA does not directly apply. The ADA's protections for people with disabilities apply to employment decisions but have limited application to health insurance underwriting in markets where individual underwriting is permitted.
The wellness exception: As discussed in the next section, employer wellness programs create a significant gap in health data protections, enabling employers to incentivize health data sharing in ways that effectively condition employment benefits on health disclosure.
20.6 When Voluntary Becomes Obligatory: Wellness Programs
The Employer Wellness Market
Corporate wellness programs — employer-sponsored initiatives that encourage employees to engage in health-promoting behaviors — have become a significant feature of the American employment landscape. Wellness programs take many forms, from simple health risk assessments to comprehensive behavioral monitoring. The surveillance-significant tier of wellness programs involves wearable technology: employers that provide employees with fitness trackers (or subsidize their purchase) and reward employees for meeting activity targets.
The financial structure of these programs is straightforward: health insurance is a major employer cost; healthier employees cost less; therefore, incentivizing health behaviors that reduce insurance costs produces economic benefit for the employer. Wellness programs are sold to employers as cost reduction tools.
From the employee's perspective, the program presents as: an employer offering a benefit (a free or subsidized Fitbit, a premium discount on health insurance) in exchange for sharing health data and meeting activity targets. The program is described as "voluntary."
📊 Real-World Application: Under rules finalized by the Equal Employment Opportunity Commission (EEOC) in 2016 (later revised), "voluntary" wellness programs may offer financial incentives of up to 30% of the cost of employee-only health insurance coverage. A premium discount of 30% for wearing a fitness tracker and meeting step goals represents, for many employees, a significant financial pressure to participate. Low-income workers, for whom a 30% premium discount represents a substantial proportion of take-home pay, face more financial pressure to participate than higher-income workers — creating a system in which the employees least able to afford the health insurance penalty for non-participation are most likely to be compelled, in practice, to share health data.
This is not "voluntary" in the sense of genuinely free choice. It is voluntary in the legal sense that no one is physically compelled to share data. But the financial structure — offer health data or pay more for health insurance — presents a coercive choice to lower-wage workers in ways that the "voluntary" label obscures.
The Wellness-Surveillance Pipeline
Employer wellness programs that use wearable technology create a data pipeline from the employee's body to the employer's analytics infrastructure:
- Employee wears fitness tracker (Fitbit, Apple Watch, Whoop)
- Device collects heart rate, steps, sleep, and other physiological data
- Data transmits to device manufacturer's cloud
- Employer wellness program vendor (Vitality, Virgin Pulse, Wellable) aggregates data from multiple employees
- Employer receives aggregate analytics (and in some cases individual data) about workforce health
- Employer uses data for insurance pricing, wellness program design, and — in some cases — employment decisions
The scope of what employers can see varies by program design. Some programs show employers only whether an employee met activity targets; others provide more granular data. Legal protections for employees in this context are limited and inconsistently enforced.
The employee who wears a Fitbit provided by their employer's wellness program may genuinely believe they are participating in a personal health initiative. They are also generating data about their body that flows into a corporate analytics system. The distinction between self-surveillance and employer surveillance, in this context, collapses.
🔗 Connection to Chapter 29: The employer wellness surveillance discussed here is a preview of the workforce monitoring examined in depth in Chapter 29, which examines HR analytics, productivity monitoring, and the surveillance of remote workers. The wellness program creates a data pipeline from the employee's body; Chapter 29's topics extend that pipeline to include behavioral and productivity monitoring that overlaps with, and is connected to, the health surveillance infrastructure.
20.7 The Quantified Self and Capitalism
You Are Your Own Data Laborer
The Quantified Self movement's founding philosophy positioned self-tracking as individual empowerment — a tool that gave the individual new insight and agency. The critical sociology of the Quantified Self, developed by scholars including Deborah Lupton and Dawn Nafus, offers a different analysis: self-tracking makes individuals into unpaid data laborers whose behavioral data generates commercial value for corporations.
When Jordan uses a running app, they generate data about running routes, pacing, improvement trajectories, and behavioral patterns associated with running. This data has commercial value: it can be sold to data brokers, used to train AI models, deployed in advertising, licensed to health insurers, or used to develop new features. Jordan receives the app's navigation and tracking services in exchange. The exchange is not equitable: Jordan generates continuous, longitudinal, intimate physiological and behavioral data; Jordan receives a running log.
This data labor analysis is significant for several reasons:
It reveals the real economic structure of free health apps. The services are not free; the data is the payment. The "free" framing obscures what is being exchanged.
It challenges the empowerment narrative. Individual empowerment through self-knowledge is real for the individual. But the aggregate of millions of individuals' self-tracking data produces commercial value that accrues entirely to corporations, not to the individuals who produced it.
It connects self-surveillance to exploitation. Labor theorists use the term "data labor" to describe the unpaid production of commercial value through digital activity. Self-tracking is a particularly intense form of data labor: the individual actively and continuously produces detailed physiological and behavioral data, which is then extracted by the platform.
It identifies the alignment of interests between self-surveillance and surveillance capitalism. The surveillance capitalism logic examined in Chapter 11 depends on extracting behavioral data from individual subjects. The Quantified Self movement — by normalizing continuous self-monitoring and data sharing — produces subjects who are maximally legible to surveillance capitalism's data extraction mechanisms.
🌍 Global Perspective: The relationship between self-tracking and insurance is not uniform globally. In the United States, where health insurance is primarily employer- or individually-purchased and where premiums can reflect health status in some markets, wellness program data creates financial pressure and risk. In countries with national health systems (United Kingdom, Canada, most of Europe), the link between individual health data and insurance pricing is weaker or absent, reducing the financial coercion dimension of wellness tracking. However, the commercial data broker dimensions — data about wearable users flowing to commercial buyers — are global, following data wherever the devices are sold.
20.8 Social Self-Surveillance: Posting Your Life
The Performance of the Quantified Life
Self-tracking does not remain private. A significant feature of the Quantified Self ecosystem is social self-surveillance — the sharing of self-tracking data on social media platforms, within apps' social features, and in dedicated communities.
Fitbit's social features allow users to compare step counts, engage in step challenges, and celebrate each other's achievements. Apple's Activity app enables sharing of workout data with friends. Strava — a GPS activity tracking app with social features — allows users to follow each other's runs, rides, and workouts. Nike Run Club, Peloton, and most major fitness apps have social layers that transform personal tracking into social performance.
This social dimension of self-tracking adds an important layer to the surveillance analysis: the individual not only monitors themselves but performs self-monitoring for an audience. The tracked workout is simultaneously a measurement (personal) and a presentation (social). The social feedback — likes, congratulations, competition rankings — transforms self-surveillance into a form of social validation seeking.
The sociology of social self-surveillance draws on Erving Goffman's dramaturgical theory of social interaction (developed in The Presentation of Self in Everyday Life, 1959), which analyzes social life as performance. On social fitness platforms, individuals curate their self-tracking data for presentation: sharing their best workouts, not their skipped ones; sharing weight loss milestones, not plateaus; sharing the meditation streak, not the anxiety that required it.
Social self-surveillance is simultaneously: - Voluntary (no one requires you to post workouts) - Coerced (social norms in certain communities make not posting feel abnormal) - Generating data for platforms (your shared workout data is commercial data for Strava, Apple, Nike) - Normalizing surveillance (making continuous self-monitoring seem like a positive social activity)
📝 Note: The normalization function of social self-surveillance connects to a theme running through this part: surveillance becomes harder to resist when it is associated with positive social experiences (community, validation, achievement). Fitness apps with social features have taken the disciplinary self-monitoring examined in the previous section and added a reward structure that makes it feel like something other than surveillance.
20.9 Jordan's Employer, the Fitbit, and the Choice That Isn't
Jordan's employer at the warehouse has announced a new benefit: employees who wear a fitness tracker and meet weekly activity targets will receive a 15% discount on their monthly health insurance premium.
The math is not abstract for Jordan. Fifteen percent of a monthly premium at Jordan's age and income is about $28 per month — $336 per year. That is not a trivial amount. Jordan has already looked at whether their running app's step data could substitute for the tracker; it cannot — the employer's wellness vendor requires a specific certified device from an approved list.
Jordan stands in the employee break room, looking at the signup form. Marcus would probably say: it's a good deal, you're already tracking yourself, just sign up and save the money. Yara would probably say: your employer is paying you to give them your body data, and you should know that before you sign up.
Jordan thinks about what they have learned in the past few months. They think about the data brokers who aggregate location data. They think about Google's comprehensive knowledge of their movements. They think about the Fitbit that will go to a wellness vendor that will share data with the employer that will have, as a result, a detailed record of Jordan's physiological patterns.
Jordan also thinks about $336 per year and their current bank balance.
"Voluntary," Dr. Osei had said in seminar, "is a word that does a lot of work in surveillance analysis. It often means 'we didn't use a gun.'"
Jordan signs up. Then they go home and read the wellness vendor's privacy policy, which they have never done before with anything.
The policy is 4,200 words long. Jordan reads all of it.
The data is shared with the employer in aggregate. Individual data is retained by the vendor for three years. The vendor may use "de-identified" data for research and product development. The vendor may share individual data with the employer if the employer has "reasonable business need." "Reasonable business need" is not defined.
Jordan sends Yara a text: "I just read a 4,200-word privacy policy and I feel like I understand less than I did before I started."
Yara replies: "That's how they want you to feel."
📝 Note for Students: Jordan's situation — accepting wellness surveillance because the financial cost of refusal is real — is increasingly common. Research by the Kaiser Family Foundation has found that employer wellness programs increasingly use financial incentives large enough to constitute meaningful economic pressure on lower-wage workers. The "voluntary" character of these programs does not change the power dynamics that produce the decision Jordan faces.
20.10 Structural Analysis of the Quantified Self
What the Movement Reveals
The Quantified Self movement, examined through the structural lens of this textbook, reveals several things about surveillance in contemporary capitalism:
The normalization function: The QS movement normalized continuous self-monitoring — the idea that one's body and behavior should be subject to ongoing measurement, analysis, and optimization. This normalization prepared the cultural ground for the employer wellness surveillance, the insurance incentive programs, and the commercial health data markets that followed. What the early QS community did as a voluntary, enthusiast practice has been generalized into a corporate expectation and a market infrastructure.
The individual responsibility frame: Self-tracking is premised on the idea that health, productivity, and well-being are primarily individual responsibilities — measurable, improvable, and within individual control. This frame erases structural determinants of health (access to healthy food, safe housing, medical care, economic security, environmental factors) and attributes outcomes to individual measurement and behavior. The person who cannot meet their step goals because they work two jobs and have no time for exercise is not failing at self-optimization; they are experiencing a structural problem that no amount of self-tracking can address.
The data labor relationship: The commercial ecology of self-tracking — free apps, manufacturer-owned data, commercial data sales, insurance and employer integration — is a data labor relationship in which individuals produce value that accrues to corporations. The therapeutic benefit is real but does not change the economic structure.
The surveillance capitalism integration: Consumer wearables are not merely surveillance devices in isolation; they are nodes in the surveillance capitalism infrastructure examined in Chapter 11. The data they generate flows into commercial data pipelines that connect health behavior to insurance pricing, employer decision-making, and advertising targeting.
When Voluntary Monitoring Loses Its Voluntariness
A central thesis of this chapter is that the distinction between voluntary self-tracking and imposed surveillance is not stable. The trajectory of self-monitoring technology follows a pattern:
- Enthusiast adoption: Technically sophisticated, economically secure early adopters develop self-tracking practices for personal benefit.
- Mass consumer adoption: Consumer wearables bring the practice to a broader audience; commercial data infrastructure develops around the data.
- Institutional integration: Employers and insurers develop programs that incentivize or require the sharing of self-tracking data.
- Structural coercion: The financial stakes of non-participation become high enough that "voluntary" participation becomes practically mandatory for lower-wage workers.
- Full normalization: Self-tracking becomes an expected feature of responsible adult behavior; not tracking becomes the deviation that requires explanation.
The United States is currently somewhere between stages 3 and 4 for health and fitness tracking, with considerable variation by industry and employment context. The trajectory is not inevitable — regulatory intervention can alter it — but the direction of the current path is clear.
Chapter Summary
The Quantified Self movement began as an enthusiast community exploring self-knowledge through data. It has expanded into a commercial ecosystem in which consumer wearables collect intimate physiological data, that data flows into commercial pipelines, and institutional integration through employer wellness programs and insurance incentives is increasingly creating financial pressure to participate.
Foucault's technologies of the self provide the conceptual framework: self-tracking is not simply a neutral measurement practice but a technology through which individuals internalize and apply to themselves the norms of a surveillance-capitalist society — norms about health, productivity, and what constitutes a "good" body and life.
The critical sociology of the Quantified Self challenges the movement's empowerment narrative: individuals who track themselves are generating commercial value for corporations without adequate compensation or transparency. The "voluntary" character of self-tracking is real but increasingly constrained by financial incentives that make non-participation costly, particularly for lower-wage workers.
For Jordan, the decision to sign up for the wellness program — made with full awareness of what it means, after reading the privacy policy — is a microcosm of the textbook's central argument. Surveillance is not merely imposed; it is accepted, accommodated, even embraced, because the alternatives are costly. Understanding the architecture of that acceptance — and the structures that produce it — is what this book has been building toward.
Part 5 will take the analysis outside the home and into the workplace, the city, and the systems of governance that structure public life. What Jordan has learned in Part 4 — about the surveillance of the intimate, the personal, the bodily — will frame everything that follows.
Key Terms
Quantified Self: The cultural movement and personal practice of systematically measuring and analyzing one's own physiological and behavioral data, coined by Gary Wolf and Kevin Kelly in 2007.
Technologies of the self: Foucault's term for practices through which individuals act on themselves to conform to social norms, improve themselves, or achieve desired states. Self-tracking is analyzed as a contemporary technology of the self.
Wellness surveillance: The use of wearable devices and health data by employers and insurers to monitor employee health behaviors, typically through financial incentive structures attached to wellness programs.
Data labor: The production of commercial value by individuals through their digital activity, without compensation proportionate to the value generated. Self-tracking is analyzed as a particularly intensive form of data labor.
Social self-surveillance: The practice of sharing self-tracking data on social media platforms, within app social features, and in dedicated communities, transforming personal monitoring into social performance.
Voluntary monitoring: Self-tracking undertaken at the individual's own initiative, without explicit external mandate. The chapter analyzes the conditions under which "voluntary" monitoring loses its genuinely voluntary character through financial incentivization and social normalization.
Discussion Questions
-
The chapter argues that self-tracking "normalizes" continuous self-monitoring in ways that prepare the cultural ground for institutional surveillance. Do you find this normalization argument persuasive? What evidence supports or challenges it?
-
Jordan decides to sign up for the employer wellness program after calculating the financial impact of not doing so. Using the "consent as fiction" framework, evaluate whether Jordan's decision constitutes meaningful consent to the wellness surveillance the program involves.
-
Foucault's analysis characterizes self-tracking as a technology through which individuals apply social norms to themselves. Identify a specific self-tracking practice and analyze what norms it encodes. Who benefits from those norms? Who is disadvantaged by them?
-
The chapter argues that the commercial ecology of self-tracking makes individuals into "data laborers." What would fair compensation for this data labor look like? Is "data ownership" — the idea that individuals should own and control data about themselves — a viable alternative to the current commercial model?
-
Part 4 has examined surveillance in residential spaces, in relationships, in pockets, and in bodies. What is the relationship between all these forms of surveillance? Is there a single underlying logic, or are these separate phenomena that happen to share some features?