> "The internet interprets censorship as damage and routes around it."
In This Chapter
- Opening: The Conversation Jordan and Yara Had
- 32.1 What Counter-Surveillance Is
- 32.2 Encryption Fundamentals
- 32.3 Why Encryption Matters: Beyond "Nothing to Hide"
- 32.4 The "Going Dark" Debate
- 32.5 VPNs: What They Do and Don't Do
- 32.6 Tor and the Onion Routing Network
- 32.7 Browser Privacy: Extensions and Configuration
- 32.8 Mobile Privacy: Hardening Your Smartphone
- 32.9 Anonymization and Re-Identification: The Limits of "Anonymized" Data
- 32.10 Obfuscation as Counter-Surveillance Strategy
- 32.11 Metadata Hygiene: Signal, ProtonMail, and Tails OS
- 32.12 The Limits of Individual Counter-Surveillance
- 32.13 The 10-Step Practical Action Checklist
- 32.14 Jordan and Yara: Setting Up Signal
- 32.15 Chapter Summary
Chapter 32: Counter-Surveillance: Encryption, Anonymization, and Obfuscation
"The internet interprets censorship as damage and routes around it."
— John Gilmore (1993)
"Arguing that you don't care about the right to privacy because you have nothing to hide is no different from saying you don't care about free speech because you have nothing to say."
— Edward Snowden
Opening: The Conversation Jordan and Yara Had
It started with a text message.
Jordan had been telling Yara — over standard SMS — about the digital rights meeting they'd attended, about the data broker forms they'd filled out, about the growing discomfort they felt knowing that their employer could probably access this conversation through the carrier's servers, that surveillance advertising companies could correlate their movements through app location tracking, that their advocacy work was being conducted over infrastructure designed to maximize disclosure, not protect it.
Yara's reply was brief: Let's move this to Signal.
Jordan had heard of Signal — Marcus had mentioned it several times in the context of privacy nerd enthusiasm — but had never actually installed it. They downloaded it in the break room at the warehouse, created an account, and texted Yara the link to connect.
When they resumed the conversation on Signal, something felt different — not in the content, but in the quality of the space. Jordan knew, because they had spent weeks learning it, that the conversation was now end-to-end encrypted: the Signal app on their phone, and Yara's phone, held the cryptographic keys. No one in the middle — not Signal's servers, not their carrier, not any government agency with a subpoena to Signal — could read the messages. Signal itself couldn't read them. The conversation was private in a way that their previous messages had not been.
"Is this enough?" Jordan typed.
"It's a start," Yara wrote back. "Not a solution."
That distinction — between a start and a solution — is the organizing tension of this chapter. Counter-surveillance tools are real. They provide genuine protection. They are the product of remarkable technical achievement and sustained civil liberties advocacy. And they are not sufficient by themselves. Understanding both what they do and what they don't do is essential for anyone who takes surveillance seriously.
32.1 What Counter-Surveillance Is
Counter-surveillance is not one thing. It is a cluster of practices — technical, behavioral, legal, and artistic — that share the goal of reducing or undermining the asymmetry between watcher and watched.
The field has no single name. Privacy technologists talk about "privacy-enhancing technologies" (PETs). Security researchers talk about "operational security" (opsec). Activists talk about "security culture." Artists talk about "camouflage" and "obfuscation." Lawyers talk about "rights assertion." Each frame captures something real; none captures everything.
For this chapter, we will organize counter-surveillance into four categories:
- Technical: Encryption, anonymization networks, VPNs, secure operating systems, browser extensions, device hardening
- Behavioral: Metadata hygiene, compartmentalization, operational security practices
- Legal: Rights assertion (Chapter 31), freedom of information requests, legal challenges to surveillance programs
- Artistic/social: Obfuscation, CV Dazzle, AdNauseam, culture jamming (Chapter 33)
This chapter focuses primarily on the technical and behavioral. Chapter 33 addresses artistic and activist approaches.
💡 Intuition Checkpoint: Before reading further, think about your current communication practices. If someone gained access to your phone and email accounts, what would they learn about you? Now think: who has access to your communications infrastructure that you haven't thought about? Your carrier? Your email provider? App makers? The answer to the second question determines the scope of the problem that technical counter-surveillance addresses.
32.2 Encryption Fundamentals
Encryption is the mathematical transformation of readable information (plaintext) into unreadable scrambled data (ciphertext) that can only be decoded by someone with the right key. It is the foundational technology of digital privacy.
Symmetric and Asymmetric Encryption
Symmetric encryption uses the same key to encrypt and decrypt. It's fast and efficient — ideal for encrypting stored files or communications between systems that already share a key. The problem: how do you securely share the key in the first place? If you're communicating with someone you've never met, you can't call them up and read the key to them without someone intercepting it.
Asymmetric encryption (also called public-key encryption) solves the key distribution problem elegantly. Each party has two mathematically related keys: - A public key that anyone can have — used to encrypt messages to you - A private key that only you hold — used to decrypt messages encrypted with your public key
If Jordan wants to send Yara a message that only Yara can read, Jordan encrypts it with Yara's public key. Only Yara's private key can decrypt it. Jordan can give that message to anyone — even hostile eavesdroppers — without compromising its contents. The private key never leaves Yara's device.
This sounds like magic, and in some sense it is — it depends on mathematical problems (like factoring very large numbers) that are computationally infeasible to solve without the private key, even with enormous computing resources.
🎓 Advanced Note: The security of most asymmetric encryption systems (RSA, Diffie-Hellman) depends on the computational difficulty of certain mathematical problems. Quantum computers, if they achieve sufficient scale, could potentially solve these problems efficiently — "breaking" current encryption. Post-quantum cryptography is an active research area; NIST standardized several post-quantum algorithms in 2024. For communications requiring long-term security (where adversaries might record encrypted messages now to decrypt later when quantum computers are available), this is a serious concern.
End-to-End Encryption: The Signal Protocol
End-to-end encryption (E2EE) means that messages are encrypted on the sender's device and decrypted only on the recipient's device. No one in the middle — not the service provider, not the network, not anyone with a subpoena to the service provider — can read the content.
The Signal Protocol, developed by Moxie Marlinspike and the Signal Foundation, is widely regarded as the gold standard for E2EE messaging. It combines: - Asymmetric key exchange (X3DH protocol) for establishing secure sessions - Symmetric encryption (AES-256) for message content - The Double Ratchet algorithm for forward secrecy — each message uses a different key, derived from the previous one, so compromising one key doesn't compromise past or future messages - Sealed sender for hiding metadata about who is communicating with whom
The Signal Protocol is also used by WhatsApp (for messages), iMessage (partially), and many other applications. But there's an important distinction: using the Signal Protocol and using the Signal app are different things. WhatsApp uses the Signal Protocol for message encryption but WhatsApp is owned by Meta and collects extensive metadata — who you talk to, when, how often. Signal, the app, collects minimal metadata by design.
📊 Real-World Application: When a grand jury subpoenaed Signal in 2016 for information about a user, Signal could provide only: the date the account was created, and the date it last connected to Signal's servers. That was the totality of information Signal held. This is what "minimal data collection" means in practice — the company cannot hand over what it doesn't have.
32.3 Why Encryption Matters: Beyond "Nothing to Hide"
The most common dismissal of encryption is the "nothing to hide" argument: if you're not doing anything wrong, why do you need to hide your communications?
This argument fails for several reasons.
First, it presupposes a permanent and correct definition of "wrong." What is legal today may be illegal tomorrow; what is illegal in one jurisdiction may be legal in another. People living under authoritarian governments — or in democracies with expanding surveillance powers — face real risks from communications that are perfectly lawful. Before Roe v. Wade's reversal, discussing abortion access was legal; in some states afterward, it became legally complex. Discussing union organizing is legal; employers surveilling those discussions have used them to retaliate.
Second, it confuses privacy with secrecy. Privacy is not about hiding wrongdoing. It is about controlling who has access to your thoughts, relationships, and activities. You have a private conversation with your doctor not because you're doing something wrong, but because that information belongs to you and your doctor. You close the bathroom door not because you're committing crimes, but because some things are not for general observation.
Third, it ignores the chilling effect. Research consistently shows that people behave differently when they know they're being observed (recall the panopticon from Chapter 2). If your communications are monitored, you may avoid topics not because they're wrong but because they're sensitive, uncertain, or merely personal. The result is a narrowing of thought and expression — exactly what surveillance is designed to produce in populations under authoritarian control, and a genuine cost even in democracies.
Fourth, it ignores who actually needs encryption. Domestic abuse survivors hiding from abusive partners. Journalists protecting sources. Activists coordinating in countries where their causes are illegal or dangerous. Whistleblowers documenting institutional wrongdoing. LGBTQ+ people in contexts where their identity is criminalized or endangers them. HIV-positive people concerned about medical privacy. Encryption for all of these people is not about hiding wrongdoing — it is about basic safety.
🔗 Connection: The "nothing to hide" argument is a variant of the consent-as-fiction problem examined in Chapter 31 and throughout this textbook. It treats the absence of obvious secrets as consent to surveillance. But privacy is a prior condition of autonomy, not merely a protection for secrets.
32.4 The "Going Dark" Debate
Law enforcement and intelligence agencies have argued for decades that encryption "goes dark" — makes their lawful surveillance impossible because they can't intercept communications even with a court order. The FBI, in particular, has pushed for "exceptional access" — backdoors or key escrow mechanisms that would allow government access to encrypted communications under warrant.
This debate has recurred since the 1990s "Crypto Wars" (when the government tried to mandate the Clipper Chip, a backdoored encryption standard) and has intensified with the widespread adoption of E2EE.
The law enforcement argument: - Criminals and terrorists use encrypted communications to plan attacks and coordinate activity - Lawful interception with court authorization has always been a cornerstone of law enforcement - Strong encryption that even providers can't break prevents legitimate oversight - The public has a safety interest in allowing law enforcement access
The counter-argument (made by cryptographers, civil liberties groups, and the security community): - There is no backdoor only for the good guys. Any cryptographic weakness that allows government access can be discovered and exploited by malicious actors, foreign intelligence services, and criminals. A backdoor is a backdoor. - We are not "going dark" — we're in a golden age of surveillance. More metadata, device data, cloud backups, and other information is available to law enforcement than ever before. The FBI's "going dark" testimony is disputed by the evidence. (See the Berkman Klein Center's "Don't Panic" report, cited in Chapter 31's further reading.) - The math doesn't allow for selective vulnerability. Cryptographers' consensus is that there is no way to build a system that is secure against everyone except authorized governments. Systems are either secure or they're not. - The global deployment of U.S. encryption standards matters. If the U.S. mandates backdoors in encryption, it weakens encryption globally — including for dissidents, human rights workers, and businesses operating in authoritarian countries.
⚠️ Common Pitfall: The "going dark" debate is often framed as law enforcement safety vs. privacy. The actual trade-off is more complex: law enforcement's ability to intercept one class of communications vs. everyone's security against criminal hackers, foreign intelligence services, and abusive governments. Treating this as a simple safety vs. privacy trade-off misses the structural security costs of weakening encryption.
32.5 VPNs: What They Do and Don't Do
Virtual Private Networks (VPNs) are one of the most commonly used and commonly misunderstood privacy tools.
What a VPN Does
A VPN creates an encrypted tunnel between your device and a VPN server operated by the VPN provider. Your internet traffic appears to originate from the VPN server's IP address rather than your own. Two primary effects:
- Your internet service provider (ISP) cannot see the content of your traffic — they see an encrypted connection to the VPN server
- Websites and services you access cannot see your real IP address — they see the VPN server's address
This provides genuine privacy benefits in specific circumstances: - Prevents your ISP from monitoring and selling your browsing history (which ISPs in the US can legally do) - Prevents networks (coffee shop WiFi, employer networks) from seeing your traffic - Can circumvent geographic restrictions on content (streaming services, censored websites) - Protects against some forms of tracking that rely on IP address consistency
What a VPN Does NOT Do
A VPN is not anonymity software. It does not make you untraceable. Specifically: - It shifts trust from your ISP to your VPN provider. The VPN provider can now see your traffic. If the VPN provider keeps logs and is subpoenaed, or if the VPN provider is run by or cooperates with surveillance agencies, your traffic is exposed. - It does not prevent browser fingerprinting or tracking by cookies and other persistent identifiers - It does not prevent Google, Facebook, or other logged-in services from knowing who you are - It does not protect against malware on your device - It does not hide your traffic from the websites you visit — they see the VPN server's IP, but if you're logged into accounts, they know who you are
✅ Best Practice: Evaluating VPN providers: - Does the provider have a verified no-logs policy? (Look for independent audits, not just marketing claims) - Where is the provider headquartered, and what data retention laws apply? - Does the provider's business model depend on selling your data? (Free VPNs very often do) - Has the provider's no-logs claims been tested by subpoenas or court orders?
Recommended providers with strong track records: Mullvad (accepts cash, does not require email for signup), ProtonVPN (Swiss-based, open source), IVPN. Avoid free VPNs as a category — if you're not paying, your traffic is likely the product.
32.6 Tor and the Onion Routing Network
Tor (The Onion Router) provides a level of anonymity that VPNs cannot. Originally developed by the U.S. Naval Research Laboratory for secure communications and now operated by the nonprofit Tor Project, Tor works by routing your traffic through a series of three relays (nodes), each of which knows only the previous and next node — not your identity or final destination.
The analogy: imagine sending a letter inside three nested envelopes. The postal carrier who delivers the first envelope knows your address but not what's inside or where it's ultimately going. The carrier who handles the second envelope knows it came from the first but not your original address or destination. The carrier who delivers the final envelope knows the destination but not who originally sent it. No single carrier has the full picture.
How Tor Works in Practice
- Traffic is encrypted in three layers (hence "onion")
- Each relay decrypts one layer and passes the traffic to the next
- The exit node — the last relay — sends traffic to the final destination but doesn't know who originally sent it
- The entry node knows who you are but not where you're going
- No single relay has both pieces of information
Tor's Strengths
- Genuine anonymity against network-level surveillance, ISP monitoring, and most government adversaries
- Bypass censorship — used extensively by journalists, activists, and ordinary people in countries with internet censorship (China, Iran, Russia)
- Access to .onion services — hidden services that operate entirely within the Tor network, making both the user and the server difficult to identify
- Strong track record — has provided genuine protection for journalists, whistleblowers, and activists for over two decades
Tor's Limitations
- Slower than regular browsing — routing through three relays adds latency
- Some sites block Tor exit nodes
- Not immune to all attacks — correlation attacks (if an adversary controls both the entry and exit nodes) can deanonymize users; this is mainly a risk from nation-state adversaries
- Not protection against application-layer mistakes — logging into your Google account over Tor defeats the anonymity Tor provides; your behavior within websites can still identify you
- Not protection against malware — a compromised device reveals your activity regardless of what network you're using
🎓 Advanced Note: Operation Onymous (2014), in which law enforcement agencies took down dozens of dark web sites including Silk Road 2.0, used a combination of operational mistakes by site operators (not Tor vulnerabilities) and some exploitation of software vulnerabilities in the sites themselves. The Tor protocol itself was not compromised. This distinction matters: Tor is a strong tool that users sometimes defeat through their own mistakes.
32.7 Browser Privacy: Extensions and Configuration
The web browser is one of the most surveillance-intensive applications most people use. Browser tracking includes: - Third-party cookies — tracking across sites - Browser fingerprinting — identifying your device through its unique configuration - Tracking pixels — tiny images embedded in web pages and emails that report when they're loaded - Supercookies/evercookies — persistent tracking mechanisms that survive clearing normal cookies - Canvas fingerprinting, WebRTC leaks — more sophisticated identification techniques
Key Privacy Extensions
uBlock Origin is the most important privacy extension for most users. It is an efficient, open-source content blocker that blocks: - Advertising networks and their tracking scripts - Known malware domains - Third-party trackers - Fingerprinting scripts
Unlike AdBlock Plus, which allows "acceptable ads" (paid by advertisers), uBlock Origin blocks by default and receives no advertising revenue. It is free, open source, and maintained by a small team without commercial conflicts.
Privacy Badger (developed by the EFF) uses a learning approach: it observes which domains track you across multiple sites and blocks them. It complements uBlock Origin by catching trackers that pattern-matching approaches might miss. Note: Privacy Badger is being scaled back by EFF in favor of other tools; uBlock Origin alone provides strong protection.
HTTPS Everywhere (EFF) ensured that connections were made using HTTPS when available, preventing network-level interception. This extension is now largely obsolete because modern browsers default to HTTPS and warn users about HTTP connections.
Firefox Containers allow you to isolate different browsing activities — your Facebook browsing happens in a Facebook container that can't see your other tabs. The Facebook Container extension, specifically, prevents Facebook from tracking your activity on other sites.
Browser Choice
Firefox with default settings plus uBlock Origin is the recommendation for most users — it provides strong privacy features, is open source, and doesn't depend on a business model tied to advertising.
Brave is a Chromium-based browser with built-in tracking protection and ad blocking. It has a controversial integrated cryptocurrency system (BAT tokens) but its privacy defaults are strong.
Tor Browser provides the strongest privacy but at the cost of speed and some usability limitations. Use for sensitive browsing.
Chrome and Edge should be used with caution for privacy-sensitive activities, as they are produced by companies (Google and Microsoft) with significant advertising and data collection interests.
📝 Note on search engines: Google's search engine is one of the largest data collection operations on the internet. DuckDuckGo and Startpage offer privacy-respecting search alternatives. DuckDuckGo does not track searches or build profiles; Startpage shows Google results without giving Google information about who is searching.
32.8 Mobile Privacy: Hardening Your Smartphone
Chapter 18 examined how smartphones are among the most powerful surveillance devices in everyday life. The same devices that enable location tracking, contact monitoring, and behavioral profiling can be configured to reduce — though not eliminate — that exposure.
iOS vs. Android
iOS (Apple iPhone) provides stronger privacy defaults than Android by default: - App tracking transparency requires explicit permission for cross-app tracking - Location access can be set to "while using app" rather than background - iMessage provides E2EE for Apple-to-Apple messages - iCloud backups are encrypted but Apple holds keys (Apple can give governments access to iCloud backup content)
Android varies significantly by manufacturer. Stock Android (Google Pixel) has improved privacy features; manufacturer-modified Android (Samsung, etc.) varies. Android's openness allows for more customization but also means less reliable defaults.
GrapheneOS is a hardened, privacy-focused Android distribution that runs on Pixel phones. It provides: no Google services by default, stronger sandbox isolation for apps, and can run Google Play apps in an isolated container if needed. It is the gold standard for mobile privacy but requires technical comfort.
Practical Smartphone Hardening
✅ Best Practice: 10 Steps to Harden Your Smartphone
- Use a strong, unique lock screen PIN (6+ digits; avoid patterns and fingerprint-only unlock on devices where fingerprint can be compelled)
- Audit app permissions regularly — location, microphone, camera, contacts. Revoke permissions that apps don't need
- Disable advertising ID and tracking — iOS: Settings > Privacy > Tracking. Android: Settings > Privacy > Ads > Delete Advertising ID
- Use Signal for messaging — E2EE, minimal metadata collection, disappearing messages
- Use ProtonMail or Tutanota for sensitive email — encrypted email providers
- Disable location services for apps that don't need it — many apps request location without functional need
- Keep OS and apps updated — security patches close vulnerabilities that surveillance tools exploit
- Use a VPN on untrusted networks (coffee shops, airports, hotels)
- Be cautious about apps from unknown developers — many free apps monetize through data collection
- Consider what you back up and where — iCloud and Google backup contain extensive personal data; understand the privacy implications
32.9 Anonymization and Re-Identification: The Limits of "Anonymized" Data
A common claim from companies and governments defending data collection: "We anonymize the data, so your privacy is protected." This claim requires scrutiny.
True anonymization means removing identifying information so that re-identification is not feasible. Genuine anonymization is much harder to achieve than most organizations acknowledge.
The Re-Identification Problem
Research has repeatedly demonstrated that "anonymized" datasets can often be re-identified:
- Netflix Prize dataset (2008): Researchers Arvind Narayanan and Vitaly Shmatikoff showed that the "anonymized" movie rating dataset Netflix released for a machine learning competition could be de-anonymized by cross-referencing with public IMDb ratings. A person's movie ratings, even without name attached, create a pattern distinctive enough to identify them.
- AOL search data (2006): AOL released "anonymized" search logs. A New York Times reporter identified User 4417749 as Thelma Arnold of Lilburn, Georgia, from search queries alone ("landscapers in Lilburn, Ga" + queries about 60-year-old single women's health concerns).
- Location data: Research by Montjoye et al. (2013) found that four spatio-temporal points are enough to uniquely identify 95% of individuals in a mobility dataset.
- Credit card transactions (2015): Research showed that three or four metadata points from credit card transactions (merchant and time) are sufficient to identify individuals in a dataset of 1.1 million people.
The mathematical intuition: modern datasets are high-dimensional — they contain many attributes for each person. Even if each attribute is shared with many people, the combination of attributes is nearly unique. This is the aggregation problem that Warren and Brandeis sensed but couldn't have articulated in 1890: combining individually innocuous information creates profiles that are intensely identifying.
⚠️ Common Pitfall: The term "anonymized" in corporate privacy policies and government justifications for data collection should not be taken at face value. "Anonymized data" in practice usually means "pseudonymized data with identifying fields removed" — which is often reversible. The only genuine anonymization involves not collecting the data in the first place, or applying strong techniques like differential privacy that add mathematical noise to limit re-identification.
🎓 Advanced Note: Differential privacy, developed by cryptographer Cynthia Dwork, provides a mathematically rigorous definition of privacy that allows statistics about a population to be computed without exposing information about individuals. Apple and Google use differential privacy in some data collection contexts. It represents a genuine technical advance over simply removing names. But differential privacy involves trade-offs between privacy and accuracy — more privacy protection means less accurate statistics — and its implementation can be flawed.
32.10 Obfuscation as Counter-Surveillance Strategy
Beyond protecting your own data, obfuscation involves generating misleading information to pollute surveillance datasets. The goal: make your actual behavior harder to identify by surrounding it with noise.
AdNauseam
AdNauseam is a browser extension that clicks every ad it blocks. The effect: your advertising profile is flooded with contradictory data. You appear to be interested in everything — fishing gear and feminist literature, insurance and extreme sports, country music and avant-garde jazz. The profile that advertisers build on you becomes useless for targeting.
AdNauseam was developed by academics Daniel Howe, Mushon Zer-Aviv, and Helen Nissenbaum as a combination of ad-blocking and obfuscation. Nissenbaum, who developed the concept of "obfuscation" as a privacy strategy, argues that when you cannot opt out of surveillance, making the surveillance data worthless is a legitimate form of resistance.
Note: Google has removed AdNauseam from the Chrome Web Store. It is available for Firefox. This itself illustrates the limits of obfuscation: Google's interest in advertising data leads it to suppress tools that undermine that data.
Decoy Searches and Location Spoofing
More broadly, obfuscation strategies include: - Decoy internet searches — searching for things you're not actually interested in to pollute your profile - Location spoofing — apps that allow you to present a false location to apps that request it - Randomized browsing — browser extensions that generate random browsing activity in the background
The Ethics and Limits of Obfuscation
Nissenbaum and Ryan Calo have debated the ethics of obfuscation. Arguments in favor: - When surveillance is nonconsensual, resistance through any available means is legitimate - Obfuscation imposes costs on surveillance systems proportional to the invasiveness of the surveillance - Pollution of surveillance databases creates positive externalities — it degrades the accuracy of systems used to target, discriminate, and control
Arguments against: - Obfuscation pollutes data that may have legitimate uses (aggregate statistics, security research) - It may provide false confidence — generating fake data doesn't prevent genuine surveillance - It is arms-race dynamics — sophisticated adversaries will develop de-obfuscation techniques
The deeper point: obfuscation works as strategy when widely adopted. Individual obfuscation is a drop in a data ocean. Collective obfuscation — by communities, by activist networks, by coordinated groups — could meaningfully degrade surveillance accuracy. This connects to the theme that individual counter-surveillance is limited by its individualism.
32.11 Metadata Hygiene: Signal, ProtonMail, and Tails OS
Even with content encrypted, metadata reveals enormous amounts about you. Recall from Chapter 9: the NSA's "we kill people based on metadata" quote captures the operational reality. Who you communicate with, when, how often, from where — this pattern of life data is often more valuable to surveillance analysts than content.
Signal: Minimizing Metadata
Signal is designed not just to encrypt content but to minimize metadata. Specific features: - Sealed sender: The Signal server doesn't know who sent a message to whom (only that some encrypted data was delivered to some account) - Disappearing messages: Messages can be set to auto-delete after a configurable period on both sender and recipient devices - No advertising, no monetization of data: Signal is a nonprofit; its business model is donations, not data - Open source: The code can be audited by security researchers - Note to Self: The "Note to Self" feature allows you to use Signal as a private notes app
When a U.S. government agency subpoenas Signal, Signal can provide only: the date an account was created and the date it last connected. This is not corporate rhetoric — it has been tested in real legal proceedings.
ProtonMail and Tutanota: Encrypted Email
Standard email is not private. Your email provider can read your emails, is subject to government subpoenas, and frequently scans email for advertising and other purposes.
ProtonMail (based in Switzerland) provides end-to-end encryption for emails sent between ProtonMail accounts, and encrypts all emails at rest using keys only you hold. Emails to non-ProtonMail recipients are not E2EE unless you use password-protected messages. Swiss law provides some additional protections against U.S. government data requests.
Tutanota (based in Germany) provides similar E2EE email, with strong privacy defaults and open-source code.
Limitations: Email metadata (sender, recipient, subject line, timestamp) is difficult to encrypt while maintaining email's routing functionality. E2EE email services protect content but metadata is more exposed.
Tails OS: The Amnesia Operating System
Tails (The Amnesic Incognito Live System) is an operating system you run from a USB drive. It routes all internet traffic through Tor and leaves no trace on the computer it runs on — when you shut it down, nothing is written to the host machine's storage.
Tails is used by journalists, activists, whistleblowers, and security researchers who need to work on sensitive material without leaving forensic traces. Edward Snowden used Tails (along with PGP encryption) to communicate with journalists before the NSA revelations.
Tails is not for everyday use — it is a specialized tool for situations where leaving no trace is essential. Its limitations include: slower browsing through Tor, difficulty with some web applications, and the need to carry and protect the USB drive.
32.12 The Limits of Individual Counter-Surveillance
Having surveyed the technical toolkit, it's essential to be honest about what it cannot do.
The Structural Problem
Every technical tool described in this chapter addresses one aspect of one layer of the surveillance architecture. Encryption protects message content from interception but doesn't address the social sorting that happens with metadata. Tor anonymizes network location but doesn't address employer monitoring. Privacy browser extensions block advertising trackers but don't address CCTV, biometric databases, or physical surveillance.
The surveillance architecture documented in this textbook — from employment monitoring to national security interception to advertising ecosystems — is structural. It is built into the design of systems that most people depend on for work, communication, commerce, and civic participation. Technical counter-surveillance tools, used by individuals, address symptoms rather than causes.
📊 Real-World Application: A journalist using Signal, ProtonMail, Tails, and a VPN is genuinely more protected than one using Gmail and standard SMS. But that journalist still works within institutions (news organizations with their own surveillance), travels through spaces with CCTV, uses credit cards that create financial records, and operates in a social world where their sources and contacts are visible to surveillance systems even if the journalist's own communications are not.
The Adversary Model Problem
Counter-surveillance advice is only as good as the adversary model it assumes. The appropriate tools depend on who you're protecting against:
- Advertising trackers: uBlock Origin, privacy browser, VPN provide strong protection
- ISP monitoring: VPN provides strong protection
- Local network attacker: VPN and HTTPS provide strong protection
- Government metadata collection: Signal, minimal-metadata tools help significantly; Tor adds anonymity
- Nation-state adversary with significant technical resources: Tails, strong opsec, physical security awareness; no tool provides certainty
- Physical surveillance, covert operatives: Technical tools provide minimal protection; behavioral and social approaches more relevant
⚠️ Common Pitfall: The most dangerous mistake in counter-surveillance is overconfidence — believing that using encrypted apps or Tor makes you invisible when your adversary is monitoring at the application layer, has physical access to your devices, or is targeting the people you communicate with rather than your communications directly.
The Community/Collective Problem
The utility of privacy tools depends partly on how widely they're used. Signal is most useful when the people you want to communicate with also use Signal. If you use Signal but your colleagues use SMS, you're encrypting your half of the conversation and exposing theirs.
This network effect cuts against individual counter-surveillance: you can't protect yourself without also protecting (or being protected by) your community. Advocacy for widespread adoption of privacy tools is itself a form of counter-surveillance — it changes the environment in which surveillance operates.
32.13 The 10-Step Practical Action Checklist
This checklist is organized by effort and impact. Start with the high-impact, low-effort steps; move toward the high-effort steps as circumstances warrant.
✅ Practical Counter-Surveillance Checklist
Tier 1: High Impact, Low Effort (Do These First)
-
Install Signal. Replace SMS for personal communications. Enable disappearing messages (set to 1 week or shorter for sensitive conversations). Use it as your default messaging app for everyone who has it.
-
Install uBlock Origin. Add it to every browser you use. It blocks the vast majority of tracking with minimal configuration required.
-
Update software regularly. Most successful device compromises exploit known vulnerabilities that have already been patched. Update your OS, browser, and apps promptly.
-
Use strong, unique passwords and a password manager. Reusing passwords means one breach compromises all your accounts. Bitwarden (open source, free) and 1Password are recommended. Use the password manager to generate and store unique random passwords.
-
Enable two-factor authentication on important accounts. Prefer app-based 2FA (Authy, Google Authenticator) over SMS-based 2FA (SIM swapping attacks can bypass SMS 2FA).
Tier 2: Meaningful Protection, Moderate Effort
-
Switch to Firefox as your default browser. Add uBlock Origin. Consider Firefox Multi-Account Containers to isolate different browsing contexts.
-
Use a reputable VPN on untrusted networks (and consider always-on). Mullvad, ProtonVPN, or IVPN.
-
Audit your app permissions. Go through every app on your phone and revoke permissions it doesn't functionally need. Especially: location (set to "only while using"), microphone, camera, contacts.
-
Opt out of advertising identifiers. iOS: Settings > Privacy > Tracking (disable for all apps). Android: Settings > Privacy > Ads > Delete Advertising ID.
-
Use DuckDuckGo or Startpage as your default search engine. This breaks Google's ability to build a search-query profile on you.
Tier 3: Significant Protection, Higher Effort (For Higher-Risk Situations)
-
Create a ProtonMail or Tutanota account for sensitive correspondence.
-
Use Tor Browser for sensitive research and browsing where you don't want your IP address known.
-
Submit data broker opt-out requests (see Chapter 31 practical guide). Use DeleteMe or do it manually; repeat quarterly.
-
Separate devices/accounts for sensitive activities — a separate browser profile or device for activism, journalism, or other high-risk activities.
-
Learn about Tails OS and consider using it if you regularly work with sensitive material.
32.14 Jordan and Yara: Setting Up Signal
Jordan and Yara sat at the kitchen table with their phones, Yara walking Jordan through the setup.
"Signal is free," Yara said. "Open source. The code is public, so security researchers can check it. And it's run by a nonprofit, so nobody's monetizing your conversations."
Jordan downloaded the Signal app, created an account with their phone number, and looked through the settings. Yara showed them how to enable disappearing messages — two weeks for the general conversation, one week for anything sensitive.
"What about my number?" Jordan asked. "It's tied to my real identity."
"Your carrier knows your number. Anyone you give your number to already knows it. Signal's sealed sender protects who you're talking to on Signal's servers. But if you're really concerned about linking your number to your Signal account, you can use a VoIP number or a second SIM."
Jordan absorbed this. "So Signal isn't perfect."
"Nothing is perfect," Yara said. "Your threat model matters. Are you worried about Signal employees reading your messages? No — they can't. Are you worried about a specific person you gave your number to? Different problem. Are you worried about a nation-state adversary with subpoena power trying to identify who you're communicating with? That requires a different level of opsec."
"Threat model," Jordan repeated. "We talked about that in Dr. Osei's class. You have to know who you're protecting against."
"Right. For most of what we're doing — advocacy work, organizing, talking about things the university and your employer might not like — Signal is excellent protection. We're not doing anything illegal. We're not evading law enforcement. We're protecting our private conversations from surveillance capitalism and institutional snooping. Signal is the right tool for that."
Jordan set up disappearing messages and sent Yara a test message: Testing. This will disappear.
"Now," Yara said, "let's talk about what Signal doesn't do."
Marcus, sitting in the corner pretending to do homework, looked up with interest.
"Signal doesn't protect your metadata if someone is watching your network traffic closely and can see you're using Signal at all. It doesn't protect your conversations if someone has physical access to your phone. It doesn't protect you from screenshots. And it doesn't protect the people you're talking to if their devices are compromised."
"So the weakest link is the endpoint," Jordan said.
"Always. The encryption is strong. The endpoints — the devices — are where you're vulnerable."
Jordan thought about this. The encryption was solid, mathematical, reliable. The human beings on either end were the unpredictable variable. Surveillance, ultimately, found the human.
"We need to talk about opsec," Yara said. "But first — you should know your rights if your devices are ever seized."
That was for Chapter 31. Jordan already knew their rights. Now they were learning the tools. And beginning, slowly, to feel less like an object of surveillance and more like a person with agency within it.
32.15 Chapter Summary
Counter-surveillance encompasses technical, behavioral, legal, and artistic practices that reduce the asymmetry between watcher and watched. No single approach addresses the full architecture of surveillance; all approaches are limited by the structural nature of contemporary surveillance systems.
Encryption: End-to-end encryption, exemplified by the Signal Protocol, provides mathematically strong protection for message content. The "nothing to hide" argument fails because privacy is not about hiding wrongdoing but about controlling access to one's own thoughts, relationships, and activities.
The going dark debate: Law enforcement arguments for backdoors in encryption are opposed by the security community's consensus that there is no backdoor only for authorized parties — weakening encryption for government access weakens it for everyone.
VPNs: Shift trust from ISP to VPN provider; provide meaningful protection against ISP monitoring and local network surveillance; do not provide anonymity.
Tor: Provides genuine anonymity through onion routing; slower than regular browsing; not immune to all attacks; the gold standard for anonymous browsing.
Browser privacy: uBlock Origin is the most important single extension; Firefox with privacy extensions provides strong protection; browser fingerprinting remains a persistent challenge.
Mobile hardening: Ten practical steps reduce smartphone surveillance exposure without requiring device abandonment.
Anonymization and re-identification: "Anonymized" data can frequently be re-identified through aggregation; genuine anonymization is technically difficult and rarely achieved by organizations claiming to anonymize data.
Obfuscation: AdNauseam and similar tools pollute surveillance databases; most effective as a collective strategy; limited as individual practice.
Metadata hygiene: Signal, ProtonMail, and Tails OS minimize metadata collection; metadata remains a significant surveillance vulnerability even when content is encrypted.
Limits: Individual counter-surveillance addresses symptoms rather than structural causes. The appropriate tool set depends on your threat model. Community adoption matters more than individual perfection.
Next: Chapter 33 examines art and activism as forms of counter-surveillance — the cultural and political dimensions of resistance to the surveillance architecture.