30 min read

Priya had been planning to break up with her boyfriend for three weeks. She knew she wanted to, but something stopped her every time she tried: a vague unease she couldn't quite name. It wasn't fear of his reaction, exactly, though that was part of...

Chapter 19: Relationship Surveillance: Stalkerware, Parental Controls, and Trust

Opening: The Text That Wasn't Sent

Priya had been planning to break up with her boyfriend for three weeks. She knew she wanted to, but something stopped her every time she tried: a vague unease she couldn't quite name. It wasn't fear of his reaction, exactly, though that was part of it. It was something about the feeling that he always seemed to know things — where she'd been, who she'd been with, what she was planning — before she had told him.

She started testing it. She mentioned to her mother, in a text message, that she was thinking about going to her friend Jess's place on Friday. She did not tell her boyfriend. He mentioned Jess's name on Thursday.

She downloaded a security app recommended by a friend — a digital forensics tool that scanned installed apps against a database of known stalkerware products. It found something: a monitoring application installed on her phone, disguised under a generic system-service name, that was transmitting her text messages, call logs, GPS location, and keystroke history to an email address she did not recognize.

The email address was her boyfriend's.

He had been reading her texts. He had been tracking her location for months — for all of their eight-month relationship, as far as she could determine. Every text she had drafted and discarded. Every location she had visited. Every call.

The question Priya kept returning to in the weeks after she ended the relationship and changed her phone was: how had he gotten the app on her phone? The answer, eventually, was a weekend when he'd borrowed her phone "to make a call" because his battery was dead. Eight minutes. That was all it took.


Priya's story is not unusual. The Coalition Against Stalkerware — a nonprofit coalition of cybersecurity companies, digital rights organizations, and domestic violence advocates — documented thousands of reports of stalkerware use in the years following its founding in 2019. Security companies that scan for stalkerware have consistently found it on devices belonging to domestic violence survivors. The technology is commercially available, marketed as legitimate software, and used pervasively in intimate partner abuse contexts.

This chapter examines surveillance within relationships — intimate partnerships, parent-child relationships, and the blurred space in between. It is the most personal chapter in this part, because the surveillance it describes happens between people who know each other, who have or had obligations of care and trust, and who are navigating one of the most charged questions in human relationships: how much do we have the right to know about the people we love?


19.1 The Spectrum of Relationship Surveillance

A Continuum, Not a Binary

Before examining the most harmful end of relationship surveillance — stalkerware, tech-facilitated coercive control — it is important to recognize that surveillance within relationships exists on a spectrum. Not every form of monitoring is abusive. Not every concern about a partner's or child's location is an exercise of control. The challenge in this chapter is to understand where safety-motivated monitoring becomes control-motivated surveillance, without collapsing the distinction between them.

The spectrum runs from:

Agreed mutual transparency: Both partners choose to share location with each other for mutual convenience — coordinating pickup times, knowing when someone is getting home safely. Both have access; both consent meaningfully; either can opt out without consequence.

Asymmetric agreed monitoring: A parent monitors a teen child's location for safety. The child knows about the monitoring and has some degree of (perhaps limited) input into its scope. The asymmetry is acknowledged and (ideally) explained.

Surveillance without disclosure: One partner monitors the other without their knowledge. Even if motivated by concern rather than control, the lack of disclosure creates a fundamental dishonesty in the relationship.

Coercive monitoring: Surveillance used as a tool of control — to restrict movement, to enforce compliance, to gather information used for confrontation, threats, or violence.

Stalkerware-facilitated abuse: Technology deliberately installed to enable monitoring without any prospect of consent, integrated into a pattern of coercive control.

The line between concern and control is not always clearly visible from outside a relationship. What matters structurally is the consent framework, the power dynamic, and the consequences of being monitored — especially the consequences of being caught out.

💡 Intuition Check: Think about a close relationship in your life — a partnership, a family relationship, a close friendship. Is there any form of monitoring in that relationship (location sharing, read receipts, knowledge of social media activity)? Who has access to whose information? Is it reciprocal? Would either party feel comfortable raising concerns about the monitoring? Your answers are a starting point for thinking about where surveillance falls on this spectrum in your own experience.


19.2 Stalkerware: Definition, Technology, and Prevalence

What Stalkerware Is

Stalkerware — also called spouseware, creepware, or monitoring software — refers to applications designed to covertly monitor a device, typically to track a specific individual without their knowledge or consent. Stalkerware typically:

  • Is installed on a device without the device owner's knowledge
  • Operates invisibly (no icon, no notification, designed to avoid detection)
  • Transmits data (location, messages, calls, keystrokes, photos) to a remote party
  • Is marketed for use in intimate relationships ("monitor your spouse," "track your partner") or with a fig-leaf framing as parental control or employee monitoring software

Stalkerware is distinct from legitimate parental monitoring software (which is disclosed to the child and serves developmentally appropriate safety functions, at least in its better forms), from legitimate workplace device management (which is disclosed to employees and applied to employer-owned devices), and from consensual location sharing (which is mutual and revocable).

The distinction between stalkerware and "legitimate" monitoring software is often one of marketing and social framing rather than technical function. The same application may be marketed as a parental control tool and used as a spousal monitoring tool. This dual-use character is fundamental to the stalkerware industry's commercial survival and its legal protection.

📊 Real-World Application: The antivirus company Kaspersky, which maintains a database of stalkerware signatures, published reports in 2020 and 2021 documenting that it detected stalkerware on tens of thousands of devices globally. The Coalition Against Stalkerware's reports have documented year-over-year increases in reported stalkerware incidents correlating with increases in domestic violence reports more broadly. Research by the University of Zürich found that stalkerware was present on approximately 5% of Android devices in a sample of domestic violence survivors — a significant prevalence rate given how invisible the software is designed to be.

How Stalkerware Gets on Phones

Stalkerware installation typically requires one of the following:

Physical access: The most common installation method. An abuser who has brief physical access to a device can install stalkerware in as little as one to three minutes. The app is installed, hidden from the app drawer, and configured to transmit to the abuser's account before the phone is returned. This is how Priya's boyfriend installed the software in the opening scenario.

Phishing: A link sent via text or email that, when clicked, installs the stalkerware automatically, exploiting a vulnerability in the device's operating system. This method does not require physical access but requires the target to click the link.

Carrier or account access: In some cases, an abuser who has access to a shared carrier account or a shared Google/Apple account can enable location tracking or data access through official account features, without installing any third-party app. Google Family Link, Apple's Family Sharing, and carrier "family" tracking plans were designed for family coordination but can be used for monitoring without meaningful consent.

Factory reset + configuration: An abuser who purchases a new phone for a partner and sets it up before giving it to them can configure the device with monitoring built in from the start, using legitimate manufacturer tools.

The Commercial Industry

Stalkerware does not exist in a legal grey zone because it is hard to detect or prosecute. It exists in a legal grey zone because the companies that produce it have deliberately constructed marketing and legal frameworks that obscure its actual use.

Companies like FlexiSPY, mSpy, Spyic, and dozens of others sell their products explicitly by citing use cases that include monitoring intimate partners. Before regulatory and media pressure forced some product changes, several of these companies had advertising copy that explicitly promoted spousal monitoring ("Is your husband cheating? Find out."). Under pressure from the Coalition Against Stalkerware and the FTC, many have removed the most explicit spousal monitoring language while retaining the technical functionality.

The FTC took action against SpyFone in 2021, requiring the company to stop operations and delete all data it had collected — the first major federal enforcement action against a stalkerware company. The action did not address the broader market.

🎓 Advanced Concept: Dual-Use Technology and Regulatory Challenges

Stalkerware represents the sharpest version of the dual-use technology problem in surveillance law. The same software that an abusive partner installs covertly on a victim's phone might be legitimately installed by a parent on a child's phone (with the child's knowledge and for age-appropriate safety purposes) or by an employer on a company-owned device (with disclosed policies). Making the software itself illegal would eliminate legitimate uses. Making only certain uses illegal creates enforcement challenges because the software itself does not display intent — the same binary must be interpreted as legitimate or harmful based on the relationship between installer and monitored person, and the consent (or its absence) between them.

This challenge illustrates a general principle in technology regulation: when harmful use is primarily distinguished from legitimate use by social context rather than technical function, technical regulation is insufficient. Addressing stalkerware requires addressing coercive control as a legal category, strengthening enforcement of abuse laws, and providing survivors with practical tools for detection and removal — not only regulating the software itself.


19.3 The Coalition Against Stalkerware

Formation and Mission

The Coalition Against Stalkerware (CAS) was founded in 2019 by a group of cybersecurity companies — including Kaspersky, Malwarebytes, and Electronic Frontier Foundation — and domestic violence advocacy organizations. Its founding represented a recognition that stalkerware was not merely a technical problem (malicious software to be detected and removed) but a social problem (a tool of intimate partner violence) that required collaboration between the tech community and the advocacy community.

The coalition's approach has several components:

Technical: Establishing shared definitions and detection criteria for stalkerware, enabling antivirus companies to detect and flag stalkerware consistently across platforms. Before the coalition, different security companies had inconsistent approaches to stalkerware — some flagged it as malware, others treated it as legitimate commercial software.

Survivor-centered: Developing protocols for what security companies should do when they detect stalkerware on a device. This is more complex than it sounds: simply removing stalkerware without telling the survivor can be dangerous, because an abuser who suddenly loses access may escalate violence. The coalition developed guidance for notifying survivors without triggering that escalation.

Advocacy: Working with technology platforms (Google, Apple, app stores) to remove stalkerware from official distribution channels. Both major app stores have policies against stalkerware, but enforcement has been inconsistent — products that would clearly qualify as stalkerware under strict interpretation have remained on official stores for extended periods.

Legal: Providing resources for law enforcement and prosecutors to recognize stalkerware as a tool of abuse and to build cases accordingly.

Tech Industry Response

Google and Apple have both taken steps to address stalkerware in their ecosystems:

App store policies: Both major app stores prohibit apps that "track or monitor a person without their knowledge" in ways that violate the platform's policies. Enforcement has been imperfect. Apps with stalkerware functionality have remained in official stores by using legitimate-sounding descriptions ("family safety," "child monitoring") and technical workarounds. Both stores have improved enforcement over time, but complete elimination of stalkerware from official distribution channels has not been achieved.

OS-level protections: iOS 14 (2020) introduced indicators when apps access the microphone or camera. Both iOS and Android have improved location permission disclosures. iOS 16 introduced a "Safety Check" feature specifically designed for abuse survivors to audit and revoke permissions and account access. Android has introduced similar features.

Account access controls: Both Apple and Google have improved the visibility of shared account access — making it easier to see what devices and accounts have access to your data — in part as a response to the stalkerware problem.

These improvements are meaningful but incomplete. Physical access to a device, combined with willingness to exploit that access, remains the primary stalkerware installation vector, and operating system improvements do not prevent physical access.

Best Practice: Stalkerware Detection and Response

If you suspect stalkerware is on your device:

  1. Use a secondary device for communications while investigating — a friend's phone, a library computer. Assume any messages sent on the suspected device are monitored.

  2. Contact a domestic violence hotline. The National Domestic Violence Hotline (1-800-799-7233) and the National Network to End Domestic Violence's Safety Net project have specific guidance on technology safety. They can provide advice on how to proceed safely given your specific situation.

  3. Use a safety-aware scanning tool. Lookout Security, Malwarebytes, and Kaspersky all scan for known stalkerware. Use these on a device connected to a WiFi network you control, not shared with the suspected abuser.

  4. Do NOT immediately remove stalkerware without safety planning. If an abuser loses remote access suddenly, they may escalate. Work with an advocate before taking action.

  5. Consider your safety first, privacy second. In acute danger, getting to safety is more important than preserving evidence. Document what you can safely document, but physical safety takes priority.

  6. iOS Safety Check: On an iPhone (iOS 16+), Settings > Privacy & Security > Safety Check provides a consolidated interface for reviewing and revoking location sharing, app access, and account access.


19.4 Parental Monitoring Apps: Safety, Anxiety, and Control

The Parental Monitoring Market

Commercial parental monitoring apps are a significant market, with products including:

Life360: A family location-sharing app with tens of millions of users. Beyond location sharing, Life360 offers driving behavior monitoring (speed, phone use while driving), notifications when family members arrive at or leave specific locations, and (in premium tiers) "Crash Detection" and "SOS" features. Life360 has also been documented as a major location data broker — selling location data from family members, including minors, to commercial buyers.

mSpy: Marketed primarily as a parental control tool, mSpy enables monitoring of texts, calls, location, social media, browser history, and keystroke logging. The same product marketed to parents is also marketed (more discreetly) to partners who want to monitor spouses.

Bark: A monitoring service that uses AI to analyze a child's communications across platforms and flag content that suggests danger (bullying, predatory contact, self-harm, explicit content) without giving parents access to all communications. Bark's design explicitly attempts to provide protection without total surveillance — it does not give parents a full read receipt on every message.

Google Family Link and Apple Screen Time: Built-in operating system tools that enable content filtering, screen time management, and location monitoring. These are integrated into the OS and are therefore more difficult for tech-savvy children to circumvent.

Find My (Apple) / Google Family Sharing: Location-sharing features built into the major operating systems' family account structures, used for both legitimate coordination and, in some cases, monitoring without meaningful consent.

📊 Real-World Application: Life360 made news in 2021 when reporting by The Markup revealed that the company sold precise location data from its users — including children — to location data brokers. The data was sold as part of the same commercial ecosystem examined in Chapter 18. Parents who had installed Life360 for family safety had, without awareness, enrolled their families (including their minor children) as subjects of commercial location surveillance. Life360 subsequently announced changes to its data sales practices following the reporting, but the incident illustrated how safety-marketed family technology connects to the broader commercial surveillance ecosystem.

The Developmental Argument

The ethical debate about parental monitoring of children often turns on a developmental argument: children are developing beings who are not yet fully capable of the judgment required to navigate a digital environment safely. Parental oversight — including monitoring — serves the child's developmental interest in ways that adult monitoring of adults does not. This is the strongest argument for asymmetric, non-consensual parental surveillance.

The developmental argument is valid but limited. Its validity depends on:

  • The child's age and developmental stage. The case for monitoring a 10-year-old is categorically stronger than the case for monitoring a 17-year-old, who is approaching adult autonomy.
  • The nature of what is monitored. Filtering explicit content is different from reading every private message a teenager sends.
  • The transparency of the monitoring. Disclosed monitoring — where the child knows the monitoring occurs and understands why — serves different developmental functions than covert monitoring.
  • The consequences for getting caught. If monitoring is used as an opportunity for punishment rather than for safety, it damages rather than supports the relationship.

Research on parental monitoring and adolescent development presents a complex picture. Moderate, disclosed parental monitoring correlates with better adolescent outcomes on some measures. High levels of monitoring — particularly covert monitoring — correlate with reduced adolescent autonomy, reduced trust in parent-child relationships, and in some studies, worse outcomes on risk behaviors, possibly because covert monitoring damages the relationship quality that is actually protective.

⚠️ Common Pitfall: Parents and educators sometimes assume that because children cannot meaningfully consent (due to their developmental stage and power asymmetry), consent considerations do not apply to parental monitoring. This is a significant error. Children have developing but real privacy interests. Treating children as subjects with no relevant privacy claims is both ethically problematic and practically counterproductive — it damages the relationship quality that actually protects children. The question is not "should children have privacy from their parents" but "what scope of monitoring is appropriate at what developmental stage, and how should it be practiced to support rather than undermine the relationship."

🔗 Connection to Chapter 37: The full analysis of children's privacy — including their rights under COPPA (Children's Online Privacy Protection Act), the psychological effects of surveillance on development, and the question of at what age digital privacy should be treated as an autonomy right — appears in Chapter 37. This chapter focuses on parental monitoring as a form of relationship surveillance; Chapter 37 treats children's privacy more broadly.


19.5 Partner Surveillance: Location Sharing, Read Receipts, and the Trust Question

The Technology of Intimate Monitoring

Between consenting adults in intimate relationships, surveillance technology presents a spectrum of uses that are genuinely complex to evaluate. Some of these are so normalized that their surveillance character is invisible:

Read receipts: The notification that a message has been "read" creates continuous, ambient accountability for communication response. Both parties can see when the other has read their messages. This is surveillance in a technical sense — one party monitoring the other's communication behavior. Its widespread normalization makes it seem like a neutral feature rather than a monitoring mechanism.

Location sharing: Apps like Life360, Google Maps' location sharing feature, and Find My (Apple) allow one person to track another's location in real-time. In many relationships, mutual location sharing is experienced as convenience (coordinating pickups, knowing when a partner is en route) and safety (knowing a partner arrived home safely). In other relationships, the same technology is experienced as surveillance and pressure.

Social media monitoring: Following a partner's social media activity, checking who they interact with, reviewing their location check-ins. This is largely invisible surveillance — no notification reaches the monitored party.

Camera and microphone access: In a shared home, knowing that a partner has smart home devices in common areas creates a form of ambient monitoring.

The research on location sharing in intimate relationships shows a pattern that mirrors the coercive control literature: women are more likely than men to feel surveilled (rather than safely connected) by location sharing; partners in relationships with controlling dynamics are more likely to report using location sharing as a monitoring tool rather than a coordination tool; and the experience of being asked to share location — especially when the request comes with emotional pressure to comply — correlates with other markers of relationship control.

Monitoring vs. Controlling: Where Is the Line?

The distinction between monitoring and controlling is both important and contested. It is important because not all forms of partner awareness constitute abuse; it is contested because people in controlling relationships often genuinely believe they are simply "caring" or "concerned."

Useful markers of the distinction:

Reciprocity: Does both parties have equivalent access to the same information? Mutual location sharing (both can see both) is structurally different from one-way monitoring (A can see B; B cannot see A).

Consent and revocability: Does the monitored party have a genuine, consequence-free ability to withdraw consent? If saying "I'd rather not share my location" would result in anger, punishment, or escalation, the consent to share is not free.

Confrontation: Is monitoring data used for confrontation — "I see you were at the bar until midnight, you said you were working" — in ways that create accountability and pressure? Confrontation based on monitoring data is a sign that the monitoring is functioning as control.

Scope creep: Does monitoring expand over time, with requests for more access following initial compliance? This is a classic pattern in coercive control relationships.

The response to not knowing: Does a partner react with anxiety, anger, or escalation when they cannot monitor? The emotional valence of not having information reveals whether monitoring is serving safety or control needs.

🌍 Global Perspective: The relationship between intimate partner surveillance and cultural norms about privacy within relationships varies significantly across societies. In contexts where marriage is understood as creating a duty of total transparency, location monitoring of a spouse may be more culturally normalized — and resistance to it more socially costly — than in societies with stronger norms of individual autonomy within intimate relationships. This does not make surveillance more ethical in such contexts; it makes the pressure to submit to surveillance stronger. The Coalition Against Stalkerware operates globally and has documented how cultural norms about relationship surveillance affect survivors' ability to recognize and name what is happening to them.


19.6 Coercive Control and the Law

What Coercive Control Is

Coercive control, a concept developed by sociologist Evan Stark in his 2007 book of the same name, describes a pattern of behavior in abusive relationships that uses surveillance, isolation, monitoring, and restriction to create a condition of ongoing subordination — distinct from, and often present without, physical violence.

Stark's insight — which has significantly influenced domestic violence law and practice — is that many of the most harmful aspects of abusive relationships are not episodic violent incidents but the persistent structures of control that constitute the relationship's ongoing character. A partner who monitors location data, controls financial access, reads all communications, and restricts social contact is exercising coercive control regardless of whether physical violence has occurred.

Technology-facilitated coercive control extends this framework into the digital dimension. Research by the National Network to End Domestic Violence (NNEDV) has documented that in a large majority of domestic violence cases, technology was used as a tool of abuse — including location tracking, account access, surveillance through home devices, and stalkerware.

Coercive control has achieved legal recognition in several jurisdictions:

United Kingdom: England and Wales made coercive control a criminal offense under the Serious Crime Act 2015, punishable by up to five years in prison. The offense covers a pattern of controlling behavior that "has a serious effect" on the victim's life. Scotland passed similar legislation in 2018. UK courts have applied this to technology-facilitated control, including surveillance through smartphones and smart home devices.

United States: As of this writing, the United States has no federal coercive control statute. Several states — including California, Connecticut, and Hawaii — have enacted coercive control legislation, generally as enhancements to domestic violence law rather than standalone offenses. The legislation varies significantly in scope and has been unevenly prosecuted.

The stalkerware-specific legal gap: Even where coercive control is criminalized, stalkerware specifically presents evidentiary challenges. Demonstrating that software was installed without consent, that the installer knew the installation was without consent, and that the installation was part of a pattern of control — all of which may be necessary for prosecution — requires technical evidence that many law enforcement agencies lack the expertise to gather.


19.7 Jordan's Friend

It comes up on a Friday evening in November, three months after Jordan downloaded their Google Takeout data and started thinking differently about surveillance. Jordan is at dinner with Yara and two other friends, including Malik, who is quiet and seems off.

Later, walking home, Malik tells Jordan what's happening. His ex-girlfriend — they broke up two months ago — seems to know things she shouldn't. She texted him about a job interview he hadn't told her about. She commented on a restaurant he went to with his new date. She showed up at a coffee shop he'd been frequenting since the breakup, saying she "happened to be in the area."

Jordan thinks about what they've learned. They ask Malik: did she ever have your phone? Not borrowed, but like, had it for a while?

Malik thinks. Three weeks into the relationship, he'd dropped his phone in a lake on a camping trip. She'd given him her old phone to use for two weeks while he waited for insurance to replace his.

Two weeks. She'd had it. She'd set it up for him.

Jordan goes home and looks up the Coalition Against Stalkerware. They find the detection tool list. They text Malik with a link to Malwarebytes' free scanner and a message: "I don't want to freak you out but I want you to run this."

An hour later: "It found something."

Jordan sits with this for a long time. They think about the app on Malik's phone, running quietly for eight months, transmitting his location and messages to someone who had once said she loved him. They think about how the surveillance was, in a technical sense, the same kind of surveillance that Google performs or that location brokers enable. But it was directed specifically at Malik, by someone who knew him, for the purpose of control and — they were beginning to understand — the purpose of not letting go.

The difference between ambient corporate surveillance and targeted relationship surveillance is not just scale. It is the presence of a human will behind the watching — a will that knows you, that can use what it sees against you, that is in the room with you.

📝 Note for Students: Jordan's response — providing Malik with technical resources — is appropriate in the circumstances. If someone discloses that they believe they are being monitored by a partner or ex-partner, the most important first step is connecting them with a domestic violence resource (hotline, advocate) who can guide them through safety planning before taking any action on the device. Technical remediation without safety planning can escalate danger. The resources at the end of this chapter provide guidance.


19.8 The Normalization of Intimate Surveillance

How Monitoring Becomes Expected

One of the most significant dynamics in relationship surveillance is the normalization of monitoring as an expression of care. The logic runs: if you love someone, you want to know they are safe. If you want to know they are safe, you want to know where they are. If you want to know where they are, you should be able to check. Resistance to being checked suggests something to hide.

This logic — which erases the distinction between care and control — is actively cultivated by commercial products. Life360's marketing presents family location sharing as an expression of closeness and love, not as surveillance. Parental control products present comprehensive child monitoring as responsible parenting. In the worst cases, stalkerware products present covert monitoring as evidence of romantic investment ("you care enough to know").

The normalization of monitoring-as-care has real consequences for how people evaluate their own relationships. Survivors of tech-facilitated coercive control consistently report difficulty naming their experience as abuse, in part because the monitoring was framed as love — and in part because some of the monitoring (casual location sharing, visible read receipts) was identical to what their friends experienced in non-abusive relationships.

The Panopticon in Intimate Space

The panopticon's power, as discussed in Chapter 2, lies not in constant observation but in the awareness that observation might occur at any time. In relationships with monitoring, this dynamic is acutely present. A partner who knows they are being tracked cannot know when their tracker is looking at the data; they can only know that the data is always available. This produces the characteristic panoptic effect: the internalization of the gaze. The monitored partner begins to regulate their behavior based on what they imagine the watcher might think — not because they are doing anything wrong, but because they cannot un-know that they are watched.

In abusive relationships, this internalized gaze is a primary mechanism of control. The abuser does not need to monitor constantly; the monitored partner's awareness that monitoring is possible is sufficient to produce compliance. The surveillance is coercive not only through its use but through its existence.


19.9 Practical Ethics of Surveillance in Relationships

A Framework for Evaluation

For students thinking about surveillance in their own relationships — whether as the monitoring party or the monitored party — the following framework provides guidance:

Consent: Does the monitored party know about the monitoring? Do they have a genuine, consequence-free ability to withdraw consent? Monitoring without consent is not ethically neutral simply because it is technically easy or legally permitted in some jurisdictions.

Reciprocity: Does both parties have equivalent access to equivalent information? Asymmetric monitoring — one party watching the other without equivalent access — encodes a power differential.

Purpose: Is the monitoring serving the monitored party's interests (safety, well-being) or the monitoring party's interests (reassurance, control)? These purposes are not always easily distinguished, but the distinction matters.

Proportionality: Is the scope of monitoring proportionate to the concern it addresses? Tracking a child's location is proportionate to age-appropriate safety concerns; reading every message a teenage child sends is not proportionate to typical safety concerns.

Openness to discussion: Can the monitoring arrangement be discussed openly? If raising concerns about monitoring produces defensiveness, anger, or emotional pressure, that reaction reveals something about whether the monitoring is about safety or control.

The abuse escalation question: Any monitoring arrangement should be evaluated with the question: in the worst case, if this relationship deteriorated, could this monitoring infrastructure be used against me? The technology that coordinates pickup times can also be used to enforce isolation. Evaluating surveillance arrangements in best-case terms alone is insufficient.

Best Practice: Having the Monitoring Conversation

If you are considering introducing monitoring into a relationship (parental monitoring of a child, location sharing with a partner):

  1. Disclose fully before implementing. Tell the person what you want to monitor and why, before installing anything.
  2. Explain the purpose. "I want to share location so we can coordinate pickups easily" is a different conversation than "I want to know where you are at all times."
  3. Offer reciprocity. If you are asking someone to share location with you, offer to share your location with them.
  4. Define scope and limits. "I'll check the location sharing when we're coordinating" sets different expectations than unrestricted access.
  5. Create an exit mechanism. Be explicit that either party can withdraw consent without negative consequence.
  6. Revisit as the relationship changes. Parental monitoring appropriate for a 12-year-old may not be appropriate for a 17-year-old. Check in about whether the arrangement still makes sense.

Chapter Summary

Relationship surveillance encompasses one of the most ethically complex territories in this textbook. At its most benign, it involves the mutual location sharing that millions of families and couples use for coordination and safety. At its most harmful, it involves stalkerware-enabled coercive control — a pattern of technology-facilitated abuse that affects millions of people and is structurally connected to intimate partner violence.

The chapter has traced several key themes:

Stalkerware is commercial, prevalent, and deliberately designed to exploit the dual-use character of monitoring software — disguised as parental control or employee monitoring while marketed for intimate partner surveillance. The Coalition Against Stalkerware represents the most effective current response, but technical detection is not sufficient without safety-centered protocols.

Parental monitoring technology occupies a genuinely complex space, where legitimate safety concerns intersect with developmental privacy interests. The design of monitoring technology matters enormously: tools that provide safety without total surveillance (like Bark's AI-flagging approach) serve children's interests differently than tools that provide parents with complete access to all communications.

Partner surveillance exists on a spectrum from convenience to coercive control, and the key variables — consent, reciprocity, purpose, proportionality, and openness to discussion — provide a framework for evaluating where specific arrangements fall. The technology of intimate monitoring does not determine its meaning; the relationship context does.

The panopticon's logic, as described in Chapter 2, has its most intimate application in relationships: the awareness of being monitored by someone who knows you and can use the monitoring against you produces the internalized gaze in its most personally damaging form.

For Jordan, Malik's experience closes a circle that started with the abstract surveillance concepts of Chapter 1 and passed through the corporate data infrastructure of the previous chapters. The surveillance that matters most is not always the most technically sophisticated or the most structurally powerful. Sometimes it is one person with eight minutes and a phone.


Key Terms

Stalkerware: Software designed to covertly monitor a device without the owner's knowledge or consent, typically transmitted to a remote party. Functionally distinct from legitimate monitoring software by virtue of its covert installation and use in controlling relationships.

Coercive control: A pattern of behavior (developed theoretically by Evan Stark) that uses surveillance, isolation, monitoring, and restriction to create ongoing subordination in an intimate relationship, distinct from episodic physical violence.

Dual-use technology: Technology with both legitimate and harmful applications, where the technical function is the same and the distinction between uses is a matter of social context (primarily consent).

Coalition Against Stalkerware (CAS): A nonprofit coalition of cybersecurity companies and domestic violence advocates that develops standards for stalkerware detection, survivor-centered response protocols, and advocacy for regulatory and platform action.

Tech-facilitated abuse: The use of digital technology — including stalkerware, location tracking, smart home devices, and account access — as tools of intimate partner abuse.

Monitoring vs. controlling: The distinction between surveillance that serves the safety or well-being of the monitored person (monitoring) and surveillance that serves the control needs or interests of the monitoring party (controlling).


Discussion Questions

  1. The chapter describes a spectrum of relationship surveillance from "agreed mutual transparency" to "stalkerware-facilitated abuse." Identify a monitoring arrangement that you believe falls at each end of this spectrum, and explain what features place each arrangement where you locate it.

  2. Life360 is marketed as a family safety app. The Markup investigation revealed that it was simultaneously selling location data, including from children, to commercial brokers. What does this dual function reveal about the relationship between safety marketing and commercial surveillance?

  3. The UK criminalized coercive control in 2015; the U.S. has no federal equivalent. Evaluate the arguments for and against a federal coercive control statute. Does the tech-facilitated nature of much contemporary coercive control make federal legislation more or less necessary?

  4. Bark monitors children's communications for safety signals without giving parents full access to all messages. Does this design represent an adequate balance between parental safety interest and child privacy? At what age would even Bark-style monitoring become inappropriate?

  5. Jordan provides Malik with technical resources to detect stalkerware but notes that safety planning should come before technical remediation. Why is this sequencing important, and what does it reveal about the relationship between technical and social responses to tech-facilitated abuse?