Case Study 27-1: Crossover WorkSmart and the Extreme Monitoring Model
Overview
Crossover for Work — a company that connects employers with remote contract workers, particularly in software development and knowledge work — markets itself as having developed the definitive solution to remote work productivity management. Its proprietary platform, WorkSmart, represents perhaps the most comprehensive and philosophically explicit surveillance model in the commercial remote monitoring market. Examining Crossover and WorkSmart in detail provides insight into what intensive remote work surveillance looks like at its most developed — and what its proponents argue in its defense.
Crossover's Business Model and Philosophy
Crossover's business proposition is distinctive: it recruits highly skilled knowledge workers — software engineers, product managers, sales professionals — globally, primarily from countries where talent costs are lower than in the U.S. or Western Europe, and places them with U.S.-based or multinational employers. The employers pay Crossover; Crossover pays the workers.
The company's founder and CEO, Andy Tryba, has been explicit and unapologetic about the intensive monitoring model. In a 2020 Business Insider interview, he argued: "We measure output, not hours. But to measure output fairly, we need to know what people are doing." The WorkSmart platform, in his framing, creates the transparency that makes fair assessment of remote work possible.
This framing — surveillance as fairness, monitoring as transparency — is worth examining carefully, because it represents the most sophisticated ideological defense of intensive remote monitoring.
What WorkSmart Does
WorkSmart's documented capabilities, based on the company's own product documentation and reporting by journalists including Mia Sato at The Verge, include:
Keystroke and mouse tracking: The platform records keystrokes per hour and mouse clicks per hour, generating an "activity" metric that is used as one input to the overall productivity score.
Screenshot capture: WorkSmart takes screenshots of the worker's screen approximately every ten minutes. These screenshots are uploaded and visible to supervisors and Crossover's own quality assurance team.
Webcam captures: The platform takes periodic photos of the worker's face using their laptop's webcam, approximately every ten minutes synchronized with screenshots. These photos are used to verify that the person at the workstation is the enrolled worker — facial recognition technology confirms the identity.
"Focus" and "intensity" scoring: WorkSmart analyzes the data streams — keystrokes, mouse activity, screenshots, webcam — and produces a composite "focus score" and "intensity score" for the worker. These scores are Crossover's proprietary algorithmic assessment of worker engagement.
Active hours requirement: Workers on the Crossover platform are generally required to achieve a minimum number of "active hours" as measured by the platform — typically 40 hours per week of verified high-activity working time.
The Worker Experience
Journalism and worker accounts provide a portrait of what working under WorkSmart feels like.
Workers describe the system as psychologically exhausting in ways that are distinct from normal work pressure. The requirement to maintain high activity scores — not just to work, but to be visibly active at all times — creates what workers describe as an inability to think, to read, to plan, or to engage in any form of work that doesn't generate keystrokes and mouse movement.
One software developer who worked under the Crossover system told a reporter: "The best programming is often spent thinking. You might stare at the screen for twenty minutes working through a problem, then write ten lines of code that solve everything. The WorkSmart system would have flagged those twenty minutes as idle. I spent more time making sure I was typing than I spent thinking."
The webcam capture feature — periodic photographs of the worker's face — raises an additional dimension of intimacy. These photographs are not screenshots; they are images of a person's body, in their home, at work. They document not merely what is on the screen but what the worker looks like at that moment: their expression, their attire, their apparent alertness or fatigue.
Workers have described the experience of knowing they are being photographed periodically throughout the day as distinctly more invasive than screen monitoring. "I can't stop thinking about my face," one worker told a journalist. "Am I making a weird expression? Am I too tired-looking? There's this constant self-consciousness."
The Ideological Defense and Its Problems
Crossover's defense of WorkSmart has several components:
The transparency argument: Monitoring creates clarity about what is expected and how performance is assessed. Workers know exactly what the system values (activity, focus) and can calibrate their behavior accordingly. Opacity creates uncertainty and unfairness; measurement creates accountability.
Problem: The transparency argument assumes that what the system measures (keystrokes, mouse movement, facial presence) accurately reflects what the employer actually values (productive work output). If the measurements are poor proxies for productivity — and the evidence suggests they often are — then the transparency is misleading. Workers who understand the system and optimize for its metrics may be producing exactly the behaviors Crossover's scoring system rewards while underperforming on the actual work outcomes Crossover's clients care about.
The global fairness argument: Because Crossover works with talent globally, it needs standardized metrics to ensure fair assessment across different cultural and work contexts. Monitoring provides the common standard.
Problem: The global fairness argument assumes that the metrics are culturally neutral, which they are not. Activity scores reward a particular style of work — high keyboard-and-mouse activity, visible engagement with screens, formal work presentation via webcam — that reflects specific cultural norms about what working "looks like." Workers whose cultures, health conditions, or work styles differ may be systematically disadvantaged by metrics designed to validate a specific model of visible productivity.
The worker welfare argument: WorkSmart protects workers, Crossover argues, because it provides objective evidence of their productivity. If a supervisor wants to dismiss a worker unfairly, the monitoring data provides a record of actual activity that can serve as a defense.
Problem: This argument assumes that workers have access to the monitoring data and can use it to contest unfair treatment. In practice, Crossover controls the data, interprets the algorithms, and makes the assessments. Workers who disagree with their scores have limited ability to contest the methodology.
Regulatory Attention and Legal Context
Crossover's model has attracted regulatory scrutiny in several jurisdictions. In the Netherlands, the Dutch Data Protection Authority (Autoriteit Persoonsgegevens) opened an investigation into whether WorkSmart's facial recognition and activity monitoring violated GDPR requirements for necessity and proportionality. The investigation concluded that while some monitoring was permissible, facial recognition and webcam captures without specific opt-in consent created GDPR compliance risks.
Crossover subsequently modified its practices for EU-based workers, offering an opt-out from webcam monitoring. The modification illustrates the regulatory leverage that GDPR provides — leverage that U.S. workers lack.
Discussion Questions
-
Crossover argues that WorkSmart creates "transparency" and "fairness." Is intensive surveillance a form of transparency? For whom is it transparent?
-
The software developer quoted in this case study describes optimizing for typing activity rather than thinking. This is a textbook instance of Goodhart's Law. What would a monitoring system that incentivized genuine cognitive work rather than visible activity look like?
-
Workers who accept Crossover positions understand the monitoring model in advance. Does this advance knowledge change the ethical analysis? Why or why not?
-
Crossover's model works partly because it sources talent from countries where regulatory protections are weaker than in the EU. What does this reveal about the relationship between labor regulation and surveillance intensity?
-
The facial recognition component of WorkSmart — photographs of workers' faces every ten minutes — raises concerns that go beyond standard productivity monitoring. What specific harms could this practice create beyond the privacy concerns discussed in the main chapter?