Case Study: Uber Drivers and Data Asymmetry

"They call us partners, but a partner has access to the books." — Uber driver, Chicago, quoted in Sofia Reyes's DataRights Alliance investigation

Overview

Uber Technologies, Inc., is one of the world's most valuable technology companies, operating in over 10,000 cities across 72 countries and facilitating approximately 28 million rides per day as of 2024. The company has also become the paradigmatic example of how data asymmetry structures power in the gig economy.

Uber drivers generate enormous quantities of data through their labor: GPS traces, ride completion records, customer ratings, acceptance patterns, driving speed, braking behavior, route choices, and idle time. This data is used to train algorithms, optimize operations, set prices, and evaluate drivers. Yet drivers have virtually no access to the data they produce, no understanding of the algorithms that manage them, and no meaningful ability to contest the automated decisions that determine their income.

This case study examines the data asymmetry at the heart of the Uber driver-platform relationship, its consequences for drivers, and the emerging legal and activist responses — connecting directly to Sofia Reyes's investigation in Section 33.7 and the broader framework of data rights as labor rights.

Skills Applied: - Analyzing data asymmetry across multiple dimensions - Evaluating the Consent Fiction in platform labor - Connecting data governance to worker economic outcomes - Assessing emerging legal and collective responses


The Situation

The Data Uber Collects

Uber's driver app collects a comprehensive set of data about every aspect of driving activity:

Location data. GPS tracks the driver's location continuously while the app is active — not just during rides but during the waiting and searching periods between rides. This data feeds Uber's demand prediction and surge pricing algorithms.

Trip data. For each ride: pickup location, dropoff location, route taken, distance, duration, traffic conditions, and fare. Uber uses this data to optimize pricing, predict demand, and evaluate driver performance.

Behavioral data. Uber's system collects data on driving behavior — speed, acceleration, braking, phone usage — ostensibly for safety purposes. This data feeds Uber's "Safety Score" and can be used to identify drivers for deactivation.

Rating data. After each ride, passengers rate the driver on a five-star scale and may leave written comments. These ratings are aggregated into the driver's overall rating, which determines future ride allocation and eligibility to continue driving.

Acceptance and cancellation data. The system records which ride requests a driver accepts, rejects, or cancels, and their response times. Drivers with low acceptance rates may receive fewer ride offers or be penalized through reduced algorithmic visibility.

Earnings data. Uber records every fare, tip, bonus, quest completion, and deduction for each driver — data that is essential for understanding the economics of driving for the platform.

What Drivers Do Not Know

Despite producing all of this data, drivers operate with minimal information about the system that manages them:

They do not know the fare before accepting a ride (in most markets). Uber shows a destination estimate and an approximate earnings range, but the actual fare is calculated after the trip. Drivers make the decision to accept or reject a ride without the information most relevant to that decision.

They do not know how the algorithm allocates rides. When multiple drivers are available near a pickup location, how does the algorithm choose which driver receives the request? Is it purely proximity-based? Does the driver's rating, acceptance rate, or behavioral data influence allocation? Uber has not fully disclosed this logic.

They do not know how surge pricing is calculated. Uber's dynamic pricing algorithm adjusts fares based on real-time supply and demand. Drivers can see that surge pricing is active in a given area, but they do not know the formula, the inputs, or how the surge multiplier translates to their pay (Uber's take rate may also vary).

They do not know the deactivation threshold. Drivers with ratings below a certain threshold are deactivated — effectively fired. But the specific threshold may vary by market, and drivers receive no advance notice of how close they are to the line.

They cannot see aggregate data. A driver cannot compare their earnings, rating, or ride allocation to those of other drivers. Without aggregate data, patterns of discrimination or algorithmic bias are invisible at the individual level.

The Rating System: Power Without Accountability

Uber's five-star rating system deserves particular attention because it illustrates data asymmetry in its most personal form.

How it works. After each ride, the passenger rates the driver from one to five stars. Drivers cannot see individual ratings — only their rolling average. They cannot see who rated them or when. They cannot respond to or contest a specific rating.

Why it matters. Ratings determine a driver's future on the platform. Below a market-specific threshold (typically around 4.6 out of 5.0), drivers face deactivation warnings and eventually lose access to the platform. In a system where 4.6 out of 5.0 constitutes failing, the margin for error is extraordinarily thin.

The bias problem. Research has documented that customer ratings are influenced by racial bias. A study by Ge et al. (2020) found that Black drivers received lower ratings than white drivers for comparable service. If the deactivation algorithm uses ratings as a primary input, racial bias in customer evaluations translates directly into racially disparate deactivation rates. The algorithm does not create the bias — customers do — but the algorithm encodes and amplifies it into consequential decisions.

The asymmetry. Passengers rate drivers, but the consequences flow in one direction. A low rating from a passenger costs the driver access to future work. A low rating from a driver to a passenger may affect the passenger's ability to get a ride — but passengers face no equivalent threat to their livelihood. The rating system creates a surveillance relationship that is structurally asymmetric.


Analysis Through Chapter Frameworks

The Five Dimensions of Data Asymmetry

Sofia Reyes's five-dimensional framework (Section 33.3.2) maps precisely onto the Uber driver experience:

Dimension What Uber Knows What the Driver Knows
Earnings Every driver's earnings, market trends, revenue splits, take rates Own earnings only; cannot compare or benchmark
Ratings All ratings, rater identity, patterns, anomalies Own average only; no individual ratings, no rater identity
Algorithm Full logic for ride allocation, pricing, surge, deactivation Outputs only; no access to logic, parameters, or thresholds
Market Real-time supply/demand across entire market Own local area, partial; what Uber chooses to display
Behavior GPS traces, speed, braking, idle time, route choices Own behavior, partial; how it is scored is opaque

This asymmetry means that Uber and the driver are in fundamentally unequal positions in every negotiation, every dispute, and every decision. Uber has the complete picture; the driver has fragments. This is not a market between equals — it is a managed relationship in which one party holds all the information.

Uber drivers "consent" to the platform's data practices by accepting the terms of service. But the Consent Fiction operates at multiple levels:

No negotiation. The terms are non-negotiable. Drivers accept them or do not drive.

No comprehension. Uber's terms of service exceed 9,000 words. Sofia Reyes's investigation found that no driver she interviewed had read them. The terms are written in legal language that assumes expertise drivers do not have.

No meaningful alternative. In many markets, Uber and Lyft dominate ride-hailing. A driver who objects to Uber's terms may find that Lyft's are functionally identical. The "choice" between platforms is a choice between similar data regimes.

Dynamic terms. Uber changes its algorithms, pricing, and policies regularly — without driver input or, in many cases, advance notice. "Consent" given at signup does not cover future modifications that the driver cannot predict or evaluate.

Economic coercion. For drivers who depend on Uber as primary income, rejecting the terms means losing their livelihood. This economic dependency transforms "consent" from a voluntary choice into an economic necessity.

Algorithmic Wage Discrimination

Veena Dubal's research (Section 33.3.3) documented personalized pay — different rates offered to different drivers for comparable work, based on the platform's prediction of what each driver will accept.

The mechanism works because the data asymmetry is total. Drivers cannot see what other drivers are paid, cannot access the algorithm determining their pay, and cannot verify whether differences reflect legitimate variables (traffic, demand) or behavioral predictions (this driver tends to accept lower rates). The platform has the data to personalize; the driver lacks the data to detect personalization.

This practice raises fundamental questions about fair compensation. If an employer paid different wages to two employees doing the same work based on the employer's prediction of what each would accept, it would violate equal pay principles in most jurisdictions. Platform classification as "independent contractors" and algorithmic opacity shield the same practice from legal challenge.


Emerging Responses

Worker classification. Courts and regulators in multiple jurisdictions have challenged Uber's classification of drivers as independent contractors:

  • The UK Supreme Court ruled in Uber BV v. Aslam (2021) that Uber drivers are "workers" (a category between employee and contractor in UK law), entitled to minimum wage, holiday pay, and other protections.
  • California's AB5 (2019) established a three-part test for independent contractor classification that would have reclassified most gig workers as employees — but Proposition 22 (2020), backed by $200 million in platform spending, created a carve-out exempting gig companies.
  • The EU's proposed Platform Work Directive (2023-2024) would establish a presumption of employment for platform workers, placing the burden on platforms to prove independent contractor status.

Data access. Drivers in the EU have used GDPR data subject access requests to obtain data Uber holds about them. In the Netherlands, drivers obtained their "fraud scores" (Uber's internal assessment of the likelihood that a driver was engaged in fraudulent behavior) through GDPR requests — revealing a surveillance system drivers had not known existed. A court in Amsterdam subsequently ordered Uber to disclose its algorithms for fraud detection and ride allocation.

Collective Action

Worker data cooperatives. Sofia Reyes's DataRights Alliance partnered with gig worker organizations in Chicago to pilot a worker data cooperative — a collective structure through which drivers pooled their individual earnings data to create an aggregate dataset. For the first time, drivers could compare their per-mile earnings by time, location, and platform. The aggregate data revealed patterns invisible at the individual level: systematic differences in per-mile earnings between neighborhoods, consistent with the digital redlining patterns examined in Chapter 32.

Driver-led research. Organizations like Rideshare Drivers United (California) and the Independent Drivers Guild (New York) have conducted driver surveys, documented working conditions, and used the resulting data to advocate for policy changes. This is counter-data in action — communities producing the data that platforms and governments fail to provide.

The Amsterdam Algorithm Cases. In a series of rulings between 2021 and 2023, Dutch courts required Uber and other gig platforms to disclose algorithmic decision-making processes to drivers under the GDPR. These cases established a legal precedent: workers have a right to understand the automated systems that manage them, and "trade secret" claims cannot override fundamental data protection rights.


Connection to Chapter Themes

The Power Asymmetry

The Uber-driver relationship is a textbook case of the Power Asymmetry: one party (the platform) holds virtually all the data, controls the algorithm, sets the terms, and determines the consequences — while the other party (the driver) generates the data, experiences the algorithm's outputs, accepts the terms under economic pressure, and bears the consequences with no recourse.

The Accountability Gap

When a driver is deactivated, who is accountable? The algorithm made the decision, but the algorithm has no identity, no phone number, no office. The company designed the algorithm, but the company maintains that the deactivation was an objective algorithmic output, not a subjective human decision. The customer whose biased rating triggered the deactivation is anonymous and faces no consequences. The Accountability Gap is not a feature of negligent governance — it is a structural feature of algorithmic management.

Data Rights as Labor Rights

The Uber case demonstrates why data governance is inseparable from labor relations. Every aspect of the driver's working life — what they earn, how they are evaluated, whether they keep their job — is determined by data and algorithms. You cannot address the labor issues (fair pay, job security, safe conditions) without addressing the data issues (access, transparency, contestability). Data rights are labor rights in the algorithmic workplace.


Discussion Questions

  1. The classification question. Should Uber drivers be classified as employees, independent contractors, or a new category? How does data asymmetry inform your answer? Would reclassification as employees solve the data asymmetry problem, or would it persist regardless of classification?

  2. The rating system. Uber's rating system turns customers into unpaid, unaccountable managers. Design an alternative evaluation system that maintains quality accountability without exposing drivers to racially biased customer assessments and without creating the one-directional surveillance dynamic of the current system.

  3. The Amsterdam precedent. Dutch courts have required Uber to disclose its algorithms to drivers. Evaluate this approach: Is full algorithmic disclosure a workable solution? What are the risks (gaming, competitive harm, complexity)? Is there a middle ground between full disclosure and full opacity?

  4. The collective data solution. Sofia Reyes's worker data cooperative in Chicago enabled drivers to compare earnings for the first time. If scaled nationally, how would access to aggregate data change the power dynamics between platforms and workers? Would platforms resist? How?


Your Turn: Mini-Project

Option A: Data Access Experiment. If you use a gig platform as a worker (or know someone who does), submit a data access request under the CCPA, GDPR, or equivalent law. Document: (1) the request process, (2) the response time, (3) the format and completeness of the data provided, (4) what was missing or unusable, and (5) what you learned about the platform's data practices. Write a one-page report.

Option B: Comparative Rating Analysis. Research the rating systems used by at least three gig platforms (Uber, DoorDash, Airbnb, Fiverr, Upwork). Compare: (1) who rates whom, (2) what consequences ratings have, (3) whether ratings can be contested, (4) what data about ratings workers can access, and (5) what evidence exists about bias in each system. Write a two-page comparative analysis.

Option C: Worker Data Rights Legislation. Draft a model statute establishing data rights for gig workers. Your statute should include: (1) a right to access all data collected through the platform, (2) a right to explanation of algorithmic decisions, (3) a right to contest automated deactivation, (4) a right to aggregate data access for collective bargaining, and (5) enforcement mechanisms. Write a two-page legislative proposal with explanatory notes.


References

  • Dubal, Veena. "On Algorithmic Wage Discrimination." Columbia Law Review 123, no. 7 (2023): 1929-1992.

  • Ge, Yanbo, et al. "Racial Discrimination in Transportation Network Companies." Journal of Public Economics 190 (2020): 104205.

  • Rosenblat, Alex. Uberland: How Algorithms Are Rewriting the Rules of Work. Berkeley: University of California Press, 2018.

  • Rosenblat, Alex, and Luke Stark. "Algorithmic Labor and Information Asymmetries: A Case Study of Uber's Drivers." International Journal of Communication 10 (2016): 3758-3784.

  • Uber BV v. Aslam [2021] UKSC 5 (UK Supreme Court).

  • Court of Amsterdam, Cases C/13/687315 and C/13/692003 (Uber algorithmic disclosure), 2021-2023.

  • California Assembly Bill 5 (AB5), 2019. California Proposition 22, 2020.

  • European Commission. "Proposal for a Directive on Improving Working Conditions in Platform Work." COM(2021) 762, December 2021.