Case Study 2.1: The Open Campus — Surveillance Architecture at an American University
Overview
This case study examines how a mid-sized American research university manages its surveillance infrastructure — from physical cameras to digital monitoring to behavioral tracking systems — and how this infrastructure reflects and reproduces panoptic dynamics in an institution ostensibly committed to academic freedom and the free exchange of ideas.
The case draws on documented practices at multiple universities, investigative reporting on campus surveillance, and published policies. It is representative of conditions at institutions similar to Hartwell University, Jordan's fictional home campus.
Estimated Reading and Analysis Time: 75–90 minutes
Background: The University as Panoptic Institution
Universities have long been sites of surveillance. The lecture hall concentrates students before a professor who can observe all of them. The examination room enforces silence and individual effort under proctorial gaze. The attendance register marks presence and absence. The grade transcript provides a permanent record of academic performance available to future employers, graduate schools, and professional licensing bodies.
These traditional surveillance mechanisms are familiar enough to have become invisible — normalized into the background of academic life in exactly the way Chapter 1 described for surveillance more broadly.
What has changed in the past two decades is the digital intensification and extension of these traditional mechanisms. Universities now possess surveillance capabilities that would have been technically impossible a generation ago, and they are deploying them in ways that raise significant questions about academic freedom, student privacy, and the chilling effect in educational contexts.
The Physical Layer: Cameras, Access Cards, and Presence Detection
Most American universities now operate extensive closed-circuit television networks. A large research university may have several thousand cameras across its campus, monitoring building entrances, corridors, parking lots, recreation facilities, and dining halls. These cameras are typically operated by campus security or a contracted security firm, with footage retained for varying periods (often 30–90 days) and available to campus security, university administration, and — with appropriate process — law enforcement.
Access control systems using electronic key cards or student ID cards track entry and exit from residence halls, academic buildings, libraries, and recreational facilities. A student's ID card generates a data trail of their physical movements through campus — every door they badge through, timestamped, logged in a central system.
At some universities, this access data is used for purposes beyond security. Reported uses include: - Confirming that athletes attended mandatory study hall sessions - Verifying that students on academic probation attended tutoring appointments - Investigating complaints of policy violations
Passive presence detection represents a newer development. Several universities have piloted systems in which students' mobile phones are detected by Bluetooth or Wi-Fi sensors to record attendance in large lecture courses — eliminating the traditional attendance sheet while dramatically increasing the granularity of attendance data. SpotterEDU and Degree Analytics are two commercial providers of such systems.
The SpotterEDU Case
SpotterEDU's "Course Attendance" system was adopted by several universities beginning in 2018. The system uses Bluetooth technology embedded in the student's smartphone to detect presence within a classroom's sensor range, recording timestamps accurate to the minute.
An Associated Press investigation in 2019 found that SpotterEDU provided this attendance data to academic advisors and coaches — but also that some institutions were unaware of the full extent of what the system logged, and that the privacy policies disclosed to students were inadequate. Students who downloaded the app (sometimes as a requirement of enrollment in a class or degree program) had, in the app's terms of service, consented to location tracking that extended beyond classroom hours.
The Digital Layer: Learning Management System Analytics
The university's learning management system (LMS) — Canvas, Blackboard, Moodle, or their equivalents — is perhaps the least visible and most extensive surveillance system in students' academic lives.
When a student logs into Canvas, the system records: - Login time and session duration - Every page accessed, with time spent on each - When assignments are opened (not just submitted) - Quiz attempts, including incorrect answers and time per question - Discussion board activity, including posts read and not just posts written - File download times
This data is available to instructors and, in many systems, to academic advisors and administrators with appropriate access. Sophisticated analytics dashboards present this data in visual form: engagement "heat maps" showing when students are active, performance predictors that flag students at risk of course failure, and comparative data showing how a given student's engagement compares to class averages.
Predictive Analytics and the Early Alert System
Many universities now operate "early alert" or "early warning" systems that use LMS data, grade data, attendance data, and sometimes demographic data to generate risk scores for individual students — predictions of which students are likely to fail or drop out.
The stated purpose is benevolent: identify at-risk students early so advisors can intervene with support. The implementation raises panoptic concerns:
-
Visibility asymmetry: Students generate the data through their academic behavior; they typically cannot see their own risk scores or the algorithm that generates them.
-
Individuation: Each student receives an individualized risk score, creating a classified population sorted by algorithmic prediction.
-
Normalization: Students who engage more frequently with the LMS, submit work early rather than at the deadline, and complete optional practice activities generate behavioral profiles that score as lower-risk. This produces pressure toward a particular style of academic engagement regardless of whether that style is actually associated with learning.
-
Function creep: Data collected for one purpose (grade assessment) is repurposed for another (dropout prediction, advisor intervention, possibly future policy decisions about which students to admit).
The External Dimension: University Surveillance and the Police
University surveillance systems do not exist in isolation from external law enforcement. Campus police departments — which exist at most American four-year universities — have access to campus camera systems and, depending on jurisdiction and institutional policy, may share data with municipal, county, and federal agencies.
Documented cases at other universities include: - Campus police sharing student protest footage with municipal police departments investigating demonstrations - University network monitoring data provided to FBI investigations of students for suspected terrorism-related activities - Social media monitoring of student activist groups by campus security offices
The public-private surveillance partnership that Chapter 1 analyzed in the Ring doorbell context operates here too: the university's commercial surveillance infrastructure (LMS data, access card data, camera footage) becomes potentially available to state surveillance actors through legal process or voluntary cooperation.
Applying the Panopticon Framework
Bentham's Design at the Institutional Scale
Bentham proposed the panopticon for "instructing the willing" as readily as for imprisoning the incorrigible. The university campus realizes Bentham's ambition in distributed form: rather than a single circular building, it is a network of overlapping surveillance systems — cameras, access cards, LMS analytics, attendance trackers — each covering a different dimension of student life and behavior.
The integration of these systems means, in principle, that a university administrator with appropriate access could reconstruct a given student's day with considerable precision: when they woke (badge out of dorm), where they went (badge and camera), whether they attended class (attendance tracker), how long they studied (LMS session data), what they studied and when (page-level LMS analytics), and whether they socialized (badge data at recreation facilities).
The Normalizing Gaze in the LMS
The most significant panoptic effect in the university context may be the LMS analytics system's normalizing gaze — and specifically the way it defines and rewards a particular style of academic engagement.
The student who logs in to the LMS at predictable times, opens readings before class, completes all practice activities, and submits assignments early will generate a behavioral profile that early alert algorithms treat as low-risk and high-engagement. The student who works differently — binge-reading large amounts of material in concentrated bursts, submitting work at the deadline rather than early, preferring non-digital research — will generate a profile that looks like disengagement.
This is not a minor technical point. It is a mechanism by which a surveillance system encodes a specific behavioral norm — the digitally regularized, continuously engaged student — as the model of the successful student, and flags deviations from that norm as risk. Students who learn in different ways, or who have legitimate reasons for irregular engagement patterns (employment, caregiving responsibilities, disability), may be flagged as at-risk not because they are struggling academically but because they do not fit the algorithm's normal.
Jordan's Experience
Jordan checks their LMS every day. They have noticed — with a kind of low-grade unease they could not previously name — that their assignment-opening timestamps felt like evidence. When Jordan opened an assignment the day before the deadline rather than the day it was posted, there was a sense of being seen doing something slightly wrong, even though no rule had been violated.
After reading Chapter 2, Jordan recognizes what this unease is. The LMS's logging architecture has produced a normalizing gaze that Jordan has internalized — they are not just doing their coursework, they are performing their coursework for an invisible evaluator, and performing it in the style that the system's underlying model rewards.
This recognition is not comfortable. But it is the beginning of a more critical relationship with the institutions Jordan inhabits — which is, ultimately, what the course Dr. Osei is teaching them aims to produce.
Discussion Questions
-
Legitimacy and Purpose: Universities justify their surveillance systems on the grounds that they help students succeed — early alerts prevent dropouts, attendance tracking ensures students benefit from instruction. How much of this justification do you find convincing? Does the purpose change your assessment of the surveillance's legitimacy?
-
Consent and Alternatives: Students at most universities "consent" to campus surveillance as a condition of enrollment. Analyze this consent using the "consent as fiction" framework from Chapter 1. At what points does the consent become most fictional? What would genuine informed consent look like in this context?
-
The Normalizing Gaze: The case study argues that LMS analytics define a specific style of academic engagement as normal and flag deviations as risk. Who is most likely to be flagged by this system — what kinds of students have engagement patterns that look "risky" by algorithm but may reflect legitimate differences in learning style, life circumstances, or prior educational experience?
-
Panopticism vs. Support: The early alert system is designed to identify students who need help before they fail. Can the same system serve both a panoptic disciplinary function (classifying students into risk categories) and a genuine supportive function (connecting struggling students with resources)? Are these functions compatible or contradictory?
-
The Police Dimension: The case study notes that university surveillance data can flow to law enforcement. How does this possibility affect your analysis of the university's surveillance? Does it change the character of the surveillance from pastoral oversight to something more concerning?
-
Student Agency: What responses are available to students who object to campus surveillance? Consider both individual-level responses (opting out, using different devices, adjusting behavior) and collective-level responses (organizing, policy advocacy, public disclosure). Which are more likely to be effective, and why?
-
Design Alternative: If you were tasked with maintaining student academic support services (identifying students at risk of dropping out, connecting them with resources) without the panoptic surveillance architecture described, how would you do it? What information would you collect, from whom, with what consent process, and with what limits on use?
Extension Research
Look up the privacy policy and acceptable use policy for the learning management system used at your institution (or, if you prefer, at a publicly documented example such as the University of California system's Canvas implementation).
- What data does the system collect about student activity?
- Who has access to that data?
- For how long is data retained?
- Under what circumstances is data shared with parties outside the institution?
- What rights do students have to access, correct, or delete their own data?
Write a 300-word assessment of whether these policies adequately address the panoptic concerns raised in this case study.
Chapter 2 | Case Study 2.1 | Part 1: Foundations | The Architecture of Surveillance