Case Study: From Classroom to Career — Pathways in Data Governance

"There is no single path to data governance. There are only paths — and each one starts with a decision to take responsibility." — Dr. Adeyemi, final office hours

Overview

The field of data governance, ethics, and privacy is young, rapidly growing, and remarkably diverse. It draws on computer science, law, philosophy, sociology, political science, public policy, and community organizing. It operates in corporations, government agencies, nonprofits, academic institutions, and community organizations. There is no single credential, no standard career trajectory, and no established hierarchy.

This case study profiles five professionals who entered data governance through different pathways, each illustrating a different relationship between the knowledge gained in a course like this one and the work of data governance in practice.

Skills Applied: - Mapping career pathways to course content - Understanding how different professional roles address different governance challenges - Connecting academic preparation to professional practice - Developing a personal career strategy in data governance


Five Pathways

Pathway 1: The Data Protection Officer

Profile: Elena Vasquez

Elena studied law and worked for three years as a corporate attorney before pivoting to data protection. She obtained the IAPP's Certified Information Privacy Professional (CIPP/E) credential and took a role as Data Protection Officer (DPO) at a mid-size health technology company.

What the job involves: Elena is responsible for ensuring the company's compliance with the GDPR, HIPAA, and other applicable data protection regulations. She reviews data processing activities, conducts data protection impact assessments (DPIAs), advises product teams on privacy-by-design, responds to data subject access requests, and serves as the contact point for data protection authorities.

Connection to the course: Elena's daily work draws on Chapters 7-12 (privacy foundations), Chapters 20-25 (regulatory landscape), and Chapters 26-28 (corporate governance). She uses the concept of privacy by design (Chapter 10) every time she reviews a new product feature. She applies GDPR provisions (Chapter 20) to every data processing decision. And she navigates the tension between compliance and genuine ethical practice (Chapter 26) constantly.

The hardest part: "Compliance is not ethics. I can make the company legally compliant, but that doesn't mean we're doing the right thing. The law sets a floor, not a ceiling. My constant challenge is pushing the organization to go beyond compliance — to ask not just 'Is this legal?' but 'Is this right?' And doing that while keeping the respect of the engineers and the trust of the board is a daily balancing act."

Elena's advice: "Get the credential — CIPP, CIPM, or equivalent. It gets you in the door. But the credential teaches you the law. This course teaches you the ethics. You need both. The DPO who only knows the law will make the company compliant. The DPO who knows the ethics will make the company trustworthy."


Pathway 2: The Algorithmic Auditor

Profile: James Okafor

James studied statistics and machine learning, then worked as a data scientist for two years at a financial services firm. He became interested in algorithmic fairness after discovering that a model he helped build was producing disparate outcomes across racial groups. He transitioned to an independent algorithmic auditing firm.

What the job involves: James conducts independent assessments of algorithmic systems for bias, fairness, and compliance with emerging regulations (including the EU AI Act). He works with a team of statisticians, social scientists, and legal experts. A typical audit involves: accessing the algorithm and its training data, testing for disparate impact across protected classes, evaluating the fairness metrics used (and whether they're appropriate), assessing transparency and explainability, and producing a report with findings and recommendations.

Connection to the course: James's work is directly rooted in Chapters 13-19 (algorithmic systems and AI ethics), particularly the fairness metrics of Chapter 15, the accountability frameworks of Chapter 17, and the transparency analysis of Chapter 16. He also draws on the VitraMed thread — having seen how a well-intentioned model can produce biased outcomes, he knows that good intentions are not a substitute for rigorous testing.

The hardest part: "The clients who hire us are usually the ones who already care. The companies with the worst bias problems don't commission audits. And even when we find problems, the audit report is only as powerful as the institution's willingness to act on it. I've written reports that changed systems. I've also written reports that sat in a drawer."

James's advice: "Learn the math, but don't stop at the math. The hardest conversations in algorithmic auditing aren't about statistical significance — they're about whose definition of fairness we're using, whose experience of the algorithm we're centering, and who gets to decide what 'fair enough' means. That's ethics, not statistics."


Pathway 3: The Policy Analyst

Profile: Sofia Reyes (the textbook's character, now in her career)

Sofia grew up near the US-Mexico border, studied political science and human rights, and worked at the DataRights Alliance — a civil society organization advocating for data rights.

What the job involves: Sofia analyzes technology policies for their impact on marginalized communities, drafts policy proposals, testifies before legislative committees, coordinates advocacy campaigns, and builds coalitions between community organizations, academics, and sympathetic legislators. She specializes in the intersection of surveillance, immigration, and data governance.

Connection to the course: Sofia's work draws on every part of the textbook — from the foundational concepts (Chapters 1-6) that frame data as a social system, to the surveillance analysis (Chapters 8, 36) that informs her understanding of border technology, to the Global South perspectives (Chapter 37) that connect US policy to global dynamics. She uses the four-thread diagnostic constantly: "Every policy I analyze, I ask: Who has the power? Is consent meaningful? Who is accountable? What debt is accumulating?"

The hardest part: "Policy advocacy is slow, and the other side has more money. Tech companies spend millions on lobbying. We have a staff of twelve and a budget that wouldn't cover one Big Tech lobbyist's annual salary. But we have something they don't: the stories of the people affected. A community member who testifies about being denied housing because of an algorithm — that changes a legislator's mind in ways a lobbyist's talking points never will."

Sofia's advice: "Don't wait for the perfect policy. Engage now. Submit comments during regulatory proceedings. Attend public hearings. Write op-eds. Build relationships with community organizations. Policy is shaped by who shows up, and most of the time, the people most affected by data governance don't show up — because nobody told them they could."


Pathway 4: The AI Ethics Researcher

Profile: Dr. Kwame Mensah

Kwame earned a PhD in computer science with a dissertation on fairness in natural language processing. He is now a research scientist at a university-affiliated AI ethics institute, publishing research on bias detection methods, transparency tools, and the social impacts of large language models.

What the job involves: Kwame conducts original research on AI ethics challenges, publishes in academic venues, advises policymakers and companies, teaches courses on responsible AI, and engages with public debates about technology governance. His current project examines how AI systems perform differently across languages, with implications for global equity.

Connection to the course: Kwame's research agenda spans the algorithmic chapters (13-19), the governance chapters (20-25), and the emerging technologies chapter (38). His cross-linguistic AI work connects to Chapter 37's Global South perspectives — the observation that AI systems developed in English-speaking contexts may perform poorly (or harmfully) when deployed in multilingual Global South contexts.

The hardest part: "Academic incentives don't always align with ethical impact. I'm pressured to publish in top venues, which rewards novelty over practical utility. The most impactful work I do — advising community organizations, contributing to policy processes, building open-source tools — doesn't count as much toward tenure as a paper in a prestigious conference. The academic system undervalues the work that matters most."

Kwame's advice: "If you go the academic route, choose your institution carefully. Find a place that values interdisciplinary work and real-world impact. AI ethics research that stays in the lab is a form of ethical performance — it lets researchers feel good about themselves without changing anything. The research that matters reaches the people and institutions that build, deploy, and govern AI systems."


Pathway 5: The Community Data Advocate

Profile: Eli Okonkwo (the textbook's character, two years after the course)

Eli continued his work in Detroit after graduating, turning his Community Data Governance Charter into a template used by communities in multiple cities. He works part-time at DataRights Alliance and part-time at a community development organization, building data governance capacity in neighborhoods affected by smart city technology.

What the job involves: Eli facilitates community meetings about data governance, helps neighborhoods understand what data is being collected about them, trains community members in exercising their data rights, supports the formation of community data governance councils, and advocates for policy changes at the municipal level.

Connection to the course: Eli's work is the direct application of Chapters 32 (digital divide and data justice), 37 (Global South perspectives, adapted to US urban contexts), 39 (participatory governance and data cooperatives), and 40 (the Strategic Advocate archetype). Every concept he learned in Dr. Adeyemi's class has a direct application in his work — but the application is always mediated by the specific needs, capacities, and priorities of the community he serves.

The hardest part: "This work doesn't pay well. Community organizations are under-resourced. The people who most need data governance advocates are the people least able to fund them. I've had months where I wasn't sure I could pay rent. But I've also had moments — like when Mrs. Patterson from the neighborhood association told me that for the first time, she understood what the sensors on her street were doing — that make it worth it."

Eli's advice: "Don't romanticize this work. It's not glamorous. It's slow. The victories are small. But it's the most important data governance work there is — because it puts governance where it belongs: in the hands of the people who live with the consequences."


Connecting the Pathways

These five pathways represent different points of intervention in the data governance ecosystem:

Pathway Point of Intervention Scale Primary Tools
Data Protection Officer Corporate compliance Organizational Law, policy, institutional design
Algorithmic Auditor System accountability System-level Statistics, fairness metrics, audit methods
Policy Analyst Regulatory framework National/international Policy analysis, advocacy, coalition building
AI Ethics Researcher Knowledge production Disciplinary/global Research methods, publication, advisory
Community Data Advocate Community capacity Local/community Facilitation, organizing, education

No single pathway is sufficient. Data governance requires all five — and the connections between them. Elena's compliance work is more effective when informed by James's audits. James's audits are more meaningful when shaped by Sofia's policy analysis. Sofia's advocacy is more credible when backed by Kwame's research. And Kwame's research is more relevant when grounded in Eli's community knowledge.


Discussion Questions

  1. Which pathway resonates most with your current interests and skills? What specific steps could you take in the next year to begin pursuing it?

  2. The five professionals describe different "hardest parts" of their work. Which hardest part concerns you most? What could you do to prepare for it?

  3. All five professionals emphasize the importance of interdisciplinary knowledge. How does this course contribute to each pathway? Which additional knowledge or skills would each pathway require?

  4. Elena says "compliance is not ethics." James says "don't stop at the math." Sofia says "engage now." Kwame says "choose your institution carefully." Eli says "don't romanticize this work." Which piece of advice is most important for you personally, and why?

  5. Imagine these five professionals working together on a single data governance challenge — for example, the deployment of a facial recognition system in a public housing complex. How would each contribute? How would their perspectives complement or conflict with each other?


Your Next Steps

Based on the pathways described in this case study and your own interests, skills, and values:

  1. Identify the pathway (or combination of pathways) that most interests you.
  2. List three specific actions you will take in the next 6 months to move toward that pathway.
  3. Identify one person — a professional, a professor, a community leader — you will contact to learn more about the work.