Bibliography
Organized by chapter. Sources within each chapter are alphabetized by first author surname.
Chapter 1: The Data All Around Us
boyd, d., & Crawford, K. (2012). Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society, 15(5), 662-679. https://doi.org/10.1080/1369118X.2012.678878
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
Gitelman, L. (Ed.). (2013). "Raw data" is an oxymoron. MIT Press.
Kitchin, R. (2014). The data revolution: Big data, open data, data infrastructures and their consequences. SAGE Publications.
Lohr, S. (2015). Data-ism: The revolution transforming decision making, consumer behavior, and almost everything else. Harper Business.
Lupton, D. (2016). The quantified self: A sociology of self-tracking. Polity Press.
Mayer-Schonberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt.
O'Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Chapter 2: A Brief History of Data and Society
Aly, G., & Roth, K. H. (2004). The Nazi census: Identification and control in the Third Reich (A. Blunden, Trans.). Temple University Press.
Anderson, M. J. (2015). The American census: A social history (2nd ed.). Yale University Press.
Black, E. (2001). IBM and the Holocaust: The strategic alliance between Nazi Germany and America's most powerful corporation. Crown.
Cortada, J. W. (2012). The digital flood: The diffusion of information technology across the U.S., Europe, and Asia. Oxford University Press.
Desrosieres, A. (1998). The politics of large numbers: A history of statistical reasoning (C. Naish, Trans.). Harvard University Press.
Dirks, N. B. (2001). Castes of mind: Colonialism and the making of modern India. Princeton University Press.
Hacking, I. (1990). The taming of chance. Cambridge University Press.
Porter, T. M. (1995). Trust in numbers: The pursuit of objectivity in science and public life. Princeton University Press.
Scott, J. C. (1998). Seeing like a state: How certain schemes to improve the human condition have failed. Yale University Press.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Chapter 3: Who Owns Your Data?
Birch, K. (2020). Technoscience rent: Toward a theory of rentiership for technoscientific capitalism. Science, Technology, & Human Values, 45(1), 3-33. https://doi.org/10.1177/0162243919829567
Cohen, J. E. (2019). Between truth and power: The legal constructions of informational capitalism. Oxford University Press.
Hummel, P., Braun, M., & Dabrock, P. (2021). Own data? Ethical reflections on data ownership. Philosophy & Technology, 34(3), 545-572. https://doi.org/10.1007/s13347-020-00404-9
Lanier, J. (2013). Who owns the future? Simon & Schuster.
Lessig, L. (2006). Code: Version 2.0. Basic Books.
Litman, J. (2000). Information privacy/information property. Stanford Law Review, 52(5), 1283-1313.
Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. Cambridge University Press.
Posner, R. A. (1981). The economics of privacy. American Economic Review, 71(2), 405-409.
Skloot, R. (2010). The immortal life of Henrietta Lacks. Crown.
Varian, H. R. (2009). Economic value of Google. Presentation at American Economic Association annual meeting, San Francisco, CA.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Chapter 4: The Attention Economy
Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us hooked. Penguin Press.
Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press.
Citton, Y. (2017). The ecology of attention (B. Norman, Trans.). Polity Press.
Eyal, N. (2014). Hooked: How to build habit-forming products. Portfolio/Penguin.
Harris, T. (2016). How technology is hijacking your mind. Thrive Global. https://medium.com/thrive-global/how-technology-hijacks-peoples-minds-from-a-magician-and-google-s-design-ethicist-56d62ef5edf3
Hwang, T. (2020). Subprime attention crisis: Advertising and the time bomb at the heart of the internet. FSG Originals.
Simon, H. A. (1971). Designing organizations for an information-rich world. In M. Greenberger (Ed.), Computers, communications, and the public interest (pp. 37-72). Johns Hopkins University Press.
Williams, J. (2018). Stand out of our light: Freedom and resistance in the attention economy. Cambridge University Press.
Wu, T. (2016). The attention merchants: The epic scramble to get inside our heads. Alfred A. Knopf.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Chapter 5: Power, Knowledge, and Data
Benjamin, R. (2019). Race after technology: Abolitionist tools for the New Jim Code. Polity Press.
Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences. MIT Press.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
D'Ignazio, C., & Klein, L. F. (2020). Data feminism. MIT Press.
Foucault, M. (1977). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). Vintage Books. (Original work published 1975)
Foucault, M. (1980). Power/knowledge: Selected interviews and other writings, 1972-1977 (C. Gordon, Ed.; C. Gordon, L. Marshall, J. Mepham, & K. Soper, Trans.). Pantheon Books.
Fricker, M. (2007). Epistemic injustice: Power and the ethics of knowing. Oxford University Press.
Lukes, S. (2005). Power: A radical view (2nd ed.). Palgrave Macmillan.
Mann, M. (1986). The sources of social power: Volume 1, A history of power from the beginning to AD 1760. Cambridge University Press.
Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1), 121-136.
Chapter 6: Ethical Frameworks for the Data Age
Beauchamp, T. L., & Childress, J. F. (2019). Principles of biomedical ethics (8th ed.). Oxford University Press.
Bynum, T. W. (2010). Philosophy in the information age. Metaphilosophy, 41(3), 420-442. https://doi.org/10.1111/j.1467-9973.2010.01656.x
Floridi, L. (2013). The ethics of information. Oxford University Press.
Floridi, L., & Taddeo, M. (2016). What is data ethics? Philosophical Transactions of the Royal Society A, 374(2083), 20160360. https://doi.org/10.1098/rsta.2016.0360
Friedman, B., & Hendry, D. G. (2019). Value sensitive design: Shaping technology with moral imagination. MIT Press.
Kant, I. (1998). Groundwork of the metaphysics of morals (M. Gregor, Trans.). Cambridge University Press. (Original work published 1785)
Mill, J. S. (2001). Utilitarianism (G. Sher, Ed.). Hackett Publishing Company. (Original work published 1863)
Rawls, J. (1971). A theory of justice. Harvard University Press.
Sen, A. (2009). The idea of justice. Harvard University Press.
Vallor, S. (2016). Technology and the virtues: A philosophical guide to a future worth wanting. Oxford University Press.
Chapter 7: What Is Privacy? Definitions and Debates
Allen, A. L. (2011). Unpopular privacy: What must we hide? Oxford University Press.
DeCew, J. W. (1997). In pursuit of privacy: Law, ethics, and the rise of technology. Cornell University Press.
Gavison, R. (1980). Privacy and the limits of law. Yale Law Journal, 89(3), 421-471.
Nissenbaum, H. (2010). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.
Prosser, W. L. (1960). Privacy. California Law Review, 48(3), 383-423.
Solove, D. J. (2008). Understanding privacy. Harvard University Press.
Solove, D. J. (2011). Nothing to hide: The false tradeoff between privacy and security. Yale University Press.
Thomson, J. J. (1975). The right to privacy. Philosophy & Public Affairs, 4(4), 295-314.
Warren, S. D., & Brandeis, L. D. (1890). The right to privacy. Harvard Law Review, 4(5), 193-220.
Westin, A. F. (1967). Privacy and freedom. Atheneum.
Chapter 8: Surveillance: From Panopticon to Platform
Ball, K., Haggerty, K. D., & Lyon, D. (Eds.). (2012). Routledge handbook of surveillance studies. Routledge.
Browne, S. (2015). Dark matters: On the surveillance of blackness. Duke University Press.
Foucault, M. (1977). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). Vintage Books. (Original work published 1975)
Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. British Journal of Sociology, 51(4), 605-622. https://doi.org/10.1080/00071310020015280
Lyon, D. (2007). Surveillance studies: An overview. Polity Press.
Lyon, D. (2018). The culture of surveillance: Watching as a way of life. Polity Press.
Marx, G. T. (2016). Windows into the soul: Surveillance and society in an age of high technology. University of Chicago Press.
Snowden, E. (2019). Permanent record. Metropolitan Books.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Chapter 9: Data Collection and Consent
Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509-514. https://doi.org/10.1126/science.aaa1465
Barocas, S., & Nissenbaum, H. (2014). Big data's end run around anonymity and consent. In J. Lane, V. Stodden, S. Bender, & H. Nissenbaum (Eds.), Privacy, big data, and the public good (pp. 44-75). Cambridge University Press.
Cate, F. H. (2006). The failure of fair information practice principles. In J. K. Winn (Ed.), Consumer protection in the age of the "information economy" (pp. 341-378). Ashgate Publishing.
McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society, 4(3), 543-568.
Obar, J. A., & Oeldorf-Hirsch, A. (2020). The biggest lie on the internet: Ignoring the privacy policies and terms of service policies of social networking services. Information, Communication & Society, 23(1), 128-147. https://doi.org/10.1080/1369118X.2018.1486870
Richards, N. M., & Hartzog, W. (2019). The pathologies of digital consent. Washington University Law Review, 96(6), 1461-1503.
Solove, D. J. (2013). Introduction: Privacy self-management and the consent dilemma. Harvard Law Review, 126(7), 1880-1903.
Waldman, A. E. (2020). Industry unbound: The inside story of privacy, data, and corporate power. Cambridge University Press.
Chapter 10: Privacy by Design and Data Minimization
Cavoukian, A. (2009). Privacy by design: The 7 foundational principles. Information and Privacy Commissioner of Ontario.
Dwork, C. (2006). Differential privacy. In M. Bugliesi, B. Preneel, V. Sassone, & I. Wegener (Eds.), Automata, languages and programming (ICALP 2006, Lecture Notes in Computer Science, Vol. 4052, pp. 1-12). Springer. https://doi.org/10.1007/11787006_1
Dwork, C., & Roth, A. (2014). The algorithmic foundations of differential privacy. Foundations and Trends in Theoretical Computer Science, 9(3-4), 211-407. https://doi.org/10.1561/0400000042
Gurses, S., Troncoso, C., & Diaz, C. (2011). Engineering privacy by design. Computers, Privacy & Data Protection, 14(3), 1-25.
Langheinrich, M. (2001). Privacy by design -- principles of privacy-aware ubiquitous systems. In G. D. Abowd, B. Brumitt, & S. Shafer (Eds.), Ubicomp 2001: Ubiquitous computing (Lecture Notes in Computer Science, Vol. 2201, pp. 273-291). Springer.
Narayanan, A., & Shmatikov, V. (2008). Robust de-anonymization of large sparse datasets. In Proceedings of the 2008 IEEE Symposium on Security and Privacy (pp. 111-125). IEEE. https://doi.org/10.1109/SP.2008.33
Rubinstein, I. S., & Good, N. (2013). Privacy by design: A counterfactual analysis of Google and Facebook privacy incidents. Berkeley Technology Law Journal, 28(2), 1333-1413.
Spiekermann, S., & Cranor, L. F. (2009). Engineering privacy. IEEE Transactions on Software Engineering, 35(1), 67-82. https://doi.org/10.1109/TSE.2008.88
Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 10(5), 557-570. https://doi.org/10.1142/S0218488502001648
Chapter 11: The Economics of Privacy
Acquisti, A. (2004). Privacy in electronic commerce and the economics of immediate gratification. In Proceedings of the 5th ACM Conference on Electronic Commerce (pp. 21-29). ACM. https://doi.org/10.1145/988772.988777
Acquisti, A., Taylor, C., & Wagman, L. (2016). The economics of privacy. Journal of Economic Literature, 54(2), 442-492. https://doi.org/10.1257/jel.54.2.442
Laudon, K. C. (1996). Markets and privacy. Communications of the ACM, 39(9), 92-104. https://doi.org/10.1145/234215.234476
Ponemon Institute. (2023). Cost of a data breach report 2023. IBM Security.
Posner, R. A. (1981). The economics of privacy. American Economic Review, 71(2), 405-409.
Shapiro, C., & Varian, H. R. (1999). Information rules: A strategic guide to the network economy. Harvard Business School Press.
Spiekermann, S., Acquisti, A., Bohme, R., & Hui, K.-L. (2015). The challenges of personal data markets and privacy. Electronic Markets, 25(2), 161-167. https://doi.org/10.1007/s12525-015-0191-0
Stigler, G. J. (1980). An introduction to privacy in economics and politics. Journal of Legal Studies, 9(4), 623-644.
Varian, H. R. (1997). Economic aspects of personal privacy. In Privacy and self-regulation in the information age. U.S. Department of Commerce.
Chapter 12: Health Data, Genetic Data, and Biometric Privacy
Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Proceedings of the 1st Conference on Fairness, Accountability and Transparency (pp. 77-91). PMLR.
Gates, K. A. (2011). Our biometric future: Facial recognition technology and the culture of surveillance. NYU Press.
Green, R. C., & Farahany, N. A. (2014). Regulation: The FDA is overcautious on consumer genomics. Nature, 505(7483), 286-287. https://doi.org/10.1038/505286a
Gymrek, M., McGuire, A. L., Golan, D., Halperin, E., & Erlich, Y. (2013). Identifying personal genomes by surname inference. Science, 339(6117), 321-324. https://doi.org/10.1126/science.1229566
Joly, Y., Dupras, C., Pinkesz, M., Shore, S. A., & Bhatt, S. (2020). Genetic discrimination still casts a long shadow in 2020. European Journal of Human Genetics, 28(10), 1327-1328. https://doi.org/10.1038/s41431-020-0670-0
Lunshof, J. E., Chadwick, R., Vorhaus, D. B., & Church, G. M. (2008). From genetic privacy to open consent. Nature Reviews Genetics, 9(5), 406-411. https://doi.org/10.1038/nrg2360
Skloot, R. (2010). The immortal life of Henrietta Lacks. Crown.
Stark, L. (2019). Behind closed doors: IRBs and the making of ethical research. University of Chicago Press.
Terry, N. P. (2012). Protecting patient privacy in the age of big data. University of Missouri-Kansas City Law Review, 81(2), 385-415.
Wachter, S. (2020). Affinity profiling and discrimination by association in online behavioural advertising. Berkeley Technology Law Journal, 35(2), 367-430.
Chapter 13: How Algorithms Shape Society
Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016, May 23). Machine bias. ProPublica. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), 1-13. https://doi.org/10.1080/1369118X.2016.1216147
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin's Press.
Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167-194). MIT Press.
Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), 14-29. https://doi.org/10.1080/1369118X.2016.1154087
Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.
O'Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.
Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2), 1-12. https://doi.org/10.1177/2053951717738104
Chapter 14: Bias in Data, Bias in Machines
Barocas, S., & Selbst, A. D. (2016). Big data's disparate impact. California Law Review, 104(3), 671-732.
Benjamin, R. (2019). Race after technology: Abolitionist tools for the New Jim Code. Polity Press.
Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Proceedings of the 1st Conference on Fairness, Accountability and Transparency (pp. 77-91). PMLR.
Crawford, K. (2017). The trouble with bias [Conference keynote]. Neural Information Processing Systems (NeurIPS) 2017, Long Beach, CA.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin's Press.
Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems, 14(3), 330-347. https://doi.org/10.1145/230538.230561
Hanna, A., Denton, E., Smart, A., & Smith-Loud, J. (2020). Towards a critical race methodology in algorithmic fairness. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 501-512). ACM.
Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366(6464), 447-453. https://doi.org/10.1126/science.aax2342
Sweeney, L. (2013). Discrimination in online ad delivery. Communications of the ACM, 56(5), 44-54. https://doi.org/10.1145/2447976.2447990
Chapter 15: Fairness: Definitions, Tensions, and Trade-offs
Berk, R., Heidari, H., Jabbari, S., Kearns, M., & Roth, A. (2018). Fairness in criminal justice risk assessments: The state of the art. Sociological Methods & Research, 50(1), 3-44. https://doi.org/10.1177/0049124118782533
Binns, R. (2018). Fairness in machine learning: Lessons from political philosophy. In Proceedings of the 1st Conference on Fairness, Accountability and Transparency (pp. 149-159). PMLR.
Chouldechova, A. (2017). Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Big Data, 5(2), 153-163. https://doi.org/10.1089/big.2016.0047
Corbett-Davies, S., & Goel, S. (2018). The measure and mismeasure of fairness: A critical review of fair machine learning. arXiv preprint arXiv:1808.00023.
Dwork, C., Hardt, M., Pitassi, T., Reingold, O., & Zemel, R. (2012). Fairness through awareness. In Proceedings of the 3rd Innovations in Theoretical Computer Science Conference (pp. 214-226). ACM.
Hardt, M., Price, E., & Srebro, N. (2016). Equality of opportunity in supervised learning. In Advances in Neural Information Processing Systems 29 (pp. 3315-3323). NeurIPS.
Kleinberg, J., Mullainathan, S., & Raghavan, M. (2016). Inherent trade-offs in the fair determination of risk scores. In Proceedings of the 8th Innovations in Theoretical Computer Science Conference (pp. 43:1-43:23). Leibniz International Proceedings in Informatics.
Mitchell, S., Potash, E., Barocas, S., D'Amour, A., & Lum, K. (2021). Algorithmic fairness: Choices, assumptions, and definitions. Annual Review of Statistics and Its Application, 8, 141-163. https://doi.org/10.1146/annurev-statistics-042720-125902
Rawls, J. (1971). A theory of justice. Harvard University Press.
Selbst, A. D., boyd, d., Friedler, S. A., Venkatasubramanian, S., & Vertesi, J. (2019). Fairness and abstraction in sociotechnical systems. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 59-68). ACM.
Chapter 16: Transparency, Explainability, and the Black Box Problem
Burrell, J. (2016). How the machine "thinks": Understanding opacity in machine learning algorithms. Big Data & Society, 3(1), 1-12. https://doi.org/10.1177/2053951715622512
Doshi-Velez, F., & Kim, B. (2017). Towards a rigorous science of interpretable machine learning. arXiv preprint arXiv:1702.08608.
Edwards, L., & Veale, M. (2017). Slave to the algorithm? Why a "right to an explanation" is probably not the remedy you are looking for. Duke Law & Technology Review, 16(1), 18-84.
Guidotti, R., Monreale, A., Ruggieri, S., Turini, F., Giannotti, F., & Pedreschi, D. (2019). A survey of methods for explaining black box models. ACM Computing Surveys, 51(5), 1-42. https://doi.org/10.1145/3236009
Lipton, Z. C. (2018). The mythos of model interpretability. Queue, 16(3), 31-57. https://doi.org/10.1145/3236386.3241340
Lundberg, S. M., & Lee, S.-I. (2017). A unified approach to interpreting model predictions. In Advances in Neural Information Processing Systems 30 (pp. 4765-4774). NeurIPS.
Ribeiro, M. T., Singh, S., & Guestrin, C. (2016). "Why should I trust you?" Explaining the predictions of any classifier. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 1135-1144). ACM.
Rudin, C. (2019). Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead. Nature Machine Intelligence, 1(5), 206-215. https://doi.org/10.1038/s42256-019-0048-x
Selbst, A. D., & Barocas, S. (2018). The intuitive appeal of explainable machines. Fordham Law Review, 87(3), 1085-1139.
Wachter, S., Mittelstadt, B., & Floridi, L. (2017). Why a right to explanation of automated decision-making does not exist in the General Data Protection Regulation. International Data Privacy Law, 7(2), 76-99. https://doi.org/10.1093/idpl/ipx005
Chapter 17: Accountability and Audit
Barocas, S., Hardt, M., & Narayanan, A. (2019). Fairness and machine learning: Limitations and opportunities. fairmlbook.org. https://fairmlbook.org/
Brown, S., Davidovic, J., & Hasan, A. (2021). The algorithm audit: Scoring the algorithms that score us. Big Data & Society, 8(1), 1-8. https://doi.org/10.1177/2053951720983865
Costanza-Chock, S. (2020). Design justice: Community-led practices to build the worlds we need. MIT Press.
Kroll, J. A., Huey, J., Barocas, S., Felten, E. W., Reidenberg, J. R., Robinson, D. G., & Yu, H. (2017). Accountable algorithms. University of Pennsylvania Law Review, 165(3), 633-705.
Metcalf, J., Moss, E., & boyd, d. (2019). Owning ethics: Corporate logics, Silicon Valley, and the institutionalization of ethics. Social Research: An International Quarterly, 86(2), 449-476.
Raji, I. D., & Buolamwini, J. (2019). Actionable auditing: Investigating the impact of publicly naming biased performance results of commercial AI systems. In Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society (pp. 429-435). ACM.
Raji, I. D., Smart, A., White, R. N., Mitchell, M., Gebru, T., Hutchinson, B., Smith-Loud, J., Theron, D., & Barnes, P. (2020). Closing the AI accountability gap: Defining an end-to-end framework for internal algorithmic auditing. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 33-44). ACM.
Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2014). Auditing algorithms: Research methods for detecting discrimination on internet platforms. In Data and discrimination: Converting critical concerns into productive inquiry (pp. 1-23). New America Foundation.
Vecchione, B., Levy, K., & Barocas, S. (2021). Algorithmic auditing and social justice: Lessons from the history of audit studies. In Proceedings of the 1st ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (pp. 1-9). ACM.
Chapter 18: Generative AI: Ethics of Creation and Deception
Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (pp. 610-623). ACM.
Chesney, R., & Citron, D. K. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107(6), 1753-1820.
Crawford, K. (2021). Atlas of AI: Power, politics, and the planetary costs of artificial intelligence. Yale University Press.
Floridi, L., & Chiriatti, M. (2020). GPT-3: Its nature, scope, limits, and consequences. Minds and Machines, 30(4), 681-694. https://doi.org/10.1007/s11023-020-09548-1
Liang, P., Bommasani, R., Lee, T., Tsipras, D., Soylu, D., Yasunaga, M., ... & Koreeda, Y. (2023). Holistic evaluation of language models. Transactions on Machine Learning Research. https://doi.org/10.48550/arXiv.2211.09110
Solaiman, I., Brundage, M., Clark, J., Askell, A., Herbert-Voss, A., Wu, J., Radford, A., & Wang, J. (2019). Release strategies and the social impacts of language models. arXiv preprint arXiv:1908.09203.
Weidinger, L., Mellor, J., Rauh, M., Griffin, C., Uesato, J., Huang, P.-S., Cheng, M., Glaese, M., Balle, B., Kasirzadeh, A., Kenton, Z., Brown, S., Hawkins, W., Stepleton, T., Biles, C., Birhane, A., Haas, J., Rimell, L., Hendricks, L. A., ... Gabriel, I. (2021). Ethical and social risks of harm from language models. arXiv preprint arXiv:2112.04359.
Whittaker, M. (2021). The steep cost of capture. Interactions, 28(6), 50-55. https://doi.org/10.1145/3488666
Chapter 19: Autonomous Systems and Moral Machines
Awad, E., Dsouza, S., Kim, R., Schulz, J., Henrich, J., Shariff, A., Bonnefon, J.-F., & Rahwan, I. (2018). The moral machine experiment. Nature, 563(7729), 59-64. https://doi.org/10.1038/s41586-018-0637-6
Coeckelbergh, M. (2020). AI ethics. MIT Press.
Floridi, L., & Sanders, J. W. (2004). On the morality of artificial agents. Minds and Machines, 14(3), 349-379. https://doi.org/10.1023/B:MIND.0000035461.63578.9d
Foot, P. (1967). The problem of abortion and the doctrine of the double effect. Oxford Review, 5, 5-15.
Gunkel, D. J. (2012). The machine question: Critical perspectives on AI, robots, and ethics. MIT Press.
Johnson, D. G. (2006). Computer systems: Moral entities but not moral agents. Ethics and Information Technology, 8(4), 195-204. https://doi.org/10.1007/s10676-006-9111-5
Nyholm, S. (2020). Humans and robots: Ethics, agency, and anthropomorphism. Rowman & Littlefield.
Thomson, J. J. (1985). The trolley problem. Yale Law Journal, 94(6), 1395-1415.
Vallor, S. (2016). Technology and the virtues: A philosophical guide to a future worth wanting. Oxford University Press.
Wallach, W., & Allen, C. (2008). Moral machines: Teaching robots right from wrong. Oxford University Press.
Chapter 20: The Regulatory Landscape: A Global Survey
Bennett, C. J., & Raab, C. D. (2006). The governance of privacy: Policy instruments in global perspective (2nd ed.). MIT Press.
Bradford, A. (2020). The Brussels effect: How the European Union rules the world. Oxford University Press.
Bygrave, L. A. (2014). Data privacy law: An international perspective. Oxford University Press.
European Parliament and Council. (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council (General Data Protection Regulation). Official Journal of the European Union, L119, 1-88.
Greenleaf, G. (2021). Global data privacy laws 2021: Despite COVID delays, 145 laws show GDPR dominance. Privacy Laws & Business International Report, 169, 3-5.
Hoofnagle, C. J. (2014). Federal Trade Commission privacy law and policy. Cambridge University Press.
Mattoo, A., & Meltzer, J. P. (2018). International data flows and privacy: The conflict and its resolution. Journal of International Economic Law, 21(4), 769-789. https://doi.org/10.1093/jiel/jgy044
Schwartz, P. M., & Peifer, K.-N. (2017). Transatlantic data privacy law. Georgetown Law Journal, 106(1), 115-179.
Solove, D. J., & Hartzog, W. (2014). The FTC and the new common law of privacy. Columbia Law Review, 114(3), 583-676.
Chapter 21: The EU AI Act and Risk-Based Regulation
Bradford, A. (2020). The Brussels effect: How the European Union rules the world. Oxford University Press.
Ebers, M., Hoch, V. R. S., Rosenkranz, F., Ruschemeier, H., & Steinroetter, B. (2021). The European Commission's proposal for an Artificial Intelligence Act -- A critical assessment by members of the Robotics and AI Law Society (RAILS). Multidisciplinary Scientific Journal, 4(4), 589-603.
European Commission. (2021). Proposal for a regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act). COM(2021) 206 final.
European Parliament and Council. (2024). Regulation (EU) 2024/1689 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act). Official Journal of the European Union, L 2024/1689.
Floridi, L. (2021). The European legislation on AI: A brief analysis of its philosophical approach. Philosophy & Technology, 34(2), 215-222. https://doi.org/10.1007/s13347-021-00460-9
Malgieri, G., & Niklas, J. (2020). Vulnerable data subjects. Computer Law & Security Review, 37, 105415. https://doi.org/10.1016/j.clsr.2020.105415
Smuha, N. A. (2021). From a "race to AI" to a "race to AI regulation": Regulatory competition for artificial intelligence. Law, Innovation and Technology, 13(1), 57-84. https://doi.org/10.1080/17579961.2021.1898300
Veale, M., & Borgesius, F. Z. (2021). Demystifying the Draft EU Artificial Intelligence Act. Computer Law Review International, 22(4), 97-112. https://doi.org/10.9785/cri-2021-220402
Chapter 22: Data Governance Frameworks and Institutions
Abraham, R., Schneider, J., & vom Brocke, J. (2019). Data governance: A conceptual framework, structured review, and research agenda. International Journal of Information Management, 49, 424-438. https://doi.org/10.1016/j.ijinfomgt.2019.07.008
Alhassan, I., Sammon, D., & Daly, M. (2016). Data governance activities: An analysis of the literature. Journal of Decision Systems, 25(sup1), 64-75. https://doi.org/10.1080/12460125.2016.1187397
DAMA International. (2017). DAMA-DMBOK: Data management body of knowledge (2nd ed.). Technics Publications.
Janssen, M., Brous, P., Estevez, E., Barbosa, L. S., & Janowski, T. (2020). Data governance: Organizing data for trustworthy artificial intelligence. Government Information Quarterly, 37(3), 101493. https://doi.org/10.1016/j.giq.2020.101493
Micheli, M., Ponti, M., Craglia, M., & Berti Suman, A. (2020). Emerging models of data governance in the age of datafication. Big Data & Society, 7(2), 1-15. https://doi.org/10.1177/2053951720948087
OECD. (2019). Recommendation of the Council on Artificial Intelligence (OECD/LEGAL/0449). OECD Publishing.
Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. Cambridge University Press.
Verhulst, S. G., & Young, A. (2017). Open data in developing economies: Toward building an evidence base on what works and how. African Minds.
Weber, K., Otto, B., & Osterle, H. (2009). One size does not fit all -- A contingency approach to data governance. Journal of Data and Information Quality, 1(1), 1-27. https://doi.org/10.1145/1515693.1515696
Chapter 23: Cross-Border Data Flows and Digital Sovereignty
Aaronson, S. A. (2019). What are we talking about when we talk about digital protectionism? World Trade Review, 18(4), 541-577. https://doi.org/10.1017/S1474745618000198
Bauer, M., Ferracane, M. F., & van der Marel, E. (2016). Tracing the economic impact of regulations on the free flow of data and data localization. ECIPE Occasional Paper No. 02/2016. European Centre for International Political Economy.
Bradford, A. (2020). The Brussels effect: How the European Union rules the world. Oxford University Press.
Chander, A., & Le, U. P. (2015). Data nationalism. Emory Law Journal, 64(3), 677-739.
Court of Justice of the European Union. (2020). Data Protection Commissioner v. Facebook Ireland Limited, Maximillian Schrems (Case C-311/18, Schrems II).
Farrell, H., & Newman, A. L. (2019). Weaponized interdependence: How global economic networks shape state coercion. International Security, 44(1), 42-79. https://doi.org/10.1162/isec_a_00351
Kuner, C. (2013). Transborder data flows and data privacy law. Oxford University Press.
Pohle, J., & Thiel, T. (2020). Digital sovereignty. Internet Policy Review, 9(4), 1-19. https://doi.org/10.14763/2020.4.1532
Chapter 24: Sector-Specific Governance: Finance, Health, Education
Ananny, M., & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973-989. https://doi.org/10.1177/1461444816676645
Baker, T., & Dellaert, B. (2018). Regulating robo advice across the financial services industry. Iowa Law Review, 103(2), 713-750.
Cohen, I. G., Evgeniou, T., Gerke, S., & Minssen, T. (2020). The European artificial intelligence strategy: Implications and challenges for digital health. The Lancet Digital Health, 2(7), e376-e379. https://doi.org/10.1016/S2589-7500(20)30112-6
Gerrish, S. (2018). How smart machines think. MIT Press.
Klosowski, T. (2021, March 24). The state of consumer data privacy laws in the US (and why it matters). Wirecutter/New York Times.
Price, W. N., II, & Cohen, I. G. (2019). Privacy in the age of medical big data. Nature Medicine, 25(1), 37-43. https://doi.org/10.1038/s41591-018-0272-7
Roberts, H., Cowls, J., Morley, J., Taddeo, M., Wang, V., & Floridi, L. (2021). The Chinese approach to artificial intelligence: An analysis of policy, ethics, and regulation. AI & Society, 36(1), 59-77. https://doi.org/10.1007/s00146-020-00992-2
Williamson, B. (2017). Big data in education: The digital future of learning, policy and practice. SAGE Publications.
Ziewitz, M. (2016). Governing algorithms: Myth, mess, and methods. Science, Technology, & Human Values, 41(1), 3-16. https://doi.org/10.1177/0162243915608948
Chapter 25: Enforcement, Compliance, and the Limits of Law
Bamberger, K. A., & Mulligan, D. K. (2015). Privacy on the ground: Driving corporate behavior in the United States and Europe. MIT Press.
Bennett, C. J. (2008). The privacy advocates: Resisting the spread of surveillance. MIT Press.
Citron, D. K. (2008). Technological due process. Washington University Law Review, 85(6), 1249-1313.
Kaminski, M. E. (2019). The right to explanation, explained. Berkeley Technology Law Journal, 34(1), 189-218.
Lessig, L. (2006). Code: Version 2.0. Basic Books.
Reidenberg, J. R. (1998). Lex informatica: The formulation of information policy rules through technology. Texas Law Review, 76(3), 553-593.
Rubinstein, I. S. (2018). The future of self-regulation is co-regulation. In E. Selinger, J. Polonetsky, & O. Tene (Eds.), The Cambridge handbook of consumer privacy (pp. 503-523). Cambridge University Press.
Solove, D. J., & Hartzog, W. (2014). The FTC and the new common law of privacy. Columbia Law Review, 114(3), 583-676.
Yeung, K. (2018). A study of the implications of advanced digital technologies (including AI systems) for the concept of responsibility within a human rights framework. Council of Europe Study DGI(2019)05.
Chapter 26: Building a Data Ethics Program
Hagendorff, T. (2020). The ethics of AI ethics: An evaluation of guidelines. Minds and Machines, 30(1), 99-120. https://doi.org/10.1007/s11023-020-09517-8
Jobin, A., Ienca, M., & Vayena, E. (2019). The global landscape of AI ethics guidelines. Nature Machine Intelligence, 1(9), 389-399. https://doi.org/10.1038/s42256-019-0088-2
Metcalf, J., Moss, E., & boyd, d. (2019). Owning ethics: Corporate logics, Silicon Valley, and the institutionalization of ethics. Social Research: An International Quarterly, 86(2), 449-476.
Mittelstadt, B. (2019). Principles alone cannot guarantee ethical AI. Nature Machine Intelligence, 1(11), 501-507. https://doi.org/10.1038/s42256-019-0114-4
Morley, J., Floridi, L., Kinsey, L., & Elhalal, A. (2020). From what to how: An initial review of publicly available AI ethics tools, methods and research to translate principles into practices. Science and Engineering Ethics, 26(4), 2141-2168. https://doi.org/10.1007/s11948-019-00165-5
Moss, E., & Metcalf, J. (2020). Ethics owners: A new model of organizational responsibility in data-driven technology companies. Data & Society Research Institute.
Raji, I. D., Smart, A., White, R. N., Mitchell, M., Gebru, T., Hutchinson, B., Smith-Loud, J., Theron, D., & Barnes, P. (2020). Closing the AI accountability gap: Defining an end-to-end framework for internal algorithmic auditing. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 33-44). ACM.
Selbst, A. D. (2021). An institutional view of algorithmic impact assessments. Harvard Journal of Law & Technology, 35(1), 117-191.
Whittaker, M., Crawford, K., Dobbe, R., Fried, G., Kaziunas, E., Mathur, V., West, S. M., Richardson, R., Schultz, J., & Schwartz, O. (2018). AI Now Report 2018. AI Now Institute.
Chapter 27: Data Stewardship and the Chief Data Officer
Brous, P., Janssen, M., & Vilminko-Heikkinen, R. (2016). Coordinating decision-making in data management activities: A systematic review of data governance principles. In H. J. Scholl, O. Glassey, M. Janssen, B. Klievink, I. Lindgren, P. Parycek, E. Tambouris, M. A. Wimmer, T. Janowski, & D. Sa Soares (Eds.), Electronic Government (pp. 115-125). Springer.
DAMA International. (2017). DAMA-DMBOK: Data management body of knowledge (2nd ed.). Technics Publications.
Gebru, T., Morgenstern, J., Vecchione, B., Vaughan, J. W., Wallach, H., Daume, H., III, & Crawford, K. (2021). Datasheets for datasets. Communications of the ACM, 64(12), 86-92. https://doi.org/10.1145/3458723
Lee, Y. W., Madnick, S. E., Wang, R. Y., Wang, F., & Zhang, H. (2014). A cubic framework for the chief data officer: Succeeding in a world of big data. MIS Quarterly Executive, 13(1), 1-13.
Micheli, M., Ponti, M., Craglia, M., & Berti Suman, A. (2020). Emerging models of data governance in the age of datafication. Big Data & Society, 7(2), 1-15. https://doi.org/10.1177/2053951720948087
Mitchell, S., Potash, E., Barocas, S., D'Amour, A., & Lum, K. (2021). Algorithmic fairness: Choices, assumptions, and definitions. Annual Review of Statistics and Its Application, 8, 141-163. https://doi.org/10.1146/annurev-statistics-042720-125902
Redman, T. C. (2008). Data driven: Profiting from your most important business asset. Harvard Business Press.
Stilgoe, J., Owen, R., & Macnaghten, P. (2013). Developing a framework for responsible innovation. Research Policy, 42(9), 1568-1580. https://doi.org/10.1016/j.respol.2013.05.008
Chapter 28: Privacy Impact Assessments and Ethical Reviews
Clarke, R. (2009). Privacy impact assessment: Its origins and development. Computer Law & Security Review, 25(2), 123-135. https://doi.org/10.1016/j.clsr.2009.02.002
De Hert, P. (2012). A human rights perspective on privacy and data protection impact assessments. In D. Wright & P. De Hert (Eds.), Privacy impact assessment (pp. 33-76). Springer.
European Data Protection Board. (2017). Guidelines on data protection impact assessment (DPIA) and determining whether processing is "likely to result in a high risk" for the purposes of Regulation 2016/679 (WP 248 rev.01).
Mantelero, A. (2018). AI and big data: A blueprint for a human rights, social and ethical impact assessment. Computer Law & Security Review, 34(4), 754-772. https://doi.org/10.1016/j.clsr.2018.05.017
Metcalf, J. (2017). Ethics codes in the digital age: The production and policing of data ethics. Data & Society Research Institute.
Reisman, D., Schultz, J., Crawford, K., & Whittaker, M. (2018). Algorithmic impact assessments: A practical framework for public agency accountability. AI Now Institute.
Selbst, A. D. (2021). An institutional view of algorithmic impact assessments. Harvard Journal of Law & Technology, 35(1), 117-191.
Wright, D. (2012). The state of the art in privacy impact assessment. Computer Law & Security Review, 28(1), 54-61. https://doi.org/10.1016/j.clsr.2011.11.007
Wright, D., & De Hert, P. (Eds.). (2012). Privacy impact assessment. Springer.
Chapter 29: Responsible AI Development
Amershi, S., Begel, A., Bird, C., DeLine, R., Gall, H., Kamar, E., Nagappan, N., Nushi, B., & Zimmermann, T. (2019). Software engineering for machine learning: A case study. In Proceedings of the 41st International Conference on Software Engineering: Software Engineering in Practice (pp. 291-300). IEEE.
Gebru, T., Morgenstern, J., Vecchione, B., Vaughan, J. W., Wallach, H., Daume, H., III, & Crawford, K. (2021). Datasheets for datasets. Communications of the ACM, 64(12), 86-92. https://doi.org/10.1145/3458723
Holstein, K., Wortman Vaughan, J., Daume, H., III, Dudik, M., & Wallach, H. (2019). Improving fairness in machine learning systems: What do industry practitioners need? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-16). ACM.
Mitchell, M., Wu, S., Zaldivar, A., Barnes, P., Vasserman, L., Hutchinson, B., Spitzer, E., Raji, I. D., & Gebru, T. (2019). Model cards for model reporting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 220-229). ACM.
Morley, J., Floridi, L., Kinsey, L., & Elhalal, A. (2020). From what to how: An initial review of publicly available AI ethics tools, methods and research to translate principles into practices. Science and Engineering Ethics, 26(4), 2141-2168. https://doi.org/10.1007/s11948-019-00165-5
Raji, I. D., Smart, A., White, R. N., Mitchell, M., Gebru, T., Hutchinson, B., Smith-Loud, J., Theron, D., & Barnes, P. (2020). Closing the AI accountability gap: Defining an end-to-end framework for internal algorithmic auditing. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 33-44). ACM.
Rakova, B., Yang, J., Cramer, H., & Chowdhury, R. (2021). Where responsible AI meets reality: Practitioner perspectives on enablers for shifting organizational practices. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1), 1-23. https://doi.org/10.1145/3449081
Sculley, D., Holt, G., Golovin, D., Davydov, E., Phillips, T., Ebner, D., Chaudhary, V., Young, M., Crespo, J.-F., & Dennison, D. (2015). Hidden technical debt in machine learning systems. In Advances in Neural Information Processing Systems 28 (pp. 2503-2511). NeurIPS.
Chapter 30: When Things Go Wrong: Breach Response and Crisis Ethics
Acquisti, A., Friedman, A., & Telang, R. (2006). Is there a cost to privacy breaches? An event study. In Proceedings of the 27th International Conference on Information Systems (pp. 1563-1580). AIS.
Citron, D. K. (2014). Hate crimes in cyberspace. Harvard University Press.
Kerr, I. R., & Earle, J. (2013). Prediction, preemption, presumption: How big data threatens big picture privacy. Stanford Law Review Online, 66, 65-72.
Martin, K. D., Borah, A., & Palmatier, R. W. (2017). Data privacy: Effects on customer and firm performance. Journal of Marketing, 81(1), 36-58. https://doi.org/10.1509/jm.15.0497
Ponemon Institute. (2023). Cost of a data breach report 2023. IBM Security.
Romanosky, S. (2016). Examining the costs and causes of cyber incidents. Journal of Cybersecurity, 2(2), 121-135. https://doi.org/10.1093/cybsec/tyw001
Solove, D. J. (2021). The myth of the privacy paradox. George Washington Law Review, 89(1), 1-51.
Solove, D. J., & Citron, D. K. (2018). Risk and anxiety: A theory of data-breach harms. Texas Law Review, 96(4), 737-786.
Zou, Y., Mhaidli, A. H., McCall, A., & Schaub, F. (2018). "I've got nothing to lose": Consumers' risk perceptions and protective actions after the Equifax data breach. In Proceedings of the Fourteenth Symposium on Usable Privacy and Security (pp. 197-216). USENIX.
Chapter 31: Misinformation, Disinformation, and Platform Governance
Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.
Citron, D. K., & Chesney, R. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107(6), 1753-1820.
Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the "post-truth" era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008
Persily, N., & Tucker, J. A. (Eds.). (2020). Social media and democracy: The state of the field, prospects for reform. Cambridge University Press.
Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton University Press.
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. https://doi.org/10.1126/science.aap9559
Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making (DGI(2017)09). Council of Europe.
Zuckerman, E. (2017). Digital cosmopolitans: Why we think the internet connects us, why it doesn't, and how to rewire it. W. W. Norton.
Chapter 32: Digital Divide, Data Justice, and Equity
Benjamin, R. (2019). Race after technology: Abolitionist tools for the New Jim Code. Polity Press.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
D'Ignazio, C., & Klein, L. F. (2020). Data feminism. MIT Press.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin's Press.
Gangadharan, S. P. (2012). Digital inclusion and data profiling. First Monday, 17(5). https://doi.org/10.5210/fm.v17i5.3821
Heeks, R. (2017). Information and communication technology for development (ICT4D). Routledge.
Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.
Robinson, L., Cotten, S. R., Ono, H., Quan-Haase, A., Mesch, G., Chen, W., Schulz, J., Hale, T. M., & Stern, M. J. (2015). Digital inequalities and why they matter. Information, Communication & Society, 18(5), 569-582. https://doi.org/10.1080/1369118X.2015.1012532
Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society, 4(2), 1-14. https://doi.org/10.1177/2053951717736335
Warschauer, M. (2004). Technology and social inclusion: Rethinking the digital divide. MIT Press.
Chapter 33: Labor, Automation, and the Gig Economy
Autor, D. H. (2015). Why are there still so many jobs? The history and future of workplace automation. Journal of Economic Perspectives, 29(3), 3-30. https://doi.org/10.1257/jep.29.3.3
Casilli, A. A. (2019). En attendant les robots: Enquete sur le travail du clic [Waiting for robots: An inquiry into click work]. Editions du Seuil.
Cherry, M. A. (2016). Beyond misclassification: The digital transformation of work. Comparative Labor Law & Policy Journal, 37(3), 544-577.
Dubal, V. B. (2019). The drive to precarity: A political history of work, regulation, & labor advocacy in San Francisco's taxi & Uber economies. Berkeley Journal of Employment and Labor Law, 38(1), 73-135.
Frey, C. B., & Osborne, M. A. (2017). The future of employment: How susceptible are jobs to computerisation? Technological Forecasting and Social Change, 114, 254-280. https://doi.org/10.1016/j.techfore.2016.08.019
Gray, M. L., & Suri, S. (2019). Ghost work: How to stop Silicon Valley from building a new global underclass. Houghton Mifflin Harcourt.
Irani, L. (2015). The cultural work of microwork. New Media & Society, 17(5), 720-739. https://doi.org/10.1177/1461444813511926
Rosenblat, A. (2018). Uberland: How algorithms are rewriting the rules of work. University of California Press.
Vallas, S. P., & Schor, J. B. (2020). What do platforms do? Understanding the gig economy. Annual Review of Sociology, 46, 273-294. https://doi.org/10.1146/annurev-soc-121919-054857
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Chapter 34: Environmental Data Ethics and Climate
Crawford, K. (2021). Atlas of AI: Power, politics, and the planetary costs of artificial intelligence. Yale University Press.
Edwards, P. N. (2010). A vast machine: Computer models, climate data, and the politics of global warming. MIT Press.
Hao, K. (2019, June 6). Training a single AI model can emit as much carbon as five cars in their lifetimes. MIT Technology Review.
Henderson, P., Hu, J., Romoff, J., Brunskill, E., Jurafsky, D., & Pineau, J. (2020). Towards the systematic reporting of the energy and carbon footprints of machine learning. Journal of Machine Learning Research, 21(248), 1-43.
Kaack, L. H., Donti, P. L., Strubell, E., Kamiya, G., Creutzig, F., & Rolnick, D. (2022). Aligning artificial intelligence with climate change mitigation. Nature Climate Change, 12(6), 518-527. https://doi.org/10.1038/s41558-022-01377-7
Luccioni, A. S., Viguier, S., & Ligozat, A.-L. (2023). Estimating the carbon footprint of BLOOM, a 176B parameter language model. Journal of Machine Learning Research, 24(253), 1-15.
Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L.-M., Rothchild, D., So, D., Texier, M., & Dean, J. (2021). Carbon emissions and large neural network training. arXiv preprint arXiv:2104.10350.
Rolnick, D., Donti, P. L., Kaack, L. H., Kochanski, K., Lacoste, A., Sankaran, K., Ross, A. S., Milojevic-Dupont, N., Jaques, N., Waldman-Brown, A., Luccioni, A., Maharaj, T., Sherwin, E. D., Muber, S. K., Duber, G., Leblond, R., & Bengio, Y. (2022). Tackling climate change with machine learning. ACM Computing Surveys, 55(2), 1-96. https://doi.org/10.1145/3485128
Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for deep learning in NLP. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (pp. 3645-3650). ACL.
Chapter 35: Children, Teens, and Digital Vulnerability
Barassi, V. (2020). Child data citizen: How tech companies are profiling us from before birth. MIT Press.
boyd, d. (2014). It's complicated: The social lives of networked teens. Yale University Press.
Kidron, B. (2018). The rights of children in the digital age. InterMEDIA, 46(3), 10-13.
Livingstone, S. (2009). Children and the internet: Great expectations, challenging realities. Polity Press.
Livingstone, S., & Third, A. (2017). Children and young people's rights in the digital age: An emerging agenda. New Media & Society, 19(5), 657-670. https://doi.org/10.1177/1461444816686318
Lupton, D., & Williamson, B. (2017). The datafied child: The dataveillance of children and implications for their rights. New Media & Society, 19(5), 780-794. https://doi.org/10.1177/1461444816686328
Montgomery, K. C. (2007). Generation digital: Politics, commerce, and childhood in the age of the internet. MIT Press.
Stoilova, M., Livingstone, S., & Nandagiri, R. (2020). Digital by default: Children's capacity to understand and manage online data and privacy. Media and Communication, 8(4), 197-207. https://doi.org/10.17645/mac.v8i4.3407
UNICEF. (2017). Children in a digital world: The state of the world's children 2017. UNICEF.
Chapter 36: National Security, Intelligence, and Democratic Oversight
Ackerman, S. (2013, November 15). NSA goes on 60 Minutes: The definitive facts behind CBS's flawed report. The Guardian.
Bigo, D. (2012). Security, surveillance and democracy. In K. Ball, K. D. Haggerty, & D. Lyon (Eds.), Routledge handbook of surveillance studies (pp. 277-284). Routledge.
Donohue, L. K. (2008). The cost of counterterrorism: Power, politics, and liberty. Cambridge University Press.
Greenwald, G. (2014). No place to hide: Edward Snowden, the NSA, and the U.S. surveillance state. Metropolitan Books.
Lyon, D. (2014). Surveillance, Snowden, and big data: Capacities, consequences, critique. Big Data & Society, 1(2), 1-13. https://doi.org/10.1177/2053951714541861
Mayer, J. (2016). What Brennan missed. The New Yorker. https://www.newyorker.com/news/news-desk/what-brennan-missed
Schneier, B. (2015). Data and Goliath: The hidden battles to collect your data and control your world. W. W. Norton.
Snowden, E. (2019). Permanent record. Metropolitan Books.
Solove, D. J. (2011). Nothing to hide: The false tradeoff between privacy and security. Yale University Press.
Chapter 37: Global South Perspectives on Data Governance
Arora, P. (2019). The next billion users: Digital life beyond the West. Harvard University Press.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
Kwet, M. (2019). Digital colonialism: US empire and the new imperialism in the Global South. Race & Class, 60(4), 3-26. https://doi.org/10.1177/0306396818823172
Milan, S., & Treré, E. (2019). Big data from the South(s): Beyond data universalism. Television & New Media, 20(4), 319-335. https://doi.org/10.1177/1527476419837739
Mungiu-Pippidi, A. (2015). The quest for good governance: How societies develop control of corruption. Cambridge University Press.
Nkrumah, K. (1965). Neo-colonialism: The last stage of imperialism. Thomas Nelson & Sons.
Sambasivan, N., Kapania, S., Highfill, H., Akrong, D., Paritosh, P., & Aroyo, L. M. (2021). "Everyone wants to do the model work, not the data work": Data cascades in high-stakes AI. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-15). ACM.
Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society, 4(2), 1-14. https://doi.org/10.1177/2053951717736335
Thatcher, J., O'Sullivan, D., & Mahmoudi, D. (2016). Data colonialism through accumulation by dispossession: New metaphors for daily data. Environment and Planning D: Society and Space, 34(6), 990-1006. https://doi.org/10.1177/0263775816633195
Chapter 38: Emerging Technologies and Anticipatory Governance
Bostrom, N. (2014). Superintelligence: Paths, dangers, strategies. Oxford University Press.
Collingridge, D. (1980). The social control of technology. Frances Pinter.
Floridi, L., Cowls, J., Beltrametti, M., Chatila, R., Chazerand, P., Dignum, V., Luetge, C., Madelin, R., Pagallo, U., Rossi, F., Schafer, B., Valcke, P., & Vayena, E. (2018). AI4People -- An ethical framework for a good AI society: Opportunities, risks, principles, and recommendations. Minds and Machines, 28(4), 689-707. https://doi.org/10.1007/s11023-018-9482-5
Guston, D. H. (2014). Understanding "anticipatory governance." Social Studies of Science, 44(2), 218-242. https://doi.org/10.1177/0306312713508669
Jasanoff, S. (2005). Designs on nature: Science and democracy in Europe and the United States. Princeton University Press.
Jasanoff, S. (2016). The ethics of invention: Technology and the human future. W. W. Norton.
Marchetti, G. E. (2022). The ethics of quantum computing: A preliminary framework. Science and Engineering Ethics, 28(3), 1-23. https://doi.org/10.1007/s11948-022-00372-3
Owen, R., Bessant, J., & Heintz, M. (Eds.). (2013). Responsible innovation: Managing the responsible emergence of science and innovation in society. Wiley.
Stilgoe, J., Owen, R., & Macnaghten, P. (2013). Developing a framework for responsible innovation. Research Policy, 42(9), 1568-1580. https://doi.org/10.1016/j.respol.2013.05.008
Chapter 39: Designing Data Futures: Participation, Imagination, and Hope
Costanza-Chock, S. (2020). Design justice: Community-led practices to build the worlds we need. MIT Press.
D'Ignazio, C., & Klein, L. F. (2020). Data feminism. MIT Press.
Escobar, A. (2018). Designs for the pluriverse: Radical interdependence, autonomy, and the making of worlds. Duke University Press.
Haraway, D. J. (2016). Staying with the trouble: Making kin in the Chthulucene. Duke University Press.
Jasanoff, S. (2015). Future imperfect: Science, technology, and the imaginations of modernity. In S. Jasanoff & S.-H. Kim (Eds.), Dreamscapes of modernity: Sociotechnical imaginaries and the fabrication of power (pp. 1-33). University of Chicago Press.
Mazzucato, M. (2021). Mission economy: A moonshot guide to changing capitalism. Harper Business.
Morozov, E. (2013). To save everything, click here: The folly of technological solutionism. PublicAffairs.
Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. Cambridge University Press.
Srnicek, N. (2017). Platform capitalism. Polity Press.
Zuboff, S. (2022). Surveillance capitalism or democracy? The death match of institutional orders and the politics of knowledge in our information civilization. Organization Theory, 3(3), 1-79. https://doi.org/10.1177/26317877221129290
Chapter 40: Your Responsibility: From Knowledge to Action
Arendt, H. (1963). Eichmann in Jerusalem: A report on the banality of evil. Viking Press.
Benjamin, R. (2019). Race after technology: Abolitionist tools for the New Jim Code. Polity Press.
Costanza-Chock, S. (2020). Design justice: Community-led practices to build the worlds we need. MIT Press.
Floridi, L. (2019). Translating principles into practices of digital ethics: Five risks of being unethical. Philosophy & Technology, 32(2), 185-193. https://doi.org/10.1007/s13347-019-00354-x
Jonas, H. (1984). The imperative of responsibility: In search of an ethics for the technological age. University of Chicago Press.
Latour, B. (2004). Why has critique run out of steam? From matters of fact to matters of concern. Critical Inquiry, 30(2), 225-248. https://doi.org/10.1086/421123
Sen, A. (2009). The idea of justice. Harvard University Press.
Vallor, S. (2016). Technology and the virtues: A philosophical guide to a future worth wanting. Oxford University Press.
Winner, L. (1986). The whale and the reactor: A search for limits in an age of high technology. University of Chicago Press.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.