Chapter 31: Quiz — The Environmental Cost of AI
20 questions. Mix of multiple choice, true/false, and short answer.
Multiple Choice
1. The 2023 study by Li et al. ("Making AI Less 'Thirsty'") estimated that ChatGPT consumes approximately how much water per conversation?
A) 50 milliliters (about 1.7 oz) B) 500 milliliters (about half a liter) C) 5 liters (about 1.3 gallons) D) 50 liters (about 13 gallons)
Answer: B — Li et al. estimated approximately 500 mL of freshwater per ChatGPT conversation, primarily through evaporative cooling in data centers.
2. The Patterson et al. (2021) study estimated the training carbon cost of GPT-3 at approximately:
A) 55 metric tons CO2 equivalent B) 552 metric tons CO2 equivalent C) 5,520 metric tons CO2 equivalent D) 55,200 metric tons CO2 equivalent
Answer: B — Patterson et al. estimated GPT-3 training at approximately 552 metric tons CO2e, roughly equivalent to driving a car around the Earth 70 times.
3. The Jevons paradox (rebound effect) predicts that efficiency improvements in AI computing will:
A) Proportionally reduce total energy consumption B) Have no effect on total energy consumption C) Likely lead to increased total energy consumption as lower costs enable expanded AI deployment D) Reduce energy consumption only if accompanied by regulatory limits
Answer: C — The Jevons paradox holds that efficiency improvements reduce per-unit cost, enabling greater total use and potentially increasing total resource consumption — a pattern historically observed in energy technologies.
4. In the three-scope carbon accounting framework, which scope covers the carbon emissions from manufacturing the GPUs used to train an AI model?
A) Scope 1 B) Scope 2 C) Scope 3 D) None — hardware manufacturing emissions are not covered by the GHG Protocol
Answer: C — Hardware manufacturing emissions are supply chain emissions, covered by Scope 3 (indirect value chain emissions). Most AI company reporting covers only Scope 1 and 2.
5. Which cooling technology for data centers consumes the most freshwater?
A) Air cooling (using fans and ambient air only) B) Evaporative cooling (cooling towers) C) Liquid immersion cooling D) Geothermal cooling
Answer: B — Evaporative cooling uses water evaporation to remove heat; a significant portion of the water used evaporates and must be replaced with fresh water. Air cooling uses no water; liquid immersion typically uses closed-loop systems with minimal water loss.
6. The Green Software Foundation's Software Carbon Intensity (SCI) metric is designed to:
A) Measure the water consumption of software systems B) Provide a methodology for measuring and comparing the carbon intensity of software C) Certify AI models as carbon neutral D) Regulate greenhouse gas emissions from data centers
Answer: B — The SCI is a measurement standard — analogous to miles per gallon for automobiles — that enables consistent comparison of software's carbon intensity across different systems.
7. Google's 24/7 carbon-free energy commitment is more demanding than traditional "100% renewable energy" claims primarily because:
A) It applies to all Google data centers globally B) It requires matching electricity consumption to carbon-free sources in the same hour and location, not just on an annual average basis C) It includes Scope 3 supply chain emissions D) It prohibits the use of any fossil fuel backup power
Answer: B — The 24/7 standard requires temporal and geographic matching — carbon-free electricity at the time and place of consumption — rather than the annual averaging that allows renewable energy certificates to cover non-renewable consumption during some hours.
8. The EU Corporate Sustainability Reporting Directive (CSRD) requires environmental disclosure on a "double materiality" basis, which means:
A) Companies must report both Scope 1 and Scope 2 emissions B) Materiality must be assessed both for how sustainability risks affect the company financially and for how the company's activities affect society and the environment C) Two independent auditors must verify all sustainability disclosures D) Sustainability disclosures must be filed twice annually
Answer: B — Double materiality requires assessment of both financial materiality (how environmental factors affect the company) and impact materiality (how the company affects the environment) — a significantly more comprehensive standard than single financial materiality.
9. Approximately what percentage of global cobalt production comes from the Democratic Republic of Congo, making the AI hardware supply chain dependent on conditions in that country?
A) 20% B) 40% C) 70% D) 95%
Answer: C — Approximately 70% of global cobalt production comes from the DRC, where mining conditions have been documented as involving child labor, unsafe practices, and environmental contamination.
10. The Strubell et al. (2019) paper "Energy and Policy Considerations for Deep Learning in NLP" was significant primarily because:
A) It proposed regulations requiring AI companies to disclose carbon emissions B) It was the first paper to make AI training carbon costs visible to the research community, catalyzing the Green AI movement C) It demonstrated that AI training was more carbon-intensive than commercial air travel D) It showed that renewable energy could eliminate AI's carbon footprint
Answer: B — The Strubell et al. paper made AI training energy costs visible to the ML research community for the first time, sparking the "Green AI" research agenda and the broader conversation about AI's environmental impact.
True/False
11. Training a large language model like GPT-3 is the dominant source of that model's lifetime carbon emissions, exceeding inference costs over the model's deployment period.
Answer: False — While training is a large one-time cost, inference (running the model for users) is cumulative. At the scale of hundreds of millions of daily queries for widely deployed models, inference carbon costs typically exceed training costs over the model's deployment lifetime within weeks or months.
12. A single ChatGPT query consumes approximately 10 times as much energy as a standard Google search query.
Answer: True — Standard estimates place a Google search query at approximately 0.0003 kWh and an AI-assisted query at approximately 0.003 kWh, a roughly 10x difference, though both figures are approximations and vary with query complexity and hardware efficiency.
13. Microsoft's commitment to become "carbon negative" by 2030 means it will have no carbon emissions from its operations by that date.
Answer: False — "Carbon negative" means removing more carbon from the atmosphere than the company emits — through carbon capture, nature-based solutions, and other methods — not eliminating emissions. The company's emissions would still be positive; they would be more than offset by removals.
14. Air cooling for data centers eliminates freshwater consumption but typically requires more energy than evaporative cooling in hot climates.
Answer: True — Air cooling moves heat to the ambient air without water evaporation; it requires more energy in hot climates because hot air is less effective at absorbing heat than evaporating water. The trade-off is reduced water use at the cost of higher energy (and therefore potentially higher carbon) consumption.
15. The EU AI Act requires AI companies to disclose specific energy consumption figures for individual AI models at defined quantitative thresholds.
Answer: False — The EU AI Act requires high-risk AI systems to document computational resource consumption and requires energy efficiency consideration in design, but does not mandate specific quantitative energy disclosure at the model level with defined thresholds. It directs the European Commission to develop measurement methodologies, acknowledging current frameworks are insufficient.
Short Answer
16. Explain why data center evaporative cooling is particularly environmentally problematic when data centers are sited in arid, drought-prone regions, and describe one alternative approach that could reduce this impact.
Model Answer: Evaporative cooling works by exposing hot water to ambient air, causing some water to evaporate and carrying heat away as latent heat of evaporation. The evaporated water is lost from the system and must be replaced with fresh water. The technology is particularly efficient in hot, dry climates because dry air can absorb more evaporated water than humid air — meaning the same cooling effect is achieved with more evaporation. But hot, dry climates are also typically water-stressed: they receive little rainfall, have limited surface water, and often rely on groundwater or long-distance water transport. Data centers in these climates therefore compete for scarce freshwater resources with agricultural users, municipalities, and ecosystems. In the Phoenix, Arizona, area — one of the most significant US data center hubs — this creates direct competition with Colorado River water allocations already under mandatory cutback.
One alternative approach is air cooling: using fans to move ambient air across server components without any water evaporation. Air cooling eliminates freshwater consumption entirely but requires more energy in hot climates because hot air is less effective as a cooling medium than water evaporation. In practice, air cooling may be combined with a smaller evaporative cooling component for peak temperature management. Another alternative is closed-loop liquid cooling at the chip level, where liquid coolant directly absorbs heat from processor chips and is then cooled in a closed system that releases heat to the external environment without evaporating water — significantly reducing (though not eliminating) water consumption.
17. What is the "Jevons paradox" and how does it apply to AI efficiency improvements?
Model Answer: The Jevons paradox (also called the rebound effect) observes that improvements in resource use efficiency tend to increase rather than decrease total resource consumption, because lower cost per unit of output enables and encourages greater total use. William Stanley Jevons observed in 1865 that improvements in steam engine fuel efficiency led to increased total coal consumption — more efficient engines enabled more and cheaper steam-powered applications, driving up aggregate demand.
Applied to AI: as advances in model compression, hardware efficiency, and architecture design reduce the energy required per unit of AI capability, the cost barrier to AI deployment falls. Organizations that previously could not afford to run AI on all their data because the computing cost was prohibitive can now do so. Tasks that previously required expensive frontier models can now be handled by cheaper smaller models — freeing resources to run frontier models on previously uneconomic tasks. Queries that previously cost too much to route through an LLM are now routed through one. Each of these responses to lower cost per AI unit increases total AI use. The net effect on total energy consumption depends on whether the efficiency improvement percentage exceeds the deployment expansion percentage — and current evidence suggests expansion is outpacing efficiency improvement, producing growing absolute energy consumption despite improving efficiency per unit.
18. What is "Scope 3" carbon accounting, and why is it significant for AI companies' environmental reporting?
Model Answer: Scope 3 covers indirect greenhouse gas emissions in an organization's value chain — both upstream (suppliers, raw material extraction, component manufacturing) and downstream (product use by customers, end-of-life treatment). Unlike Scope 1 (direct emissions from owned operations) and Scope 2 (indirect emissions from purchased electricity), Scope 3 captures the full lifecycle carbon associated with an organization's activities.
For AI companies, Scope 3 emissions are significant and largely undisclosed for several reasons: Hardware manufacturing — the carbon embedded in producing GPUs, servers, and data center infrastructure, primarily through semiconductor fabrication processes that are energy-intensive and chemically complex — represents a substantial upstream supply chain emission that AI companies purchase from TSMC, Samsung, and other manufacturers. Customer inference — when AI companies sell API access to their models, the energy consumed by customer inference queries generates downstream Scope 3 emissions. When AI models are downloaded and run on customer hardware, those emissions are also downstream Scope 3. Rare earth mineral extraction — the supply chain for specialty metals in AI chips generates Scope 3 emissions at extraction and processing stages. Most AI company sustainability reporting covers only Scope 1 and 2, systematically understating their total climate impact. Full Scope 3 disclosure would require supply chain engagement, customer use data, and hardware lifecycle accounting that current voluntary practices do not achieve.
19. What is the "Green AI" research agenda, and what specific changes does it propose for AI research publication norms?
Model Answer: The Green AI agenda, articulated by Roy Schwartz and colleagues in a 2020 Communications of the ACM paper building on Strubell et al. (2019), argues that AI research has been systematically incentivized toward larger, more computationally intensive models without accountability for their energy and environmental cost. The research community celebrates accuracy improvements without measuring or reporting the computational cost at which those improvements were achieved, creating an incentive structure in which a 1% accuracy gain using 10x more computation is as publishable as a 1% gain using the same computation — even if the former is much less valuable per unit of environmental cost.
Specifically, the Green AI agenda proposes: mandatory reporting of computational cost (measured in FLOPs or GPU-hours) in AI research publications, alongside accuracy metrics; development of "efficiency leaderboards" that rank models on the basis of accuracy per unit of computation, not just absolute accuracy; evaluation frameworks that give academic credit for achieving similar performance at lower computational cost; explicit research investment in model compression, knowledge distillation, efficient architecture design, and other approaches that produce competitive performance at lower computational cost; and community norms, enforced by conference and journal editors, that treat computational cost as a material methodological fact that must be disclosed.
20. Describe two specific ways that AI could contribute to addressing the climate crisis, and explain the conditions under which these contributions would produce genuine net climate benefit rather than merely offsetting some of AI's own environmental cost.
Model Answer: Two significant potential positive contributions:
Energy grid optimization: AI systems can optimize electricity grid dispatch — predicting demand patterns, managing variable renewable energy inputs (wind, solar), reducing transmission losses, and improving the economic integration of storage into grid management. DeepMind demonstrated meaningful improvements in grid balancing cost in partnership with UK grid operators. For these applications to produce genuine net climate benefit, the conditions include: the energy consumed by the AI system itself (training and inference) must be substantially less than the energy savings enabled; the grid optimization must demonstrably increase renewable energy penetration rather than just improving existing grid economics; and the optimization benefit must be demonstrated through rigorous before-after or controlled comparison, not merely asserted.
Materials discovery for clean energy: AI tools can accelerate discovery of new materials for solar cells, batteries, and electrolyzers by screening vast chemical spaces and predicting material properties. AlphaFold's demonstration in protein structure prediction establishes that AI-accelerated scientific discovery is achievable. For materials discovery to produce genuine net climate benefit, the discovered materials must reach commercial deployment — not merely exist as laboratory results; the manufacturing of those materials at scale must itself be feasible and low-carbon; and the timescale of impact must be considered (materials discovery to commercial deployment typically takes 10–20 years). In both cases, the condition for genuine net benefit is rigorous measurement of both the AI system's own environmental cost and the climate benefit attributed to its outputs, compared against the counterfactual (what climate outcome would have occurred without the AI system), with appropriate uncertainty quantification.