Chapter 31: Key Takeaways — The Environmental Cost of AI

Core Concepts

1. AI Has Two Distinct Energy Cost Components: Training and Inference Training — building the model — is a one-time (or periodic) large energy expenditure. Inference — running the model for users — is a continuous, cumulative expenditure that typically exceeds training costs over a model's deployment lifetime. Both matter for environmental accounting; inference is often underweighted because it is less dramatic and harder to attribute to specific systems.

2. Carbon Disclosure by AI Companies Is Voluntary, Inconsistent, and Insufficient Major AI companies — OpenAI, Anthropic, Google DeepMind, Meta AI — do not publish model-level or application-level carbon costs. Aggregate corporate sustainability reporting obscures the specific environmental impact of individual AI systems. The public cannot assess whether the benefits of AI justify its carbon costs because the costs are not disclosed.

3. A ChatGPT Query Consumes ~10x More Energy Than a Google Search The difference between AI-generated responses and traditional search indexing is substantial in energy terms. As AI is integrated into search, assistants, and productivity tools at global scale, this energy differential multiplies across billions of daily queries into a meaningful fraction of national electricity demand.

4. Water Consumption Is AI's Hidden Environmental Dimension Data center evaporative cooling consumes millions of gallons of freshwater per facility per day. The Li et al. (2023) estimate of 500 mL of water per ChatGPT conversation — while approximate — reveals that AI inference at scale is a significant water consumer. Water disclosure receives far less attention than carbon despite being a more immediately acute resource constraint in many deployment regions.

5. Data Center Siting in Water-Stressed Regions Creates Local Justice Problems Data centers concentrated in water-stressed areas (Phoenix, Arizona; drought-prone Chilean regions; western Netherlands) compete for finite water resources with agricultural users, indigenous communities, and residential users who have the least power to defend their interests in regulatory processes.

6. The Rebound Effect May Offset Efficiency Gains As AI makes computation cheaper, organizations use more computation. The Jevons paradox — efficiency improvements encourage greater total resource use — is likely operating in AI: per-unit energy consumption is improving, but total energy consumption is growing because AI deployment is expanding faster than efficiency improvements. Efficiency gains without regulatory constraints on total resource use may not produce net environmental benefit.

7. AI's Hardware Supply Chain Has Serious Environmental and Human Rights Dimensions The minerals required for AI chips (cobalt from DRC, lithium from South America), the semiconductor fabrication water use, and the e-waste from rapid hardware upgrade cycles represent environmental costs that do not appear in AI companies' energy and carbon reporting. Scope 3 supply chain emissions for AI hardware companies are substantial and largely undisclosed.

8. AI Also Has Genuine Climate Potential — But Evidence for Net Benefit Is Limited AI applications in grid optimization, climate modeling, materials discovery, and precision agriculture represent genuine opportunities to reduce climate impact. Whether any specific AI application produces net climate benefit requires rigorous cost-benefit analysis that is rarely conducted; the narrative of "AI for climate" should not substitute for such analysis.

9. Green AI Research Offers Genuine Technical Progress Model compression, knowledge distillation, efficient architecture design, and mixture-of-experts approaches demonstrate that AI capabilities can be achieved at substantially lower computational cost than brute-force scaling. The research community and industry should invest in and reward efficient AI alongside capable AI.

10. The Justice Dimension Is Not Peripheral The communities most exposed to AI's climate and water costs are not the communities capturing its benefits. Carbon emissions fall on globally distributed communities bearing climate consequences with the least adaptive capacity. Water withdrawals fall on local communities adjacent to data centers with the least political power to challenge them. Ethical AI development requires accounting for these externalities rather than externalizing them.


Key Frameworks

The Energy Hierarchy for AI Deployment When evaluating AI deployment, apply an energy hierarchy: (1) Is the AI task necessary, or can the objective be achieved with less computation? (2) Can a smaller, more efficient model achieve adequate quality? (3) Is the inference infrastructure powered by genuinely low-carbon electricity? (4) Is water use minimized through efficient cooling? (5) Is the environmental cost disclosed and accounted for?

The Double Materiality Test for AI Environmental Disclosure Drawn from the EU Corporate Sustainability Reporting Directive: evaluate both (a) how environmental risks (carbon regulation, water scarcity, resource costs) affect AI businesses financially, and (b) how AI businesses affect the environment through emissions, water use, and supply chain impacts. Both dimensions are material; currently, only the first is consistently reported.

Green AI Principles Following Schwartz et al. (2020): (1) Report the computational cost of AI alongside accuracy metrics; (2) Prefer efficient models over more computationally intensive ones where performance is comparable; (3) Measure and disclose inference energy costs for widely deployed systems; (4) Invest in efficiency research as a first-class research priority.


Practical Guidance for Business Professionals

Organizations that deploy AI should: - Include the energy and carbon cost of AI inference in their Scope 2 and Scope 3 emissions accounting - Prefer AI service providers that disclose energy consumption and use verified renewable energy - Evaluate whether smaller, more efficient models can achieve adequate performance for specific tasks - Anticipate that AI environmental disclosure requirements will become mandatory and begin building measurement capacity now - In procurement of AI services, ask vendors to disclose model-level and service-level energy and carbon metrics

The absence of current mandatory disclosure requirements does not eliminate environmental impact. It eliminates visibility — which is a different thing, and a temporary one.