Case Study 2: The IPCC Climate Assessment Reports and Scientific Reporting at Global Scale
The Intergovernmental Panel on Climate Change (IPCC) is a UN body that periodically produces comprehensive assessment reports synthesizing the current state of climate science. Each assessment report (AR) runs to thousands of pages, includes hundreds of figures, cites thousands of papers, and involves hundreds of scientists from dozens of countries. The production process takes years and is one of the most complex scientific reporting efforts in history. The IPCC reports are the opposite of the automated morning email in Case Study 1 — they are painstakingly crafted, reviewed, and negotiated, and they represent the high end of scientific reporting. But automation still plays a role, and examining where and how is instructive.
The Situation: Reporting the State of Climate Science
The IPCC was created in 1988 by the World Meteorological Organization and the United Nations Environment Programme. Its mandate was to provide an objective scientific view of climate change for governments and the public. The IPCC does not conduct original research; instead, it assesses and synthesizes the vast scientific literature on climate change, producing comprehensive reports every 5-7 years.
The IPCC has completed six assessment cycles:
- AR1 (1990): ~400 pages total.
- AR2 (1996): ~800 pages.
- AR3 (2001): ~2,000 pages.
- AR4 (2007): ~2,800 pages.
- AR5 (2013-2014): ~4,000 pages.
- AR6 (2021-2022): ~4,500+ pages, plus special reports.
Each AR has three main "Working Group" reports (WG1 on physical science, WG2 on impacts and adaptation, WG3 on mitigation) plus a synthesis report. Each Working Group report has dozens of chapters, hundreds of figures, and tens of thousands of references. The production is coordinated by a small IPCC Secretariat in Geneva, but the actual writing is done by hundreds of volunteer scientists from around the world.
The reports are influential. They are cited in policy debates, climate treaties, legal proceedings, and scientific papers. Governments negotiate the summary language line by line. The reports shape how the world understands and responds to climate change.
The Production Process
Producing an IPCC assessment report takes about 5-7 years from start to finish. The process:
Year 1: Scoping. The IPCC Bureau and expert panels decide the scope and outline of the report. This includes the chapter structure, the high-level questions to address, and the timeline for production.
Year 2: Author selection. For each chapter, lead authors are nominated by national governments and scientific institutions. The IPCC selects Coordinating Lead Authors (2-3 per chapter) and Lead Authors (10-20 per chapter), balancing expertise, geography, and experience. Review Editors are also appointed.
Year 2-3: First draft. Authors read the literature, synthesize findings, and write the first draft of their chapters. They meet in person several times during this period to coordinate, resolve disagreements, and review each other's work. Each chapter is typically 100-200 pages with ~50 figures and ~500 references.
Year 4: First review. The first draft is released for expert review by a larger community of scientists. Hundreds to thousands of reviewers submit comments. The authors respond to every comment in writing, explaining whether they accepted the change and why.
Year 4-5: Second draft. The authors revise the chapters based on the review comments. The resulting "second order draft" is usually substantially different from the first — tighter, better referenced, more carefully hedged.
Year 5: Second review. The second-order draft goes out for another round of review, this time to both expert reviewers and government representatives. More comments, more responses.
Year 5-6: Final draft. Authors finalize the chapters. The scientific content is essentially locked at this point.
Year 6: Summary for Policymakers (SPM) drafting. Authors draft the "Summary for Policymakers" — a 30-50 page executive summary of the full report's findings. The SPM is the most-read part of the report.
Year 6: Government approval. The SPM goes to an intergovernmental approval session where government representatives from around the world negotiate the wording line by line. Every sentence must be agreed to by consensus. This process takes a week of round-the-clock meetings and produces a document that is scientifically accurate but diplomatically acceptable to all participating governments.
Year 6: Publication. The approved SPM is released alongside the full report. The report is published as a series of PDFs and hosted on the IPCC website (www.ipcc.ch).
Year 6+: Dissemination. The IPCC Secretariat, authors, and national academies promote the report through press conferences, policy briefings, lectures, and educational materials.
Where Automation Fits
The IPCC reports are mostly hand-crafted — humans write the text, humans draw the figures, humans negotiate the SPM. Automation is not the primary mode of production. But automated tools play critical supporting roles.
Literature search and citation management. Authors use tools like Zotero, Mendeley, or EndNote to track references. For a chapter with 500 references, automated citation management is essential. The authors also use literature search engines (Google Scholar, Web of Science) to find relevant papers.
Figure generation. Many figures are produced with matplotlib, ncview, Panoply, or other Python/scientific tools. The authors have the raw data (from climate models, observational datasets, or previous studies) and generate figures programmatically. The AR6 figures, in particular, were mostly produced with Python — matplotlib, xarray, cartopy, and seaborn are standard in the climate science community.
Data provenance. Each figure in an IPCC report is expected to be reproducible. The underlying data is published alongside the report (often on the IPCC Data Distribution Centre) with metadata about sources and methods. For AR6, this included Jupyter notebooks showing how each figure was generated.
Review comment tracking. The review process produces thousands of comments per chapter. The IPCC uses custom tracking systems to manage comments, author responses, and revision histories. This is automation applied to a manual process — the commenting itself is human, but the tracking is technical.
Translation. The IPCC reports are translated into multiple UN languages (English, French, Spanish, Russian, Chinese, Arabic). Professional translators do the work, but translation memory tools track terms and ensure consistency across chapters.
Website and media. The IPCC website is built from the source content with modern web tools. The press materials are generated from the SPM and the full report using templating (similar to Jinja2). The visual identity is consistent across materials because a brand template is applied automatically.
Data visualizations for outreach. Beyond the formal figures in the reports, the IPCC produces interactive dashboards, animated videos, and infographics for public communication. These use tools like D3.js, Plotly, and custom interactive frameworks. Some of the more recent outreach materials use Streamlit or Dash for interactive exploration.
So while the core writing and figure design is human, the production infrastructure around the reports is heavily automated. Without the automation, the sheer volume of content (tens of thousands of pages, thousands of figures, thousands of citations) would be unmanageable.
The Challenges of Massive Scientific Reports
The IPCC reports face several challenges that are unusual even for large scientific documents:
Multi-author coordination. A chapter with 20 authors writing together needs strong coordination tools. Google Docs, LaTeX with Git, and custom document management systems all get used. No single tool dominates; different working groups use different approaches.
Consensus building. The SPM negotiation is famously grueling. Scientists and governments have different priorities — scientists want to represent the literature accurately, governments want to protect their political positions. The negotiated text is a compromise, and some critics argue it weakens the scientific message. Others argue that the consensus is what gives the SPM its authority.
Uncertainty communication. IPCC reports use a specific vocabulary for uncertainty: "virtually certain" (>99% probability), "very likely" (>90%), "likely" (>66%), "about as likely as not" (33-66%), "unlikely" (<33%), "very unlikely" (<10%), "exceptionally unlikely" (<1%). This vocabulary is standardized across all reports so readers can compare statements. Automating the translation from "the paper says X" to "the IPCC says Y with [likelihood]" is subtle and requires careful calibration.
Visualization consistency. With hundreds of figures across multiple Working Groups, maintaining visual consistency is hard. AR6 made an effort to produce figures with a common style, using consistent colormaps, typography, and panel labels. This required a shared matplotlib style sheet and coordination across author teams. The result is the most visually cohesive IPCC report to date.
Global accessibility. The reports must work for readers in dozens of countries, with varying English fluency and scientific backgrounds. Text is written in plain but precise English, figures are labeled with care, and translations are commissioned immediately after publication.
Theory Connection: The Limits of Automation
The IPCC reports illustrate the limits of what automation can do for high-stakes scientific reporting. The most important parts of the process — synthesizing the literature, writing the text, negotiating the SPM — cannot be fully automated without losing the authority that makes the reports valuable. Automation helps with the mechanical parts (figure generation, citation management, translation memory, website production) but not with the intellectual parts.
This is a specific case of a general pattern: automation shifts work rather than eliminating it. The IPCC does not spend less time on reports because of automation; it spends the time differently. Figure production that used to take weeks now takes days, freeing authors to spend more time on scientific interpretation. Citation management that used to be nightmarish is now smooth, freeing authors to focus on reading. The total effort is similar, but the effort is allocated differently.
For practitioners, the lesson is to automate the mechanical parts and leave the judgment parts to humans. In your own report pipelines, identify which tasks are truly mechanical (data loading, chart building, layout, delivery) and automate them. Identify which tasks require judgment (interpretation, emphasis, commentary, audience tailoring) and keep humans in the loop. The hybrid approach produces better results than either full automation or full manual work.
The morning email from Case Study 1 and the IPCC AR report from Case Study 2 bracket the range of automated reporting. The morning email is maximum automation (content and delivery). The IPCC report is minimum automation (only infrastructure). Most real report pipelines sit between these extremes, and the right balance depends on the context. For a daily operational report, automate everything. For a high-stakes scientific synthesis, automate the infrastructure and let humans write.
Discussion Questions
-
On consensus in science. The IPCC's consensus-based process produces reports that all participating governments agree to. Does this improve the authority of the reports, or does it weaken the scientific message?
-
On automation limits. What parts of the IPCC process could be further automated? What parts cannot?
-
On visualization consistency. AR6 made an effort to produce visually cohesive figures across working groups. What tools and processes enabled this?
-
On the SPM negotiation. The line-by-line negotiation of the Summary for Policymakers is famously painful. Is there a better process, or is pain inherent to consensus-building?
-
On your own scientific reports. If you were writing a multi-chapter scientific report with many collaborators, what tools and processes would you use?
-
On the IPCC vs. the morning email. These two case studies represent opposite ends of the automation spectrum. Where does most real-world reporting fall?
The IPCC reports are one of the most ambitious scientific reporting efforts in history. They are produced mostly by hand, with automation supporting rather than replacing human work. The contrast with the morning email (Case Study 1) is instructive: different contexts call for different balances of automation and human effort. When you build your own report pipelines, think about which end of the spectrum you are on, and automate accordingly. For most business and scientific contexts, you will be somewhere in the middle — some automation for efficiency, some human input for judgment. The tools in this chapter support both ends of the spectrum.