24 min read

There is a temptation, in a textbook about surveillance, to treat it as a modern phenomenon — something that emerged with CCTV cameras in the 1970s, became expansive with the internet in the 1990s, and became omnipresent with smartphones after 2007...

Chapter 23: Weather Surveillance and Climate Monitoring

Opening: The First Surveillance Network

There is a temptation, in a textbook about surveillance, to treat it as a modern phenomenon — something that emerged with CCTV cameras in the 1970s, became expansive with the internet in the 1990s, and became omnipresent with smartphones after 2007. This temptation should be resisted. Systematic observation, the collection and analysis of data from distributed sources, the integration of that data into decision-making by institutions with power over people's lives — this is surveillance, and it is very old.

The oldest continuous surveillance network in the modern sense is not a police database or a credit bureau. It is the weather observation system.

Human beings have been systematically monitoring atmospheric conditions since at least the seventeenth century. The Medici network of weather stations, established in Italy in 1654 under Grand Duke Ferdinand II, placed barometers and thermometers at multiple locations across Europe and required regular, standardized observations to be recorded and transmitted to a central location in Florence for analysis. This was, by any reasonable definition, a surveillance network: distributed sensors, standardized data collection, transmission to a center, and analysis for institutional purposes.

The Grand Duke wanted to know, before it was apparent locally, whether weather patterns were changing. He wanted advance warning. He wanted the view from above — not from a literal elevation but from an information position that no single local observer could occupy.

This desire — to see more than any individual can see, to detect patterns across space and time, to know before you would otherwise know — is the fundamental motivation of surveillance. Weather observation expressed it centuries before CCTV or the internet existed.


23.1 The History of Weather Observation: Building the Network

The modern global weather observation system developed over three centuries through a series of technological advances and institutional agreements. Understanding this history reveals weather monitoring as a paradigm case of surveillance infrastructure development — and illuminates why it has remained relatively uncontroversial while other, structurally similar, surveillance systems are intensely contested.

Early Instrumentation: Creating the Measurable Atmosphere

Before instruments, weather could be observed and described but not measured. Temperature was hot or cold; wind was still or stormy; rain was light or heavy. These descriptions were qualitative, non-standardized, and not comparable across locations or observers.

The invention of measurable atmospheric instruments transformed observation into data:

  • Thermometer (ca. 1650, various inventors including Galileo): Provided standardized temperature measurement, enabling comparison across observers and locations
  • Barometer (Evangelista Torricelli, 1643): Measured atmospheric pressure, which proved to be the key variable for weather prediction — falling pressure typically precedes storms
  • Rain gauge: Standardized precipitation measurement
  • Anemometer (León Battista Alberti, ca. 1450; standardized forms later): Measured wind speed

The Medici network was among the first to systematically deploy these instruments across multiple locations with standardized protocols. But the network was limited in scale and survived only while institutionally supported — when the Medici political power waned, the network dissolved.

The Telegraph and the Synoptic Map

The fundamental problem of early weather networks was time. A barometer reading in Paris was interesting; the same reading integrated with simultaneous readings from Brussels, London, Frankfurt, and Madrid was revolutionary. But collecting simultaneous observations from across a continent required rapid communication.

The electric telegraph, commercially available by the 1840s, solved this problem. For the first time, atmospheric measurements could be transmitted faster than weather moves. An observation made in Paris at 8:00 AM could be on the desk of a London meteorologist at 8:15 AM — before the weather system it documented had reached the Channel.

The synoptic map — a map showing weather conditions simultaneously across a large area — became possible only with the telegraph. "Synoptic" means "seeing together," which is precisely the surveillance aspiration: to see what no single observer could see, assembled from the reports of many.

The United States established the Signal Corps weather service (later the Weather Bureau, later the National Weather Service) in 1870, operating a telegraph-linked network of weather stations across the country. Europe developed similar national networks. By the late nineteenth century, regular exchange of weather data between national services was occurring across Europe, and the first international meteorological conference (Vienna, 1873) established the principles of international data sharing that still govern the World Meteorological Organization today.

📝 Note: International Weather Data as Commons

One of the most remarkable features of the global weather observation system is that it operates on a principle of mandatory data sharing between nations. Under World Meteorological Organization (WMO) agreements, national weather services are required to share basic observational data in near-real time with all other member nations. This means that data collected by the U.S. National Weather Service is immediately available to Russia, China, North Korea, and every other WMO member — and vice versa. Weather data is, in this sense, a genuine global commons. This stands in stark contrast to intelligence satellite imagery or epidemiological data, which are jealously guarded by national institutions. The contrast raises a question: what made weather data a commons, and what would it take to extend that principle to other forms of surveillance data?

Radiosondes: Taking the Third Dimension

Surface weather stations measure what is happening on the ground. But weather is a three-dimensional phenomenon — the conditions at 10,000 feet matter enormously for understanding and predicting what will happen at the surface. Upper-air measurements were impossible until the development of the radiosonde in the early twentieth century.

A radiosonde is an instrument package carried aloft by a weather balloon. As the balloon rises through the atmosphere, the radiosonde continuously measures and transmits: - Temperature - Relative humidity - Atmospheric pressure (used to calculate altitude) - Wind speed and direction (calculated from GPS tracking of the balloon's horizontal drift)

The balloon rises until it bursts, typically at altitudes of 20–30 km, and the radiosonde descends on a small parachute. About half of released radiosondes are recovered and returned to weather services; the other half become litter.

The global radiosonde network comprises approximately 900 stations worldwide, each launching two balloons per day at the same time (00:00 UTC and 12:00 UTC) to enable simultaneous global upper-air observations. This coordinated, standardized, global data collection is one of the most systematic scientific observations in human history.

Process Diagram: The Radiosonde Launch and Data Collection Cycle

Radiosonde instrument package prepared
        ↓
Attached to latex weather balloon filled with helium or hydrogen
        ↓
Balloon released from surface station
        ↓
During ascent (typically 90-120 minutes):
  Temperature, humidity, pressure measured every 1-2 seconds
  GPS position tracked continuously → wind calculated from drift
  Data transmitted to ground station via radio signal
        ↓
Balloon reaches burst altitude (20-30 km)
        ↓
Balloon bursts, radiosonde descends on parachute
        ↓
Data transmitted to national weather service
        ↓
Shared with World Meteorological Organization global exchange
        ↓
Incorporated into numerical weather prediction models

23.2 Doppler Radar: Seeing the Storm's Anatomy

Weather balloons measure atmospheric conditions at specific locations and times. They cannot track the detailed structure of precipitation, severe weather, or wind patterns across a landscape in real time. This is the domain of Doppler radar.

How Doppler Radar Works

Weather radar works by transmitting pulses of microwave energy and measuring the time and intensity of their return echo from precipitation (rain, snow, hail). From the return echo, radar systems can determine:

  • Precipitation location: Where rain or snow is falling, mapped across the radar's range (typically 250-460 km for long-range systems)
  • Precipitation intensity: The stronger the return echo, the heavier the precipitation
  • Precipitation type: Dual-polarization radar (introduced to the U.S. network in 2013) can distinguish rain, snow, hail, and mixed precipitation by comparing vertical and horizontal radar returns

The "Doppler" component — which distinguishes modern weather radar from earlier systems — adds velocity measurement. By analyzing the slight frequency shift in the return echo (the Doppler effect, the same principle that makes a passing siren shift in pitch), Doppler radar measures the speed at which precipitation is moving toward or away from the radar antenna. This enables:

  • Wind speed mapping: Doppler velocity products show wind structure across the radar's coverage area
  • Rotation detection: Wind shear (adjacent parcels of air moving in different directions) creates characteristic Doppler signatures that indicate tornado potential. Most tornadoes are now detected by Doppler radar before they touch the ground, providing life-saving advance warning

Process Diagram: How Doppler Radar Detects Severe Weather

Radar antenna rotates 360°, transmitting microwave pulses at multiple elevation angles
        ↓
Each pulse returns echoes from precipitation
  Return timing → distance to precipitation
  Return intensity → precipitation rate
  Return frequency shift → precipitation velocity (toward/away from radar)
        ↓
Data from all directions assembled into:
  Reflectivity product: Where is it raining/snowing?
  Velocity product: How is precipitation moving?
  Dual-polarization products: What type of precipitation?
        ↓
Algorithms process data:
  Detect rotation signatures → mesocyclone/tornado alerts
  Calculate precipitation accumulation
  Identify melting layer (bright band)
        ↓
Products distributed to:
  National Weather Service offices → human meteorologists interpret
  Broadcast meteorologists
  Emergency managers
  Automated alert systems (NWS Wireless Emergency Alerts)

The WSR-88D Network

The United States operates 159 Weather Surveillance Radar-1988 Doppler (WSR-88D) stations — commonly known as NEXRAD (Next Generation Radar) — which together provide nearly complete radar coverage of the contiguous United States. Each station runs continuously, 24 hours a day, 365 days a year, producing complete volumetric scans of the surrounding atmosphere every 4-6 minutes.

This is remarkable surveillance infrastructure: 159 sensors, each monitoring a 300+ km radius sphere continuously, generating data that is processed automatically, distributed in real time, and archived indefinitely. The aggregate data volume from the NEXRAD network is enormous — and all of it is publicly available.

💡 Intuition: NEXRAD as a Model Surveillance Network

Consider NEXRAD's properties: it is distributed (159 stations), continuous (never stops), comprehensive (covers the entire country), standardized (all stations use the same hardware and protocols), networked (data is aggregated at central servers in near-real time), open (data is publicly accessible), and perpetually archived. If any other surveillance system had these properties — 159 distributed sensors providing continuous, comprehensive coverage of the entire country, with all data publicly available and archived indefinitely — it would be considered an extraordinary surveillance apparatus, with implications for civil liberties and privacy law. NEXRAD has all of these properties and generates essentially no civil liberties concern whatsoever. Why? Because its subjects are raindrops, not people.


23.3 Weather Satellites: The Atmospheric View from Orbit

Satellite imagery for weather observation predates commercial remote sensing by decades. The first meteorological satellite, TIROS-1 (Television Infrared Observation Satellite), was launched by NASA in April 1960 — the same year as the first successful Corona spy satellite mission.

Weather satellites occupy two primary orbital regimes:

Geostationary Satellites (GEO)

Geostationary satellites orbit at approximately 35,786 km altitude, exactly matching Earth's rotation rate and thus appearing to hang stationary over a fixed point on the equator. This allows them to continuously monitor the same large area of Earth without interruption.

The GOES (Geostationary Operational Environmental Satellite) series has been operated by NOAA since 1975. Current operational satellites are GOES-East (GOES-16, monitoring the Western Hemisphere from 75°W longitude) and GOES-West (GOES-18, monitoring the Americas and Pacific from 137°W). Together they provide continuous, high-temporal-resolution imagery of the Americas with image refresh rates as fast as 30 seconds for severe weather events.

Key GOES capabilities:

  • Visible imagery: High-resolution cloud images (at 0.5 km resolution for the primary channel) updated continuously
  • Infrared imagery: Cloud top temperature measurement, enabling detection of thunderstorm intensity and fog
  • Water vapor imagery: Direct measurement of water vapor in the upper troposphere, essential for tracking large-scale atmospheric flow
  • Lightning mapping: The Geostationary Lightning Mapper on GOES-16/18 detects individual lightning flashes across the Americas, providing early warning of convective intensification

Polar Orbiting Satellites (LEO)

Polar orbiting weather satellites orbit at lower altitudes (approximately 850 km) and pass over every location on Earth approximately twice daily. Because they fly lower, they can carry instruments with finer resolution and higher sensitivity than geostationary satellites. The primary U.S. polar orbiting weather satellites are operated under the JPSS (Joint Polar Satellite System) program.

Polar orbiting satellites are particularly important for: - Atmospheric soundings: Temperature and humidity profiles derived from microwave and infrared instruments, equivalent to a radiosonde but without launching physical balloons - Soil moisture and sea surface temperature: Passive microwave sensors detect radiation emitted by Earth's surface - Sea ice extent: Passive microwave imagery enables year-round monitoring of Arctic and Antarctic sea ice regardless of cloud cover or darkness

🌍 Global Perspective: The WMO Data Commons in Practice

Under World Meteorological Organization Resolution 40 (1995), basic meteorological data — including weather satellite imagery, upper-air soundings, and surface observations — must be shared freely and without restriction between WMO member nations. This means that NOAA's GOES imagery is available to any nation's weather service, as is European Meteosat data, Japanese Himawari data, and so on. The global weather observation system is thus a genuine data commons — the most successful example in history of a large-scale, institutionalized commitment to open data sharing across national boundaries. The contrast with surveillance data, which is typically classified and hoarded, is instructive.


23.4 The Modern Weather Network: Surface Stations, Buoys, and ASOS

Weather satellites and Doppler radar are the most visible components of the weather observation system, but the backbone of operational meteorology is the surface observation network — thousands of standardized instruments measuring conditions at ground level.

The ASOS Network

The Automated Surface Observing System (ASOS) comprises more than 900 automated weather stations at airports and other locations across the United States. Each ASOS station measures: - Temperature and dew point - Wind speed, direction, and gusts - Visibility (using a forward-scatter visibility sensor) - Precipitation type and rate - Sky condition (ceiling and cloud layers, using a ceilometer laser) - Atmospheric pressure (altimeter setting)

ASOS stations report conditions every minute and transmit standardized coded weather observations (METARs) every 30 minutes — or more frequently when conditions change significantly. The METAR format is internationally standardized, enabling direct comparison of observations from any ASOS station worldwide.

Buoys and Ocean Monitoring

Ocean weather observations present unique challenges: there are no land surfaces to mount instruments, and maintaining human-staffed ships across the global ocean is prohibitively expensive. The solutions are moored buoys (anchored to the ocean floor in fixed locations) and drifting buoys (carried by ocean currents).

The National Data Buoy Center (NDBC) operates more than 1,000 buoys and Coastal-Marine Automated Network (C-MAN) stations, providing weather observations from offshore locations that would otherwise have no surface data. Buoys measure: - Air temperature, pressure, and humidity - Wind speed and direction - Wave height, period, and direction - Ocean surface temperature - In some cases, ocean current speed and direction

The ARGO float program extends ocean observation below the surface: approximately 4,000 autonomous profiling floats cycle between the surface and 2,000 m depth, measuring temperature and salinity at multiple depths and transmitting data via satellite when they surface. This is perhaps the most remarkable example of environmental surveillance — autonomous devices drifting throughout the world's oceans, continuously measuring conditions, and reporting to central databases, operated by dozens of nations as a global commons.


23.5 Climate Surveillance: The Long View

Operational weather forecasting uses surveillance data to answer the question "what will happen in the next few days?" Climate science uses surveillance data to answer the question "how is the Earth changing over decades and centuries?" The distinction is one of timescale, not fundamental methodology.

CO2 and Greenhouse Gas Monitoring

The most consequential form of climate monitoring is the measurement of atmospheric greenhouse gas concentrations. The foundational dataset is the Keeling Curve — continuous measurements of atmospheric CO2 concentration taken at the Mauna Loa Observatory in Hawaii since 1958 by Charles David Keeling and his successors.

The Keeling Curve is now the most important graph in environmental science: it shows the monotonic increase in atmospheric CO2 from 315 parts per million in 1958 to more than 420 ppm by 2023, with a characteristic seasonal oscillation overlaid on the long-term trend (Northern Hemisphere vegetation draws down CO2 in summer and releases it in winter, creating annual "breathing" cycles).

The Global Atmosphere Watch (GAW) network, coordinated by the WMO, maintains approximately 30 "global" baseline stations measuring greenhouse gases, aerosols, and other atmospheric constituents at locations specifically chosen for their distance from local pollution sources. This network provides the gold standard atmospheric composition data that underpins international climate assessments.

Ice Cores and Proxy Climate Records

Monitoring atmospheric change over historical timescales requires sources that predate instrumental records. Ice cores — samples drilled from glaciers and ice sheets in Greenland, Antarctica, and other cold regions — contain air bubbles trapped at the time the ice formed, preserving atmospheric composition from thousands to hundreds of thousands of years ago.

Analysis of ice core air bubbles provides direct measurements of past CO2 and methane concentrations. Analysis of the ice itself (specifically, the ratio of oxygen isotopes) provides temperature estimates. The combination produces a climate history extending 800,000 years into the past — revealing the relationship between greenhouse gas concentrations and global temperature across multiple glacial cycles.

Ice core analysis is not "surveillance" in the conventional sense, but it is monitoring over time — systematic observation of past conditions to understand trends and trajectories. The methodological continuity with operational atmospheric monitoring is exact.

🎓 Advanced: Evidence Evaluation Framework for Climate Data Claims

When evaluating claims about climate change based on monitoring data, apply the following framework:

TICS Framework (Type, Independence, Convergence, Scale)

Element Questions
Type What type of data supports the claim? Direct measurement, proxy record, model output, or combination?
Independence Are the supporting datasets independent of each other, or do they share common sources of systematic error?
Convergence Do multiple independent data types converge on the same conclusion? (Ice cores, direct measurements, satellite data, and ocean records all agree on warming trends — this convergence is strong evidence)
Scale What spatial and temporal scales does the evidence cover? A local temperature record is less compelling than a global network; a 20-year trend is less compelling than a 100-year one

Climate change science scores highly on all four dimensions: the evidence base includes multiple independent data types (direct atmospheric measurements, ice cores, ocean heat content, satellite-derived surface temperature, sea level rise) that converge on consistent conclusions across scales from local to global and from decade to century.


23.6 Private Weather: The Commodification of Atmospheric Surveillance

The global weather observation system was built by governments as a public good — understanding weather and providing forecasts is a matter of public safety, affecting agriculture, aviation, emergency management, and daily life for every person. But the twentieth and twenty-first centuries have seen the progressive privatization of weather data and forecasting services.

The Weather Company and IBM

The Weather Company, originally the Weather Channel (launched 1982), built a media and data business on top of the public observation infrastructure of NOAA and WMS. Using publicly collected weather data, the company developed proprietary modeling, forecasting, and data products sold to businesses and individuals. Acquired by IBM in 2016 for approximately $2 billion, the Weather Company now provides weather risk analytics to insurance companies, retail chains (weather drives foot traffic), utilities, and logistics companies.

The business model is instructive: the underlying surveillance infrastructure (weather observation networks) is publicly funded and freely available. The Weather Company's value is in the analytical layer built on top of that public infrastructure — proprietary models, decision-support tools, and sector-specific applications. This is a common pattern in the privatization of public surveillance infrastructure: the data collection is a public good; the valuable derivative products are private.

ClimaCell (Tomorrow.io) and Hyperlocal Forecasting

Tomorrow.io (formerly ClimaCell) represents a newer approach: using unconventional data sources — cellular network signal strength, GPS satellite signal delays, commercial aircraft data — to supplement and improve on traditional weather observation. The insight is that the cellular network is, in effect, a distributed weather sensor: when it rains, it affects the strength of microwave signals between cell towers in characteristic ways that can be detected and used to estimate precipitation intensity.

This is a clear example of surveillance repurposing: cell tower signal data collected primarily for network management is reanalyzed as a weather observation dataset. The cell towers become, without being designed for the purpose, nodes in a precipitation monitoring network.

⚠️ Common Pitfall: The Public/Private Infrastructure Confusion

A common confusion in discussions of weather privatization is between who operates the observation infrastructure and who profits from weather data. The core observation infrastructure — NEXRAD radars, GOES satellites, ASOS stations, radiosondes — is operated by government agencies (NOAA, NWS) and funded by taxpayers. The data these systems produce is freely available. Private companies do not own this infrastructure. What private companies own is the analytical and service layer built on top of the free data: proprietary forecasting models, API access, decision-support applications, and sector-specific products. The surveillance infrastructure remains public; the value extraction from that infrastructure is increasingly private.


23.7 When Climate Monitoring Meets Human Surveillance: Carbon Footprint Tracking

Weather and climate monitoring are generally regarded as uncontroversial — monitoring the atmosphere does not directly harm any individual or group, and the benefits (weather forecasts, climate science) are broadly shared. But at the edge of climate monitoring, there is a category of applications that begins to look more like the surveillance of people than the surveillance of weather.

Carbon Footprint Tracking

The concept of an individual "carbon footprint" — the amount of greenhouse gas emissions attributable to a single person's activities — was popularized in the early 2000s (and, significantly, by a BP advertising campaign designed to shift responsibility for climate change from corporations to individuals). Carbon footprint tracking takes this concept and applies surveillance logic to it: continuously monitoring an individual's activities to estimate their greenhouse gas emissions.

Several applications and services have been developed for personal carbon tracking: - Financial transaction monitoring: Some credit card companies and banking apps analyze spending patterns to estimate associated carbon emissions (e.g., Aspiration, Doconomy) - Travel tracking: Apps that monitor transportation mode (car, train, flight) and estimate associated emissions - Home energy monitoring: Smart meters and home energy management systems that track consumption and convert it to carbon equivalents

These tools represent a form of behavioral surveillance in which the stated purpose is environmental — helping individuals understand and reduce their carbon impact. But the data collected — transaction records, location history, energy consumption — is detailed personal data with uses well beyond carbon tracking.

🔗 Connection: Carbon Surveillance and the Individual vs. Systemic Debate

The focus on individual carbon footprints has been critiqued by climate justice advocates as a strategy for diverting attention from systemic corporate and government-level emissions reductions. A related critique applies to carbon tracking apps: if individuals are being surveilled for their environmental behavior while corporations are not subject to equivalent transparency, the surveillance is being deployed asymmetrically — more intensive monitoring of individuals than of the institutional actors responsible for a much larger share of emissions. This pattern recurs throughout surveillance history: the intensive monitoring of individuals (workers, consumers, citizens) while corporate and government activity remains relatively opaque.

Insurance and Climate Risk

The insurance industry has become a major user of climate and weather data for what is essentially individual risk surveillance: - Actuarial pricing: Property insurers use detailed flood, wildfire, and storm risk maps derived from weather and climate data to set premiums at the property level. A home in a flood zone is not just riskier — it receives a precisely calculated risk score based on its specific elevation, distance from water, and historical flood frequency - Climate risk disclosure: Financial regulators in the EU and increasingly the U.S. are requiring companies to disclose their exposure to climate risk — a form of mandated climate surveillance at the corporate level - Satellite-based claims verification: Insurance companies increasingly use satellite imagery and weather data to verify property damage claims — cross-referencing reported damage with historical weather records, radar data, and post-event satellite imagery

This insurance use of climate monitoring data is not neutral: it determines who can afford coverage, which neighborhoods are insurable, and which properties become financially stranded as climate risk increases. The surveillance of climate risk is simultaneously a surveillance of community vulnerability — documenting, and potentially exacerbating, the inequalities that climate change creates.


23.8 The Surveillance of Nature and the Nature of Surveillance

The weather observation system is the oldest and most thoroughly normalized surveillance network in existence. It collects data from thousands of sensors, transmits it in real time to central repositories, applies automated analysis, and distributes the results to a vast institutional apparatus that uses them to make decisions affecting millions of people — agricultural planting schedules, aviation routing, emergency evacuation orders, crop insurance payouts.

And yet it generates essentially no controversy about surveillance, privacy, or civil liberties. Why?

The absence of human subjects: Weather monitoring's targets are atmospheric phenomena. No individual human is the subject of observation. The data does not represent or expose any person.

The legitimacy of the purpose: Weather forecasting and climate monitoring serve obvious, broadly shared interests. The benefits are diffuse (everyone benefits from better weather forecasts) rather than concentrated in particular institutional interests.

The transparency of the infrastructure: NEXRAD radar data, GOES satellite imagery, and ASOS station data are all publicly available in near-real time. The surveillance network is visible and its data is open.

Historical normalization: Weather monitoring has been institutionalized for so long, across such a wide range of societies, that it is treated as natural rather than as a product of specific institutional choices. We do not think of choosing to accept weather monitoring in the way we might think of choosing to accept CCTV cameras — the former predates living memory while the latter is recent and visible.

These contrasts are instructive. They suggest that if the conditions that make weather monitoring acceptable (non-human subjects, universally shared benefits, open data, historical normalization) could be extended to other forms of surveillance, those other forms might also become less controversial. Conversely, when surveillance does not meet these conditions — when subjects are human, benefits are concentrated, data is secret, and the practice is recent — it generates the controversy that weather monitoring avoids.

💡 Intuition: The "Natural" Surveillance

The fact that weather monitoring feels natural and inevitable while CCTV feels intrusive and contestable is not because weather monitoring is actually different in kind. It is because weather monitoring has been normalized over generations while CCTV has not — or not yet. Normalization is a process, not a feature of technologies. The surveillance technologies that feel most natural today were once novel and contested; the surveillance technologies that feel most novel and contested today may be fully normalized within a generation. Understanding this process should make us cautious about treating any surveillance practice as simply "how things are."


23.9 Summary

Weather surveillance is the oldest institutionalized monitoring system in existence. From the Medici weather station network of 1654 through the global NEXRAD, GOES, radiosonde, and ARGO float systems of today, atmospheric monitoring has developed into a global surveillance infrastructure of remarkable sophistication, scale, and openness.

The system works because its targets are non-human (atmospheric phenomena), its benefits are universal (everyone needs weather forecasts), its data is largely open (WMO resolution 40 mandates international sharing), and its legitimacy has been normalized over centuries. These features distinguish it from contested surveillance systems while illuminating what makes any surveillance system more or less acceptable.

The privatization of weather data services represents a case study in the capture of public surveillance infrastructure by private interests — the data collection remains public, but the value added to that data is captured privately. Carbon footprint tracking and climate risk insurance demonstrate how climate monitoring logic extends into individual surveillance, with familiar patterns of asymmetric power and concentrated effect on marginalized communities.

Chapter 24 examines epidemiological surveillance — the monitoring of population health — which shares many features with weather monitoring (legitimate public purpose, historical normalization, professional scientific infrastructure) while raising more acute concerns about individual privacy because its subjects are unmistakably human.


Key Terms

Doppler radar: Weather radar that measures not only the presence and intensity of precipitation but also its velocity, using the Doppler effect to detect rotation and wind shear associated with severe weather.

Radiosonde: An instrument package carried aloft by a weather balloon, measuring and transmitting temperature, humidity, pressure, and wind data throughout the atmosphere.

NEXRAD (WSR-88D): The network of 159 Doppler radar stations operated by NOAA, NWS, and FAA that provides continuous weather radar coverage of the United States.

GOES: Geostationary Operational Environmental Satellite; NOAA's geostationary weather satellite system providing continuous monitoring of the Western Hemisphere from orbit.

ASOS: Automated Surface Observing System; network of more than 900 standardized automated weather stations at airports and other locations across the United States.

Radiosonde: An instrument package carried by weather balloon to measure temperature, humidity, and pressure profiles through the atmosphere.

Keeling Curve: The continuous record of atmospheric CO2 concentration at Mauna Loa Observatory, Hawaii, beginning in 1958; the foundational dataset of climate science.

ARGO float: An autonomous profiling float that cycles between the ocean surface and 2,000 m depth, measuring temperature and salinity; approximately 4,000 floats currently operating globally as a commons.

TICS framework: Type, Independence, Convergence, Scale — an evidence evaluation framework for assessing climate and environmental monitoring data claims.