" ... without models there are no data."

The second paragraph of the Introduction to the recent book "A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming" by Paul N.Edwards reads:

' ... without models there are no data. I'm not talking about the difference between "raw" and "cooked" data. I mean this literally. Today, no collection of signals or observations -- even from satellites, which can "see" the whole planet -- becomes global in time and space without passing through a series of models.'


A few pages on Edwards summarizes: "... everything we know about the world's climate -- past, present, and future -- we know through models."

This book distinguishes three types of computer models.

1. Simulation models that are built on physical theory. These use atmospheric physics to build numerical models to calculate large-scale atmospheric motion and predict the weather.

2. Reanalysis models that come from weather forecasting. These models also simulate the weather, but they also check their results with actual weather observations.

3. Data analysis models that are a collection of mathematical techniques, algorithms, and empirically derived adjustments to instrument readings.

In addition to these models, the socio-technical systems that have been developed to observe weather and climate make up a climate knowledge infrastructure. One further important observation is that this infrastructure and the models used to provide and analyze the data we have about weather and climate are constantly changing.

I have just started reading Edwards' book, but I am already taking one lesson from it: I need to understand how data and models inform what I accept as information and knowledge. As my friend Dennis Hamilton reminded me, "there are no uninterpreted data". Knowing how models and data yield information I can evaluate and use is one of those 21st Century required skills.