
Having just finished the book, I find myself “just thinking about the weather” (like 10,000 Maniacs).
Every decade, we have added one day to our forward weather forecast. So, today’s weekly forecast is as accurate as the 2-day forecast in the 70’s. In the first book on weather prediction — 100 years ago — Lewis Fry Richardson prophesied “perhaps someday in the dim future it will be possible to advance the computation faster than the weather advances, and at a cost less than the savings to mankind due to the information gained. But that is a dream.”
What a setup for Moore’s Law! More on that later.
Weather predictions started with telegraph networks in the 1860’s. News of a storm front could arrive by electrical signals faster than the wind itself. Those physical networks were interrupted by the civil war and the great wars. The observational stations drove short term forecasts driven by simple pattern matching; meteorologists flipped through maps of prior patterns to find one that looked similar, missing the nuances in the complex networks of interactions.
What was needed was a theory, a mathematics derived from first principles of the physics of atmospheric flows. Those equations, a collection of interlocking partial differential equations, across a matrix of pressure, temperature, air density, wind vectors and such, were first published in 1904 (and are the subject of the thick textbook below). They are practically unsolvable, but can be approximated with a variety of numerical / graphical methods and approximations (hydrostatic, anelastic, autobarotropic shallow fluid, etc.). New weather prediction models were then back tested on historical data, an iterative feedback cycle of learning from past to present.
The weather became important to ship traffic and battle planning, and forecasts were weaponized in wartime. The terminology of weather “fronts” traces to the martial vernacular of WW I. The Germans were at a distinct meteorological disadvantage, with storms coming from areas controlled by the Allied powers. Siemens developed automatic weather stations with NiCad batteries and radios that could be dropped off by plane in remote locations. With 200 submarines trying to maintain a blockade of England, the Germans desperately needed weather predictions for the North Atlantic. In 1943, they sent U-537 to an uninhabited part of North America, and set up a weather station on a local peak, with a long range 30-ft. diameter antenna to beam weather data back to Germany. To evade detection, they hand-painted “Canada Meteor Service” on the side and scattered American cigarette packs about. It remained there until discovered in 1981. Yes, the only known incursion by the Nazis onto North American soil was for the weather.
Then came the rockets. The first U.S. launch of a V-2 rocket brought back from Germany snapped a picture of the cloud cover as had never been seen before, with a quarter of the U.S. in a single frame. In 1954, an Aerobee rocket cam captured the first clear image of a tropical storm swirling in the Gulf of Mexico, and it became a full-page spread in Life magazine. (I have an Aerobee nose cone, fin can, and engine on display at work).
The first weather satellite, TIROS 1, launched in 1960, and in Kennedy’s famous speech that launched the Apollo program, he also beckoned “at the earliest possible time, a satellite system for worldwide weather observation.” It was overshadowed a bit by the whole man on the moon thing.
Today, the polar-orbiting LEO satellites raster scan the Earth (like Planet Labs) and “contribute the most quantitative data to the weather models. When it comes to meaningful impacts on forecasting, they are the champs.” (p.81). We have hundreds of LEO and GEO birds with a variety of weather instruments (optical, IR, radar) providing global coverage.
It’s a torrent of data, feeding supercomputers that are upgraded every two years. About half of the supercomputers on Earth are working on the weather. The European Center for Medium-Range Weather forecasts has two supercomputers the size of volleyball courts with 260,000 processor cores (in 2019). They maintain the current champion model for forecasting. They devote 50% of their compute cycles to iterating on model improvements (and the other 50% running the latest model for the world). They have improved their forecasts continuously for 40 years straight.
To build a global model, there are global sensors from many nations, all contributing to a public good. “WMO estimates put the economic value of weather services in excess of $100 billion annually, while the cost of providing them is a tenth of that.” (p.175) Still, a big number for a public good. “The weather machine is a last bastion of international cooperation.” (p.181)
P.S. The book is not nearly as gripping as the history of ammonia, and it ends abruptly without painting a picture of what’s next for Sim-Earth… with a proliferation of networked sensors and machine learning in the mix.



Leave a Reply