WeatherWhys Blog

Numerical Weather Prediction

The importance of knowing what the weather will be has been a practical necessity of agriculture and transportation for thousands of years. Simply viewing the sky to see what was coming in the next 15-30 minutes pushed the envelope of capability for most of history. But our relentless march toward the computer predictions of today was slow to begin. All the elements of what now makes up a sophisticated suite of models had to first realize a vast infrastructure that included several disparate branches of science and technology.

People like Isaac Newton and Pierre-Simon Laplace of the mid 2nd millennium provided mathematic building blocks for applications they couldn’t possibly have foreseen. A worldwide network of weather observations sites would need to be preceded by the definition of temperature, humidity, wind, pressure- and all the individual components that are variables describing the state of the atmosphere. A cursory understanding and classification of clouds was needed. Over the course of about 200 years, with the encouragement of key individuals whose curiosity about weather thrust them into the forefront of innovation, that required infrastructure started to emerge. Military engagements, such as World War II, also accelerated technological development that moved meteorology into the future.

Over time, scientists have gained a detailed understanding of geophysical properties. All aspects of the natural world can be described by partial differential equations, especially the atmosphere. Incoming short wave solar radiation is absorbed by the earth and then re-radiated up again as long wave radiation.  The complex absorption, reflection and scattering of this energy may be quantified through mathematical calculation. But precise measurement would take decades to perfect. Understanding the exact relationships of energy to fluid motions to conservation of momentum was a daunting task. But in 1904, a young Norwegian meteorologist borrowed various mathematical ingredients from the arising realm of geophysics in that era, and developed what would become the backbone of modern numerical weather prediction. As if he was taking ingredients to make a sumptuous meal, Vilhelm Bjerknes pulled together 7 critically important mathematical expressions to form the Primitive Equations.

These Navier-Stokes equations came in 4 parts: 1) differential equations addressing geostrophic momentum, 2) the hydrostatic equation (an equation of state describing a balance between the atmospheric desire to escape to space and gravity), 3) a couple of continuity equations, and 4) a thermodynamic energy equation reflecting the 1st law of thermodynamics.

It was an overwhelming accomplishment to finally see how the atmosphere works, with a detail and comprehension never before possible. Now the ocean of air above us could be precisely defined and, conceivably, predicted into the future. All that would be required involved a three dimensional analysis of the entire region and then systematic calculations using Primitive Equations, providing a solution into the future. Of course, in the early 20th century, there would be many hurdles to overcome before a useable practical forecast of sensible weather might be expected. Indeed, computer models we have come to take for granted today took the work of geniuses their entire lifetime to attain.
In order to measure the upper atmosphere, weather balloons were sent skyward. By the 1930s a crude radiosonde had been developed transmitting upper air measurements back to earth and as World War II loomed, a network of balloon releases began to take shape. Today there are over 800 locations worldwide that send up radiosondes twice a day. Also, a global network of routine surface weather observations grew. This network has included several types of observations from land, sea, air and space. And all along, as weather data started flowing in, there still needed to be a way to organize and compute the data. In the early days, a few experiments were made to actually calculate what the weather conditions would be 12 hours into the future using paper and pencil. The arduous result took weeks to finally complete. Obviously, a machine would need to replace hand calculations with long division.

In 1943, the Electronic Numerical Integrator and Computer (ENIAC) was developed at the University of Pennsylvania for wartime application. It also was the first machine to run Primitive Equations. By 1951, the Universal Automatic Computer (UNIVAC) started running experimental weather models faster. All of these systems assigned values of temperature, humidity, wind and pressure to equidistant grid points on a map, based on observed data interpolated to the grid locations. Each point on the map was actually 6 points, 1 on the surface and 5 vertically above it at various elevations. Computation into the future was on the x, y and z axes. By 1961, a new generation of UNIVAC computers was providing “barotropic” projections of upper air patterns- that is, extrapolating current weather systems without thermodynamic changes. By the late 60s “baroclinic” models could forecast the deepening of troughs and amplifying of ridges. Output from routine weather computer models started in the 1950s and was available to meteorologists via facsimile machines. 

The Limited Area Fine Mesh (LFM) model of the early 1970s brought increased resolution and accuracy into daily use with a routine forecast that went 48 hours into the future issued twice a day. Along with the LFM came a pioneering technique that blended pure numerical data from the model with climatological statistics of a particular site, using regression equations. This new Model Output Statistics (MOS) became a reliable guidance tool. By the mid 1980s, a Nested Grid Model (NGM) proved to be an important innovation with 3 domains (a synoptic scale domain “nested” within 2 other hemispheric scales). Resolution and accuracy became impressive.

Computer models now solve for topography and special conditions within the planetary boundary layer. They time-step into the future making complete forecasts for the entire horizontal and vertical domain at short intervals (1-2 minutes), interval by interval up to the limit of the specific model (384 hours into the future for the Global Forecast Suite- GFS).  Billions of individual calculations are made with each model run. It is only with modern super computers in the United States and several other countries around the world that incredibly detailed forecasts can now be made.

Today’s numerical weather predictioncomputer models are constantly beingupgraded with improved physics. Output is available generally 4 times a day via the internet. Faster machines are allowing for finer resolutionand ultimately more accurate forecasts. New ensemble techniques are also being introduced to improve predictions. Specialized domains and physics are making tropical forecasts better. And accuracy, which used to be confined to the initial 6 hours, is now being verified farther into the future. Today’s meteorologists lean on modern technology for a reliable peak into the future. It is sometime amazing to consider how far we have come from the “Art of Meteorology” our predecessors practiced to the precision craft we employ every day in the 21st century. Computer weather models are an extraordinary human accomplishment of insight, trial and error, and tenacity that have benefited us all.