x
Breaking News
More () »

A look at how weather forecasts have drastically improved over the years

Advances in computing have led to much more accurate weather forecasts.
Credit: KWES

ODESSA, Texas — Weather forecasts have made tremendous improvements in accuracy over the last century, especially in the past 30 years.

A forecast that goes seven days out is accurate 80% of the time, while in the 1990s that was true for just a three day forecast.

Most of the improvement stems from advancements in computing power. Computers run complex atmospheric models several times each day to calculate what the weather should be in the future.

This works by taking real-time atmospheric data observed from weather stations and balloons, which is then input into the models to calculate the future.

Hundreds of variables are used, including things like temperature and humidity, as well as things you probably wouldn't think about, such as incoming solar flux and ozone concentration.

The collected data is then processed by the supercomputers performing up to 30 quadrillion operations every second.

By operation, we mean a mathematical calculation which uses the known laws of physics to predict what the future atmosphere will look like one time step in the future.

The output is then run through the same equations to produce the next set of forecast data at t=2.

The output of these models is what you typically see on TV when meteorologists show things like future winds, temperatures and precipitation amounts.

There are two main models, the GFS, and the ECMWF. The GFS goes 384 hours into the future and the ECMWF goes 284 hours in the future.

The ECMWF, or the European model, is generally regarded as more accurate than the GFS, although the competition is close.

In the future, both models will improve as computing capacity increases, meaning that one day we could have month-long forecasts that would be just as accurate as contemporary weekly outlooks.

Before You Leave, Check This Out