Downscaling

Downscaling results. Original CRU data at 0.5 x 0.5 degrees (right). At right, the same CRU data downscaled to 2 x 2 km.

Delta method schematic.

Climate scientists downscale climate data to generate locally relevant data from GCMs. The goal is to connect global scale projections and regional dynamics to create regionally specific forecasts.

SNAP uses a statistical downscaling approach known as the delta method because:

  • It involves just subtraction and division, which helps in interpreting and explaining downscaling results
  • Its low computational demand = efficient downscaling of GCMs and emission scenarios over centuries

Download SNAP downscaling scripts (.zip)

Delta method strategy

There are two basic steps for using the delta method:

  1. Calculate changes in projected monthly temperature or precipitation in relation to the model’s average climate during the time period for which PRISM data are available.
  2. Interpolate those changes to match PRISM resolution by adding them to (for temperature) or multiplying them by (for precipitation) the PRISM averages.

Delta method example

Downscale GCM data with PRISM data as the baseline climate:

  1. Standardize GCM and PRISM data

    • Rotate grid and set latitude and longitude values to standard WGS84 geographic coordinate system. This sets North as the top of the grid and converts original lat/long values from 0–360 to -180 to 180.
    • Convert GCM units to more common units:

      • Convert temperature from Kelvin to Celsius
      • Convert precipitation values from kg/m²/sec to mm/month
  2. Calculate GCM climatologies to determine projected changes in climate and the amount of model bias inherent in that change. Climatologies are GCM mean monthly values across a reference period (usually 30 years) from the 20c3m scenario outputs. The values represent modeled data and contain an expected model bias that we later adjust.

    • Determine a reference state of the climate according to the GCMs by using 20th-century (20c3m) scenario GCM data values to calculate climatologies for the same time span used in the high resolution data we are downscaling to.
    • We do this calculation for a worldwide extent at the coarse GCM spatial resolution (range: 1.875°–3.75°).
  3. Calculate anomalies: monthly absolute (for temperature) or proportional (for precipitation) anomalies for a worldwide extent at the coarse GCM spatial resolution.

    • Temperature:  future monthly value (e.g. May 2050 A1B scenario) - 20c3m climatology
    • Precipitation:  future monthly value (e.g. May 2050 A1B scenario) / 20c3m climatology
  4. Downscale, and remove bias

    • Interpolate temperature and precipitation anomalies with a first-order bilinear spline technique across an extent larger than our high-resolution climatology dataset. We use a larger extent to account for the climatic variability outside of the bounds of our final downscaled extent.
    • Our GCM anomaly datasets are now at the same spatial resolution as our high resolution climatology dataset.
  5. Add interpolated temperature anomalies to or multiply interpolated precipitation anomalies by the high-resolution climatology data (e.g., PRISM). This step effectively downscales the data and removes model biases by using observed data values as baseline climate.

    • The final products are high resolution (2km or 771m for PRISM) data.
  6. Validate our product: Although the baseline climate data used in our downscaling procedure have been peer reviewed and accepted by the climate community, we take the final step of vetting our own products to increase our confidence in them.

    • While it is impossible to validate future data, we can compare downscaled 20th-century scenario (20c3m) GCM data to actual weather station data.
    • A committee of climate experts plots and inspects all of our projected future monthly output data. See plots of these under each dataset on our Data Downloads page.