There are two basic steps for using the delta method:
Calculate changes in projected monthly temperature or precipitation in relation to the model’s average climate during the time period for which PRISM data are available.
Interpolate those changes to match PRISM resolution by adding them to (for temperature) or multiplying them by (for precipitation) the PRISM averages.
Delta method example
Downscale GCM data with PRISM data as the baseline climate:
Standardize GCM and PRISM data
Rotate grid and set latitude and longitude values to standard WGS84 geographic coordinate system. This sets North as the top of the grid and converts original lat/long values from 0–360 to -180 to 180.
Convert GCM units to more common units:
Convert temperature from Kelvin to Celsius
Convert precipitation values from kg/m²/sec to mm/month
Calculate GCM climatologies to determine projected changes in climate and the amount of model bias inherent in that change. Climatologies are GCM mean monthly values across a reference period (usually 30 years) from the 20c3m scenario outputs. The values represent modeled data and contain an expected model bias that we later adjust.
Determine a reference state of the climate according to the GCMs by using 20th-century (20c3m) scenario GCM data values to calculate climatologies for the same time span used in the high resolution data we are downscaling to.
We do this calculation for a worldwide extent at the coarse GCM spatial resolution (range: 1.875°–3.75°).
Calculate anomalies: monthly absolute (for temperature) or proportional (for precipitation) anomalies for a worldwide extent at the coarse GCM spatial resolution.
Temperature: future monthly value (e.g. May 2050 A1B scenario) - 20c3m climatology
Precipitation: future monthly value (e.g. May 2050 A1B scenario) / 20c3m climatology
Reasoning behind using a proportional anomaly for precipitation
A proportional anomaly for precipitation reduces the possibility of projecting negative precipitation in the future. Negative projections could occur with absolute anomalies if wet biases in the model lead to circumstances where the absolute reduction in precipitation is greater than the observed average precipitation.
Complications can arise with the proportional method in arid areas (where climatological precipitation is close to zero) when dividing by a very small number could produce unreasonably large precipitation increases. In the rare events that this does occur, we truncate the top 0.5% of anomaly values to the 99.5 percentile value for each anomaly grid.
This results in:
no change for the bottom 99.5% of values,
little change for the top 0.5% in grids where the top 0.5% of values are not extreme, and
substantial change only when actually needed, such as cases where a grid contains one or more cells with unreasonably large values resulting from dividing by near-zero.
We don’t omit precipitation anomaly values of a certain magnitude. Instead, we use a data distribution-based quantile to truncate the most extreme values. The 99.5% cutoff was chosen after consideration of the ability of various quantiles to capture extreme outliers. This adjustment allows the truncation value differ for each grid, because it is based on the distribution of values across a given grid.
Downscale, and remove bias
Interpolate temperature and precipitation anomalies with a first-order bilinear spline technique across an extent larger than our high-resolution climatology dataset. We use a larger extent to account for the climatic variability outside of the bounds of our final downscaled extent.
Our GCM anomaly datasets are now at the same spatial resolution as our high resolution climatology dataset.
Add interpolated temperature anomalies to or multiply interpolated precipitation anomalies by the high-resolution climatology data (e.g., PRISM). This step effectively downscales the data and removes model biases by using observed data values as baseline climate.
The final products are high resolution (2km or 771m for PRISM) data.
Validate our product: Although the baseline climate data used in our downscaling procedure have been peer reviewed and accepted by the climate community, we take the final step of vetting our own products to increase our confidence in them.
While it is impossible to validate future data, we can compare downscaled 20th-century scenario (20c3m) GCM data to actual weather station data.
A committee of climate experts plots and inspects all of our projected future monthly output data. See plots of these under each dataset on our Data Downloads page.