Updated 3/27/2015 with this addition from Variable-Variability.blogspot.com
Two new reviews of the homogenization methods used to remove non-climatic changes
Two new reviews of the homogenization methods used to remove non-climatic changesBy coincidence this week two initiatives have been launched to review the methods to remove non-climatic changes from temperature data. One initiative was launched by the Global Warming Policy Foundation (GWPF), a UK free-market think tank. The other by the Task Team on Homogenization (TT-HOM) of the Commission for Climatology (CCl) of the World meteorological organization (WMO). Disclosure: I chair the TT-HOM.
The WMO is one of the oldest international organizations and has meteorological and hydrological services in almost all countries of the world as its members. The international exchange of weather data has always been important for understanding the weather and to make weather predictions. Its main role is to provide guidance and define standards to make collaboration easier. The CCl coordinates climate research, especially when it comes to data measured by the national weather services.
The review on homogenization, which the TT-HOM will write, is thus mainly aimed at helping national weather services produce better quality datasets to study climate change. This will allow the weather services to provide better climate services to help their nations adapt to climate change.
Homogenization is necessary because much has happened in the world between the French and the industrial revolution, two world wars, the rise and the fall of communism, and the start of the internet age. Inevitably many changes have occurred in climate monitoring practices. Many global datasets start in 1880, the year toilet paper was invented in the USA and 3 decades before the T-Ford.
As a consequence, the instruments used to measure temperature have changed, the screens to protect the sensors from the weather have changed and the surrounding of the stations has often been changed and stations have been moved in response. These non-climatic changes in temperature have to be removed as well as possible to make more accurate assessments of how much the world has warmed.
Removing such non-climatic changes is called homogenization. For the land surface temperature measured at meteorological stations, homogenization is normally performed using relative statistical homogenizing methods. Here a station is compared to its neighbours. If the neighbour is sufficiently nearby, both stations should show about the same climatic changes. Strong jumps or gradual increases happening at only one of the stations indicate a non-climatic change.
If there is a bias in the trend, statistical homogenization can reduce it. How well trend biases can be removed depends on the density of the network. In industrialised countries a large part of the bias can be removed for the last century. In developing countries and in earlier times removing biases is more difficult and a large part may remain. Because many governments unfortunately limit the exchange of climate data, also the global temperature collections can only remove a part of the trend biases.
Some differencesSome subtle differences. . . . Continue at: http://variable-variability.blogspot.com/2015/04/two-new-reviews-of-homogenization-methods.html
To facilitate this challenge, I've made annualized data available for the eight stations as a spreadsheet file.
- A GHCN (global historical climatology network) station report browser. GHCN provide graphical reports on the adjustments made to each station record, but you need to know the station ID to find them. I have created an interactive map to make this easier.
- The Berkeley Earth station browser. The Berkeley Earth station reports provide additional information to help you understand why particular adjustments have been made.
- The Skeptical Science temperature record calculator. This allows you to construct your own version of the temperature record, using either adjusted or unadjusted data for both the land and sea surface temperatures.
- The GHCN station data. For daily average temperature data, look for the 'tavg' files. The 'qca' files are adjusted temperatures, and the 'qcu' files are unadjusted temperatures. The files are stored in '.tar.gz' archives, and may be extracted using most freely available unzip software.
- The Hadley sea surface temperature data. Look for the files HadSST.22.214.171.124.median.zip, or HadSST.126.96.36.199.unadjusted.zip for the unadjusted data.
- Menne et al (2009) The U.S. historical climatology network monthly temperature data, version 2.
- Bohm et al (2010) The early instrumental warm-bias: a solution for long central European temperature series 1760–2007.
- Brunet et al (2010) The minimization of the screen bias from ancient Western Mediterranean air temperature records: an exploratory statistical analysis.
- Ellis (1890) On the difference produced in the mean temperature derived from daily maximum and minimum readings, as depending on the time at which the thermometers are read
- Fiddling with global warming conspiracy theories while Rome burns
- Telegraph wrong again on temperature adjustments