The article is long and fairly technical, but to boil it down the GHCN made "adjustments" to "homogenize" the raw data taken from various weather stations. This was done supposedly to remove bad readings from things like moving thermometer stations around and the like. The basic algorithm is to average the 5 closest stations and then look to see if the readings from the station in question vary drastically from the average.
Source
SourceAnother example is Australia. NASA [GHCN] only presents 3 stations covering the period 1897-1992. What kind of data is the IPCC Australia diagram based on?
If any trend it is a slight cooling. However, if a shorter period (1949-2005) is used, the temperature has increased substantially. The Australians have many stations and have published more detailed maps of changes and trends.
One programmer highlighted the error of relying on computer code that, if it generates an error message, continues as if nothing untoward ever occurred. Another debugged the code by pointing out why the output of a calculation that should always generate a positive number was incorrectly generating a negative one. A third concluded: “I feel for this guy. He’s obviously spent years trying to get data from undocumented and completely messy sources.”
Programmer-written comments inserted into CRU’s Fortran code have drawn fire as well. The file briffa_sep98_d.pro says: “Apply a VERY ARTIFICAL correction for decline!!” and “APPLY ARTIFICIAL CORRECTION.” Another, quantify_tsdcal.pro, says: “Low pass filtering at century and longer time scales never gets rid of the trend – so eventually I start to scale down the 120-yr low pass time series to mimic the effect of removing/adding longer time scales!”
Source[P]ut this in the context of what else we know from the CRU data dump:
1. They didn’t want to release their data or code, and they particularly weren’t interested in releasing any intermediate steps that would help someone else
2. They clearly have some history of massaging the data — hell, practically water-boarding the data — to get it to fit their other results. Results they can no longer even replicate on their own systems.
3. They had successfully managed to restrict peer review to what we might call the “RealClimate clique” — the small group of true believers they knew could be trusted to say the right things.
As a result, it looks like they found themselves trapped. They had the big research organizations, the big grants — and when they found themselves challenged, they discovered they’d built their conclusions on fine beach sand.
Source
Just a selection of source code forges.
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
Here is the link to the sections of code forged:printf,1,'IMPORTANT NOTE:'
printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
Source
How can people still possibly believe that climate change is in any way man made?
Last edited: