Friday, May 15, 2009

Global Warming GIGO

The world wants the US to lead the effort on climate change, and the American arsenal is grounded on the U.S. Historical Climatology Network (USHCN). USHCN data are widely used as a baseline for climate models created by major scientific centers, including:
• NASA Goddard Institute for Space Studies (GISS) managed by Dr. James Hansen

• Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge Laboratory

• Hadley Climate Research Unit (CRU) in the UK managed by Dr. Phil Jones

• National Climatic Data Center (NCDC) managed by Mr. Thomas Karl

• Intergovernmental Panel on Climate Change (IPCC), a joint project of the World Meteorological Organization
But a team led by a long-time meteorologist recently revealed that "only 11 percent of surveyed stations being of acceptable quality, the raw temperature data produced by the USHCN stations are not sufficiently accurate to use in scientific studies or as a basis for public policy decisions." How could this happen?

Background: Two years ago, former television meteorologist Anthony Watts decided to double-check the United States government. Specifically, although surface measurement records collected by the National Oceanic and Atmospheric Administration's (NOAA) Historical Climatology Network (USHCN) commonly were cited to show increased temperatures, Watts realized that "very little physical site survey data exist[ed]" about such stations. He set up a website, called for volunteers, and started investigating.

This was an innovative approach to one well-known issue--the potential contamination of temperature data by the "urban heat island" effect. (Briefly stated, cities are warmer than rural areas, much of the evidence of warming comes from surface measurements in cities, while satellite and Southern Hemisphere data show substantially less warming.) But from the start, Watts also focused on a related but distinct issue--whether the placement of measurement instruments vitiated the reliability or comparability of station data.

Standards: NOAA says temperature data must come from terrain typical of the overall area:
The ground over which the shelter [radiation] is located should be typical of the surrounding area. A level, open clearing is desirable so the thermometers are freely ventilated by air flow. Do not install the sensor on a steep slope or in a sheltered hollow unless it is typical of the area or unless data from that type of site are desired. When possible, the shelter should be no closer than four times the height of any obstruction (tree, fence, building, etc.). The sensor should be at least 100 feet from any paved or concrete surface.
Watts wondered whether stations were properly "sited" under such standards. NOAA itself shared this concern, saying in 2002 (at 1):
The research community, government agencies, and private businesses have identified significant shortcomings in understanding and examining long-term climate trends and change over the U.S. and surrounding regions. Some of these shortcomings are due to the lack of adequate documentation of operations and changes regarding the existing and earlier observing networks, the observing sites, and the instrumentation over the life of the network. These include inadequate overlapping observations when new instruments were installed and not using well-maintained, calibrated high-quality instruments. These factors increase the level of uncertainty when government and business decision-makers are considering long-range strategic policies and plans.
In the context of improving the quality of its data, NOAA recognized (at 6) that atypical siting was likely to inject errors into the temperature measurements:
Classification for Temperature/Humidity

Class 1: Flat and horizontal ground surrounded by a clear surface with a slope below 1/3 (<19º). Grass/low vegetation ground cover <10 cm high. Sensors located at least 100 meters (m) from artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots. Far from large bodies of water, except if it is representative of the area, and then located at least 100 meters away. No shading when the sun elevation >3 degrees.

Class 2: Same as Class 1 with the following differences. Surrounding Vegetation <25 cm. Artificial heating sources within 30m. No shading for a sun elevation >5º.

Class 3 (error 1ºC): Same as Class 2, except no artificial heating sources within 10m.

Class 4 (error >/= 2ºC): Artificial heating sources <10m.

Class 5 (error >/= 5ºC): Temperature sensor located next to/above an artificial heating source, such a building, roof top, parking lot, or concrete surface.
Analysis: Earlier this year, Watts' group issued its first report, based on evidence gathered at "854 of 1221 stations . . . in the USHCN network" located throughout the co-terminus United States. (They've added some sites since publication.) Titled with a question--"Is The U.S. Surface Temperature Record Reliable?"--the paper's succinct answer is "no." The report itself includes photos and data sets from more than 100 stations (the remainder are available on the web).

The evidence suggests systemic weaknesses in site placement. For example, the Marysville, California station (USHCN 45385) sits between the fire-house house and its paved parking lot, surrounded by heat sources:


source: Surface Station report at 6
Temperatures recorded at Marysville may have increased, but reflections from walls and asphalt could be creating a "micro-climate" un-representative of the surrounding community and un-connected with any global and greenhouse gas-caused climate change.

The station at Drain, Oregon (USHCN 352406) abuts a sewage treatment plant:



source: Surface Station report at 12
Yet wastewater purification often is a warm process, meaning the Drain station is parked next to the sort of artificial heating source NOAA knows introduces measurement errors. Capping carbon won't correct that.

There are scores of further examples. Plus details of (at 4-5) an upward bias of as much as 2.5 degrees (F) caused by the 1979 change from whitewash to latex paint for the wooden shell housing the instruments. And the report affirms (at 13-14) a problem I've pointed to: NASA's and NOAA's penchant for retroactively "adjusting" the raw data up--almost always up--by over 0.5 degrees (F).

Answer: The summary result (at 16) is easy to understand:
Each station has been assigned a CRN rating based on the quality rating system provided by NOAA. We found only 3 percent of the stations surveyed meet the requirements of Class 1, while an additional 8 percent meet the requirements of Class 2. Stations that don’t qualify as Class 1 or 2 have artificial heating sources closer than 10 meters to the thermometer, a far cry from the gold standard of 100 meters. This means 89 percent -- nearly 9 of 10 -- of the stations surveyed produce unreliable data by NOAA’s own definition.

Twenty percent of stations were rated as Class 3, 58 percent as Class 4, and 11 percent as Class 5. Recall that a Class 3 station has an expected error greater than 1ºC, Class 4 stations have an expected error greater than 2ºC, and Class 5 stations have an expected error greater than 5ºC. These are enormous error ranges in light of the fact that climate change during the entire twentieth century is estimated to have been only 0.7º C. In other words, the reported increase in temperature during the twentieth century falls well within the margin of error for the instrument record.

This project has shown that the vast majority of the temperature stations in the USHCN network have proximity to biasing elements that make them unreliable. Figure 27 offers a visual representation of how low-quality stations greatly outnumber high-quality stations.


source: Surface Station report at 16
Impact: As Roger Pielke Sr. says, this is "an outstanding, clearly written report." And it appears to be effective--the buzz is building.

Kudos to Anthony Watts--read the whole thing. Especially his executive summary (at 1): "The conclusion is inescapable: The U.S. temperature record is unreliable." Making much warming "science" similarly unreliable.

The scientists should have known starting next to sewage would generate garbage.

4 comments:

OBloodyHell said...

Nice work on this piece.

Bob Cosmos said...

Look how this plays right into the liberal agenda: They can spend trillions on 'fixing global warming' then recalibrate the temperature readings from the last 50 years and claim success. That way their programs don't really have to do anything -- they can show 'progress' in global warming and take credit for capping greenhouse gasses by moving around a few weather stations.

It stinks of Rathergate -- "fake but accurate".

OBloodyHell said...

> It stinks of Rathergate -- "fake but accurate".

Except that The TANG memos were the former and blatantly not the latter.

I've always found it hilarious that Rather destroyed his reputation trying to harp on a point that was utterly irrelevant. Kerry's service record was relevant to the campaign. Not only did it reflect upon his credentials to lead and his behavior as an individual, he kept mentioning the subject at every opportunity, such that it became a running joke that if Kerry was in the room for fifteen seconds five of them would be spent mentioning his Vietnam service.

By 2004, though, Bush's creds as a C-I-C were already founded on something other than his service record, and found to be adequate. Hence, any trivial matter such as that in the TANG memos -- even had it been true -- would not have meant squat to most people. It would have been Bush's version of "I partook but I didn't inhale".

So Rather's own rabid liberalism is what did him in, and he, and it, did it pursuing a "ghost dog" to boot.

.

@nooil4pacifists said...

I agree with OBH: The problem with the USHCN is that it's real but inaccurate.