
Releasing data
Will this be the start of a flood of data releases from organisations desperate not to be tarred with the CRU brush, I wonder? How many fudge factors will they find in the Met Office code?
The Met Office plans to re-examine 160 years of temperature data after admitting that public confidence in the science on man-made global warming has been shattered by leaked e-mails.
The new analysis of the data will take three years, meaning that the Met Office will not be able to state with absolute confidence the extent of the warming trend until the end of 2012.
The Met Office database is one of three main sources of temperature data analysis on which the UN’s main climate change science body relies for its assessment that global warming is a serious danger to the world. This assessment is the basis for next week’s climate change talks in Copenhagen aimed at cutting CO2 emissions.
But this paragraph is the most amazing, and, given my previous post, not at all surprising:
The Government is attempting to stop the Met Office from carrying out the re-examination, arguing that it would be seized upon by climate change sceptics. (source)
Yep, that’s right. Move along. Nothing to see here. Don’t want the sceptics looking at the data, they might find something wrong with it! Sadly for the oafish Gordon Brown, the Met Office has in fact gone one step further, and will release the data into the public arena:
The Met Office has announced plans to release, early next week, station temperature records for over one thousand of the stations that make up the global land surface temperature record.
…
This subset is not a new global temperature record and it does not replace the HadCRUT, NASA GISS and NCDC global temperature records, all of which have been fully peer reviewed. We are confident this subset will show that global average land temperatures have risen over the last 150 years. [Well of course it will. We all know temperatures have risen in that period. What it doesn’t prove is man-made warming – Ed]
This subset release will continue the policy of putting as much of the station temperature record as possible into the public domain.
We intend that as soon as possible we will also publish the specific computer code that aggregates the individual station temperatures into the global land temperature record.
As soon as we have all permissions in place we will release the remaining station records – around 5000 in total – that make up the full land temperature record. We are dependant on international approvals to enable this final step and cannot guarantee that we will get permission from all data owners. (source)
I assume this will include the “data” that they used for these howlers:
Summer 2009 Prediction: “Barbecue summer” with high temperatures and no more than average amounts of rainfall.
Result: The wettest July in almost century.
Winter 2008 Prediction: A mild and dry season, with a few strong chilly spells.
Result: The eighth-warmest January ever recorded.
Summer 2007 Prediction: High temperatures and no indications that it would be a particularly wet summer.
Result: The summer of 2007 was one of the wettest since records began. June and July saw major flooding across parts of England and Wales.
Will this data be “fully baked” or just lightly roasted?