Computer glitch hits climate prediction project
Tiny software mistake puts results back by months.
A software error has hit one of the world's most sophisticated climate simulations. Participants in the BBC-sponsored project, which uses spare time on home computers to predict Britain's climate in 2080, will have to wait longer than expected to see their work on television.
The results, due to be presented as part of the BBC's "Climate Chaos" television season, will be delayed by about two months as experts at climateprediction.net, the research project based at the University of Oxford, reset the software.
Some 200,000 volunteers had pledged their computers to the effort, making it possibly the largest mass-participation climate experiment ever. The model aims to simulate the British climate from 1920 to 2080. But users found that the program was mysteriously crashing at 2013. What's more, the models were predicting far greater global warming up to that point than experts expected.
Researchers at climateprediction.net tracked the problem down to a mistake in how the predicted levels of sulphate aerosols — tiny particles that block sunlight and cause 'global dimming', cooling the Earth — were entered into the model. They told confused users that they would be have to start again from scratch.
Reset to zero
The project, launched in February 2006, was set to reveal its predictions next month. "We'll still be doing it, but it will be going out later," a spokeswoman for the BBC's factual programming department confirmed. "It's very important and very interesting - but we can't do it without the results."
The mistake probably happened, says Allen, because of the complexity of the model, which was designed at the Hadley Centre for Climate Prediction and Research in Exeter, UK. "These climate models are some of the most complicated pieces of software in the world," he says. "This model was designed to be the world's best, not to be an easy piece of software to run on a PC."
The simulation takes about three months to run on a home computer. The problem has therefore affected only "dedicated front-runners with souped-up machines", Allen says. "The vast majority will be re-running only a couple of decades," he adds.
Allen insists that the work done thus far has not been in vain. "Your crunching to date has absolutely not been wasted," he has told the project's volunteers. "In essence, what your models have done is show how much the world would have warmed up over the twentieth century if it weren't for the masking effect of global dimming."
Reactions on the project's message board range from "I can't believe that this program wasn't completely tested before being released to thousands of people," to "Whoops! Still, that's science for you...."
The release of aerosols climbed steadily in the twentieth century, but it is predicted to level off or fall in the future. But a coding error meant that the model's aerosol data began much later than the proper start date of 1920. When computers reached 2013, they found they had run out of aerosol data.
The data thus generated gives a picture of the world without taking into account the true impact of global dimming. This could be useful in evaluating the separate contribution of greenhouse gases to the climate, Allen says. "What we've done - albeit by mistake - is to evaluate the different climate forcings separately," he adds.
Previous studies by climateprediction.net have reported a possibility of more extreme global warming than seen with other simulations. A study involving some 90,000 users last year concluded that a doubling of carbon dioxide could result in a warming of anything from 2 ºC to 11 ºC (see ' Internet project forecasts global warming').
And although a two-month delay might be frustrating for television viewers, it's a blip in the world of climate number-crunching. "In the progress of science, believe me, it is the blink of an eye," Allen says.
Visit our newsblog to read and post comments about this story.