Running climate models from home

By Bryan • environment • 10 Jun 2008

My latest geeky addition to my already over-worked and desperately-waiting-retirement computer. Pretty sleek, huh?

Back when I was doing my undergrad the hip thing circulating geek-dom was participating in distributed computer networks with the most popular one being SETI@home. Basically the network required one to download a small program which would then analyze small packets of radio-telescope data for certain patterns and then return the results. Being a wannabe-geek during that time I eagerly hopped on the bandwagon and began running the program in the background. I think I participated for about a year or so and after one of the many HD crashes I’ve had, never bothered to reinstall the software. That was way back in 2000-2001.


Memories of this little program resurfaced today as I stumbled across and article on wired.com discussing the apparently spiritual nature and crusade-like properties of the SETI project. I didn’t bother to read it, but though firing up the ole’ SETI screensaver would be another procrastination tool for me to play with. So I went searching for the software only to find that the whole thing has been replaced with an all-in-one distributed computing software package from Berkely. Acronymed as BOINC (Berkely Open Infrastructure for Network Computing) this program acts as somewhat of an hub, allowing for one’s computer to simutaneously particiapte in dozens of other SETI like projects.

After browsing through a few of the projects (which included SETI) I decided that the climateprediction.net effort sat most closely with some of my personal interests…and the screen saver/visual representation just looked plain cool. Essentially, the job of my computer is to run a climate model from what appears to be 1810 to 2050.

The aim of the Climate prediction.net project is to investigate the uncertainties in various parameterizations that have to be made in state-of-the-art climate models (see “Modelling The Climate“). The model is run thousands of times with slight perturbations to various physics parameters (a ‘large ensemble‘) and the project examines how the model output changes. These parameters are not known exactly, and the variations are within what is subjectively considered to be a plausible range. This will allow the project to improve understanding
of how sensitive the models are to small changes and also to things like changes in carbon dioxide and sulphur cycle. In the past, estimates of climate change have had to be made using one or, at best, a very small ensemble (tens rather than thousands) of model runs. By using participants’ computers, the project will be able to improve understanding of, and
confidence in, climate change predictions more than would ever be possible using the supercomputers currently available to scientists.

After about a day and a half of operation my computer has managed to push through .043% of the model and it is estimated that completion of this particular model will require approximately 477 horus of computing time. Given that the model only runs when my computer is running, this could take awhile (in regards to the possible climatic effect of energy use…and my pocket book…I don’t run my computer at night).

*regardless of your stance on climate change…it kicks the crap out of your average screensaver.

*the image from my desktop shows the program analyzing data from 26/01/1811 0830

Tags: , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *