Tuesday, December 9, 2008

The future of climate change is in Linux's hands

The Millennium Simulations, an earth modeling venture at the Max Planck Institute for Meteorology, will allow scientists to model the changes in the world's climate over the last millennium as well as centuries into the future.

By factoring in human influences on carbon, including changes in land use, as well as natural phenomena including volcanic activity, the Millennium Simulations will provide an insight into how the earth's climate will change over the coming decades and centuries.

It's this information that will go towards informing the next assessment report of the Intergovernmental Panel on Climate Change, the body whose information is fed to the highest levels of government to help them make decisions on the environment. The Millennium Simulations' Earth System Model contains sub-models for atmosphere, ocean, land-surface, and ocean-biogeochemistry and features an "interactive carbon cycle".

With scientists across Europe and beyond contributing information to the project and analyzing the results, Millennium Simulations needs a lot of bandwidth - bandwidth enabled by Gant2, a grid linking national research and education networks across the continent.

So far, the simulations have racked up more than 50 terabytes of data, Reinhard Budich, project manager at Max Planck Institute for Meteorology, told silicon.com.

"One of the simulation runs we do of, say, 1,000 years produced something like 2.5 terabytes of data and these data have to be shared. We do not only one run, we do many runs because we have to adjust the models to the different forcings [forcing factors - elements that can affect the climate models] we have and then we do model runs with different forcings switched off and on." Using Gant2, the Institute can make the simulations available to earth scientists, meteorologists, historians and other interested parties around the world.

According to Budich, the contribution of Gant2 - which is also used by scientists working on the Large Hadron Collider - cannot be overstated.

"We, as a project, depend on Gant... Such an infrastructure initiative like Gant is really indispensible for us." The data from the Millennium Simulations is stored at the World Datacenter for Climate, housed in the German Climate Computing Centre in Hamburg - the largest Linux-based database in the world, Budich told silicon.com, holding 400 terabytes of data. The link between the Max Planck Institute and the German Climate Computing Center is a theoretical 655Mpbs.

"This is what we can afford so far - it's a matter of cost as well. How much of that [bandwidth] you really get from that for your little project is one of those things where I'd like to see improvements," Budich said.

"We as researchers have to share bandwidth between us - there are about 400 registered users at this computing centre so you can imagine the bandwidth we get for our transfer. It depends really on time of day, usage of others, etc. What I'd like to see in the long run is that I can reserve bandwidth for an application, I can say: 'Today between noon and 1.30 I want to transfer this much data to somewhere. Give me that bandwidth'. That's the quality of service I'd like to see."

It's an option Budich believes vendors are open to the idea and could adopt such a business model in the future.

Storage too has proved a bugbear for the climate researchers, with data retrieval usually a matter of minutes but occasionally seeing scientists wait for hours for information.

"Many of those data have to reside on tape because we just can't afford enough disk space for that... Those data that are asked for more often the database keeps on disc basically but those asked for seldom are offloaded onto tape."

  • It’s up to tech to save the world
  • Large Hadron Collider computing grid launched
  • No comments: