Is it truly possible to model 30 years of worldwide weather measurements and come up with one simple prediction for climate change on a global scale? Dr. Frédéric Laliberté from the Physics Department at the University of Toronto and his colleagues may have done just that.

Published in the journal Science, Laliberté’s research employs a remarkable method using high-frequency data — which, in the world of climate modelling, means observations of weather taken every six hours. This would not seem significant, except that measurements of sufficient quality for modelling have been collected by scientists and meteorologists for the past 30 years. Keep in mind also that weather is a local phenomenon, so in addition to being different every six hours, it is different in every part of the world.

By combining this host of data from various sources including satellites, rain gauges, actual temperature from the ground level, and so forth with a method of producing datasets that atmospheric physicists call reanalysis, information can be gathered totalling tens of terabytes. However, what is interesting is when that amount of data is all used at once as opposed to breaking it up into smaller chunks to produce conclusions on a smaller scale. Laliberté reports that his team has used almost 20 terabytes of data for their modelling.

“Most climate scientists will use simple data to come to a simple analysis because it’s easier to understand,” says Laliberté, “so what we’ve tried to do is use a lot of data and come to a very simple description.”

Although the data used may have been complex and enormous in size, the physical foundations on which the model was built are among the most fundamental. Using basic theoretical principles of thermodynamics, the researchers modeled the global atmosphere as a simple thermal structure called a “heat engine.” In a heat engine, an amount of heat in a system — such as, for instance, the atmosphere — is converted into mechanical power, which requires an output, such as, for instance, a giant storm. For those of you who have taken high school physics, you may recognize the principle of conservation of energy in this description, which is one of the most basic laws of nature known to science.

What happens when you take this model that combines our most basic understanding of physics with one of the most extensive weather datasets at our disposal, and apply global warming?

In Laliberté’s own words, “We’ve looked at the atmosphere as a heat engine, and it allowed us to on the one hand identify the impact of global warming on the heat engine, so we concluded that the overall descending and ascending motions associated with storms would be constrained.”

What this means is that global warming may bring us a future in which we have fewer, more intense storms, but at the expense of smaller and more common ones. Less energy is put into circulating the global atmosphere, and instead, it builds up energy over a long period of time and then releases it in the form of superstorms.

These results caused a ruckus by coinciding conveniently with the Winter Storm Juno, the blizzard you may remember for dumping a heap of snow on the East coast of North America. When asked if his model’s prediction is already coming true, Laliberté responds, “The way I’ve been approaching this, and I think the safest way to linking weather to climate, is to use weather to understand climate and not the other way around.”

He explains that even though Juno, as a large storm with a lot of moisture associated with it, fits the kind of storm that his research is describing, it may be too soon to attribute its occurrence to the change in climate.
“It might be the one that really favoured in the study,” he says, “but then — it’s impossible to say.”