Coinciding with the celebration in Lima of the XX Conference on Climate Change (COP 20), the WMO published a press release about the behavior of the first ten months of 2014. Among other interesting information, 2014 was on it's track to be the hottest (or almost) year ever recorded
The press release also stated that this is largely due to the high temperatures of the ocean surface layers:
Evolution of the heat content in the oceans in the layer 0-700 meters Source: NOAA |
Also, the Met. Office published a report which basically comes to the same conclusions also presenting some very interesting graphics:
Ranking of the most warm and cold years. Source: Met. Office |
All these are the results of a huge research effort for what has been called the "detection of climate change": determine whether the series of observations of different atmospheric variables, particularly temperature and precipitation show a trend in one way or in another from a statistical point of view. The results are here and they are very significant.
However, what concerns to the general public and to the media is to determine if this warming is responsible for adverse weather events and if we can blame anthropogenic greenhouse gas emissions.
Thus, whenever a significant weather event occurs, and specially if you have had an adverse effect on the population, the question most repeated by journalists is if this event is related to anthropogenic climate change. Meteorologists answers have always been: An isolated phenomenon can never be scientifically attributed to climate change because it can be within the natural variability of the atmosphere. And if that had anything to do, it could not be known in which exact proportion. They normally add: it would be the continued trend, increase or decrease of this type of phenomena which could be related to climate change.
The journalist agrees that this is the correct scientific answer but ... that is not a newspaper headline as it doesn't point climate change as responsible of the specific adverse meteorological situation that affected us. And this is what the general public is interested in. But things are now changing.
Aware of both the importance of the matter (its value in communication and public awareness) and its own scientific interest, the scientific community intensified its efforts from 2009-2010 to try to answer the million dollar question. This is how many "attribution" projects started seeking to establish wether a particular phenomenon can be attributed to natural variability or not.
A first approach is purely statistical and is based on the study of series of precipitation and temperatures. It looks to variability over years and looks to what extent the episode of study is statistically consistent with the variability or is way out of the natural variability. If this were so, it would follow that possibly some new forcing has contributed to its generation. Naturally, for this type of work, long statistical series are needed and the conclusion that can be obtained about the "weirdness" of the episode does not conclude that its cause is the anthropogenic action.
A much more "direct" method is based on the use of climate models, in which the type of situation or episode is simulated either leaving the atmospheric dynamics "on their own" without any external conditioning or applying a "forcing" through the introduction of current concentrations of greenhouse gases in the atmosphere. This way we can see if the situation can be explained as a result of this natural variability or if it could only happen when concentrations of such gases are on scene.
In this context, the American Meteorological Society (AMS) launched an interesting initiative in 2012: to publish every year a special edition of its famous Bulletin with compilation of articles about the attribution of different adverse situations occurred during the previous year in different parts of the world. On the first issue of the year 2012, some very interesting findings were collected from 2010 and 2011 such as:
- Texas Heat Wave: It was related to "La Niña". Now it is twenty times more likely during "La Niña" years than it was in the 60s.
- Very cold December 2010 in the UK: Chances of occurrence have been halved as a result of anthropogenic climate change.
- Very warm November 2011 in the UK: Its occurrence is now sixty times more likely then in the 60s.
- Severe flooding in Thailand: No demonstrable relation to global warming.
If we go to the 2013 report, we also find interesting information such as:
- Heat waves in Australia, Europe, China, Japan and Korea: The anthropogenic influence increased their likelihood of occurrence or intensity.
- Cold Spring in the UK: The anthropogenic reduces its likelihood and intensity.
- Heavy rainfall in central and southern Europe: No anthropogenic influence was found or at least remains unclear.
- Heavy snowfall in the area of the Pyrenees: Unclear anthropogenic influence.
However, the "problem" of these reports is that they publish their results several months later, once the media interest has already gone and thus the chance of a major social and political awareness has disappeared... This is why the Met Office report we already talked about is so important. They have initiated a process (Christidis et al., 2014) to quickly determine the possible allocation of a specific temperature record to anthropogenic climate change. In fact, the Met Office affirms that the current average global temperatures would be very unlikely to occur without human influence on climate.
Anyway, what's really important is that a way has been stablished to study the influence of climate change on global extreme weather events. Besides its great scientific scope, this represents an important step forward in raising awareness about a phenomenon wich is already irreversible but could still be partially mitigated.
This way, meteorologists can start answering the million dollar question in a more accurate way.