Some years ago I was sceptical about anthropogenic climate change. I now have to admit that I was wrong. Unfortunately, there is strong evidence that the Earth is warming very rapidly, that this change is largely attributable to human activity, and that it is already having and will continue to have very dangerous consequences. This post will set out why I have become convinced that climate change is a terrifying reality, with reference to the scientific evidence. I am not a climatologist, and I claim no authority of any kind on the subject; readers are invited to investigate the science for themselves. I welcome correction of any errors in the facts and figures.
We know that global warming is happening: Earth’s global average surface temperature has risen 0.55°C since the 1970s, and 0.8°C over the course of the twentieth century as a whole. It is important to understand that this is a long-term trend: weather is not climate, and a short span of cold weather does not mean that the Earth is not warming in the long term.
We also have good evidence that the present warming trend is unprecedented and anomalous, in comparison with naturally-caused climate changes in past centuries. Although direct global temperature records are only available from the mid-nineteenth century onwards, the “instrumental period”, temperatures before this time can be estimated using a number of proxy measurements. One is dendroclimatology, the analysis of tree rings to obtain data about historic temperatures; another is the measurement of the relative amounts of different isotopes of oxygen (oxygen-16 and oxygen-18) in ice cores and deep sea sediments.
Contrary to a popular myth, we know that the warming trend evident from the data is not primarily attributable to the “urban heat islands” resulting from growing urbanization. Climatologists have long since recognized the effect of urbanization on local temperature measurements, and account for it by homogenizing anomalous urban temperature trends with those from neighbouring rural stations, a process described in some detail in Hansen et al (2001). Hausfather et al (2013) investigated the effect of urbanization on temperature measurements in the contiguous United States: they found that no more than 21% of the increases in unadjusted minimum temperatures in the United States since 1895 and no more than 9% of those since 1960 were attributable to urbanization, and that homogenization of the data removed most of this apparent urban bias. In another study, Jones et al (2008) investigated the effect of urbanization on temperature measurements in China, where there has been rapid urban development in the past several decades: they found that only 0.1°C of warming in China from 1959 to 2004 was attributable to urbanization, with 0.81°C attributable to climatic factors. Their paper is behind a paywall, but the methodology is explained briefly here at RealClimate.org. In short, the “urban heat island” effect is already accounted for in the data, and does not explain the warming trend of the past century.
We also know that, contrary to another popular myth, the sudden temperature increase isn’t attributable to solar activity: in fact, the Sun has been cooling slightly since 1960, during the same period in which temperatures on Earth have been warming. Solar activity has certainly caused dramatic climate changes in the past – Milankovitch cycles, slight variations in the Earth’s orbit, are believed to have been a dominant factor in causing past ice ages and thaws – but that is not what is happening in this case. Rather, there is evidence that the present warming trend is primarily attributable to human activity.
We know that human activity has released unprecedented amounts of carbon dioxide into the atmosphere since 1850, and that this has been happening faster than the oceans and vegetation can absorb it. The concentration of carbon dioxide in the atmosphere has increased from 280 to nearly 380 parts per million. We also know that the increase in atmospheric carbon dioxide levels is attributable mainly to burning fossil fuels and to deforestation. Among other evidence, this is supported by the observation that the ratio of the carbon-13 to carbon-12 isotopes in the atmosphere has shrunk: fossil fuels, being composed of plant matter, contain a lower carbon-13 to carbon-12 ratio than does the carbon naturally occurring in the atmosphere.
We know that carbon dioxide, along with some other gases occurring in the atmosphere such as water vapour and methane, is a greenhouse gas: that is, it absorbs and emits infrared radiation on certain wavelengths, and therefore prevents some of the infrared radiation from the Earth’s surface from escaping into space. The natural greenhouse effect, caused by the natural concentrations of greenhouse gases in the atmosphere, is in part what makes Earth warm enough to be habitable in the first place. An increase in the concentration of greenhouse gases in the atmosphere will, all other things being equal, cause the Earth to get warmer. All of this is taught in primary school science in most countries, and is not itself controversial. We know therefore that an increase in atmospheric carbon dioxide affects radiative forcing, which is “a measure of how the energy balance of the Earth-atmosphere system is influenced when factors that affect climate are altered. The word radiative arises because these factors change the balance between incoming solar radiation and outgoing infrared radiation within the Earth’s atmosphere.” In other words, it is the change in net energy flow at the top of the atmosphere: more energy flowing in than flowing out means that the system will warm, and the reverse means that the system will cool. Radiative forcing is measured at the tropopause – the top of the troposphere, which is the lowest layer of the atmosphere – and is conventionally measured in watts per square metre.
Climate sensitivity measures the extent of the effect of changes in radiative forcing on surface temperatures: for each increase of 1 watt per square metre of radiative forcing, how much will temperatures on the Earth’s surface increase? This is a complicated question, and one which has been extensively studied by climatologists. According to Knutti and Hegerl (2008), reviewing a range of studies, climate sensitivity – here defined as the effect on global surface temperatures if the amount of carbon dioxide in the atmosphere were to double – is likely to lie between 2°C and 4.5°C. This is based on a variety of different observations, both on observations from the instrumental period – the time during which temperatures have been systematically measured, roughly the past 150 years – and on paleoclimate data. Some recent studies, such as Hargreaves et al (2012), use data from the cooling during the last glacial period.
Any uncertainty about climate sensitivity is eagerly exploited by those who deny that climate change is happening: Viscount Monckton of Brenchley, a vocal climate change sceptic who is not himself a climatologist, has published a non-peer-reviewed paper arguing that climate sensitivity is far lower than the consensus estimate. In response, Gavin Schmidt of NASA accuses him of making a number of egregious errors. Chief among them is Schmidt’s allegation that Monckton has no coherent justification for reducing by two-thirds the accepted estimate of the radiative forcing attributable to carbon dioxide. I am not a climatologist and am not in a position to pronounce on a question this technical, but Schmidt is himself an eminent climatologist at NASA’s Goddard Institute for Space Studies, and he relies upon a substantial body of peer-reviewed research on the subject of radiative forcing. The estimates of radiative forcing used by the IPCC, derived from a number of studies, give a consensus figure of around 3.7 watts per square metre for the radiative forcing which would result from a doubling of atmospheric carbon dioxide. The accuracy of the models has been confirmed both by measurements of the increased amount of infrared radiation reaching the Earth’s surface at the wavelengths absorbed by carbon dioxide, and by satellite measurements of the reduced amount of infrared radiation escaping into space at those same wavelengths. It seems that Monckton’s claims fly in the face of well-established science.
In short, the evidence is clear that the Earth is warming at a relatively rapid rate, and that this warming is in large part the result of human activity since the beginning of industrial times – large-scale deforestation and the burning of fossil fuels – which has led to a growing concentration of greenhouse gases, in particular carbon dioxide, in the atmosphere. And there is evidence, too, of catastrophic consequences, including a growing risk of droughts, floods and heat waves, dealt with in some depth by Smith et al (2008). This is a social justice issue too. In our unjust global economic system, the adverse consequences of climate change fall overwhelmingly on people in developing countries, while it is the developed world which is responsible for most of the world’s carbon dioxide emissions.
My present position was not easily reached. Most of us are instinctively inclined, with good reason, to be sceptical about predictions of catastrophe, especially when those predictions require us to make radical changes to our ways of life. And there have certainly been media scares in the past which bore little relation to the scientific evidence. But global warming is not a media scare: it is a reality, established by a substantial corpus of peer-reviewed scientific research.