Tuesday, April 9, 2013

Broken Hockey stick?


    Apparently the average temp has been flat for a couple of years,   Below Fred Pearce at Yale 360 explains current theories. 
     This provides  an opportunity to "go slow" arguments.  Se Economist below:

Probing the Reasons Behind The Changing Pace of Warming

by Fred Pearce, originally published by Yale Environment 360  | TODAY
A consensus is emerging among scientists that the rate of global warming has slowed over the last decade. While they are still examining why, many researchers believe this phenomenon is linked to the heat being absorbed by the world’s oceans.
Whatever happened to global warming? Right now, that question is a good way of starting a heated argument. Some say it is steaming ahead. But others say it has stalled, gone into reverse, or never happened at all — and they don’t all run oil companies or vote Republican.

So what is going on?

First, talk of global cooling is palpable nonsense. This claim relies on the fact that no year has yet been hotter than 1998, an exceptional year with a huge planet-warming El Nino in the Pacific Ocean. Naysayers pretend that 1998 was typical, when it was anything but, and that temperatures have been declining since, which is statistical sleight of hand.

Meanwhile consider this. According to the National Oceanic and Atmospheric Administration (NOAA), all 12 years of the new century rank among the 14 warmest since worldwide record-keeping began in 1880. The second-warmest year on record, after 1998, was 2010. This is not evidence of cooling.

But there is a growing consensus among temperature watchers that the pace of warming in the atmosphere, which began in earnest in the 1970s and seemed to accelerate in the 1990s, has slackened, or stalled, or paused, or whatever word you choose. It may turn out to be a short blip; but it is real. “Although the first decade of the 21st century was the warmest on record, warming has not been as rapid since 2000,” says Pete Stott, head of climate monitoring and attribution at the UK’s Met Office, one of the leading keepers of the global temperature. He calls it a “hiatus” in warming.

In a blog last week, James Hansen, the retiring head of NASA’s Goddard Institute for Space Studies (GISS), agreed that “the rate of global warming seems to be less this decade than it has been during the prior quarter century”

Something is going on. With heat-trapping greenhouse gases accumulating in the atmosphere ever faster, we might expect accelerated warming. So it needs explaining.

There are a number of theories. Hansen suggested that extra emissions of particles in Asian smogs could be shading the Earth and camouflaging the greenhouse effect. In a February post on RealClimate, his Goddard Institute colleague Gavin Schmidt instead pointed to fewer warming El Ninos and more cooling La Ninas. He suggested that adjusting for their influence produced an unbroken pattern of warming.

Schmidt’s analysis certainly hints at a role for the oceans in all this. And most researchers on the case argue that, one way or another, the most likely explanation for the heating hiatus is that a greater proportion of the greenhouse warming has been diverted from the atmosphere into heating the oceans. A new study from Kevin Trenberth of the National Center for Atmospheric Research in Boulder, Colorado, published online in Geophysical Research Letters, found that ocean warming has been accelerating over the last 15 years.

Richard Allan of the University of Reading in England says simply: “Warming over the last decade has been hidden below the ocean surface.” If you take the oceans into account, he says, “global warming has actually not slowed down.”

This should not come as a surprise, notes Chris Rapley of University College London. The oceans are the planet’s main heat sinks. More than 90 percent of the extra heat trapped in the atmosphere by greenhouse gases ends up there. But, while climate models are good at calculating atmospheric processes, they are poorer at plumbing the ocean-atmosphere interactions that determine how fast and how regularly this happens.

That makes those interactions a big source of uncertainty about atmospheric global warming, especially over the short term. If oceans grab a bit more heat one year, they can shut down that year’s warming. Equally, if they release a bit more they can accelerate atmospheric warming. This matters. “The way the ocean distributes the extra energy trapped by rising greenhouse gases is critical... [to] global surface temperatures,” says Allan. For forecasters trying to figure out the next decade or so, oceans could screw it all up.

Some bits of the puzzle have been known for a while. For instance, during El Nino years, warm water spreads out across the equatorial Pacific and the ocean releases heat into the air, warming the air measurably. That is what happened in 1998.

But while El Ninos come and go within a year or so, there are other cycles in heat distribution and circulation of the oceans that operate over decades. They include the Pacific Decadal Oscillation and the Atlantic Multidecadal Oscillation (AMO), both of which have been implicated in climate fluctuations in the 20th century. So have these or other ocean cycles been accelerating the uptake of heat by the oceans?

Virginie Guemas of the Catalan Institute of Climate Sciences in Barcelona, believes so. In a paper published in Nature Climate Change this week, she claims to provide the first “robust” evidence linking ocean uptake of heat directly to what she calls the recent “temperature plateau” in the atmosphere.

By plugging detailed measurements of recent atmospheric and sea temperatures into EC-Earth, a European model of interactions between atmosphere, oceans, ice and land surfaces, Guemas found that about 40 percent of the take-up was in the tropical Pacific, and another 40 percent in the tropical and North Atlantic.

She told me that it seems likely the changing thermohaline ocean circulation, which starts in the North Atlantic, plus the cycles of El Nino and perhaps the AMO, may play a prominent role. She thinks her model could have predicted the recent slowdown of atmospheric warming ahead of time.

That would be a breakthrough, but nobody has done it yet. Meanwhile, the climate modellers are skating on thin ice when they make predictions that play out over the timescales of a decade or so on which ocean cycles seem to operate. These forecasters can claim that, all things considered, they have done pretty well. But the forecasts remain hostages to fortune.

If anything, the recent pause shows the model forecasts in a good light. Myles Allen, a climate modeller at Oxford University in England, reported in Nature Geoscience last month on an audit of one of his own forecasts, which he made in 1999. He had predicted a warming of a quarter-degree Celsius between the decade that ended in 1996 and the decade that ended in 2012. He found that, in the real world, temperatures got too warm too soon during the 1990s; but the slackening pace since had brought the forecast right back on track.

That shows the forecast is performing well so far, but Allen admitted it might not stay that way. If temperatures flat-line out to 2016, his model’s prediction for that year will look no better than a forecast based on a series of random fluctuations.

Some in the mainstream climate community privately admit that they were caught out by the slackening pace of warming in the past decade or so. Back in the 1990s, some suggested — or at least went along with — the idea that all the warming then was a result of greenhouse gases. They were slow to admit that other factors might also be at work, and later failed to acknowledge the slowdown in warming. As Allen pointed out earlier this year: “A lot of people were claiming in the run-up to the Copenhagen 2009 [climate] conference that warming was accelerating. What has happened since then has demonstrated that it is foolish to extrapolate short-term climate trends.”

Not surprisingly they have been taken to task for this hubris. Roger Pielke Jr., an environmental studies professor at the University of Colorado at Boulder, who enjoys baiting the mainstream, told me last month: ”It is good to see climate scientists catching up with the bloggers. They should ask why it took so long to acknowledge what has been apparent to most observers for some time.”

But modellers are now responding more actively to the new real-world temperature data. For instance, the Met Office’s Stott reported last month that global temperatures were following the “lower ranges” of most model forecasts, and that higher projections were now “inconsistent” with the temperature record.

And last December, the Met Office downgraded its best guess for temperatures in the five years to 2017 from 0.54 degrees C higher than the average for the late-20th century average to 0.43 degrees higher. It said the new forecast was the first output of its latest climate model, HadGEM3, which incorporates new assessments of natural cycles.

But the problem is that these cycles are not well integrated into most climate models. Natural cycles could switch back to warming us again at any time, admits Stott. But he has no clear idea when.

The stakes for the climate forecasting community are high. It may be unfair, but the brutal truth is that if the climatologists get their forecasts for the coming decade badly wrong, then a great many in the public will simply not believe what they have to say about 2050 or 2100 – even though those forecasts may well be more reliable.

Forecasters badly need a way to forecast the ocean fluctuations, and it could just be that Guemas’s new study will help them do that. She claims that her findings open the way to the future delivery of “operational decadal climate predictions.” For now she is cautious about making firm predictions, but told me she believes that “the heat that has been absorbed recently by the ocean might very well be released back to the atmosphere soon. This would be the scenario of highest probability. It would mean an increased rate of [atmospheric] warming in the next decade.”

It would indeed. If natural cycles start pushing towards strong warming, they will add to the continued inexorable upward push from rising concentrations of heat-trapping greenhouse gases. In that case, we would see climate change returning to the rapid pace of the 1990s. Whatever happened to global warming? The odds may be that by 2020 it will have come roaring back.

Climate science

A sensitive matter

The climate may be heating up less in response to greenhouse-gas emissions than was once thought. But that does not mean the problem is going away

OVER the past 15 years air temperatures at the Earth’s surface have been flat while greenhouse-gas emissions have continued to soar. The world added roughly 100 billion tonnes of carbon to the atmosphere between 2000 and 2010. That is about a quarter of all the CO₂ put there by humanity since 1750. And yet, as James Hansen, the head of NASA’s Goddard Institute for Space Studies, observes, “the five-year mean global temperature has been flat for a decade.”
Temperatures fluctuate over short periods, but this lack of new warming is a surprise. Ed Hawkins, of the University of Reading, in Britain, points out that surface temperatures since 2005 are already at the low end of the range of projections derived from 20 climate models (see chart 1). If they remain flat, they will fall outside the models’ range within a few years.
The mismatch between rising greenhouse-gas emissions and not-rising temperatures is among the biggest puzzles in climate science just now. It does not mean global warming is a delusion. Flat though they are, temperatures in the first decade of the 21st century remain almost 1°C above their level in the first decade of the 20th. But the puzzle does need explaining.
The mismatch might mean that—for some unexplained reason—there has been a temporary lag between more carbon dioxide and higher temperatures in 2000-10. Or it might be that the 1990s, when temperatures were rising fast, was the anomalous period. Or, as an increasing body of research is suggesting, it may be that the climate is responding to higher concentrations of carbon dioxide in ways that had not been properly understood before. This possibility, if true, could have profound significance both for climate science and for environmental and social policy.
The insensitive planet
The term scientists use to describe the way the climate reacts to changes in carbon-dioxide levels is “climate sensitivity”. This is usually defined as how much hotter the Earth will get for each doubling of CO₂ concentrations. So-called equilibrium sensitivity, the commonest measure, refers to the temperature rise after allowing all feedback mechanisms to work (but without accounting for changes in vegetation and ice sheets).
Carbon dioxide itself absorbs infra-red at a consistent rate. For each doubling of CO₂ levels you get roughly 1°C of warming. A rise in concentrations from preindustrial levels of 280 parts per million (ppm) to 560ppm would thus warm the Earth by 1°C. If that were all there was to worry about, there would, as it were, be nothing to worry about. A 1°C rise could be shrugged off. But things are not that simple, for two reasons. One is that rising CO₂ levels directly influence phenomena such as the amount of water vapour (also a greenhouse gas) and clouds that amplify or diminish the temperature rise. This affects equilibrium sensitivity directly, meaning doubling carbon concentrations would produce more than a 1°C rise in temperature. The second is that other things, such as adding soot and other aerosols to the atmosphere, add to or subtract from the effect of CO₂. All serious climate scientists agree on these two lines of reasoning. But they disagree on the size of the change that is predicted.
The Intergovernmental Panel on Climate Change (IPCC), which embodies the mainstream of climate science, reckons the answer is about 3°C, plus or minus a degree or so. In its most recent assessment (in 2007), it wrote that “the equilibrium climate sensitivity…is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C and is very unlikely to be less than 1.5°C. Values higher than 4.5°C cannot be excluded.” The IPCC’s next assessment is due in September. A draft version was recently leaked. It gave the same range of likely outcomes and added an upper limit of sensitivity of 6°C to 7°C.
A rise of around 3°C could be extremely damaging. The IPCC’s earlier assessment said such a rise could mean that more areas would be affected by drought; that up to 30% of species could be at greater risk of extinction; that most corals would face significant biodiversity losses; and that there would be likely increases of intense tropical cyclones and much higher sea levels.
New Model Army
Other recent studies, though, paint a different picture. An unpublished report by the Research Council of Norway, a government-funded body, which was compiled by a team led by Terje Berntsen of the University of Oslo, uses a different method from the IPCC’s. It concludes there is a 90% probability that doubling CO₂ emissions will increase temperatures by only 1.2-2.9°C, with the most likely figure being 1.9°C. The top of the study’s range is well below the IPCC’s upper estimates of likely sensitivity.
This study has not been peer-reviewed; it may be unreliable. But its projections are not unique. Work by Julia Hargreaves of the Research Institute for Global Change in Yokohama, which was published in 2012, suggests a 90% chance of the actual change being in the range of 0.5-4.0°C, with a mean of 2.3°C. This is based on the way the climate behaved about 20,000 years ago, at the peak of the last ice age, a period when carbon-dioxide concentrations leapt. Nic Lewis, an independent climate scientist, got an even lower range in a study accepted for publication: 1.0-3.0°C, with a mean of 1.6°C. His calculations reanalysed work cited by the IPCC and took account of more recent temperature data. In all these calculations, the chances of climate sensitivity above 4.5°C become vanishingly small.
If such estimates were right, they would require revisions to the science of climate change and, possibly, to public policies. If, as conventional wisdom has it, global temperatures could rise by 3°C or more in response to a doubling of emissions, then the correct response would be the one to which most of the world pays lip service: rein in the warming and the greenhouse gases causing it. This is called “mitigation”, in the jargon. Moreover, if there were an outside possibility of something catastrophic, such as a 6°C rise, that could justify drastic interventions. This would be similar to taking out disaster insurance. It may seem an unnecessary expense when you are forking out for the premiums, but when you need it, you really need it. Many economists, including William Nordhaus of Yale University, have made this case.
If, however, temperatures are likely to rise by only 2°C in response to a doubling of carbon emissions (and if the likelihood of a 6°C increase is trivial), the calculation might change. Perhaps the world should seek to adjust to (rather than stop) the greenhouse-gas splurge. There is no point buying earthquake insurance if you do not live in an earthquake zone. In this case more adaptation rather than more mitigation might be the right policy at the margin. But that would be good advice only if these new estimates really were more reliable than the old ones. And different results come from different models.
One type of model—general-circulation models, or GCMs—use a bottom-up approach. These divide the Earth and its atmosphere into a grid which generates an enormous number of calculations in order to imitate the climate system and the multiple influences upon it. The advantage of such complex models is that they are extremely detailed. Their disadvantage is that they do not respond to new temperature readings. They simulate the way the climate works over the long run, without taking account of what current observations are. Their sensitivity is based upon how accurately they describe the processes and feedbacks in the climate system.
The other type—energy-balance models—are simpler. They are top-down, treating the Earth as a single unit or as two hemispheres, and representing the whole climate with a few equations reflecting things such as changes in greenhouse gases, volcanic aerosols and global temperatures. Such models do not try to describe the complexities of the climate. That is a drawback. But they have an advantage, too: unlike the GCMs, they explicitly use temperature data to estimate the sensitivity of the climate system, so they respond to actual climate observations.
The IPCC’s estimates of climate sensitivity are based partly on GCMs. Because these reflect scientists’ understanding of how the climate works, and that understanding has not changed much, the models have not changed either and do not reflect the recent hiatus in rising temperatures. In contrast, the Norwegian study was based on an energy-balance model. So were earlier influential ones by Reto Knutti of the Institute for Atmospheric and Climate Science in Zurich; by Piers Forster of the University of Leeds and Jonathan Gregory of the University of Reading; by Natalia Andronova and Michael Schlesinger, both of the University of Illinois; and by Magne Aldrin of the Norwegian Computing Centre (who is also a co-author of the new Norwegian study). All these found lower climate sensitivities. The paper by Drs Forster and Gregory found a central estimate of 1.6°C for equilibrium sensitivity, with a 95% likelihood of a 1.0-4.1°C range. That by Dr Aldrin and others found a 90% likelihood of a 1.2-3.5°C range.
It might seem obvious that energy-balance models are better: do they not fit what is actually happening? Yes, but that is not the whole story. Myles Allen of Oxford University points out that energy-balance models are better at representing simple and direct climate feedback mechanisms than indirect and dynamic ones. Most greenhouse gases are straightforward: they warm the climate. The direct impact of volcanoes is also straightforward: they cool it by reflecting sunlight back. But volcanoes also change circulation patterns in the atmosphere, which can then warm the climate indirectly, partially offsetting the direct cooling. Simple energy-balance models cannot capture this indirect feedback. So they may exaggerate volcanic cooling.
This means that if, for some reason, there were factors that temporarily muffled the impact of greenhouse-gas emissions on global temperatures, the simple energy-balance models might not pick them up. They will be too responsive to passing slowdowns. In short, the different sorts of climate model measure somewhat different things.
Clouds of uncertainty
This also means the case for saying the climate is less sensitive to CO₂ emissions than previously believed cannot rest on models alone. There must be other explanations—and, as it happens, there are: individual climatic influences and feedback loops that amplify (and sometimes moderate) climate change.
Begin with aerosols, such as those from sulphates. These stop the atmosphere from warming by reflecting sunlight. Some heat it, too. But on balance aerosols offset the warming impact of carbon dioxide and other greenhouse gases. Most climate models reckon that aerosols cool the atmosphere by about 0.3-0.5°C. If that underestimated aerosols’ effects, perhaps it might explain the lack of recent warming.
Yet it does not. In fact, it may actually be an overestimate. Over the past few years, measurements of aerosols have improved enormously. Detailed data from satellites and balloons suggest their cooling effect is lower (and their warming greater, where that occurs). The leaked assessment from the IPCC (which is still subject to review and revision) suggested that aerosols’ estimated radiative “forcing”—their warming or cooling effect—had changed from minus 1.2 watts per square metre of the Earth’s surface in the 2007 assessment to minus 0.7W/m ² now: ie, less cooling.
One of the commonest and most important aerosols is soot (also known as black carbon). This warms the atmosphere because it absorbs sunlight, as black things do. The most detailed study of soot was published in January and also found more net warming than had previously been thought. It reckoned black carbon had a direct warming effect of around 1.1W/m ². Though indirect effects offset some of this, the effect is still greater than an earlier estimate by the United Nations Environment Programme of 0.3-0.6W/m ².
All this makes the recent period of flat temperatures even more puzzling. If aerosols are not cooling the Earth as much as was thought, then global warming ought to be gathering pace. But it is not. Something must be reining it back. One candidate is lower climate sensitivity.
A related possibility is that general-circulation climate models may be overestimating the impact of clouds (which are themselves influenced by aerosols). In all such models, clouds amplify global warming, sometimes by a lot. But as the leaked IPCC assessment says, “the cloud feedback remains the most uncertain radiative feedback in climate models.” It is even possible that some clouds may dampen, not amplify global warming—which may also help explain the hiatus in rising temperatures. If clouds have less of an effect, climate sensitivity would be lower.
So the explanation may lie in the air—but then again it may not. Perhaps it lies in the oceans. But here, too, facts get in the way. Over the past decade the long-term rise in surface seawater temperatures seems to have stalled (see chart 2), which suggests that the oceans are not absorbing as much heat from the atmosphere.
As with aerosols, this conclusion is based on better data from new measuring devices. But it applies only to the upper 700 metres of the sea. What is going on below that—particularly at depths of 2km or more—is obscure. A study in Geophysical Research Letters by Kevin Trenberth of America’s National Centre for Atmospheric Research and others found that 30% of the ocean warming in the past decade has occurred in the deep ocean (below 700 metres). The study says a substantial amount of global warming is going into the oceans, and the deep oceans are heating up in an unprecedented way. If so, that would also help explain the temperature hiatus.
Double-A minus
Lastly, there is some evidence that the natural (ie, non-man-made) variability of temperatures may be somewhat greater than the IPCC has thought. A recent paper by Ka-Kit Tung and Jiansong Zhou in the Proceedings of the National Academy of Scienceslinks temperature changes from 1750 to natural changes (such as sea temperatures in the Atlantic Ocean) and suggests that “the anthropogenic global-warming trends might have been overestimated by a factor of two in the second half of the 20th century.” It is possible, therefore, that both the rise in temperatures in the 1990s and the flattening in the 2000s have been caused in part by natural variability.
So what does all this amount to? The scientists are cautious about interpreting their findings. As Dr Knutti puts it, “the bottom line is that there are several lines of evidence, where the observed trends are pushing down, whereas the models are pushing up, so my personal view is that the overall assessment hasn’t changed much.”
But given the hiatus in warming and all the new evidence, a small reduction in estimates of climate sensitivity would seem to be justified: a downwards nudge on various best estimates from 3°C to 2.5°C, perhaps; a lower ceiling (around 4.5°C), certainly. If climate scientists were credit-rating agencies, climate sensitivity would be on negative watch. But it would not yet be downgraded.
Equilibrium climate sensitivity is a benchmark in climate science. But it is a very specific measure. It attempts to describe what would happen to the climate once all the feedback mechanisms have worked through; equilibrium in this sense takes centuries—too long for most policymakers. As Gerard Roe of the University of Washington argues, even if climate sensitivity were as high as the IPCC suggests, its effects would be minuscule under any plausible discount rate because it operates over such long periods. So it is one thing to ask how climate sensitivity might be changing; a different question is to ask what the policy consequences might be.
For that, a more useful measure is the transient climate response (TCR), the temperature you reach after doubling CO₂ gradually over 70 years. Unlike the equilibrium response, the transient one can be observed directly; there is much less controversy about it. Most estimates put the TCR at about 1.5°C, with a range of 1-2°C. Isaac Held of America’s National Oceanic and Atmospheric Administration recently calculated his “personal best estimate” for the TCR: 1.4°C, reflecting the new estimates for aerosols and natural variability.
That sounds reassuring: the TCR is below estimates for equilibrium climate sensitivity. But the TCR captures only some of the warming that those 70 years of emissions would eventually generate because carbon dioxide stays in the atmosphere for much longer.
As a rule of thumb, global temperatures rise by about 1.5°C for each trillion tonnes of carbon put into the atmosphere. The world has pumped out half a trillion tonnes of carbon since 1750, and temperatures have risen by 0.8°C. At current rates, the next half-trillion tonnes will be emitted by 2045; the one after that before 2080.
Since CO₂ accumulates in the atmosphere, this could increase temperatures compared with pre-industrial levels by around 2°C even with a lower sensitivity and perhaps nearer to 4°C at the top end of the estimates. Despite all the work on sensitivity, no one really knows how the climate would react if temperatures rose by as much as 4°C. Hardly reassuring.

Labels: , , ,


Post a Comment

Subscribe to Post Comments [Atom]

<< Home