Honesty About Dangerous Climate Change

Just in case you’re in a good mood, we offer you this exploration of a most difficult subject — What is the science actually telling us, and how should we pass it on It doesn’t directly address “the equity issue,” but it helps, we think, to lay the foundation upon which real climate equity will have to be built…

I. The 2C line and its Implications

The last several years have seen a substantial quickening in the discussion of “dangerous climate change.” Unfortunately, as it has progressed, the science has pulled us ever further away from any sort of easy confidence. Sure, we have the technology we need to begin decarbonizing the economy, but first we have to break the fossil cartel’s political lock. And we don’t have forever. Indeed, we now know that the situation really is quite dire.

Once, even climate radicals, faced on one side by denialists and on the other by liberal environmentalist calls for targets of, say, 550 ppm CO2, could appeal to 450 ppm CO2 as an appropriate and reasonably safe target. It wasn’t going to be easy, not compared to business and usual, but it was clearly possible. And hopefully it was safe. Even as a precautionary target, it was good enough, or seemed to be.

No more. Today, large numbers of articles, reports and recommendations on the subject of dangerous climate change have drawn a “2 degree line” (no more than a 2C increase in global mean surface temperature above preindustrial levels), and argued that it defines the limit that really matters. And this time, the line is not being arbitrarily drawn. This time, a growing mass of high-quality science indicates that, beyond 2C, dangerous climate change. could well become a full-blown ecological and civilizational crisis.

One milestone in this literature comes from the Climate Action Network (the largest international coalition of NGOs working the climate issue). CAN called, back at COP-8 in 2002, for anthropogenic warming to be held below 2C and then rapidly drawn down. And CAN’s position paper, or the slightly condensed version included in an excellent briefing packet called Building on Kyoto, is still well worth your time. If you want to know the dangers, from the suffering already being visited upon the poor and the badly born, to the probable, terrible, incremental impacts of continued warming – threats of species and ecosystem loss, risks to water supplies and food production, increases in droughts and floods – this is a good first place to go. As for the greater risks – up to and including potentially catastrophic possibilities like a substantial change in the thermohaline circulation, a mass die-off of the Amazon, or the melting of the Greenland or Antarctic ice sheets – you’ll find them here as well.

Indeed, we have little to add to CAN’s analysis of dangerous climate change, which is itself based on the reports of the IPCC and the broader scientific literature on climate impacts. Save for one thing: We hasten add that, read carefully, the scientific literature implies that any further climate change is dangerous. And certainly, despite all, this is the essential bottom line. Every human life lost is an irreversible harm, as is the extinction of a culture or a species. It’s a primitive truth, but even today, with needless death all around us, it bears repeating.

As does this one: we’re already committed to a significant level of harm; what we’re discussing now is how much more we’re willing to tolerate. Moreover, and significantly, we’re doing so in a situation that is transparently, patently, unjust. Many of those who are most likely to be harmed – or destroyed – did little or nothing to bring us to this point. Nor are they fairly represented in the debates and the decisions. Hell, the poor might as well be polar bears, for all the voice they have in the halls of power.

Still though, what choice do we have but to soldier on Enter the discussion about where to draw the line, and how, and you enter the lands of “realism.” We’d all vote to stop climate change immediately, if we only believed that doing so would be so cheap that no country or bloc of countries could effectively object. But we do not so believe. Thus we’re forced to start trading away lives and species in order to advocate a “reasonable” definition of “dangerous.”

That’s the game.

And we play it because we have no choice. Because we cannot afford to be purists. Because, come what may, we have to get the ball rolling. Because the emissions pathways implied by even a compromise target like 2C are (and are recognized to be) far beyond the willingness to pay of today’s most powerful global actors, whether they be identified as individuals, countries, or corporations.

Here, then, is the awful truth: the worse the situation, and the faster we have to act, the more it’s going to cost. This, really, is the problem, and these days its not an altogether comfortable one to be aggressively pointing out. So it’s no surprise that, as we will show, the advocates of precautionary temperature targets strain to soft-pedal their messages, typically by linking 2C of warming to CO2 concentration targets that can be straight-forwardly shown to actually imply a larger, and sometimes much larger, probable warming.

Many of these people are our friends, so we want to be very clear here. Climate activists soft-pedal the truth because they think it will help, and perhaps they are even right. Who are we to know Nevertheless, we also believe that the waffling is becoming dangerous, that it threatens, if continued, to critically undermine the coherence of our emerging understanding. That it delays difficult, but necessary, conclusions.

We believe, too, that this risk isn’t worth taking. The science is coming out now, and we might as well face it quickly. If the situation is worse than we had hoped, and the implied costs of transition greater, well then, so be it. Costs are political matters, and they define a debate that we’re going to have to have, seriously, and as quickly as possible.

The next step, in any case, is to be clear about what the science is telling us. And here we should try to be old fashioned, to return to the time before we knew that science was irreducibly political, to try, if only briefly, to keep politics to the side. For one thing, the effort is a bracing one. It forces us to think, precisely, about what we’re willing to accept, and for whom. And it obligates the scientists, in particular, to leave the framing to others, and to tell it like it really is.

At least that’s how we see it. Others disagree, and the core of the disagreement is the strategic sense – immensely widespread these days – that just now it’s unwise to point out how small the remaining carbon budget really is. Here too we understand the logic, and here too we beg to disagree. Because while the carbon budget is small, the scope for decarbonization is large – enough so that we actually still have reasonably fair and attractive paths open to us.

The situation, in other words, is not hopeless, though we’re going to have to think pretty clearly to get out of it. We should start by being honest, first of all with ourselves.

II. Thinking probabilistically

In what follows, we provide a brief sketch of the science needed to correlate emissions stabilization targets with the equilibrium temperatures that they most probably imply. As climate policy often does, this requires that we make some fairly technical arguments, and condemns us to write for those already familiar with the key concepts. We’ve tried to manage this situation by hiding all of the (fairly extensive) technical background material behind hyperlink citations. If you want to minimize your contact with the technical argument, just ignore them. Also, if you’re really prepared to trust us, even when we say things like “The equilibrium temperature resulting from a greenhouse-gas concentration of 550 ppm CO2-equivalent has only a 10-25% chance of being under 2C”, you can just skip just ahead a bit to The Scary Results, below.

Here goes…

The general relationship between greenhouse gas concentrations and temperature change is well understood. To a first order, equilibrium temperature change is a simple function of net radiative forcing and climate sensitivity. Climate sensitivity is defined as the increase in global mean surface temperature at equilibrium resulting from an increase in forcing equivalent to a doubling of CO2. Notably, the radiative forcing of a given CO2 concentration is known relatively precisely. (See CO2 Forcing.) Much less certain are the radiative forcings (both positive and negative) of non-CO2 gases and aerosols (as well as other effects such as the albedo effects of soot on snow and ice), both in the present and the future. (See Non-CO2 Forcings.)

The climate sensitivity itself is also very uncertain, one might say notoriously so; the IPCC’s Third Assessment Report (TAR), published in 2001, says only that it’s probably between 1.5C and 4.5C, and pointedly declines to estimate the probability distribution function (hereafter PDF) of its possible values. Also, the historic “best estimate” of 2.5C (from the earlier second assessment) has been dropped as no longer a justifiable consensus. And since the TAR, studies have reported a significant probability that the climate sensitivity exceeds 4.5C. (See Climate Sensitivity PDFs .)

In any case, it is quite straightforward, given a climate sensitivity PDF, to calculate the probable equilibrium temperature response associated with a given greenhouse-gas concentration. If the concentration target is specified in CO2-equivalent terms (in ppm), or in terms of net radiative forcing (in watts per square meter), the calculation is particularly simple. (See Probability Calculations .) If you want to do a similar calculation for a stabilization level of CO2 only, you have to make an additional assumption about the forcings from non-CO2 greenhouse gases, aerosols and other factors. Reasonable guesses for these forcings in the coming decades range from an optimistic 50 ppm CO2-equivalent to 100 ppm CO2-equivalent or more.

The math here isn’t quite so simple, because of the non-linearity of CO2 forcing; the same amount of forcing from non-CO2 gases (measured in Wm-2) adds a different value of CO2-equivalent (in ppm), depending on whether it is added to (say) 450 ppm or 550 ppm of CO2. (See CO2 Equivalence .) However, it’s still reasonable and informative to estimate the temperature consequences of CO2-only stabilization by adding, say, 50 or 100 ppm of CO2-equivalent. So, for example a 450 ppm concentration of just CO2 can be modeled as 500 or 550 ppm of CO2-equivalent, and a 550 ppm concentration of CO2 can be modeled as 600 or 650 ppm of CO2-equivalent.

Did you get that

The Scary Results

Here, then, in a few simple probabilities, are some reasons why an honest appraisal of stabilization targets is urgently necessary.

As we suggested, a 450 ppm CO2 scenario is reasonably comparable to a 500 or 550 ppm CO2-equivalent scenario.

  • At the low (optimistic) end, where 450 ppm CO2 means 500 ppm CO2 equivalent, there’s still only a roughly 20% – 40% chance that the equilibrium temperature will be below 2C. And there’s a 15-30% chance that it will actually exceed 3C.(See Calculation Results)
  • If, however, the non-CO2 forcings turn out to add to about 100 ppm CO2 equivalent, and this is a distinct possibility, then 450 ppm CO2 means 550 ppm CO2-equivalent. In this case there is only a 10-25% chance that the temperature will be below 2C. There’s a 20-35% chance it will exceed 3, and a 5-15% chance that it will exceed 4C.
  • Similar calculations can be done for a 550 ppm CO2 stabilization level, by estimating the equilibrium temperature associated with 600 ppm CO2-equivalent (about a 5-15% chance of staying under 2% and a 15-30% chance of exceeding 4C) or 650 ppm CO2 equivalent (about a 3-10% chance of staying under 2C and a 25-40% chance of exceeding 4C).
  • The forcing that gives you (say) a 90% chance of staying below 2C depends on the climate sensitivity PDF or PDFs used. For most published PDFs, which have a 90th percentile in the range 3.5-5C, this means the maximum forcing is roughly half a doubling. This turns out to be about 400 ppm CO2-equivalent, which is, alas, almost equal to today’s actual concentration of CO2 alone (about 380 ppm).

The complexity of non-CO2 forcings is a dissertation-length topic, but for the moment two points should suffice. First, in addition to CO2, there’s already about 100 ppm CO2-equivalent of other “well-mixed greenhouse gases” (primarily CH4, N20, and CFCs) in the atmosphere. Second, there is, today, a significant “masking” from the negative forcing from aerosols and other “cooling pollutants,” much of which will be reduced rapidly as fossil emissions are curtailed and traditional air-pollution brought under control. For mcuh more on this, see Non-CO2 Forcings.

Put simply, all this means that, even if greenhouse-gas concentrations stopped rising today, we might still already be committed to a temperature increase greater than 2C.

III. Ambiguity and its Discontents

In light of these numbers, it’s instructive to look at a few recent articles and speeches in which inconsistent temperature and concentration targets are simultaneously proposed.

Bob Watson

An important example comes from a recent policy opinion in Science by Robert Watson, the highly-respected former chair of the IPCC. Watson writes:

Governments should then consider setting a long-term target based either on a greenhouse gas stabilization level (between 450 and 550 ppm) or on limits for both the absolute magnitude of global temperature change (less than 2 to 3C) and the rate of temperature change (less than 0.2C per decade).

(Watson, R. T. 2003. “Climate change: The political situation.” Science 302: 1925-1926.)

First, note the ambiguity with regard to what gas or gases are covered by Watson’s proposed stabilization targets, and whether the temperature increase is relative to current or pre-industrial temperature. From context, it’s most likely that he means CO2 only, and temperature change relative to pre-industrial. In either case, he is omitting any discussion of the likelihood of his stabilization targets meeting his temperature targets, which, as we showed above, are fairly low.

John Browne

Another important contribution to the “dangerous climate change” discussion has been made recently by John Browne, Chairman and CEO of the oil giant BP. Browne’s and BP’s recent public endorsements of a relatively stringent mitigation target is of enormous political significance. But it’s even more fraught with ambiguities than Watson’s advice, and goes beyond them to actual contradictions. In a speech to the Council on Foreign Relations in New York on June 24, 2004, Browne said:

There is a very strong case for precautionary action and I believe the aim of that action should be to limit any increase in the world’s temperature to around 2 degrees Celsius. That translates into a stabilization of greenhouse gases in the atmosphere at around 500 to 550 ppm sometime early next century.

This statement is very ambiguous, first of all as to whether Browne is advocating a target of 2C above present or preindustrial temperatures, and also if his proposed 500-550 ppm stabilization target is CO2 or CO2-equivalent.

In contrast, in a published article in the July/August issue of Foreign Affairs, Browne writes:

A sober strategy would ensure that any increase in the world’s temperature is limited to between 2 or 3 degrees Celsius above the current level in the long run. Focused on that goal, a growing number of governments and experts have concluded that policy should aim to stabilize concentrations of carbon dioxide in the atmosphere in the range from 500 to 550 ppm over the next century, which is less than twice the pre-industrial level.

In yet a third statement, this one at a Pew Climate Center conference in July, Chris Mottershead, a “Distinguished Advisor” to John Browne, said:

We believe that anthropogenic human-induced climate change has to be kept below around 2 degrees, that the consequences of changes above 2 degrees are so dreadful if they were to occur – and it may still be only a maybe occur – that we need to avoid that. If you choose to keep your temperature below 2 degrees then you have to stabilize atmospheric concentrations at somewhere between 500 and 550 parts per million.

Here the default reading would seem to be that the goal is 2C total anthropogenic climate change, not 2C above current. And although the stabilization target is again ambiguous, Mottershead told an audience member after his talk that he was referring to all greenhouse gases, not just CO2. This contradicts Browne’s Foreign Affairs article, but is slightly more internally consistent, since stabilization of CO2 proper at between 500 ppm and 550 ppm, with any significant non-CO2 forcings, has only a small chance of keeping temperature change below 2C.

James Hansen

A third important example can be found in the work of James Hansen, Director of NASA’s Goddard Space Science Center and certainly one of the most important climate scientists on the planet. Based on the need to protect the Greenland Ice Sheet, Hansen argues for a temperature target of no more than 1C above present temperatures. He frames his GHG stabilization recommendation in terms of additional radiative forcing rather than a CO2 concentration target, but this is no obstacle to probabilistic analysis. His recommendation for holding net radiative forcing to no more than an additional 1 watt per square meter (equivalent to adding about 80 ppm of CO2 to the current level) would leave us with only a roughly 30-50% chance of keeping us below his stated temperature goal (See Hansen Calculations.)

Now, there’s nothing inherently wrong with choosing a policy which only meets your target under optimistic assumptions; if the costs of a more stringent policy are more unacceptable than the consequences of missing your target, it’s quite a defensible strategy. The problem is that these authors, all of whom (particularly Watson and Hansen) should know the actual probabilities they’re dealing with, do not lay them out in the course of their argument. Any of them, for example, could have said, “we ought to be seriously considering stabilization targets of 400 ppm CO2-equivalent if we want to have a high probability of staying below 2C and a very low probability of exceeding 3C.” As we showed above, such numbers are fully supported by transparent scientific calculations based on reasonable assumptions about the values of the most uncertain variables. Instead of such transparent argumentation, however, they suggest stabilization targets of at least 450 ppm CO2-equivalent, and as high as 550 ppm CO2, without even nodding to the low probabilities that such concentrations would yield the stabilization temperatures that they themselves advocate.

We had reasons for picking these examples, but they were hardly the only ones we had to choose from. For example, the Climate Action Network, in the paper we cited at the start of this article, has this to say about precautionary emissions pathways:

Nevertheless a plausible range of parameters indicates that CO2 concentration would have to peak no higher than 450 ppmv and probably somewhat lower. As a consequence of the need to reduce the warming, arrest the thermal sea level rise and minimize the risk of ice sheet decay or collapse cited above, the CO2 concentration would then have to be reduced.

Only in a footnote are we told that, with about 100 ppm of CO2-equivalent in non-CO2 greenhouse gases, a 450 ppm peak CO2 path, “mid-range” climate sensitivities and a concentration level that drops after it peaks, temperatures “would almost certainly approach and could exceed 2C.”

Also interesting is the recent report of the South-North Dialogue on Equity in the Greenhouse, organized by the Wuppertal Institute in Germany and The Energy Research Centre of the University of Cape Town, South Africa. This panel of academic and policy experts strongly endorsed the 2C target, and, in doing so, took care to include an outstanding graphic from Caldeira et al. (2003, “Climate sensitivity uncertainty and the need for energy without CO2 emission.” Science 299: 2052-2054) that lays out four possible 2C emissions paths, consistent with climate sensitivities of 1.5, 2.0, 3.0 and 4.5C.

Here, FYI, is that figure:

But even after showing this amazing figure, one that succinctly displays the reality of our situation, the South-North group then goes on to pick a 450 ppm CO2 path, which they describe as being consistent with 2C “providing the climate sensitivity turns out to be at the low end of its range of uncertainty.” Which, alas, it is not very likely to be.

Again, the caveats are there if you read the text carefully, look at all the figures carefully, and read the footnotes. But the target pathway always seems to slip upwards, and never do you get an actual argument for what a precautionary path with a high probability of staying under 2C (even if it’s less than our 90% suggestion) might actually look like, or why we should take it.

The reasoning in all these cases, seems to be two-fold. First, since the costs associated with even 450 ppm CO2 seem to be beyond the demonstrated willingness to pay of the global community, there’s no point to suggesting an even more “unrealistic” target. To seriously advocate (say) 350 ppm CO2 is to be outside – far, far outside – the mainstream policy discourse. Furthermore, within the realm of 450 to 550 ppm CO2, it’s at least possible to honestly imagine that the global emissions budget is large enough to be shared without any fatally unrealistic North to South redistribution of emissions rights. Given all this, why raise unnecessary alarms Isn’t this enough And isn’t the best strategy to just get the regulatory camel’s nose under the tent, set a price on carbon, and get the incentives in place for the necessary and inevitable technological revolution

Perhaps so. But there’s a problem: time.

IV. Towards a New Realism

Time is not on our side, for several reasons. The first is atmospheric inertia – the more CO2 we emit before we seriously begin to reduce emissions, the harder it will be to meet any stringent target. The second is economic inertia – the more coal plants and SUVs we and the developing countries build in the next decade, the more expensive it will be to reduce emissions. The third is the wild-card of aerosol emissions – if and when (and possibly before) fossil fuel emissions (particularly from coal) are substantially reduced, there will be a corresponding reduction in sulfate emissions, which actually cool the planet. The full extent of the negative forcing currently “masking” the positive forcing from anthropogenic greenhouse gases is poorly known, but it may be that cooling pollutants are actually offsetting 150 ppm of CO2-equivalent warming, or even more! (See, again, Non-CO2 Forcings) This is a terrifying prospect, because aerosols have extremely short atmospheric lifetimes relative to CO2. When they’re cleaned up, and they will be, their negative forcings will rapidly disappear. And we may find that we’re in hotter water than we thought.

The point in all this is that the global warming problem is much more urgent than is generally admitted. We may wish that it wasn’t, and we may hope that our children don’t live to see the worst case crystallize around them, but if we’re to have anything more than a snowball’s chance of making it in under the two degree target, we definitely need a new realism about what it will require.

To wit: It will require a hedging strategy that actually keeps the two degree target in reach. Which means early action, and lots of it. Which in turn means the prospect of a global climate accord that is fair enough to motivate serious global action. Which means a fair sharing of the now-scarce global greenhouse sinks, and, as if that isn’t enough, real consideration of the right to protection from climate harm, more-than-token funding for adaptation, and actual compensation for the now-inevitable damages.

Taking Hedging Seriously

Even skeptics of a negotiated temperature or concentrations target, such as Johanthan Pershing and Fernando Tudela (see their essay in Beyond Kyoto: Advancing the International Effort against Climate Change) suggest a hedging policy designed to keep low stabilization targets within reach, just in case the climate sensitivity comes in on the high end of its probable range. However, these discussions have rarely advanced beyond the argument that, if you’re not sure if you want to hit 450 ppm CO2 or allow as much as 550, your “optimal” hedging strategy is to stay on a trajectory between the “optimal” paths for the upper and lower targets. And, unfortunately, given what we now know, such a strategy looks pretty inadequate. In our view, a serious hedging strategy must, almost by definition, take account of the most restrictive limits or targets, the ones we’re going to have to try to make it to if the climate sensitivity turns out to be high.

This is not hard to understand. If you might someday decide 400 ppm CO2 or even 350 ppm CO2 is the right stabilization level, you probably don’t want to go too far above 400 ppm CO2 before you have to make that decision. In short, if we’re not honest about just how low the stabilization targets associated with a temperature target like 2C actually are, then even if we’re hedging while we debate a target, we won’t hedge enough.

In any case, we now face a situation in which hedging strategies must be discussed in some detail. Post-Kyoto discussions are well underway in both academic and policy circles, and many of them proceed by specifying emissions targets through 2020. Similarly, governments and utility companies are planning investments in energy systems that will have lifetimes of thirty years or more. Given all this, we can’t afford to not be thinking, quite concretely, about what will happen if the climate sensitivity turns out to be as high as 4 – 4.5C – or even higher.

The debate that needs to take place is the one in which we decide what probability of disaster we’re prepared to accept. Unfortunately, we’re ill prepared for this kind of debate. Let’s face it: no one among us would willingly board a plane that had a 1 in 100 risk of crashing, How then can we treat risks of long-term climate catastrophe – say a 10% chance of a 5 to 10 meter rise in sea level – as acceptable Or, if not acceptable, as inevitable (See Catastrophic Risks ).

Equity and Development Rights

The real issue, in all this, is that hedging strategies designed to keep low temperature targets within reach require us to assume that our remaining carbon budget is very small. A precautionary carbon budget associated with even a 400 ppm CO2 stabilization target – which is by no means “safe” – is probably only about 400 gigatonnes of carbon, +/- 50GtC, for the whole century. (See Carbon Cycle Uncertainty.)

This small budget inescapably means that not only emissions growth, but total emissions as well, must peak very soon and then decline, in the South as well as the North. This, in turn, means that substantial investments in low-emissions technology will soon be required, in both developing countries and the North, investments far in excess of those that would be considered “economically optimal” in the absence of severe emissions constraints.

If the developing countries must fund this additional investment themselves, their consumption and economic growth rates could be significantly reduced. Thus, as is now beginning to be recognized, it would be quite unfair to expect any but the richest developing countries to pay the full costs of the decarbonization investment that is needed, even within their borders, in the next few decades. Just as significantly, it would be unwise, for it is unlikely to happen.

The North became rich in a world without carbon limits, and few Southern diplomats are going to forget this anytime soon. Moreover, the rich world adds to it its already large carbon debt each year that its emissions continue to exceed its fair share of the remaining atmospheric space. Given this, most developing countries may quite justly demand that much, if not all, of the cost of decarbonizing their economies be carried by the wealthier countries.

Not to put too fine a point on this, but the greenhouse crisis represents a real “limit to growth”, and the argument over how to divide the costs associated with that limit is in fact an argument over rights to benefit from the economic value of the global greenhouse commons. Eventually, if there’s to be real progress, developing countries must receive credible guarantees that greenhouse mitigation will not compromise their development. As is – assuming that “development” is properly conceived – indeed their right.

The knot here is a Gordian one, but it can be cut. The key, we think, is greenhouse development rights, conceived as equal rights to the benefits of emitting carbon into the atmospheric commons, calculated, over time and on a per capita basis.

This does not mean equal cumulative per capita emissions. The issue here is benefits, and per capita developmental space. A ton of carbon, emitted by a British industrialist in the early days of the industrial revolution, is not equivalent to a ton of carbon, as emitted in today’s far more carbon-efficient economy. And it will be even less equivalent tomorrow, particularly if the decarbonization revolution really takes off.

The point of all this should be explicit. First of all, and most importantly, there is hope, even hope of greenhouse justice, and this for the simple reason that decarbonized development is actually possible. Second, and almost as crucially, the time is past when simple schemes like Contraction and Convergence. could plausibly claim to be either fair or workable. It’s time now, past time really, to get more sophisticated about what greenhouse justice would actually mean.

Not that we can know in any absolute sense, for the future of justice is, like the future in general, obscure. That said, we actually know a great deal. We know that the “equity issue” is now, finally, in play. That a fair accord must take history, and national circumstances, and technological change all into account. That, as Contraction and Convergence helped to teach us, it must be strongly founded in per-capita rights. That, at the end of the day, it must yield both contraction and convergence, and a truly sustainable form of development..

Adaptation and Compensation

One other crucial conclusion follows from this analysis of stabilization targets: that we need a serious discussion of financial liability – to pay for adaptation to climate change that will not be avoided, and as compensation for climate damages that will actually occur.

Even if we keep the warming below 2C, there will be a range of serious climate harms. For just this reason, the Climate Action Network advocates an “adaptation track” as part of a framework for a multi-stage, global Post-Kyoto regime. It’s not going to be easy to fund. In fact, it will be extremely difficult, and measures of historical responsibility will likely be key to doing so. To be sure, the difficulty of precisely attributing climate damages to specific causes makes further attributions of responsibility a real challenge, but it’s not an impossible one. Some damages (such as those caused by rising sea-levels) are clearly tied to anthropogenic global warming, and statistical measures of increasing impacts offer some reasonable basis for attributing other types of damages as well. As the damages rise, we’ll see plenty of ideas for navigating through this maze, though few of them are likely to be attractive to the rich. (See Liability.)

Realism would seem to imply that liability, the bulk of which must necessarily fall on the industrialized countries, will never be taken seriously in a world of sovereign nation states. Certainly the industrialized countries have successfully resisted any such discussion to date. But given the severity of the coming storm, it’s also pretty clear that continuing this kind of refusal will condemn us to a chaotic and bitter future, one in which the cooperation that adequate action depends upon can never actually materialize. The very fact that the damages are going to sharply rise, and that so many of them will fall on the developing world, virtually guarantees this.

It’s a tough issue, but it’s not going away.

V. Conclusion

This discussion does not address the question of how the needed “willingness to pay” will be won. This is, after all, an open question, and one that draws us from the dynamics of the climate system to the politics of honesty and justice, and to a very different set of problems.

Still, this is the central question, so we can hardly conclude without noting that, within the common frames of economic and political “rationality,” it’s almost impossible to take a genuinely precautionary approach to climate change. This is true for two reasons. First, within the myopia of conventional economic frames, it’s simply not “economically rational” for the current generation to pay to prevent harms that will occur far in the future, not, at least, if that future is being discounted at the typical rate of 3 to 5 percent a year. Second, it’s simply not “rational” for sovereign nation-states to pay to prevent climate damages in other countries. Individual politicians – “statesmen,” they would be called – may even want to do so, but they face almost insuperable obstacles, not the least of which is that politicians who expect to be reelected must work to maximize their own country’s economic receipts. (See Economic Rationality .)

There are, fortunately, other ways to approach this problem. The one we recommend asserts an ethical realism in which both equity and sustainability take precedence over the short-term maximization of national income, and this for the most pragmatic of reasons: Because, without such a realism, we have no real hope of building the international coalition needed to prevent a truly dangerous degree of climate change.

From this perspective, the imperative to prevent dangerous climate change is entirely unambiguous, and the range of cost estimates typically cited for low stabilization targets – from 0 to 5 percent of global economic activity – is only the cost of bringing our global economy back within the bounds of sustainability. (See Stabilization Costs). As for the need for the already wealthy countries to pay the vast majority of the bill, whatever it turns out to be, this appears as a straightforward consequence of the “ecological debt” accrued by those countries, a debt manifest in both their greater responsibility for the climate crisis and their greater capacity to do something about it.

The fact is that, notwithstanding the evident “victory” of economists in the old debate over the “limits to growth,” the climate crisis proves the ultimate inevitability of limits. The economists’ argument, simply put, was that the response of economic actors to price signals would ensure that resources did not actually run out. However, for reasons that are well known, even to economists, there is no effective price signal for the damage caused by climate change, and there will be none unless powerful countries choose to create one.

Neither future generations, nor poor countries today, are able to purchase climate protection from the polluters. As for us in the present generation, if we shrink from paying to preventing climate change, if we do so, in fact, before even knowing what the costs are, and before making serious efforts to minimize them (by, for example, eliminating fossil-fuel subsidies), then what do we do but make a mockery of our claims to seek sustainability What do we do but make our own worst fears come true

Of course we want all this to be inexpensive. The idea that we might actually have to spend 5 percent of our income solving the carbon problem is nearly unthinkable. But wishing doesn’t make it so, and it’s manifestly absurd, perhaps even suicidal, to allow our current “unwillingness to pay” to bound our thinking about precaution and sustainability. The fact is that, given all its many and predictable benefits, it would be no surprise to find that, all things considered, rapid decarbonization was actually cheap. In any case, we’ll soon see that we have no choice. The bills are coming due, and one way or another, we will pay them.

Two degrees is already a compromised target, one with which we’ve already negotiated away thousands of species and, probably, millions of lives. Still, we suspect – along with many others – that advocating a maximum 2C target may be the best strategic move available. But let’s be realistic. The arguments for 2C, and for the emissions reductions that are going to be necessary to keep the warming below 2C, take us far beyond the bounds of conventional climate policy discourse. In fact, they demand a rather brave new synthesis of scientific realism, ethical clarity, and political ambition. And it’s coming time now to admit it.

— September 16, 2004

Acknowledgements

The authors are grateful to Barbara Haya and Michael Mastrandrea, who provided helpful comments on the manuscript and technical notes. All remaining errors of fact, style, or opinion are our own.

Technical Notes and References

Dangerous Climate Change

In addition to the Climate Action Network paper we refer to, a substantial number of articles and reports addressing dangerous climate change have appeared in the last few years. Here are a few of the most interesting and important:

1) O’Neill, B. C. and M. Oppenheimer (2002). “Climate change – Dangerous climate impacts and the Kyoto protocol.” Science 296(5575): 1971-1972.

In this high-profile policy editorial, O’Neill and Oppenheimer recommend a limit of 1C beyond 1990 temperatures to protect coral reefs, 2C to protect the Greenland and West Antarctic ice sheets; and 3 to protect the thermohaline circulation.

2) Grassl, H., J. Kokott, et al. (2003). Climate Protection Strategies for the 21st Century: Kyoto and Beyond Berlin, German Advisory Council on Global Change (WBGU). Also Hare, W. (2003). Assessment of Knowledge on Impacts of Climate Change – Contribution to the Specification of Art. 2 of the UNFCCC Berlin, German Advisory Council on Global Change (WBGU), and Nakicenovic, N and K. Riahi (2003). Model runs with MESSAGE in the context of the further development of the Kyoto-Protocol.

The German Government’s Advisory Council on Global Change (WBGU) recently issued this report reviewing the evidence of potential risks of climate damages. They conclude that there are very good reasons to keep the temperature increase below 2C, and urge the adoption of Contraction and Convergence as a global reductions framework. Note, though, that the supplemental report by Nakicenovic and Riahi is skeptical about the Contraction and Convergence approach. The supplemental report by Bill Hare gives greater detail on the climatic, ecological and health risks of increasing temperatures.

3) Hansen, J. (2004). “Defusing the global warming time bomb.” Scientific American 290(3): 68-77. A similar article is available online at http://pubs.giss.nasa.gov/docs/2003/2003_Hansen.pdf

Hansen, Director of NASA’s Goddard Institute for Space Science, advocates a limit to further temperature increase of 1C as prudent with regard to preventing sea-level rise from the break up of the Greenland Ice Sheet. We discuss Hansen’s calculations elsewhere in this article.

4) Parry, M., N. Arnell, et al. (2001). “Millions at risk: defining critical climate change threats and targets.” Global Environmental Change-Human and Policy Dimensions 11(3): 181-183.

Parry et al. report that additional temperature increases of one to two degrees C will likely put millions to tens or hundreds of millions of people at additional risk from water shortages, food insecurity, increases in vector borne diseases, and storm-related damages in this century. Their modeling provides a substantial component of the analysis used by Hare, the WBGU, CAN and others.

5) Mastrandrea, M. D., and S. H. Schneider. 2004. “Probabilistic integrated assessment of “dangerous” climate change.” Science 304: 571-575.

Mastrandrea and Schneider take a sophisticated approach to an uncertainty analysis of “dangerous climate change”. They use a version of the classic DICE model to show that accounting for uncertainty in climate damages or discount rates increases the optimal carbon tax, and can greatly reduce the risk of dangerous climate change.

6) Azar, C., and H. Rodhe. 1997. “Targets for stabilization of atmospheric CO2.” Science 276: 1818-1819.

In addition to the recent contributions mentioned above, we draw your attention to this under-appreciated contribution from Christan Azar and Henning Rodhe, whose (1997) policy editorial in Science made many of the same arguments we make here concerning the low stabilization targets implied by low temperature targets. They also called for global warming not to exceed 2C, and noted that, given the uncertainty in climate sensitivity, stabilization targets of 350 to 400 ppm CO2 were appropriately precautionary.

Back to text

Radiative Forcing

Radiative forcing, measured in Watts per square meter (Wm-2), is the change in the Earth’s energy balance due to anthropogenic (or in some cases, like volcanoes, non-anthropogenic) changes in atmospheric composition or land surface cover. The temperature of the Earth is maintained near equilibrium by a combination of the reflection of short-wave radiation by the Earth’s atmosphere and surface and the long-wave re-radiation of absorbed energy, which together balance incoming solar radiation (about 342 Wm-2). Greenhouse gases (GHGs) like CO2 trap this long-wave radiation near the Earth’s surface. The preindustrial concentrations of greenhouse gases maintain the temperature at the Earth’s surface 30-35C higher than it would be in their absence. Anthropogenic increases in GHGs trap more of this long-wave radiation near the surface, increasing average surface temperature. A doubling of CO2 amounts to an increase in radiative forcing of about 3.7 Wm-2. Changes in albedo (reflectivity of the Earth) from aerosols or land cover change can augment or counteract increased forcing from anthropogenic GHGs; currently they are believed to add a significant negative forcing, masking part of the positive anthropogenic forcing. (See Non-CO2 Gases and also the IPCC’s Third Assessment Report, Working Group I.)

Back to text

Climate Sensitivity

The climate sensitivity is defined as the equilibrium response of global mean surface temperature to a doubling of CO2 from the pre-industrial level (278 ppm). Straightforward physics suggests that such an increase in radiative forcing (about 3.7 Wm-2) should raise the earth’s surface temperature by about 1.2C; however, because the climate sensitivity is an estimate of the response of the whole global system, various feedbacks such as changes in atmospheric water content, cloud cover, and the extent of snow and ice must be taken into account. It is the uncertainty in these and other feedbacks that produce the range of estimates for the climate sensitivity found in various general circulation models (GCMs). Additionally, it is important to note that, due to the slow circulation and large heat content of the ocean, the equilibrium temperature will be approached asymptotically over a period of hundreds of years. (TAR WGI p. 93, also Appendix 6.1, p. 405)

For simplified, first-order calculations, the climate sensitivity can simply be considered to be the equilibrium response to an increase in average global radiative forcing equal to a doubling of CO2; however, in reality the expected response to the same average level of forcing could be quite different, depending on the mix of gases contributing to the total and their spatial distribution, as well as the rate at which they accumulate.

Back to text

CO2 Forcing

Because the effectiveness of CO2 at trapping outgoing longwave radiation is dependent on the amount of CO2 already in the atmosphere, the radiative forcing associated with a given increase in atmospheric CO2 does not increase linearly. The standard formulation used in the IPCC and elsewhere is a function of the log of CO2, such that the increase in radiative forcing is the same for each doubling of CO2 (i.e., 278 to 556, and 556 to 1112 ppm). This nonlinearity is shown graphically in the figure below, which shows the increase in radiative forcing for each additional 50 ppm of CO2. Note that the “bricks” of 50 ppm each are significantly thicker at the bottom than at the top.

Note also the line indicating forcing equivalent to a doubling of CO2 above preindustrial levels. This value, 3.7 Wm-2 in the IPCC’s Third Assessment Report, was revised downward from 4.0 Wm-2 in the Second Assessment Report, and is still considered uncertain to within about 10%. In the remainder of the calculations in this paper, this uncertainty is ignored.

Back to text

Non-CO2 Forcings

Non-CO2 forcings include several types: other well-mixed GHGs such as CH4, N2O, and CFCs; aerosols (microscopic liquid or solid particles) and their various effects; spatially variable GHGs such as ground-level ozone; and albedo effects from changes in land cover or in the extent and reflectivity of snow and ice. These effects can be positive or negative, including offsetting positive and negative effects for the same aerosols. The uncertainty range of these effects, both individually and collectively, is quite large, as shown in the figure below, reproduced from the IPCC’s Third Assessment Report. The IPCC declined in the TAR to come up with a single uncertainty range for net forcings, but others have done so, including Hansen and Sato (2001), whose estimate of 1.6 1.1 Wm-2 is derived by summing the estimates for the various individual components, and Knutti et al. (2002), who use a Monte Carlo analysis with a simple observationally-constrained climate model, and estimate net current forcings to be between 1.5 and 2.5 Wm-2 (5-95% confidence interval).

Figure 9 from the Technical Summary of TAR WGI. A full explanation is available at http://www.grida.no/climate/ipcc_tar/wg1/figts-9.htm.

The negative forcing from sulfates and other aerosols is one of the largest uncertainties in net forcing. In addition to the reflective or absorptive effects of the aerosols themselves, they have forcing effects from their influence on cloud properties, including both the size of cloud particles and their duration. These indirect effects are not yet well understood, and the net indirect forcing of aerosols is poorly constrained between 0 and -2 Wm-2 (see Anderson et al., 2003, as well as the TAR).

In addition, although they are non-anthropogenic, changes in solar irradiation and in mean volcanic aerosol levels are sometimes counted in the net forcing balance because they do in fact contribute to the overall change since pre-industrial times, and (if positive) must be compensated by a reduction in anthropogenic forcings to keep total forcing (and thus the rate and extent of temperature change) to a desired level. See Chapter 6 of TAR WGI.

Anderson, T. L., R. J. Charlson, S. E. Schwartz, R. Knutti, O. Boucher, H. Rodhe, and J. Heintzenberg. 2003. “Climate forcing by aerosols – a hazy picture.” Science 300: 1103-1104.

Hansen, J. E., and M. Sato. 2001. “Trends of measured climate forcing agents.” Proceedings of the National Academy of Sciences of the United States of America 98: 14778-14783.

Back to Thinking Probabilistically

Back to Scary Results

Back to Towards a New Realism

Climate Sensitivity PDFs

The probability distribution of the climate sensitivity – that is, the likelihood that it lies in any particular range – can be defined mathematically as a probability density function, or PDF. A PDF is the logical extension of a discreet probability distribution (which can be represented as a histogram) into a continuous function. The classic normal distribution, a bell-shaped curve defined by its mean and standard deviation, is a very commonly used PDF, and indicates that values close to the mean are more likely than outlying values; the uniform distribution, defined by a constant level of likelihood between a fixed upper and lower bound, is commonly used when there is no strong evidence that some values in a range are more likely than others.

As we noted in the text, the IPCC declined in the TAR to ascribe a shape to the probability distribution of the climate sensitivity or provide any quantifiable likelihood information, saying only that it probably lies between 1.5 and 4.5C. In previous assessment reports, the IPCC gave a “best guess” of 2.5C, but that was dropped from the TAR; indeed, the mean of the climate sensitivities of the GCMs reported in the TAR was 3.5C, with a range from 2.0 to 5.1C (TAR WG1, Table 9.4).

On the basis of this information alone, at least three general-form PDFs are plausible for the climate sensitivity:

1) a uniform distribution (an equal probability of every temperature between 1.5 and 4.5C, and zero probability outside that range);

2) A normal distribution with a mean of 3C and standard deviation such that some small fraction (e.g., 5% or 10%of the distribution) lies above or beyond the 1.5-4.5C range (similar to Hansen 2004);

3) A log-normal distribution, with parameters such that again, a small fixed percentage of the distribution lies outside the 1.5 – 4.5 range (this implies a median of about 2.6C). A log-normal distribution is similar to a normal distribution, except that the natural logarithm of the value in question (here, climate sensitivity) has a normal distribution. (After Wigley and Raper, 2001)

Climate sensitivity based on IPCC range. “Normal 1” has mean 3.0C, standard deviation 0.75. Wigley and Raper has median 2.6C, 5-95% range from 1.5-4.5C.

A variety of additional PDFs for the climate sensitivity have been published in the scientific literature. Some of these have “tails” with a large fraction (e.g., 10-25%) of the distribution lying above 4.5C (e.g., Andronova and Schlesinger 2001, or Forest et al., 2002). Although the highest values of these PDFs are very unlikely on the grounds of other (particularly paleoclimatic) evidence, they do suggest that the IPCC range does not describe all of the existing uncertainty.

Using PDFs in calculations requires making judgments about which ones to use and why. Indeed, the simple math that connects a temperature target to a level of forcing requires a unique PDF for climate sensitivity to produce a unique answer; and even producing a range with a median requires deciding how many and which PDFs to use. Thus the heterogeneity of the PDFs that exist for climate sensitivity pose a problem for policy-relevant recommendations.

To preview the calculations we make later in this paper, the problem can be demonstrated this way: If you were to use a recently published PDF from the Hadley center (Murphy et al., 2004), you would conclude that (say) an 80% chance of keeping the equilibrium temperature increase below 2C would require forcing to be kept below about 1.75 wm-2, or about 390 ppm CO2-equivalent. However, if you were to use the lognormal PDF used by Wigley and Raper (2001), that same 80% probability of staying below 2C would imply stabilization at 2.15 Wm-2 (about 420 ppm CO2-equivalent); while for the lower of the PDFs published by Forest et al. (2002), that rises to 2.4 Wm-2 (about 440 ppm CO2-equivalent).

What are we to make of the range What good does it do a policy-maker to be told, “well, if you believe W, you should do X, but if you believe Y, you should do Z” Certainly the policy-maker has no scientific grounds for preferring W to Y, so if X is cheaper for her or him, why not choose to believe W

These questions point to the crux of the problem: any choice of what recommendations to draw from these models will have serious implications for the distribution of costs and risks. The ambiguity of evidence will make it easy for actors of all kinds to choose to emphasize scientific storylines that support their preferences and, to be blunt, their interests. Scientists will be challenged to explain why society should make very large investments (with significant redistributional implications) on the basis of mere probabilities about probabilities, or worse, preferences about probabilities.

The problem here shouldn’t be underestimated. Literally millions of lives and trillions of dollars are potentially at stake, on the basis of extremely esoteric scientific issues like “Bayesian priors.” The scientific community itself is only in the early stages of discussion of the handling of these kinds of cascading uncertainties. However, in support of the arguments in this essay, I would like to advance the following hypothesis: given what we know about the climate sensitivity at this point, a precautionary policy must accept that there is a very significant probability (i.e., 10 to 20%) that the climate sensitivity is above 4C, and thus that, given a 2C target, precautionary targets for the stabilization of CO2 itself must be at or below the current concentration of about 380 ppm. How far below will depend, among other things, on what level of forcing from non-CO2 gases we believe is achievable.

How CO2 concentrations can be reduced below current levels, and over what time frame, are the subject of a critical emerging area of research (low-emissions scenarios) and a different essay. Similarly, there is a tremendous need to explore in detail the tools available for managing non-CO2 greenhouse gases and aerosols, particularly because negative forcings from sulfate and other aerosols will quickly disappear as we reduce other GHGs.

NOTE: The first author is collaborating with Michael Mastrandrea and Malte Meinshousen on a spreadsheet-based tool that collects numerous published PDFs and presents them in a standardized and comparable format, as well as some simple tools for the types of calculations described here. For a current version, please contact pbaer@ecoequity.org.

Andronova, N. G., and M. E. Schlesinger. 2001. “Objective estimation of the probability density function for climate sensitivity.” Journal of Geophysical Research-Atmospheres 106: 22605-22611.

Forest, C. E., P. H. Stone, A. P. Sokolov, M. R. Allen, and M. D. Webster. 2002. “Quantifying uncertainties in climate system properties with the use of recent climate observations.” Science 295: 113-117.

Hansen, J. (2004). “Defusing the global warming time bomb.” Scientific American 290(3): 68-77. A similar article is available online at http://pubs.giss.nasa.gov/docs/2003/2003_Hansen.pdf

Murphy, J. M., D. M. H. Sexton, D. N. Barnett, G. S. Jones, M. J. Webb, and M. Collins. 2004. “Quantification of modelling uncertainties in a large ensemble of climate change simulations.” Nature 430: 768-772.

Wigley, T. M. L., and S. C. B. Raper. 2001. “Interpretation of high projections for global-mean warming.” Science 293: 451-454.

Back to text

Probability Calculations

One of the properties of a PDF is that, if there is only one stochastic (uncertain) variable in a mathematical function, there is an exact correlation between percentile thresholds in the input and output distributions. To use our case as an example, if we specify the forcing in Wm-2 and treat the climate sensitivity as uncertain, the equilibrium temperature increase TEQ is defined as

TEQ = (F/3.71) x T2x

where F is the increase in radiative forcing in Wm-2, 3.71 is the forcing in Wm-2 for a doubling of CO2, and T2x is the climate sensitivity, represented by a PDF. For any specified value of F, the median value of TEQ is precisely the value of the equation using the median value of the PDF for T2x. Similarly, the 10th percentile value of the predicted temperature increase is calculated by using the 10th percentile value of the climate sensitivity.

Concretely, suppose you know that radiative forcing will be precisely equal to a doubling of CO2. Then, the likely distribution (PDF) of equilibrium temperature is precisely the same as the PDF for climate sensitivity. If the median value of the climate sensitivity PDF is 3.0C, then by definition you have a 50% chance that equilibrium temperature will be under 3.0C. Similarly for any other percentile of the climate sensitivity PDF; if the 80th percentile is 4.0C, then there is an 80% chance that equilibrium temperature will be under that level (and a 20% chance it will exceed it).

Now consider an increase of radiative forcing equal to half a doubling of CO2, or about 1.85 Wm-2. Using the standard assumption that the relationship between forcing and equilibrium temperature is to a first order linear, the PDF for equilibrium temperature looks like the PDF for climate sensitivity divided by two. That is, whereas the median value for a 3.7 Wm-2 increase was 3.0C, for a an increase of 1.85 Wm-2, the median equilibrium temperature is 1.5C. Similarly for the 80% threshold; if the 80th percentile is 4C for a doubling, for 1.85 Wm-2 forcing, it would be 80% probable to stay below 2C, and 20% probable to exceed that level.

Similarly, if you want to find the forcing level consistent with a given probability of staying below a temperature threshold, you can do a sort of “inverse” calculation. Start with a particular PDF for climate sensitivity, in which (say) the 90th percentile is 4.5C. If you’re interested in a 90% chance of staying below 3C, the ratio of the target forcing to a doubling of CO2 is the same as the ratio of the target temperature (3C) to the 90th percentile of the climate sensitivity (4.5C), or 2:3. Thus forcing must be held to 2/3 of a doubling, which is about 2.5 Wm-2 or about 450 ppm CO2-equivalent.

For more complex equations with multiple stochastic parameters, is it typical to use Monte Carlo analysis, in which a random number generator is used to calculate the value of an equation hundreds or thousands of times. For each stochastic variable, for each “run” of the model (equation), a value is “picked” from the specified PDF, and an output value calculated. The result is an output distribution sensitive to the shapes of the input PDFs. Some of the calculations reported later in this paper are based on Monte Carlo calculations in which both the climate sensitivity and the net non-CO2 forcings are treated as PDFs.

Primers on Monte Carlo analysis are available in any university library.

Back to text

CO2 Equivalence

Because, to a first order, all different forcing agents have (for a given amount of forcing in Wm-2) an equivalent effect on climate, it is convenient to compare other gases to CO2 in terms of “ppm CO2-equivalent.” Thus an increase in forcing equivalent to a doubling of CO2 – that is, to about 550-560 ppm CO2-equivalent – will have roughly the same effect regardless of the mix of forcing agents (positive and negative) that lead to it. And thus it is consistent to say that, if we want to keep the equilibrium temperature below a given level, we must keep GHG concentrations below (say) 400 ppm CO2-equivalent, or 450 ppm CO2-equivalent, etc., without regard to what gases comprise the total.

However, when you are considering different forcing agents individually, it begins to matter that the forcing from a given amount of CO2 is not constant. Look at the figure below, in which the identical amounts of non-CO2 forcings (values from Hansen and Sato 2001) are stacked in reverse order. On the left, where tropospheric ozone is added to a high level of other forcings, it is equivalent to more than 50 ppm of CO2; whereas on the right, when it’s added to a low level of other forcings, it amounts to only about 30 ppm of CO2. So a unit of CO2 equivalent doesn’t have a unique equivalent in radiative forcing. This problem could be solved of course by defining a unit of CO2-equivalent to be, say, the amount of forcing added by one additional unit of CO2 added to the preindustrial level (about 0.02 Wm-2). But then a unit of CO2 would no longer be consistently equivalent to a unit of CO2-equivalent!

The solution to this problem is to refer to individual forcings in their “native” unit, Watts per square meter (Wm-). While this is even less intuitive a measure than ppm, most of what is important can be handled by keeping in mind that a doubling of CO2-equivalent is about 4 Wm.

Hansen, J. E., and M. Sato. 2001. “Trends of measured climate forcing agents.” Proceedings of the National Academy of Sciences of the United States of America 98: 14778-14783.

Back to text

Calculation Results

The probability ranges reported here are based on the three interpretations of the IPCC range (uniform, normal and lognormal) that are described and graphed under Climate Sensitivity PDFs. We’ve chosen to report results using only this very restricted set of PDFs, based on the IPCC range, to avoid the criticism that “you included this PDF but excluded that one.” Using all available PDFs extends the range considerably. For example, for 500 ppm CO2-equivalent, across the all PDFs mentioned in Climate Sensitivity PDFs, the probability of the equilibrium temperature being below 2C ranges from 0-58% (as against our 20-40%) while the probability of it being greater than 3C ranges from 11-62% (as against our 15-30%).

As also discussed under Climate Sensitivity PDFs, there is no obviously correct way to use, or choose between, different PDFs. Others may use the same available PDFs to draw different conclusions. We believe however that our approach is very reasonable, and the policy conclusions that follow from it fairly robust. In particular, we have excluded many plausible “high” PDFs with higher means or longer tails, which imply lower concentration targets for equivalent levels of precaution, making our conclusions in this sense quite “conservative.”

It could also be argued that by also excluding “low” PDFs with lower means or shorter tails, of which there are a few, that we are biasing our results towards more stringent reductions. However, we believe that in the current situation, in which there is still no decisive evidence for the plausibility or implausibility of various PDFs, the existence of “low” PDFs does not yield a strong argument for less precautionary emissions targets.

Back to text

Hansen Calculations

Because Hansen specifies additional radiative forcing as the policy variable in his discussion, predicted equilibrium temperature is a function of the climate sensitivity and the current radiative forcing, the latter of which, as discussed above in Non-CO2 Forcings, is also quite uncertain. Hansen himself (2004) estimates current radiative forcing as 1.6 Wm- 1 Wm- ; this implies a normal distribution with a roughly 67% chance that the “true” value is within 1 Wm- on either side of the mean. As noted above, another recent published estimate (Knutti et al. 2002), gave a 5-95% confidence interval of 1.5 to 2.5 Wm-. As also noted above, the IPCC stated that such calculations are still quite speculative, but these ranges are certainly plausible.

One can then add a fixed 1 Wm- to the value selected from a such a PDF for current forcing, and run a Monte Carlo analysis using one or more PDFs for climate sensitivity to produce an output PDF for equilibrium temperature. This is how we estimated the probabilities of staying under Hansen’s temperature goal of 1 above the present. Calculation details available from the first author (pbaer@ecoequity.org) on request.

Hansen, J. (2004). “Defusing the global warming time bomb.” Scientific American 290(3): 68-77. A similar article is available online at http://pubs.giss.nasa.gov/docs/2003/2003_Hansen.pdf

Knutti, R., T. F. Stocker, F. Joos, and G. K. Plattner. 2002. “Constraints on radiative forcing and future climate change from observations and climate model ensembles.” Nature 416: 719-723.

Back to text

Catastrophic Risks

For example, using the same methods described in Calculation Results, one can calculate that stabilization targets of 550 ppm CO2, hardly the highest targets advocated, imply risks on the order of 5 – 25% that equilibrium temperature increase would exceed 5C. This is roughly the same degree of warming that has occurred since the peak of the last ice age, and one that is very likely to melt the Greenland and West Antarctic Ice Sheets, resulting in sea-level rise of 5-10 meters or more over one to many centuries.

Research on the risks of abrupt or catastrophic climate change is becoming more widespread; the definitive summary was published by the National Academy of Science Press (National Research Council: 2002,) as Abrupt Climate Change: Inevitable Surprises. But meaningful debate about precautionary approaches is still sparse.

Back to text

Carbon Cycle Uncertainty

The amount of CO2 emitted by humans that remains in the atmosphere over time is determined by the global carbon cycle – the processes by which CO2 is exchanged between the atmosphere, oceans, plants and soils. (Animals have a negligible effect, except for humans through the burning of fossil fuels and changes in land cover, not our breathing!) The current uptake of CO2 by the oceans is relatively well known, and is roughly 2 GtC annually, 0.5 GtC. Net uptake by the biosphere is also relatively well constrained, but it is composed of two components – emissions from the biosphere from deforestation and other land use changes, and sequestration in the biosphere, from natural or managed processes. The TAR estimated net terrestrial uptake at 1.4 0.7 GtC annually in the 1990s, but said there was insufficient data to estimate the balance between emissions from land-use change and locally or regionally increasing carbon storage. Previous IPCC estimates put emissions from land use change at about 1.6-1.7 GtC/yr, with an uncertainty range of about 1 GtC.

The uncertainty in the current terrestrial processes is important because we hope to stop deforestation quickly; if what we stop is a large amount of emissions, it suggests the terrestrial sink is relatively larger, and may remain larger, but if actual land use emissions are low, the terrestrial sink will be smaller over the next century.

Future changes in the carbon cycle can be expected due to changes in ocean chemistry and biology (which regulate ocean uptake from the atmosphere), changes in human land use, and changes from the influence of changing temperature, CO2 concentration and water availability on plant growth and decomposition. The net result of these uncertainties is quite a large range of possible future values for annual uptake from the atmosphere. For example, in the TAR, the range of average annual uptake for different interpretations of the same scenario (e.g., the SRES B1 scenario, with cumulative emissions of about 900 GtC over 100 years), is between about 4 and 6.5 GtC per year; this is based on a very simple model calibrated to match a small number of more complex models, and doesn’t capture the full range of uncertainty.

A 400 ppm CO2 concentration target means that only about 45 more GtC of carbon can be allowed to accumulate in the atmosphere (1 GtC is about 0.47 ppm). So while the sink well be as high as 6 GtC annually over the coming century, it might easily be as low as 4 GtC or even 3GtC or lower, and a precautionary target based on these lower values would give a range of about 340 to 440 GtC total allowable CO2 emissions through 2100.

Back to text

Contraction and Convergence

Contraction and Convergence (C&C) is the name given by Aubrey Meyer and his Global Commons Institute to a particular formulation of the equal rights argument. Under C&C, total global emissions contract to a sustainable level (e.g., about half of today’s emissions) even as the allocation of tradable national emissions permits converges from today’s unequal per capita levels to fully equal annual allocations at some negotiable year in the future. A variety of formulas for the rate of this convergence have been offered, including some which allow developing country per capita emissions to rise above those of developed countries before dropping again to converge to pure per capita equality, but the standard formulation, the one you see in today’s graphs and wall charts, features a linear convergence rate from grandfathering to pure equality.

C&C appears to have real traction in the UK and the EU more broadly, particularly within parliamentary circles and among elites. Moreover, it has been useful, politically and pedagogically, and for an excellent reason: per capita atmospheric rights make strong intuitive sense. But when it comes to the strategic question that proponents of per-capita rights must answer – When would equal allocations be less that fair – C&C cannot provide an answer. Indeed, it does not even allow the question.

C&C does not provide for convergence of cumulative emissions, to say nothing of the cumulative developmental benefits of emissions. It cannot do so, for it rules “historical accountability” off the table from the very beginning. Given the convergence years that C&C’s proponents usually use – typically between 2030 and 2100 – cumulative per-capita shares, and per-capita benefits, ultimately remain vastly different between North and South. Translated into economic terms, developing countries actually get much less than a fair share of the cumulative developmental space associated with the global GHG sinks.

On another note, Contraction and Convergence is often criticized by climate activists for its reliance on global emissions trading. The issues here are deadly real, and must be taken seriously, but they are not in any way particular to C&C. Any burden (or resource) sharing system that relies upon emissions trading must absolutely ensure that it is conducted in a manner that is transparent, well-regulated, and fair. The “Enronization” of global carbon markets, in particular, would spell death for any trading-reliant climate stabilization regime.

Back to text

Back to Dangerous Climate Change

Liability

The principle of liability for harm caused by pollution of a “life support commons” is ethically unavoidable, and is already reflected in national pollution regulations and (rhetorically) in international law in the Stockholm Declaration and elsewhere. Both the ethics of the “life support commons” and the financial implications of establishing legal liability for adaptation and compensation are addressed in a forthcoming book chapter. (Baer, In Press, “Adaptation to Climate Change: Who Pays Whom” In Fairness in Adaptation to Climate Change, ed. W.N. Adger and J. Paavola. Cambridge, MA: MIT Press. Also recommended: Tol, R. S. J., and R. Verheyen. 2004. “State responsibility and compensation for climate change damages – a legal and economic assessment.” Energy Policy 32: 1109-1130.)

Back to text

Economic Rationality

These problematic notions of “rationality” underlie both global cost-benefit analysis of climate change, such as the famous DICE model of William Nordhaus (1994), and a wide range of models of the international negotiations offered by political scientists and economists (see for example the work of Scott Barrett). The question of the discount rate in particular has been discussed exhaustively, including by the IPCC itself (Arrow et al., 1996), with no resolution of the underlying disagreements on the horizon. Our opinion is that any model that suggests that it’s “optimal” – and hence “rational” – to increase the global temperature by 3.2C (Nordhaus 1994) or thereabouts by 2100, with atmospheric concentrations still rising, must be missing something fundamental. For interesting critiques, see Azar (1998) or Funtowicz and Ravetz (1994).

Arrow, K.J., W.R. Cline, K-G. Maler, M. Munasinghe, R. Squitieri, and J.E. Stiglitz. 1996. “Intertemporal Equity, Discounting, and Economic Efficiency.” In Climate Change 1995: Economic and Social Dimensions of Climate Change, ed. James P. Bruce, Hoesung Lee, and Erik F. Haites, 125-144. Cambridge: Cambridge University Press.

Azar, C. 1998. “Are optimal CO2 emissions really optimal Four critical issues for economists in the greenhouse.” Environmental & Resource Economics 11: 301-315.

Barrett, Scott. 2003. Environment and Statecraft: The Strategy of Environmental Treaty Making. Oxford: Oxford University Press.

Funtowicz, Silvio O., and Jerome R. Ravetz. 1994. “The worth of a songbird: ecological economics as a post-normal science.” Ecological Economics 10: 197-207.

Nordhaus, William D. 1994. Managing the Global Commons: The Economics of Climate Change. Cambridge, MA: The MIT Press.

Back to text

Stabilization Costs

A variety of methods exist for estimating the costs of reaching a given atmospheric stabilization target, and they are well reviewed in the TAR by the IPCC’s Working Group III. Estimates of reaching 450 ppm stabilization vary from as little as 0 to as much as 5% of GWP, depending on the baseline scenario and a wide range of other model assumptions. Few cost estimates have been made for stabilization levels below 450 ppm.

The wide range of these results and the broad range of practical and theoretical problems with long-term economic modeling suggest we should be cautious with these estimates. Sharply varying assumptions with very different consequences can all be quite legitimately defended. Most people familiar with the issues would agree that a strict mitigation target could well cost between 0 and 5% of GDP or more. And the fear that it might be on the high side is certainly legitimate.

As noted by Azar and Schneider (2002), however, whether even 5% is a large amount within a world economy that may quadruple in GWP during this century is a relative question. They point out that if annual growth is only 2% per year, a 4% reduction, over a century, amounts to only a 2 year delay in reaching a given level. It’s not a difference that makes a difference. And we’d like to think that most people, asked bluntly if they’d accept such a sacrifice in order to preserve the stability of the Earth and its climate for their grandchildren, or even someone else’s, wouldn’t waste a lot of time agonizing about the decision.

Azar, C., and S. H. Schneider. 2002. “Are the economic costs of stabilising the atmosphere prohibitive” Ecological Economics 42: 73-80.

Back to text

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.