Home > AGW, Catastrophism, Climate Change, Global Warming, IPCC, Omniclimate, Policy > Four Steps For A Climate Policy Beyond Scenarios And Fear

Four Steps For A Climate Policy Beyond Scenarios And Fear

Very interesting review by Tim Lewens in the London Review of Books with (explicit) reference to a “new” way to select a rational climate policy, beyond the usual soup of running scenarios designed to deal with worst-case and in general of applying the precautionary principle in order to stifle innovation and institutionalize killjoyfulness. In summary:

  1. We should aim for “concrete recommendations that are thoroughly in accordance with precautionary thinking in remaining humble about our state of knowledge, while taking into account the full range of scientific evidence
  2. However, the precautionary principle on its own is no guidance to policy decisions when facing great complexity and uncertainty, as both action and inaction might lead to disaster
  3. Cost-benefit analysis is not much better, as it simply collapses complexity and provides “a bland expression of uncertainty” that strongly depends on the (lack of) knowledge and understanding of the system at hand
  4. Instead, the first step of a good policy is to “examine how our proposed interventions will fare under a range of different plausible scenarios for the unfolding of a complex system, picking the strategy which has a satisfactory outcome across the largest range of future scenarios
  5. The second step is to “assume that the world may not behave in a manner we expect it to, and therefore make sure that the strategy we choose can be undone or altered with reasonable ease
  6. Another problem for a good policy is to avoid falling victim of “optimism bias” (overestimating the likelihood of outcomes one favours) and “affiliation bias” (the dependency of a researcher’s results on his/her affiliation)
  7. The third step is therefore to “to be attentive to the institutional sources of the data“, in order to understand and perhaps even remove the biases from the policymaking “picture”
  8. The fourth step goes even further for the same aim, and involves “broad public participation

Very shortly: know your science, know its limits, know its biases, involve as many people as possible, pick a policy that looks best across many scenarios and can be easily changed.

Now, it is pretty easy to argue that the IPCC has failed on all fronts: by fixating on worst-case analysis thereby restricting the range of scenarios; by not assuming that the world may not behave as expected, steering quite clear of providing any sign of being humble about anything; by refusing to consider the bias of its own authors and editors, through its flawed review system; and by consistently trying to keep the public at bay, with countless elitist “summits” only good for people on expenses and/or without a day job.

It will be interesting to compare the above with whatever Roger Pielke Jr has written in “The Climate Fix” (also with “Look Inside”), a book I bought a few days ago.

======

And now for some quotes from Lewens’ review of “Unsimple Truths: Science, Complexity and Policy” by Sandra Mitchell, 
ISBN 978 0 226 53262 (available at Amazon.com with the “Look Inside” feature enabled):

[…] on the important matter of what decision-makers can do to handle complexity […] Mitchell’s book is at its best. Nearly all the systems we care about – the global climate, the human body, the international financial system – exhibit the various forms of complexity she dissects.

[…] A typical reaction, displayed in many policy documents, is that when dealing with scientific uncertainty in relation to important systems, policy-makers should adopt a precautionary approach. […] Both unintentional vandalism and irresponsible dithering can lead to disaster. Those who oppose precautionary thinking often argue that it becomes incoherent or dangerous when spelled out in detail. The problem is that precautionary thinking is supposed to help in situations of uncertainty; that is, in situations where we lack knowledge, or where our knowledge is imprecise. But since decisions under such conditions tend to have the potential for grave outcomes whichever option we choose, we need guidance on how to err on the side of caution.

High-profile opponents of the precautionary principle, such as Barack Obama’s new regulation tsar, Cass Sunstein, have argued [for] a form of cost-benefit analysis as the best way to ensure that the potential costs and benefits of all courses of regulatory action – including inaction – are placed ‘on screen’.

Mitchell’s critique of cost-benefit analysis is a familiar one. It is suitable for well-understood systems, unfolding over short time periods, where we can assign probabilities with confidence. But the probability of a given outcome – financial profit, the extinction of species, an increase in sea levels, high blood pressure – in whatever system we are analysing will often vary significantly with small changes in the starting conditions, with our assumptions about the causal interactions within the system, and with variation in background conditions as the system evolves over long periods of time. Our estimates of these conditions will often be imprecise, or thoroughly conjectural, in spite of the apparent precision of the cost-benefit methodology. The question is how to turn uncertainty of this sort into trustworthy policy recommendations.

Mitchell’s stance on these matters is not new […] but her way of justifying it is particularly crisp and compelling. Simple cost-benefit analysis will tend to collapse a rich understanding of the complexity of a system into a single set of all-things-considered probability estimates for its likely end-states. In so doing, Mitchell says, we mask our grasp of complexity, and replace it with a bland expression of uncertainty.

[…] once we do acknowledge complexity, two strategies become available. First, we can examine how our proposed interventions will fare under a range of different plausible scenarios for the unfolding of a complex system, picking the strategy which has a satisfactory outcome across the largest range of future scenarios. Second, we can assume that the world may not behave in a manner we expect it to, and therefore make sure that the strategy we choose can be undone or altered with reasonable ease. The end result should be a set of concrete recommendations that are thoroughly in accordance with precautionary thinking in remaining humble about our state of knowledge, while taking into account the full range of scientific evidence.

[…] The question of how good a particular outcome would be, were it to arise, should be wholly independent of the question of how likely that outcome is. And yet it turns out that we tend to overestimate the likelihood of outcomes we favour, while underestimating the likelihood of outcomes we don’t want. This is known as ‘optimism bias’. And ‘affiliation bias’ results in (for example) the conclusions of studies on the effects of passive smoking varying according to the authors’ affiliation with the tobacco industry. Needless to say, these psychological results suggest that policy-makers need to be attentive to the institutional sources of the data they use. And this, in turn, underlines a long-standing theme of work among social scientists, who have claimed that broad public participation in risk planning may increase the quality of risk analysis. Mitchell’s stance on policy isn’t complete, but perhaps that is to be expected in a complex world.

Advertisements
  1. Fay Tuncay
    2010/11/28 at 15:08

    I guess Greenpeace will eventually drop CAGW and just focus on biodiversity, but would all their funding from foundations dry up with the death of carbon trading? I think so. Such a funding crisis would lead to a welcome shrinking of the green bureaucracy. It would no longer be a career option and then grass roots could play a more productive role, by dealing with realistic, achievable goals to protect the environment.

  2. Fay Tuncay
    2010/11/27 at 20:42

    Greenpeace mentioned this reasearch, and are shifting their message away from alarm. Just thought you should know, it is worth a read:

    Dire messages about global warming can backfire, new study shows
    By Yasmin Anwar, Media Relations | 16 November 2010

    “BERKELEY — Dire or emotionally charged warnings about the consequences of global warming can backfire if presented too negatively, making people less amenable to reducing their carbon footprint, according to new research from the University of California, Berkeley.

    “Our study indicates that the potentially devastating consequences of global warming threaten people’s fundamental tendency to see the world as safe, stable and fair. As a result, people may respond by discounting evidence for global warming,” said Robb Willer, UC Berkeley social psychologist and coauthor of a study to be published in the January issue of the journal Psychological Science….”
    http://www.berkeley.edu/news/media/releases/2010/11/16_globalwarming_messaging.shtml

  3. 2010/11/24 at 03:44

    Another step would be to ensure the money goes to the cause and not just expanding green bureaucracy. Like charities there should be a % figure published of how much from donations/grants etc actually get spent on the real cause.

    Additionally investing in grass roots organisations rather than green corporates would be beneficial.

  1. 2010/11/25 at 01:01

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: