No doubt we will shortly see a lot more about the ‘precautionary principle’ as a reason for embarking on carbon emission taxes. In Australia, the publication of the final report of the Garnaut Review later this month is sure to bring out a lot more rhetoric about urgency, precaution and ‘runningout of options’. We already see this line of argument from some industry organizations such as the Minerals Council of Australia that is, no doubt, having a difficult time managing differences among its members about whether the ‘carbon pollution’ plans of the Australian government make economic sense.
We can anticipate still stronger appeals to the ‘precautionary principle’ because it is often used to put an end to debate; ‘Look, we’ll commit to carbon reductions without detailed reasons (that we cannot agree), just in case the worst case turns out to be true’. I don’t know if that is happening inside the MCA. But it is clear that Ross Garnaut has already nailed his standard to the precautionary principle (he called it ‘Pascal’s wager‘) and it is likely that we’ll hear more of the same from the Rudd government as they approach the point where they must turn their Green Paper into legislation.
So what’s wrong with the ‘precautionary principle’? Many people respond very positively to the idea of taking action to avoid an unquantified risk because they’re naturally inclined (see the section of that paper on what does explain risk aversion: it isn’t a rational utility function) to anticipate the worst outcome from any risk considered on its own. Individuals recognize in their own behavior that they act with precaution and they consider it just common sense: ‘prudence’. But government is not the action of an individual. We need to look at the idea of precaution more carefully when we talk about collective action and ask just what the ‘principle’ implies .
A ‘precautionary’ action is taken without the sort of justification that we would otherwise require for a given decision. It is an expression of our concern about the unknown and, we fear, unlimited risk we may face. We take actions that we believe will avert some or all of this risk, although we don’t know how much risk these actions will avoid because we cannot or have not evaluated the actual risks—that is, the probabilties—involved. The so-called ‘principle’ of precaution originated in European environmental regulations and had it’s classical, convoluted, expression in “Principle 15” of the 1992 Rio Declaration of the United Nations Environment Program:
“Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”
Many commentators then and since have pointed out that this statement is no help in dealing with uncertainty. ‘Scientific certainty’ is a very rare animal so this aspires to be a quasi-universal guideline. The obvious question is how well, in the absence of scientific knowledge, can we evaluate claims of ‘serious or irreversible damage’ and, consequently, how confidently can we make decisions about which measures are ‘cost-effective’? The answer is: ‘not very well’ and ‘not confidently’.
The ‘precautionary principle’ appears to be a justification for action, but it’s really only an explanation of our reasons for taking an action; a psychological statement about our state of mind. It cannot be a justification in the way we normally use that word. A ‘justification’ is a statement of a rational balance between ends and means. Rational, here, literally means ‘measured’. So a justification is a balance (it might be wrong; justifications can be in error) struck between what we pay and the value of what we pay for; or between the crime and the punishment; or between god’s ways and mans hopes (remember Milton).
An explanation is a very different thing from a justification. We can readily explain Mohamed Atta’s actions on 11 September, 2001 in driving a plane full of people into the North Tower of the World Trade Center. He held a distorted vision of the world that demonized the United States and the West as the enemy of Islam. He held an insane hope that his actions would lead to the destruction of the U.S. and an end to the perceived threat. But this psychological explanation is not a justification. We can find no justification for killing thousands of innocent people. The explanation of Atta’s psychology is completely unsatisfying as a justification for his actions because it offers no rationality. (Unfortunately, President Bush did not seem to grasp that while the action had no possible justification, this did not mean that the explanation could be ignored).
What happens when a government takes an action that has a (psychological) explanation but not a justification; that is a ‘precautionary’ action? Some people are inclined to accept that governments should do that because they place themselves in the role of the government and think, ‘now how would I act when faced with this threat’. They answer as they would answer personally: I’ll make sure I never face the threat whatever it might be. That’s the personally prudent course. But what’s OK for an individual is not OK for a government.
Acting without justification on a risk is exactly what gambling is. It’s what happens when you put the roulette chip on the red 33 instead of on the black 72. It’s just a wager. We don’t allow gambling with public stakes (our GDP, billions of tax dollars) in a democracy because democracy is all about deliberation and consent; it’s why democracies have parliaments that are supposed at least to have a rational debate and, in theory, to reached reasoned decisions (except that they are all too often mushroomed by the dictatorship of the Executive in our current democracy). Its why we demand a free press and free speech. Not to explain our phobias but to promote the reasoned debate that helps us to take balanced decisions on the basis of evidence, not influenced by personal agendas or subjective evaluation.
The temptation to gamble is more common than you may think. Governments are frequently asked to make decisions that rely on a scientific assessment in conditions of scientific uncertainty. It’s not an unusual phenomenon and there have been many examples over the years—especially since environmental protection issues began to be prominent public concerns in the 1970s. Decisions based on uncertain science happen every day in the management of quarantine risk, drought funding, or the release into commercial use of foods or pharmaceuticals with novel compounds.
There may be no completely satisfactory rule of thumb for these decisions except to say that there is a world of difference between acting on dread and acting from prudence on the basis of partial assessments. In order to make good decisions, decision makers need to understand the nature of scientific uncertainty and to be aware of its well-attested tendency (see the references in the Peel article, below) to encourage subjective judgements—and personal or interest-group motivations—to creep into decision making. We see plenty of such subjectivism in the policy debates surrounding the uncertain climate science.
None of this is to say that governments should not act prudently when offered partial information about risk. Prudence usually means taking measures proportional to the harm that you can quantify and continuing to seek better information on the risks that you cannot quantify. It almost certainly means making no provision at all for undefined fears or ‘worst case outcomes’ for which you have no reliable probabilities. As your mother always told you; your worst fears will turn out to be groundless. She was right.
For more on the shades of difference between prudence and precaution, I strongly recommend an article from the 2004 Volume of the Melbourne Journal of International Law by Jacqueline Peel that draws on the treatment of the ‘precautionary principle’ in international tribunals. Peel’s last sentence reaches a well-founded conclusion from a decade of international experience of the ‘precautionary principle’ that should stand as a warning to all who, like the Garnaut Enquiry in my view, want to make a wager based on inconclusive climate science:
“Precaution should not require decision-makers to achieve the impossible and reach the ‘right’ decision in advance, regardless of uncertainties. Rather, the best chance for the international community to prevent serious environmental degradation in the future lies in imposing particular procedural constraints on regulatory decision-making that are designed to ensure scientific uncertainty is factored into the process and that science itself is not extended beyond the limits of its utility and capacity to inform decisions on risk regulatory measures.”