“I
knew it all along…” – as volcanologists, we need to be careful not to fall
into the many traps that come from retrospectively looking at and indeed
commenting on crises or catastrophes such as the recent eruption of Ontake.
There is a fantastic book you might want to read: Thinking fast and slow by Daniel Kahneman, which synthesises a huge
body of research about how and why we make the decisions we make, particularly
when it comes to risk and uncertainty. Many readers of this blog will be
familiar with Kahneman’s papers, notably the 1976 “Heuristics and Biases” work
with Amos Taversky. Others will be familiar with some of the work by his former
PhD student Baruch Fischhoff on (among other things) risk communication*. I was
planning on writing a short review of Thinking
fast and slow from the perspective of what volcanologists can learn from
cognitive psychology, but the eruption in Japan has got me thinking about one
particular cognitive trap – the ‘hindsight bias’ or the ‘I knew it all along
principle’, first investigated by Baruch Fischoff.
The key message is that as a group we must
be very careful that when looking back at past eruptions, particularly when
eyeballing monitoring data post-hoc, that we don’t make pronouncements about
“missed warning signs” because we interpret things with the benefit of
hindsight.
It turns out that it is very difficult for
a human mind to reconstruct what we thought about something once we adopt a new
belief about it. It leads us to believe that we understand the past,
overstating the accuracy of the beliefs that we held (or would have held) at
the time, as these are corrupted by what we now know. Kahneman suggests that if
we are surprised by an unpredicted event, we adjust our view of the world to
accommodate that surprise. Thus when we look back, we forget the state of mind
or understanding that we had at the time, and simply think about what we know
now.
What hindsight bias can do is lead us to
interpret the quality of a decision (such as the recommendation for some kind
of mitigative action) on whether the outcome was positive or negative, rather
than whether or not the decision making process was sound. This bias leads us
to a) overstate our expertise post-hoc, b) neglect the role of luck (or lack of
it) in a particular outcome and c) suppress any memory of the effect that
uncertainty will have had on our or other people’s interpretations/decisions.
Our natural tendency is to criticise
decision making on risk issues when an outcome is negative, and neglect to
recognise or praise decision-making when the outcome was good; this ‘outcome
bias’ (a facet of hindsight) affects our interpretation of past events far more
than we might realise. When considering what might happen at a volcano, a
simplistic explanation is that we can consider the probability of an eruption
happening given some monitoring signal [P(A|B)]. But, after an event has
occurred…it’s quite different! It’s no longer an event that could happen (a
chance or likelihood) but a certainty. So when we re-interpret past events,
hindsight bias makes it very difficult for us in our present state of
certainty, to acknowledge the attendant uncertainty before the eruption
occurred. We find it very difficult to reconstruct or understand what our past
belief would have been.
Kahneman suggests that these biases make it
“almost impossible to evaluate a decision
in terms of the beliefs that were reasonable when the decision was made”.
In fact, research suggests that the worse
or more shocking a catastrophe is, the more acute hindsight bias becomes (think
back to reactions in the aftermath of 9/11). This – in the case of Ontake – is
reflected by language such as “failed to forecast” used in many** news
articles.
So what does this mean for volcanologists
in the wake of a tragedy such as the eruption of Ontake? Well, the first thing we should be aware
of is that our opinions post-hoc, about what monitoring data may or may not
have shown, or what decisions should or shouldn’t have been made, are prone to
huge biases. So, we should be very careful what we voice about these
events…particularly to the media! If we are going to retrospectively look at
something, let’s do it in a robust and sensible way, such as the work by TheaHinks, Willy Aspinall and others on the 1976 eruption of Soufriére Guadeloupe.
Another point is that from afar – not being
a Japanese volcanologist working on Ontake – the availability of information
for us to be able make an informed opinion is surely very limited (what
Kahneman refers to as the ‘availability bias’ or the ‘what you see is all there
is to know principle’). So, just as we should be very cautious about talking
about ‘missed signs’, we should also be aware that when we say things like
‘it’s impossible/very difficult to predict such eruptions’ or ‘there were no
precursors’, our opinions are perhaps based on very sparse evidence (of course we can
draw on other examples from other cases – but hopefully you get my point). In
essence, maybe we could do with waiting for a little more information before
passing comment.
Hopefully you get the idea that if you
haven’t yet read Thinking fast and slow,
then please do. It’s very difficult to overcome the various heuristics and
biases that affect our opinions and decisions (even Kahneman admits to
relentlessly struggling with this) …but being aware of them is an excellent
first step.
** Not
all articles/commentaries fall foul of the hindsight bias - if you want to read some measured and
not overly opinionated articles by volcanologists about the Ontake eruption –
you might want to look here (Becky Williams) and here (Eruptions blog).
Thanks: this was a well-informed and impartial summary of prediction of volcanic risk, past and present. Very useful for a geology student interested in this area!
ReplyDeleteThanks Ailsa - I'm glad that you enjoyed it.
Delete