Sunday, September 26, 2010

Risk and Heuristics

The perception of risk affects how people behave. People tend to simplify the world; to use simple heuristics to help them understand risk and how they should behave in the face of risks. These simple rules affect how much insurance they buy, where they live, how dangerous they believe modern life is. Quoting facts and figures may do little to alter people’s behaviour or the hold of these heuristics. Studying the types of heuristics people have about risk has been a fruitful are of research since the 1970s when Tversky and Kahneman undertook their early studies (e.g. 1974, Judgment under uncertainty, Science, 185, 1124-1131). Amongst the numerous heuristics that have been researched I want to consider just three in this blog – representativeness, availability and anchoring – in the light of environmental risk.

Representativeness refers to people’s ability or tendency to view risk in one area as comparable to risk in another if the two areas, at least to them, resemble each other. Crime, whatever, its complexion may provide a convenient category for people to fear even if the causes of terrorism are different from bag snatching. The classing of both as crimes may connect the different activities as comparable in people’s minds. A previous blog discussed the media hype behind the BP oil spill. Media reports kept comparing the spill to the Exxon Valdez, forming a comparability connection in people’s minds. Both are oil spills, so they must be comparable. A closer examination of the causes and characteristics of each casts some doubt on their comparability. One was tanker spill, the other a massive, destructive blowout; one occurred in a confined water body, the other a dynamic ocean; one was associate with stark images and immediate of dying wildlife, the other with less obvious and visually striking losses of livelihoods. Yet, calling each an oil spill implies similarities in nature and similarities in response. Pointing out differences may do little to make people think that the things are different.

The ongoing floods in Pakistan are another example. Third world floods, again, may be the immediate response of some readers and viewers. The same sort of floods seems to happen every year, somewhere over there, surely by now they should know what to do? Classifying the event as a flood brings with it the risk of comparison with other events in the same class. By comparison the death toll seems small, by comparison the event seems slow, by comparison it happens a long way away. Such comparisons can become a convenient short-hand to explaining or justifying a lack of action or the vigorousness of a response. Classing an event may help to understand it but there is also a danger that we assume that as it is a member of that sort of event, we understand what it should do and how we should behave towards it. At the crudest level, for example, how many people should be dead to make it an important flood, rather than looking at the individuality of each event. Floods are different in causes, consequences and solutions; one size fitting all is as inappropriate for environmental hazards as it is for understanding most things.

The flood example is also an illustration of availability bias. Availability bias refers to the tendency for people to respond to risks more vigorously when examples of that type of risk are readily available to them. Availability may be from individual or community memory, from the media, from their beliefs about the world and any number of other sources. The Pakistan floods are compared to the impact of other floods we call to mind most readily whatever their cause. Similarly, the BP oil spill is contrasted in the media with the Exxon Valdez, as the latter is viewed as a key environmental event and so a sort of benchmark for other events, however inappropriate or appropriate the comparison might be.

On a more personal level, the fact that you may have experience a flood of your home in the last two or three years, may make you more wary of the flood risk and so more likely to purchase insurance or to try to at least as insurance companies using the same bias may raise premiums to match the increased perception of risk in your local area. Statistically, the local flood may not alter the probability of future flooding by much, if at all, but does it feel like that to you as you wade through your sodden possessions?

Anchoring refers to an individual’s or community’s starting point for assessing risk. Usually people start from a particular value that they belie is associated with a particular type of risk or event and then adjust their estimation of the risk (or its seriousness) in the light of further information. The adjustment will, however, always be in relation to that initial starting value. In other words, for the same physical risk or event, two individuals, one with a low initial estimate, the other with a high one, will interpret any further information about the risk or event in the light of their initial starting values. After the event, it is likely that both will have moved from their initial estimates but the person with the initial low value will still have a lower estimation of the risk than the person with the initial high estimation.

Once again the two recent disasters of the BP oil spill and the floods in Pakistan can be interpreted as examples of anchoring. How do you judge the impact of the BP oil spill? Initial estimates by the company and environmental groups varied. BP trying to downplay the incident, some environmental groups proclaiming nightmarish scenarios for the future of the Gulf. As the event has unfolded how have each side changed its rhetoric? BP has slowly admitted the spill was worst than initial thought, at least in terms of the amount of oil released into the ocean. Images of environmental annihilation of the Gulf have not emerged. So do you adjust your assessment of the damage wrought by the oil spill up or down as evidence and opinion have increased? Does it depend on where you started – as a committed environmentalist or as a company supporter? Does it really matter where you start, doesn’t the evidence speak for itself? Evidence is always interpreted so these heuristics are important.

Aid for the floods in Pakistan may have suffered from an anchoring effect. The areal extent of the disaster is huge and the impact and suffering caused by the floods is both massive and real, but the initial death toll seemed minor in comparison to other disasters in recent memory, such as the Haitian earthquake or the Boxing Day tsunami. It may be simplistic but impact and death toll may be related in people’s mind and a low death toll anchors the flood disaster relatively low down in a mental pecking order or recent disasters. Subsequent media coverage, celebrity appeals and governmental and UN urging for aid may be interpreted in the light of this initial anchor point.

As an additional thought, what is your individual anchor point in the ongoing ‘discussions’ about the need to reduce expenditure on public spending to clear public deficits? The debate seems to have moved beyond do we need to? The debate only seems to be how severely do we need to? Accepting the need is as much an anchor as setting an amount. I may be overly cynical but if leaks suggest a 40% cut in the spending of a government department and a review finally recommends only 30%, then you can’t help but feel a little relieved it is lower than you expected. Anchoring is a very strong tool in setting agendas both for environmental issues and for politics in general.

Sunday, September 12, 2010

Measuring Resilience

The recent BBC commissioned research of resilience in the England by Experian ( factsheets are available at ) has produced some interesting results and highlights or confirms depending on your viewpoint, a clear north-south divide in terms the vulnerability and resilience of areas to the current economic climate, in particular to the expected government cuts in spending ( . The northeast is identified as the most vulnerable or least resilient area ( with explanation of this focusing on the number of people employed in public sector jobs, the history of closures in large tradition industrial employers such as iron and steel and the lack of likely job creation in the private sector. The vulnerability of the northeast is contrasted with locations such as St Albans where the entrepreneurial spirit and the knowledge based industry focus are highlighted.

Any attempt to measure vulnerability and resilience is a good idea. Trying to put numbers to these issues is useful in debate and in identifying factors that may be important in the ability of an area to resist major events, which an economic downturn and a round of spending cuts are, even if they are economic and political rather than geophysical events. Care needs to be taken, however, to understand that these figures, any figures, that try to capture such a complex and slippery set of concepts will never succeed in illuminating every conceptual facet. It should also be borne in mind that merely having the figures can begin to set the agenda for the debate by focusing on the particular metrics used in compiling the figures. Any data collection will have some model of reality behind it, some idea of what determines resilience and vulnerability, and so will always collect data in the light of that model. Experian have published their methodology in detail on the BBC website ( ) as well as the Excel data files upon which the analysis is based ( ). This is, in my view, an excellent idea as it allows everyone to understand how the final indices are created and provides hints about the model of reality behind the numbers.

Experian view economic resilience as describing the ability of an area to withstand and respond to shocks in an external environment. Economic resilience is composed of four themes: business, community, place and people. Each is weighed differently in terms of it contribution to the overall index, with the business variables accounting for 50%, with the other themes accounting for about 17% each. Each theme is described by a series of question. Business, for example, contains questions such as how strong is the local business base? Is it dependent on sectors that have been hit by recession? Are businesses dependent on local markets, or do they export? People contains questions about age of population, their jobs and earnings. Community contains questions about life expectancy, do neighbours look out for each other and long-term unemployment. Lastly, place contains questions on house prices, green space and GCSE attainment rates. Experian use 33 variables to create the index and do state that the variables used were dictated, at least in part, by the data available at local authority level.

The actual variables used are listed for each theme and range from insolvency rates, to business density, to % of population who are corporate managers, deprivation and crime rates.

This is the basic outline of the analysis and its methodology, so from the viewpoint of environmental geography what can be asked and interpreted from this analysis? Firstly, is the unit of analysis used, the local authority area, the most appropriate for what they are trying to analyse, economic resilience? The LA may be a unit for which there is lots of data as it is an administrative unit but is it the unit within which businesses operate? Do businesses locate within local authorities or within business parks (maybe with incentives from local authorities) or where successful business or businesses like them are already located? Similarly, is the local authority really an appropriate area within which to judge community? Are communities this big? You all probably know your own local authority area, do you think it is that homogeneous, are all parts of it really that similar or are there ‘good’ and ‘bad’ bits? The appropriate unit of analysis is a continual problem in geography and, to be fair to Exeprian, they have taken a unit for which there is a lot of data and for which data is collected. This data is not collected fro the type of analysis Exeprian have undertaken so it doesn’t exactly match what they would like to be available but virtually every survey has this problem as well. The important thing to bear in mind is that whatever spatial unit you select to use there will always be problems and there will always be a geography below or above the unit you select. It is down to you to decide if these other geographies are more appropriate.

Secondly, the method of ranking local authorities for different themes and then combining ranks can be misleading. A small, fractional percentage difference between two local authorities may be the difference between being ranked 5th adn 6th whilst a 10% difference may result being ranked 320th and 321st. You do not get any feel for the magnitude of the differences between areas. This is compounded once you combine ranks. So a drop between being 1st and 25th in the business themes may only reflect a slight difference in the absolute, numerical values of variables but is represented by a large number of places in the ranksing. A drop from 200th to 201st in the rankings may represent the same absolute differences in the magnitude of the variables as between the first 25 areas in the rankings. The raw data is needed to assess how the rankings and absolute changes match up. It shoudl be noted, however, that Experian do state that they have considered the correlation between variables in their analysis, another key issue in derving a meaningful statistic.

Thirdly, what do the variables used tell us bout the type of economy that Experian view as resilient and as vulnerable? The business variables depict resilience as being determined by a strong private sector and one that is not reliant upon its local market for its existence. Innovation, knowledge-based and able to draw on a reserve pool of skilled labour seem to be other important determinants of a strong business theme. Does this match your image of a strong, contemporary economy? What about the provision of infrastructure – or is that down to the private sector as well? Public services are viewed as a weakness in the local economy despite the need for provision of such services to support the private sector. If you have an aim to convert public to private then an area where private enterprise is strong are likely to be areas that rapidly take on public services once aspects of them are privatised. Likewise, people variable focuses on level of qualification and level of corporate managers in the population, once again highlighting the potential for a privatisation of the public. The focus on knowledge-based services may also suggest a preference for small firms as drivers of the local economy rather than large employers.

Thirdly, the community and place variables once again have a model of what is ‘good’ and ‘bad’ behind them. Indices such as long-term unemployment, claimant count and income vulnerability (likely to affect lower paid but increasingly the middle income as well) all point to a relatively inflexible work force in terms of skill development relevant for the sectors identified as important in the business variables, e.g. knowledge-based services. Place variables are a more mixed bunch ranging for educational attainment (related to skills again) through to crime rates and house prices. Do you feel these variables really reflect the feelings and strengths of communities and their networks of relations? The variables seem very static and unable to grasp the dynamic nature of such relations.

Overall, the Experian map is a great starting point in the debate over resilience and vulnerability to economic events and, potentially, other events as well. But it is only a starting point and like any analysis it is limited by the data available and will have underlying assumptions about what is ‘good’ and ‘bad’ based on a particular view of how the economy functions or should function. At least there is now a basis for discussion but the survey should not become an anchor point that sets the agenda without question.

Friday, September 10, 2010

BP Oil Spill: Content of the Accident Investigation Report

In my previous blog I looked at the context of the report, in this blog I want to look at the content of the report (report available at The investigation identified eight key causes that combined to produce the incident. The eight are:
  • The annulus cement barrier did not isolate the hydrocarbons
  • The shoe track barriers did not isolate the hydrocarbons
  • The negative-pressure test was accepted although well integrity had not been established
  • Influx was not recognised until hydrocarbons were in the riser
  • Well control response actions failed to regain control of the well
  • Diversion to the mud gas separator resulted in gas venting onto the rig
  • The fire and gas system did not prevent hydrocarbon ignition
  • The blowout preventer (BOP) emergency mode did not seal the well
For each of these causes some articles assign blame to BP and its contractors (e.g. BBC report -, Guardian article - But how did the team investigate the cause within their TOR?

Appendix I of the report outlines the method used: fault tree analysis. Fault tree analysis (FTA) is a standard method of analysing technical failures of systems using Boolean logic to combine a series of lower level or previous events. Originally developed in 1962 to analyse ICBM launch control systems (, the analysis starts with the undesired event as the top of the tree and then breaks the possible causes down into subsystems and assesses how these prior causes or initiators could arise. The analysis relies upon experts being able to identify how subsystems and their components fail and how these failures can build up to produce the top event, the undesired event (Figure 1). Once identified, each subsystem can be analysed to assess if it was likely to be the source of failure in the cascade that results in the top event. Failure of lower subsystem can be prevented from producing a cascade of failures to the top event if some intervening subsystem does not fail. The number of possible ways failure can occur increases as the number of subsystems increases. As the number of subsystems reduces as the top event is approached, providing fail-safe systems becomes increasingly important as there are fewer and fewer pathways to failure. The process of creating and analysing a fault tree is systematic and logical and, where information is available, probabilities can even be assigned to specific events within the branches of the tree so a likelihood of an incident can be calculated.

Figure 1 Illustratin of Fault Tree Analysis

Within Appendix I there are four fault trees outlined based on the four critical factors identified by the investigation team; well integrity, hydrocarbons entering well undetected and loss of well control, hydrocarbons igniting on Deepwater Horizon and blowout preventer not sealing the well. These four fault trees are focus of investigation and are the context in which all evidence is collected and evaluated. The team assigned each box as a possible contributing factor to be investigated, designating, where possible, if the box represented a ‘possible immediate cause’ or a ‘possible system cause’. Simplistically, ‘possible immediate cause’ can be equated to mechanical or technical failure, whilst ‘possible system cause’ can be equated with failure of communication, human mistakes of interpretation and procedures. In addition, within each box there was either a reference to a specific section of the report for further discussion, a statement that evidence ruled out that cause or a statement that the evidence was inconclusive for that cause.

Figure 2 Illustration of branches of fault tree associated with well intregity

Figure 2 illustrates a subsection of the fault tree for well integrity. This subsection of the FTA shows that more details are available in the appropriate section of the full report, but for this branch of the fault tree, all the possible causes can be ruled out based on the evidence collected. Figure 3 shows the end branches for a section of the fault tree and in this case the interaction between ‘immediate possible cause’ and ‘possible system causes’ illustrates that it is not a simple answer of either mechanical or system failure but more likely to be a complicated combination of both as you analyse the branches. Both these figures are not chosen to point to the most important cause but rather to illustrate the reasoning behind the conclusions and recommendations of the investigation team.

Figure 3 Illustration of end branches of fault tree showing possible immediate and possible system causes

The investigation team used the Swiss-cheese model to illustrate how the four critical factors and eight causes were related (Figure 4). The barriers are the defensive physical and operational barriers that were meant to prevent an incident. Although the figure makes the key relationships easier to understand it does not show the intricate web of relations that tied all the actants, physical and human, together in the complex system that produced the event. The figure does not show the web behind the barriers nor how the barriers are defined and set up in the first place. As I said in an earlier blog, experience tends to influence what is seen as important for operation and for prevention; a new incident can alter this perception and so alter what is regarded as important for different barriers and may even identify new barriers to consider in new environments or contexts. Many of the recommendations made are aimed at improving the links and flow of information between the human actants in the system to ensure that information derived about the physical actants, such as well pressure, is interpreted in a consistent and appropriate manner and that it is clear what actions should be taken and when. Likewise, the investigation highlighted the need for information flows about the state of these actants needs to be improved, such as the condition of critical components in the yellow and blue control pods for the BOP, are maintained at the standard required for them to operate correctly.

Figure 4 Illustration of Swiss cheese model of hazards analysis based on Deepwater Horizon report

Official publications associated with the disaster are:

The US Fish and Wildlfie Service have produced this publication:

Whilst other books that explore the spill and its legacy and legal aspects include:

BP Oil Spill: Accident Investigation Report

BP released the report of its internal investigation team on the Deepwater Horizon accident on 8th September 2010 ( Media coverage of the report has made much of the alleged attempts to divert blame for the accident onto other companies; contractors involved in the operation and maintenance of the oil rig (e.g. Who’s blamed by BP for the Deepwater horizon oil spill -, BP oil spill report: the Deepwater horizon blame game-, BP oil spill: US reaction to the BP report - Robert Preston, the BBC’s business editor even dubbing BP as standing for ‘Blame Placing’ ( This blog looks at the report in context; a second blog will deal with the findings themselves.

The report itself is at pains to point out that it is own limitation. The second paragraph of the executive summary, for example, states:
“In preparing this report, the investigation team did not evaluate evidence against legal standards, including but not limited to standards regarding causation, liability, intent and the admissibility of evidence in court or other proceedings.”
Deepwater Horizon Accident Investigation Report: Executive Summary, 2010, p.2.

The report notes it had to work with the information available to it and draw interpretations from sometimes contradictory, unclear and uncorroborated evidence that used the ‘best judgement’ of the team, but from which others might draw different conclusions. The report even finishes with a section on what the team could not analyse. So is the report a PR exercise, an attempt to deflect blame or a genuine attempt to provide some rapid, informative answers to the questions about what caused a major environmental disaster.

The report needs to be considered in the light of how industries operate in the contemporary economic environment. I am not concerned with legal definitions of responsibility nor intend to discuss these or get into such a debate as I am sure that such heated deliberations will ensue once money comes to the fore. Robert Preston’s blog is useful in illustrating the ‘hollowing out’ aspect of modern large companies such as BP. Companies no longer do everything; contracting out aspect of their industry that they are either not good at or that other companies can do better or more cheaply has become a common practice. BP may be an oil company but it does not undertake every aspect of the oil industry in house.

Robert Preston’s blog provides a good analogy of a dodgy chicken tikka masala bought from a supermarket. If you are ill after eating the meal do you blame the supermarket or its contracted manufacturers? He states that most people would hold the supermarket accountable although the contracted company may have had the sloppy hygiene standards that produce the dodgy meal. In his blog he does point out that BP were the named party on the relevant oil lease and so assumed to exercise sufficient oversight.

My view is that the example is a little too simplistic to grasp the complexity of relationships that define a modern business enterprise. Imagine instead that you want to get to work every day to do what you are good at. You are not good at driving nor want the expense of owning a car, so you contract out both hiring a driver and leasing a car. You specify that you need a driver who can take orders and a car that is a reasonable car for your status. You tell the driver you leave a 08:10 and must be at work at 08:30. Everything seems to run smoothly, the driver is well turned out, the car is comfortable and you get to work on time. One day there is an accident as the car overturns taking a corner – who is to blame? It may seem simple, the driver is to blame, he was driving – he is the person immediately, obviously involved in the accident, its cause. BUT you specified the time; he has to drive to ensure you get there on time. Is it the pressure you put him under that caused the accident? Furhter investigation points to some mechanical problems with the brakes. Not enough on its own to cause the accident but a possible contriubting cause. The car is maintained by the leasing company, who are good at leasing but not at maintenance so contract that out. But you specified only a standard maintenance contract,you didn't specify that there would be undue wear on the brakes as you don't drive so don't know how different driving styles affect brake wear. The subcontractor states it is nothing to do with them as they maintained it to the standard specified. Where does the cause lie? With mechanical problems, with your communciation with your contractors, with your ability to specify exactly what you require or with your understanding of the context?

I hope you can start to see the problem. Such a complex web of relationships requires careful and thoughtful planning and overseeing. Relations and specifications need to be established carefully and maintained. Importantly, you may not realise there is a problem with the relations or specification until there is a problem. The problem itself highlights the errors, by which time it is too late. This does not absolve you of blame it just shows how difficult it is to pin down exactly who or what is the cause. Causation and blame may be different things entirely.

It is within this context of devolved tasks that the investigation team undertook the report. Central to this report, in fact any report, are the terms of reference, TOR, found in Appendix A of the report. The scope of the report is defined as finding facts surrounding the uncontrolled release of hydrocarbons and efforts to contain that release aboard the Transocean drillship Deepwater Horizon. More specifically, the team will determine the actual physical conditions, controls and operational regime related to the incident to understand a) the sequence of events, b) the reasons for initial release, c) the reasons for fire, d) efforts to control flow at the initial event. As well as a timeline for the event itself, the team were also tasked to described the event and identify critical factors, both immediate causes and system causes. As with any TOR, the terms are narrower than you might want to try to understand the event in totality and, as is common with such an event, the key focus is on the technical and procedural. The team are not tasked to apportion blame within their TOR, they are merely seen as reporting ‘the facts’. Clearly, ‘the facts’, as in people’s actions and recollections, depend on what they are told and upon who tells them, what hidden agendas each person might have. Instruments and equipment, where available, tell another set of stories which may at first seem more objective but once different experts begin to interpret the information may become almost as ambiguous as the recollections of fallible humans.

The focus on the initial release and the events leading up to the explosion of necessity spotlights the actions of individuals in the decision making at that time. Despite this, a number of issues concerning equipment, maintenance and instructions are highlighted as requiring improvement suggesting that systemic factors may be more important. In other words, the communication and relations between companies is as much at the heart of the event as the faulty decisions made at the time.

Interestingly, the investigation team had 5 specific terms of reference associated with administration, including the sanctioning of all activities by a team leader, the requirement of a BP person at each interview and on questions or tasks to BP contractors without BP approval. The impact of such administrative arrangements on the nature or scope of the questions asked is not discussed. How are these administrative requirements to be interpreted? As a standard implementation of policy in such investigations, as a check on the team adhering to the TOR or ensuring the TOR were clarified to the team when required? Your interpretation may depend on the degree of belief or trust in have in the internal report in the first place.

The complexity of the task of assigning causation and blame is highlighted by the team in the Executive Summary:
'The team did not identify any single action or inaction that caused this accident. Rather, a complex and interlinked series of mechanical failures, human judgments, engineering design, operational implementation and team interfaces came together to allow the initiation and escalation of the accident. Multiple companies, work teams and circumstances were involved over time.'
Deepwater Horizon Accident Investigation Report: Executive Summary, 2010, p.5.

But why produce and release the internal report to the public now? There are other reports in the pipeline, not the least the official report into the incident that will presumably form the basis for blame, responsibility and one would assume compensation claims. BP may be trying to show themselves as a responsible company, but there is also the possibility that they are putting the report out there as a marker, an anchor for further reports. Whatever the status of the BP internal report, it is now known and available, it provides information and interpretations that any other report will be compared to. BP have provided an anchor or a starting point for expectations. Other reports will need to refer to it, to agree or disagree with it, to confirm or reject its findings and assertions. BP might not have defined the agenda for the debate over responsibility that will develop but they have defined the starting points and details that all other reports will have to cover; so not a bad start to agenda setting.

Friday, September 3, 2010

Haddon Matrix and Hazardous Events

Looking at hazards in different ways, through different conceptual frameworks is always useful as it tends to make you think about things, however slightly, in a different way. A framework often used in injury prevention, in road accident research and public health is the Haddon Matrix. This was devised by William Haddon back in the 1970s for use in road traffic accidents. The basic matrix is divided into 12 cells. The rows are defined by the temporal aspect of the event; pre, during and post, whilst the columns are defined as ‘host’ (you could rethink this as ‘the individual’), ‘equipment’ and two for environment; one for ‘physical’, one for ‘social’. The idea is to fill in each of the cells with key aspects that will influence or did the hazardous event. Effectively you are playing out different scenarios and filling in the cells depending on what factors you see as significant in each scenario. The framework forces you to deal systematically with the nature of the hazard and how it might play out in reality.

The example provided is for road traffic accidents but the basis can be translated to other types of hazard. In the crash, the condition of the individual before the crash may be important for the reasons in the matrix. Each individual will have different characteristics that could be important and each can be included as appropriate. Simiarly, different aspects of the equipment will be important depending on the nature of the crash and so these factors may not be clear until after the event. The environmental factors, seem to be more diffuse and provide a context, that for certain types of individual behaviour and certain equipment failings produce an environment conducive to a hazardous event. Importantly, despite the descirption and divsion of the event into these spearate cells, the contents of each cell depends upon the relationships between the host, equipment and environment. Fro eample, the scoial norms that permit DUI, would not be improtant had not the host not had a seatblet and been drinking. The poorly designed fuel tanks only become significant when the drunk driver crahse and so on.

This framework does have its limitations. The recognition of important factors can be so wide ranging as to be useless in planning if extreme scenarios, with infinitesimal probabilities of occurring are considered. On the other hand, it may not be until the event happens that it becomes clear what factors are important. The matrix will probably be of most use when similar hazardous events are being considered, as similar events would be expected to have roughly similar important factors. The matrix can also be used to identify where particular factors are not relevant. In a pile-up on a foggy motorway, for example, the detailed life history of the individual in the second car in the crash may not have any significance to their survival, it is the general physical conditions that are of over-riding significance. Equipment factors, such as airbag installation, age of car, may have an impact however. In other words, the matrix might be useful to explore the topographies of different hazards or disaster; in exploring the nature or shape of the hazard and what factors dominate that landscape and which are incidental ‘bumps’ on the terrain (please excuse the landscape metaphor, but I am a physical geographer!)

Something useful might be gained by overlaying the matrix with the Swiss cheese model of Reason outlined in an earlier blog. The matrix framework helps to identify the factors that might be important at each stage; the Swiss cheese identifies if a particular trajectory of factors lines up to produce a disaster. The matrix helps identify the possibles, the Swiss cheese, whether these possibles are important in combination. In the case of the BP oil spill, for example, the Haddon matrix could be used to identify key pre, during and post disaster factors, such as the alleged failure in safety procedures and lack of disaster planning. The trajectory arrow of the Swiss cheese model can then be used to assess if this one failure affects the next layer, if one failure or factor then lines up with another to produce the cascade of errors that result in a disater.

Some potentially useful books for assessment of hazards of injury are:

Injury Prevention in Children by David Stone (2011)

Injury Control: A Guide to Research and Program Evaluation by Rivara et al. (editors) (2009)

Injury Epidemology: Research and control strategies by Leon Robertson (2007)


Risk is a tricky thing to pin down and, as with most things in hazards analysis, open to a wide range of interpretations. A useful website for discovering just how open to debate this term is is John Adams website ( We all encounter risk everyday. The financial markets have just collapsed under the weight of risks. We drive along the motorway aware of the risk of other drivers (at least I hope everyone else does as I do!). We weigh up the risk to our health, the length of our life, of another drink, another cigarette, another burger or pie. Or do we?

Risk can seem to be such an easy thing to define. You can work out the probability of something occurring- the probability of dying from smoking a specific number of cigarettes per day, the probability of a specific amount of alcohol per day giving you cancer, the probability of contracting cancer given a specific level of exposure to radiation. Trouble is people often act as if they don’t know these probabilities exist.

Risk can be defined accurately, mathematically and scientifically using statistical analysis. Risk can be defined as the chance of a particular defined hazard or event occurring. If you now the frequency of occurrence of a particular level of flow in a river then you can work out the probability in any one year that there will be a flow of a given magnitude. Leaving aside problems of how long does the record of flows need to be to be representative, how well are extreme events represented in that record and many other factors, the key point is that, in theory, risk can be calculated from such records. Risk can be given a number: a fixed value that informs people and what they should do. Btu why should risk bother you? Risk only becomes important because you feel you might have something to loss. Risk can only be defined in relation to loss, so only within a context of fear or loss. It can be refined as a simple equation of:

RISK = [Hazard (probability) x Loss (expected)]/ Preparedness (loss mitigation)

You can see how each bit of the equation could be given a number. The hazard from scientific analysis of the geophysical nature of the hazard or rather the probability of the hazardous event. Loss can be calculated by the amount of money you would need to replace what you could lose if the event happened. Preparedness, more tricky, but maybe how much you can pay to insure against you loss by that hazardous event. But is this all risk really is?

There are other ways of looking at risk.
  • Risk can be defined as being a real thing, out there and so subject to scientific and mathematical analysis and calculations that are common across experts.
  • Risk can be seen as a cultural and social phenomenon created by the society we live in and so subject to change as that society changes
  • Risk can be defined legally as a responsibility or a failure of expected conduct
  • Risk can be defined psychologically as a set of behaviours and understandings about the world
  • Risk can be defined within the humanities as an emotional phenomena and as a story or narrative

    Each of these different definitions illuminates different aspects of risk and may ring true with individuals in different circumstances. When watching news reports about floods in Pakistan, for example, I am seeing risk as a story or narrative dictated by the media and its beliefs about how I expect the disaster to unfold. Never underestimate the tight constraints of such storylines in affecting how we see things. Risk of injury on a building site could be viewed through the lens of legal definitions of risk and responsbility. The reactions of individuals to flooding and flood risk could be viewed through the lens of psychology. Some people believe in the risk and insure, others don’t and save their money – are the first group risk averse, the second risk takers or is it more complicated than that?

    There is another way of defining risk, either singularly or in combination.

    • Real: the calculation approach as above plus objective below
    • Objective: the risk is real, a thing and it is out there for us to study and quantify
    • Observed: Risk we can measure given our particular view of the world (and given it is real and objective)
    • Subjective: Risk is about mental states of individuals who are only human and so plagued by fear, worry, uncertainty and doubt
    • Perceived: subjective estimate of risk by individual or group

    I would argue that all risk is perceived and that risk tends to be defined by the judgements of people, singularly and in groups, based on their application of some knowledge or information about the uncertainty involved, where this knowledge or information is objective, observed or subjective. When we believe or precieve the risk to be generated by some real, physical phenomenon then we can meausre it and calculate risk. This does not mean others will share our view of the world as objective nor our view of risk as soemthing objective.

    What this means is that the perception and belief of risk varies from individual to individual, from group to group, from place to place and even from event to event. Trying to model or generalize about the actions of individuals in the face of risk is difficult but in future blogs I hope to present some models and general ideas about how people have tackled this complicated problem of understanding how people perceive and react to risk.

Hazards: The Complexity Approach

I was at a conference on water and risk at the start of the year. A very distinguished professor had delivered an extremely interesting talk on a key concept he had taken decades to get across to policy makers involved in development. As the presentations went on he became increasingly concerned and agitated that researchers should realise that they had to get their ideas across to policy makers who were not well versed in either the details of academic debate or the intricate nature of conceptual frameworks. He questioned a final year postgraduate about what the major new concept in hazard and risk was. Without hesitation the postgraduate replied ‘complexity’. The distinguished professor paused, audibly drew in a long breath and said ‘God help us!’

Complexity is, as the name implies, neither the easiest concept to get across nor the easiest one to illustrate. Part of this fogginess is because the concept is still evolving within hazard analysis. The fixed and clear definitions of what it is and how to use it are still in their infancy and still subject to intense academic debate (although a useful discussion of the concept is given in Smith and Petley, Environmental Hazards, 2009, Routledge). Different researchers from different field converge on a particular disaster and each applies their own view and meaning of complexity to the analysis of that disaster. So what follows is a partial interpretation of complexity thinking and hazards but one that I hope will nonetheless provide a flavour of how a new concept is starting to mesh and enhance hazards analysis.

Complexity theory is a borrowed, as most geographical concepts are, this time from physics and mathematics where it evolved from a detailed equation based theory into something that even geographers could begin to understand. The central idea is that a system of components operating together produces some output. This may not sound that dramatic but it is the type of output that is a little unexpected. Traditionally, it has tended to be assumed that you can understand something better if you start to pull it apart and study each component one by one, individually. Once you have a detailed knowledge of the components then you have a detailed knowledge of the system. Simplify the system to understand it. This is a highly reductionist view of reality and how you go about studying it. To understand a car you dismantle it and study each component in great detail then put the parts back together and you understand the car. Even with my limited mechanical knowledge, I can see this will not work! Complexity is a brake on this view of simplifying reality to study it.

Complexity recognises that real systems are complicate and intricate networks of components acting together in a variety of ways. Simply studying one component or even a small group, does nothing to help us understand how the system really works. It is the interactions, the relations, which drive the system that produce the emergent behaviour that we observe and try to study. In complexity theory the bits of the system, the actual components are still vital as without them the system would not exist, but to understand the system, to grasp how it works, it is the interactions, the relations and their changes that it is vital to understand. From these interactions there does tend to emerge some predictable forms of overall system behaviour. Sometimes, however, change the relations and the output can alter in unexpected and unpredictable ways. In this view of hazards, hazards and disasters occur not necessarily because of one factor but through a combination of and the complex interactions of a number of factors.

My earlier blog on the BP oil spill and the Swiss cheese model of hazards could be seen as an illustration of complexity in action. In this model, it is the interaction between specific ‘holes’ that results in the incident occurring, without this interaction there would be no incident, no explosion, no oil spill. This model is only one by which hazards can be understood however. My earlier blog on ash cloud again focuses on interactions, this time amongst a group of actants to start to form an understanding of how the system evolved and how the hazard itself became defined.

The compelxity approach is outliend albeit briefly in Petley adn Smith's book below.

Hazards and Vulnerability

Media reports and images are full of vulnerable people being struck by disasters. Film of families being rescued by inflatable boat in Pakistan has been common staple on recent news reports. When Hurricane Katrina smashed into New Orelans, it appeared the most vulnerable people in a developed country were being targeted by the disaster. It seems so clear but what do we actually mean by vulnerable? Leading on from this question is another important one. If we can define does this help us take steps to ensure these people are not affected by such hazards and disasters?
In a previous blog (Floods in Pakistan: Vulnerability) I began to discuss the complex nature of any definition of vulnerability and illustrated some of the issues using this ongoing disaster. If you want a simple definition then vulnerability can be defined as the potential for loss of life or property in the face of environmental hazards or environmental disasters (or indeed any hazard or disaster). Loss susceptibility is another term often used in relation to vulnerability. Other definitions include vulnerability as a threat to which people are exposed; vulnerability as the degree to which a system acts adversely to a hazard (whatever adverse might mean?!); differential risk for different social classes; interaction between risk and preparedness; inability to take effective measures; capacity of group to anticipate, cope with, resist and recover from impact of natural disaster. There are others and any one interested in the range of definitions used should have a look at Susan Cutter’s book, Hazards, vulnerability and environmental justice (2006, Earthscan). A key point to bear in mind is both the physical and human environments can be vulnerable. Physical systems can be fragile and susceptible to impacts as much as human systems. Outlining how these can be studied together will be the subject of a future blog. This blog will focus on social vulnerability, the vulnerability of the human part of the equation, rather than physical vulnerability.
Some other terms borrowed from ecology also tend to be used when researching vulnerability. Adaptation refers to the ability of the actants in the socio-ecological system to find strategies to adapt to the hazard or disaster. Resistance is the ability of the actants to resist the impact of the hazard or disaster. Resilience is the ability of the system to absorb, self-organise, learn and adapt to the hazard or disaster. A useful resource for vulnerability can be found at the web pages of Neil Adger ( and at the Resilience Alliance website ( a site looking at research into resilience of socio-ecological systems and sustainability. As with most things borrowed, once you change the context the meaning changes as well, so the application and use of these terms does not necessarily match their original, potentially more limited definitions in ecological research.
An important aspect of vulnerability is that it evolves; it changes as the nature of the disaster or hazard unfolds and as the people who are vulnerable response and react to their situations. This also highlights the importance of scale for defining vulnerability. What scale is appropriate? The individual can be viewed as an important unit, but the individual usually operates within the context of a family or household, so is this a more appropriate unit for analysing vulnerability and resilience? What about larger entities such as communities and governments? As you change the unit of analysis would you expect the different units to have the same type of vulnerability, the same ability to resist or the same characteristics of resilience? Once these different spatial entities interact, such as the provision of aid by the government to individuals, does this cross scalar interaction affect vulnerability and resilience? In other words, what seems like a simple thing is very complex to unravel in detail.
At heart vulnerability is about the differential ability or power to access resources by individuals and groups in society. To escape a flood you need the power or ability to get out of the area. You need a car, you need early warning, you need a friendly policeman to wave you through and protect you from the other people trying to escape on foot. These material things require resources and access to them at the appropriate time. There are static and dynamic aspects to this access to resources. The static aspects of vulnerability, might be capable of identification before a disaster strikes. At the simplest level, mapping socioeconomic groups gives an indication to the availability of funds to gain access to resources. Likewise, mapping similar census data such as lone parent numbers or age (elderly and young are less able to escape floods for example) could also indicate the vulnerability of a place. A useful site that discusses such mapping and has developed a specific means of measuring it, the index of social vulnerability, can be found at the University of South Carolina at the Hazards and Vulnerability Institute ( of which Susan Cutter is the Director.
There is also a dynamic aspect to vulnerability; the manner in which relationships are organised and the manner in which they change through normal times and then during and after a disaster. Such flows could include the transport infrastructure; a key aspect that appears to have failed during this disaster and which has dramatically affected the ability of the institution of government to maintain an effective relationship with vulnerable groups. At a local level, however, if the transport infrastructure that remains intact sufficient for the local population to move to safety and then initiate community based activities that represent resilience at that level? Importantly this dynamic aspect is concerned with pathways and relations, both physical between locations and places and social and emotion between peoples and between individuals and organizations. From the above it is clear that trying to understand vulnerability also means trying to udnerstadd its geography; how it varies in space and time and how people succumb to, adapt or try to overcome this geography.

Two useful books on vulnerabiltiy are:

Measuring Vulnerabiltiy to Natural Hazards: Towards Disaster Resilient Socieites by the United Nations University (2007)

Hazards Vulnerability and Environmetnal Justice by Susan Cutter (2006)