Problems of the kind encountered by the Deepwater Horizon and Sedco 711 are not unique and being judgemental with the benefit of hindsight is futile, argues drilling consultant Dr John Thorogood. Pointing out that there is much still to be learned about human error, he calls on the industry to raise its game around operations management.
Over recent months several books and articles have been written, or commentaries made about the Deepwater Horizon disaster. They have either trivialised the nature of the well by saying it should have been ‘relatively simple'[1]; attempted to ‘prove' that the accident was an entirely predictable consequence of the commentator's interpretation of organisational culture[2]; criticised apparently obvious failings of risk assessment with the wisdom of hindsight[3, 4, 5]; reduced the complexities of the event to a sound bite, ‘a breakdown of management oversight'[6]; or simply asserted that the industry has a ‘strong' safety culture[7]. Together, they create an impression that what happened was an exceptional event, confined to one particular operator and not representative of the industry as a whole.
There is, however, another side to this story, namely that what happened may be a predictable consequence of the complexity of the systems that the industry itself has created. With one recent exception[8], this is a question that the industry has not generally acknowledged and, until it does so, there is no assurance that a similar event will not happen again.
All of these critical comments are made with the benefit of knowledge of the outcome. As explained by Woods et al in Behind Human Error[9], this hindsight bias is an inevitable consequence of outcome knowledge. It results in much more severe judgements of the event than would be accorded to a similar event with a less severe outcome. The lack of public reaction and apparent regulatory inaction resulting from an earlier similar incident onboard the Sedco 711 in the North Sea is evidence of this phenomenon. In this instance, disaster was averted because the BOPs operated successfully.
It was fortunate that this blowout had not evolved to the same point as on the Deepwater Horizon, because according to the DNV forensic examination of the Horizon BOP[10], all the rams had functioned and were found closed upon disassembly. The authors concluded that the dynamic effects of the fluids in the well caused the pipe to buckle beyond the limits of the ram blades thereby preventing complete closure and sealing of the blind shear rams. A failure to shear and seal combined with an attempt to close rams on fluids flowing at high rates, a situation also not envisaged by the designers of the BOPs, created conditions under which a successful shut-in was probably impossible. To compound the good luck, the difference between 500ft of water and 5000ft ensured that there was little or no hydrocarbon inventory in the Sedco 711 riser to create the risk of an explosion.
Quick fixes
The commentaries referenced above play into a climate that is ripe for risk denial and yet demands quick fixes. They fail to recognise the complexity of the socio-technological systems that exist in our deepwater drilling operations. The Chief Counsel's Report[11] is inevitably biased with hindsight and qualifies its conclusions with regards to the BOP. However, it goes much further than the earlier National Commission Report to the President[12] to describe in detail the confusion and conflicts that those involved, both onshore and offshore, faced in the hours before the blowout.
The Chief Counsel's Report does not analyse or explain why it was that the crew on the rig made sense of what they saw in a way that convinced them that it was safe to continue displacing the well to seawater. The explanations and accompanying diagrams are couched in simple technical and managerial terms. They make it hard to understand why the rig team didn't see what was apparently staring them in the face. They belie the complexity of the layout of the drill floor and the interplay of personalities, egos and experience. Yet this is precisely what happened and nobody has thus far ventured an explanation as to why. In the closing sentences of Chapter 6, the report agrees that more regulation or inspectors will not solve the problem.
However, the report is unwilling to ‘open the curtain' to look behind the label of human error; concluding that it was simply the absence of a culture of leadership responsibility.
Again, four months earlier on the Sedco 711, a successful inflow test probably created amongst the crew a sense of absolute bomb-proof security about the well. Under these conditions, it is understandable that they might have overlooked some otherwise unremarkable operational inconsistencies prior to commencing the displacement to seawater. If taken note of, these might have suggested that the formation isolation valve had been damaged and the barrier compromised. Naturally, when faced with unexpected volumetric behaviour of the well during the displacement, such a mindset would inevitably direct the search for explanations in precisely the opposite direction to the one that is now so obvious with hindsight. It is only with the benefit of hindsight when we reflect on the narrowness of the escape do we wonder ‘How could they have missed it?'
Simply stating that the successful pressure test ‘blinkered' the crew without any deeper examination of the complexity of the situation over-simplifies and trivialises the incident. Read in conjunction with the Chief Counsel's Report, it is clear that this failure to respond to weak and contradictory signals is not a rare event in our industry. Confirmation is one of the stronger biases influencing decision-making, where operators see what they expect to see and disregard or overlook disconfirming information. It has been implicated in many other major accidents – Challenger, Columbia, Texas City and Gretley, to name but a few.
The sharp end
The natural reaction of engineers to such events is to regard the operator as the weak link. To avoid future repetition, perceived risk is mitigated with more rules, regulations, procedures and training. Contemporary research shows that the problem is much more complicated and that these remedies are generally ineffective. It is not simply the errors or omissions of the operators at the sharp end that matter. As Woods et al[9] explain, to a much greater extent, it is the contribution of those at the blunt end of the system that have far greater influence. It is the climate created by the regulators, politicians, media, public, mixed messages from senior management creating goal conflicts between efficiency and safety, and the contradictions that stem from attempts to comply with over-prescriptive or impractical procedures that create the conditions under which human error is almost inevitable. The reason that there are so few catastrophes is the ability of the practitioners at the sharp end to reconcile these conflicting demands and avoid the traps set for them. As Reason[13] observes, the operators are often the heroes because their adaptations, improvisations or compensations often retrieve troubled systems from the brink of disaster. Unfortunately, on occasion, sometimes the operators do fail. But it is not an individual failure, but collective and evidence of weakness of the system.
In the face of such complexity and in the absence of a detailed enquiry to develop a deeper understanding of the error-producing conditions that exist on our rigs and in our operations, to pretend that a different leadership culture, more risk assessment blessed with second sight, restrictive rules, vigorous regulation, rigorous training and strict certification will solve the problem is naïve. Such a position is at variance with our understanding of the nature of human error and the evidence of the events on the Deepwater Horizon and the Sedco 711 reinforces this conclusion.
In the short term, doing nothing is not an option and there are some things that the industry could beneficially do. Responding to the analysis presented in the Chief Counsel's Report, and drawing on parallels with established practice, for example the training in controlling emergencies of offshore OIMs and deputies in the UK sector, a useful first step might be to codify and enforce through regulation:
1. Selection, training, assessment and certification for all those in positions of authority and control of wellsite operations both onshore and offshore. Periodic refresher training and recertification should run in parallel with well control certification.
2. Standardised organisation structure, procedures and protocols for controlling operations.
3. Compliance with a practice of detailed planning and associated discipline of adherence to the plan from which deviations are tightly controlled.
4. Standardised operational management of change procedure with clearly defined responsibilities for decision-making.
5. Training in the non-technical skills[14, 15] such as leadership, teamwork, communication, decision-making, situation awareness and stress management combined with extensive scenario-based exercising.
While these actions are a necessary first step to consolidate what is, essentially, current best practice, they do not begin to get at the underlying problem of human error and complex systems. To make progress, it might be helpful to concede that we don't fully understand how human error works in our environment. Having acknowledged the limitations of our knowledge, we should do the research needed to gain the necessary insights specific to human error in the drilling domain.
Avoiding the pitfalls
Drawing from the experience of other high-hazard industries, such as nuclear, aviation and chemical, we must apply similar understandings to the pitfalls inherent in our own operations by identifying, training, and assessing the relevant non-technical skills.
With the resulting knowledge, we can also train our people to recognise the domain-specific traps that we set for ourselves.
We can then ensure that our management systems, risk assessment methods, planning procedures and protocols for operational control are designed with the resilience needed to avoid the pitfalls. This work will take time and there is no time to lose.
It is to be hoped that the new US Ocean Energy Safety Advisory Committee to Guide Oil & Gas Regulatory reform recently announced by US Secretary of the Interior Salazar will pick up on this agenda and drive it forward.
Inaction will ensure that at some time in the future, history will repeat itself. Hubris leads to nemesis. We must move quickly to get beyond the label of ‘human error'. OE
About the Author
John Thorogood is an independent consultant after a 34-year career with BP in drilling operations, technology and exploration project management. In 2002-2003 he undertook research with the University of Aberdeen Department of Psychology and published work on drilling teams and decision making and human factors. He is the 2011 recipient of the Society of Petroleum Engineers International Drilling Engineering Award, a former technical director of the SPE and author of more that 40 technical papers and articles on drilling engineering. He has BA and PhD degrees in Engineering from the University of Cambridge.
Acknowledgement
Margaret Crichton, People Factor Consultant Ltd, provided valuable comments on the draft of this article.
References
1. S Reed & A Fitzgerald. In Too Deep, Bloomberg, 2011.
2. L Steffy. Drowning in Oil, McGraw Hill, 2011.
3. I Fitzsimmons. ‘Macondo and other Titanic struggles', Offshore Engineer, July 2010.
4. I Fitzsimmons. ‘Macondo – the unfolding aftermath', Offshore Engineer, November 2010.
5. I Fitzsimmons. ‘Macondo and the Presidential Commission', Offshore Engineer, March 2011.
6. R Tillerson. ‘Tillerson blames BP for Gulf spill', Upstream Online, 9 March 2011.
7. M Ralls. ‘Industry safety culture is not complacent', Drilling Contractor, March 2011.
8. R Saltiel. ‘A new safety language is on the horizon', Drilling Contractor, March 2011.
9. Woods, Dekker, Cook, Johannesen & Sarter. Behind Human Error. 2nd ed, Ashgate, 2010.
10. Det Norske Veritas. Forensic Examination of Deepwater Horizon Blowout Preventer, Report EP030842 for BOEMRE, March 2011.
11. Chief Counsel's Report: Macondo, the Gulf Oil Disaster. National Commission on the BP Deepwater Horizon Oil Spill & Offshore Drilling, 2011.
12. Report to the President: Deep Water, the Gulf Oil Disaster, National Commission on the BP Deepwater Horizon Oil Spill & Offshore Drilling, January 2011. 13. J Reason. The Human Contribution: Unsafe acts, accidents and heroic recoveries. Ashgate, 2008.
14. Flin, O'Connor, Crichton. Safety at the Sharp End: An introduction to non-technical skills. Ashgate, 2007.
15. Thorogood, Crichton & Henderson. Command Skills for Drilling and Completion Teams. SPE Paper 89901, 2004.