Sean Brady sets out seven strategies for identifying and reacting to structural failure near-misses

Civil

Sean Brady sets out seven strategies for identifying and reacting to structural failure near-misses.

In Part 1, we explored the crucial role played by ‘near-misses’ in anticipating significant failures. Near-misses are defined as situations where the potential for failure exists, but good luck intervenes to prevent it.

In other words, they are incidents where latent issues are present, but the enabling conditions necessary for failure to actually occur are absent, but may be present in the future.

This concept was explored by examining Toyota’s withdrawal of more than six million vehicles due to acceleration issues in 2009 and NASA’s Columbia disaster in 2003.

Indeed, evidence from a diverse range of industries, including medicine and business, indicates that near-misses provide the early warning signs of impending failures, which, if heeded, have the potential to prevent serious catastrophe(1).

While it intuitively makes sense to investigate such near-misses, research suggests that two cognitive biases, ‘normalisation of deviance’ and the ‘outcome bias’, can result in these warnings being ignored, typically preventing rational investigation taking place.

So what options are available to overcome these very real barriers? Tinsley et al(1) present seven strategies, developed in consultation with NASA, to effectively learn from near-misses.

1. Heed high pressure
Intuitively, it makes sense that at times of high pressure the significance of near-misses is more likely to be missed. Structural design, and certainly construction, can be, by their very nature, high pressure environments.

The fundamental issue is that when decisions are made under pressure, there is a tendency to rely on heuristics and rules of thumb, thus increasing susceptibility to cognitive biases. Rationality can simply take a back seat in the decision-making process.

Tinsley et al suggest that one of the fundamental questions to ask when responding to high-pressure situations is: ‘If I had more time and resources, would I make the same decision?’. If the answer is no, then a more objective assessment of risk is required.

2. Learn from deviations
A deviation is defined as a difference between expected and actual performance. A key advantage of a deviation is that it is a measurable fact, rather than an opinion; once expected performance has been defined, and actual performance measured, the deviation is simply the difference.

The concept of deviation is powerful, because its factual nature removes emotional arguments over what constitutes a failure, and focuses discussion on the potential causes and consequences of such a deviation. Ultimately, Tinsley et al suggest that individuals should seek out deviations from the norm and ask ‘Why am I willing to tolerate this risk?’

3. Uncover root causes
Building on the previous point, unless the root cause of a deviation is understood, it is difficult to evaluate potential consequences. Unfortunately, the literature suggests that there is often a greater focus on addressing symptoms, rather than identifying cause.

This is a normal human reflex, but a failure to identify causation is a missed opportunity to identify potentially latent errors.

4. Demand accountability
Research indicates that a useful way of assessing near-misses is to do so in a formal manner. When assessments of near-misses have to be justified, our perception of them changes.

In other words, by having to defend our assessments, we become more objective in our approach, and we are more likely to view near-misses for what they are: small failures.

5. Consider worst-case scenarios
It is human nature not to consider worst-case scenarios unless specifically required. By purposefully thinking about worst-case scenarios, however, we articulate consequences and can adjust our decision-making process.

6. Evaluate projects at every stage
The benefit of investigating why a project stage fails is self-evident, but there is also benefit in evaluating why project stages are successful.

Such an exercise forces a rational assessment of why success was achieved, thus challenging the outcome bias and identifying where good luck has played a central role, thus unmasking potential latent errors.

7. Reward owning up
The research indicates that actually getting individuals to report near-miss information is highly problematic. For many individuals, reporting near-misses is akin to owning up to failures, and they are concerned about potential repercussions.

Putting in place a culture where individuals not only feel safe to report near-miss information, but where it is actively encouraged, is fundamental to developing a culture of learning from near-misses.

Near-misses and structural engineering


So are such near-miss concepts applicable to structural engineering failure?

While it is, of course, a matter of opinion, there is plenty of evidence to support its applicability. When made aware of near-miss concepts, most engineers can point to situations in their career where near-miss information was ignored, despite its potential importance being recognised.

It is dealing with these situations where organisations like Structural Safety (formally SCOSS and CROSS) play a critical role in assembling, analysing and interpreting this type of near-miss information to keep the profession aware of potential latent errors in how we approach the design and construction process.

Indeed, the broader near-miss research would support the position that structural safety should not be considered as a useful ‘add on’ to the profession, but rather as an integral part of being genuinely committed to minimising the risk of structural failure.

Failure to utilise key information


When the significant structural collapses are examined, a similar trend in a failure to utilise near-miss information is discovered.

For example, returning to the 30-year failure cycle evident in catastrophic bridge failures, as discussed in the May 2013 issue, authors Sibly and Walker highlighted that almost all of the key bridge failures they examined were preceded by near-misses that went un-investigated(2).

For example, prior to the collapse of the Tacoma Narrows Bridge, there were numerous incidents of unexplained vibration in suspension bridges, including the Golden Gate Bridge.

Further, Petroski, in To ‘Forgive Design: Understanding Failure'(3), stresses that a similar trend of near-miss type information is evident in cable stayed bridge design.

The Anzac Bridge in Sydney and the Pont De Normandie in France exhibited undesirable vibration issues which required retrofitting. One can, of course, argue that the profession is interpreting these events as near-misses, thus heeding the warnings, but history, unfortunately, cautions against such an assumption.

Petroski suggests that there may be a growing perception that such vibration issues can simply be managed, and this is classic normalisation of deviance.

Further, the outcome bias could be, naturally, attributing the success of such bridge designs to sound engineering, with the role of good luck remaining unknown.

Space Shuttle Columbia

There are clear echoes of NASA’s Columbia and Challenger disasters in such discussions, with arguments to the contrary being inconsistent with structural engineering history.

Closure


As we have seen, near-miss information can certainly play a key role in averting the larger failures in structural engineering, as well as in other professions and industries.

There are significant challenges in collecting and analysing such information, and organisations such as Structural Safety have a key role to play in ensuring such information, which is typically sensitive, is disseminated appropriately.

However, there is also a dark side to possessing near-miss information. Research indicates that possessing near-miss information can lead to riskier decision making than in its absence.

While this may appear counter-intuitive, the phenomenon is rooted in human nature, particularly in how we perceive risk. Dillion et al(4) use the following example to illustrate the issue:

Imagine you join a social club that meets in an unsafe part of the city, where there is a statistically higher than average probability of being mugged.

If you were to attend a number of meetings and not be mugged, nor witness anyone else being mugged, then you are likely to feel safer, and perceive less risk.

This is the key point; each visit to this part of the city is essentially a near-miss because the statistical probability of being mugged has not decreased, but your ‘perception’ of the statistical probability has decreased.

In other words, you are likely to be less vigilant (and less risk adverse) than someone who is not in possession of similar near-miss information. As with all information relating to failures, caution is required.

References


1) Tinsley CH, Dillon R. L. and Madsen PM (2011) ‘How to Avoid Catastrophe’, Harvard Business Review, 89 (4), pp. 90-97.
2) Sibly PG and Walker AC (1977) ‘Structural accidents and their causes’, Proceedings of the Institution of Civil Engineers, 62, Part 1, pp. 191-208.
3) Petroski H (2012) ‘To Forgive Design: Understanding Failure’, Cambridge, MA: The Belknap Press of Harvard University Press.
4) Dillon RL and Tinsley CH (2008) ‘How Near-Misses Influence Decision Making under Risk: A Missed Opportunity for Learning’ Management Science, 54 (8), pp. 1425-1440.

This article was first published in September 2013 in ‘The Structural Engineer’, pp 22–23. www.thestructuralengineer.org

Author: Sean Brady is the managing director of Brady Heywood (www.bradyheywood.com.au), based in Brisbane, Australia. The firm provides forensic and investigative structural engineering services and specialises in determining the cause of engineering failure and non-performance.

http://www.engineersjournal.ie/wp-content/uploads/2018/11/a-aaaaanzacbridge-1-1024x678.jpghttp://www.engineersjournal.ie/wp-content/uploads/2018/11/a-aaaaanzacbridge-1-300x300.jpgDavid O'RiordanCivilaeronautical,mechanical,NASA
Sean Brady sets out seven strategies for identifying and reacting to structural failure near-misses. In Part 1, we explored the crucial role played by ‘near-misses’ in anticipating significant failures. Near-misses are defined as situations where the potential for failure exists, but good luck intervenes to prevent it. In other words, they...