Sean Brady concludes this two-part article with a warning to engineers not to become over-reliant on their ‘tools’, but to consider how and when to apply them. 

Introduction

The rescue would go on all night, and when the sun rose over Mann Gulch at 4am, Dodge, Rumsey and Sallee would look down over the barren, burnt slope where they’d raced with fire.

The grim task of identifying and recovering the bodies of the 11 firefighters who had perished was in progress. The two other survivors, Hellman and Sylvia, were taken away, but they would die from their burns before noon that day (Figure 1).

Dodge had survived by lying down in an escape fire but, despite his orders, his men had ignored him and continued to clamber up the steep side of the gulch – many still clutching heavy tools – attempting to get to a ridge that was out of reach.

Only Rumsey and Sallee would just beat the flames and make it to safety. Why did so many of these men cling to their heavy tools as the flames bore down?

And why did they ignore Dodge’s escape fire and continue running, even though it should have been obvious to them that they would never make safe ground?

The easy answer to these questions, of course, is that the crew simply didn’t think at all. But to stop our analysis of the tragedy at this point is to miss the underlying reasons why they stopped thinking. Was it fear alone or was something deeper at play?

While few of us will have to outrun a wildfire in our professional engineering careers, what happened in Mann Gulch was much more than a fire, it was a lesson in how we, as humans, make decisions under pressure. Understanding the reasons why we can abandon rationality is one of the keys to preventing engineering failures. 

Figure 1 Memorial to firefighters who died at Mann Gulch

Priming and fixation

We will first fast-forward to the 1990s, to the University of Pittsburgh, Pennsylvania, where Jennifer Wiley would undertake a number of fascinating psychological experiments(1).

Wiley was interested in how priming affects our ability to think clearly. Priming, in psychological terms, is when an individual is subjected to a background factor, which then puts that individual in a specific psychological state that affects their subsequent actions, in some cases without them being aware of it.

One of the most important and comical illustrations of priming was carried out by psychologist John Bargh at the New York University, and became known as the ‘Florida Effect’(2)

Students, aged 18-22, were divided into two groups, with each group being required to make four-word sentences out of scrambled five-word sentences, eg, ‘finds he it yellow instantly’ could become ‘he finds it instantly’.

One group, the control group, were given sentences comprised of random words, but the other group were given sentences that contained some words directly related to being elderly: words like ‘bald’, ‘wrinkle’ and ‘Florida’.

The experiment commenced with each group unscrambling their sentences, and they were then directed to leave the room and walk down the corridor to another room. The outcome of the experiment actually occurred in the corridor, as opposed to either room.

Incredibly, the experiments showed that the group that unscrambled sentences containing the elderly themed words walked slower down the corridor than the control group. In effect, the elderly themed words primed the students to behave in a more ‘elderly’ fashion(2).

Psychologists have conducted many experiments to illustrate how powerful priming can be, such as how negative priming can result in poorer performance during cognitive tasks(1).

In Remote Associate Tasks (RAT) tests, an individual is provided with three words, then asked to identify a fourth word that can be combined with the other three words to make a common word or phrase.

For example, the words ‘blue’, ‘knife’ and ‘cottage’ are given to the individual. The individual then comes up with the fourth word, in this case ‘cheese’, giving ‘blue cheese’, ‘cheese knife’ and ‘cottage cheese’.

However, in some cases individuals were first primed with random words prior to sitting the RAT tests, and they subsequently performed poorer when compared to unprimed individuals.

These priming words essentially caused individuals to suffer from fixation, a fixation that was both hard to overcome and set individuals on solution paths that were unsuccessful.

(The experiments showed that incubation, taking time away from the problem and then returning to it, was most effective at overcoming the fixation. Time away allowed individuals to ‘forget’ the priming words, thus freeing up their thinking process to reach the correct answer. This, of course, is one of the reasons why we can so often solve tricky problems in the shower or while driving home from work – we are incubating the problem, allowing our minds to forget the negative priming effects, thus removing the fixation and freeing up our thinking to reach an appropriate solution.)

Domain knowledge priming

Wiley, however, was interested in an intriguing twist to the concept of priming. Rather than individuals being primed to cause fixation, what if the participants, by their existing knowledge, primed themselves? What if the domain knowledge or expertise that the individual possessed prior to the RAT tests was enough to negatively prime them? Wiley set out to answer these questions(1)

She selected knowledge of baseball as the priming ‘expertise’ or ‘domain knowledge’. In the experiments, individuals with both low and high levels of baseball knowledge were subjected to RAT tests.

The words in the RAT tests were carefully selected to contain baseball terms so that individuals with a high level of domain knowledge in baseball would activate their knowledge and become fixated.

This fixation would set them on incorrect solution paths. Wiley theorised that those with a low level of baseball knowledge would not be primed and therefore would perform better on the tests. 

'In pursuit of knowledge, everyday something is acquired; in pursuit of wisdom, everyday something is dropped.' Lao Tzu

She was right. The high-knowledge participants did considerably worse in the tests than the low-knowledge participants. The low-knowledge participants had little or no baseball knowledge to recall, did not get primed, and did not get fixated on futile directions when looking for a solution.

Wiley had demonstrated that the possession of knowledge or expertise, when it is not directly beneficial to your current task, can actually be a disadvantage. And here is where it gets really interesting. Wiley examined whether it was possible to ‘switch off ’ this expertise. Can you ‘decide’ to not use your expertise?

In the next set of tests the participants were told that the RAT tests would contain many references to baseball. They were then warned that they should not use any knowledge of baseball they possessed as it would not be helpful in completing the tests. What happened?

Despite the warning, the high-knowledge individuals did just as badly as they did when they received no warning. The warning was useless, with the experiments illustrating that it is simply not possible to ‘switch off ’ your knowledge and expertise.

Its use is automatic and it appears to occur subconsciously.

Mann Gulch

We see these very cognitive concepts at work in Mann Gulch on the afternoon of August 5, 1949.

Many of the men still clung to their heavy tools, despite being able to run faster without them, and despite Dodge’s order not to do so. It turns out that this form of behaviour is not an isolated event. At least 23 wildfire fighters died in fires from 1990 to 2007(3).

Many died within a few hundred yards of their safety zones and a number were found still wearing heavy backpacks with their chainsaws beside them. They too were in a race with the flames, and they too didn’t drop their tools.

Fundamentally, these men didn’t drop their tools any quicker than the baseball experts dropped their knowledge. They simply couldn’t. Indeed, placing total faith in our expertise is fundamental in human nature, especially in stressful situations.

Herbert Simon, winner of the Nobel Prize, identifies the issue as bounded rationality, where a human mind has limited information processing and storage capabilities, and so humans must use simple rules of thumb and heuristics to help make decisions and solve problems(4).

These rules of thumb and heuristics are our very tools, but Daniel Kahneman and Amos Tversky, both psychologists, point out that many heuristics, or simple rules, that people use to make judgments and decisions lead to systematic and predictable errors(5).

Are we as engineers in danger of making systemic and predicable errors because of our simple rules and heuristics? The answer, of course, is yes. We carry tools and rely upon them, and Mann Gulch teaches us that when we come under pressure we will rely on these tools even when we should not.

Engineering tools

So, are there times we should drop our tools? And if we do, what are we left with? Well, that depends on the tools we actually carry, as individual engineers. Are our tools engineering first principles? Or are they the systems and processes we use to deliver engineering as a service?

If it is the latter, we should give our tools some serious thought. Yet many of us don’t, we simply get on with the business of applying them. And we carry an increasing number of ‘non-first principle’ tools.

We have become dominated by ever more prescriptive design codes, ever more complex in-office procedures, and we are using ever more elaborate software packages.

While many engineers make the valid argument that many of these tools prevent errors, many other engineers make the equally valid argument that these tools actively contribute to creating errors – software analysis tools are a prime example.

Are these tools aiding us to become better engineers or are they replacing us, at least in a cognitive sense, as engineers? Many were intended to act as aids, but in the ever more commoditised world of delivering engineering services, the focus on the use of such tools is becoming greater and greater, to the detriment of fundamental principles.

Mann Gulch teaches us that when engineers find themselves in unusual situations and under pressure, they will apply these tools regardless of applicability. Indeed, if we become dependent on their use, we may find ourselves in situations where these tools have exceeded their limits without us knowing it.

The history of engineering is littered with failures caused by precisely this issue. Developing an awareness of the tools we carry, an awareness of the limitations they come with, and understanding when it is appropriate and inappropriate to drop them should be central for every engineer.

And what happens if we do learn to drop them? Well, we are left with the fundamental principles of engineering. Karl Weick, an expert in organisational behaviour, neatly sums up the advantage of dropping tools from a general perspective: learning to drop one’s tools is to gain lightness, agility, and wisdom(3).

This is precisely what Dodge did when he broke through the tree line and realised the top of the ridge was out of reach. He had already dropped his physical tools, now he would drop his mental tool – his fixation on reaching the ridge.

Running for a ridge is one of the tools used by the US Forest Service to escape harm – the ridge has less vegetation and changing wind conditions, both of which serve to slow down a fire.

Usually, this is good tool, but Dodge figured out in this particular circumstance that the tool was useless. So he dropped it. He was then left with his basic principles: fire required heat, oxygen and fuel. So he decided to deprive it of fuel.

He lit an escape fire, the first time it had ever been attempted, inventing a new tool in the process. He was only able to do this because he dropped the other tools.

He showed extraordinary agility in his thinking about the issue, exactly what Weick describes. The rest of the crew’s response to Dodge’s escape fire shows just how hard our tools are to drop.

Not only had they not dropped their tools – while some had dropped their physical tools, none appear to have dropped their mental fixation on reaching the ridge – they were unable to accept Dodge’s new tool, the escape fire.

It was unfamiliar and didn’t fit into their existing expertise and training. So they ignored it and relied on getting to the ridge – a tool still central to their expertise.

For most of us, as with the crew, a new tool needs to be introduced not at a time of stress, when we will fail to process its significance, but before. The importance of examining, evaluating and knowing if and when to drop your tools prior to a stressful period is illustrated in fire service training today(3).

Firefighters are trained to run both with and without their tools, to demonstrate that they can run faster without tools. While this sounds obvious, the training actually embeds this tool in their expertise, and at times of stress they are now equipped to decide whether running or holding onto their tools is better.

This is part of the concept of comparison, awareness and refinement. The comparison stage comes by examining how you would perform both with and without your tools (running slow versus running fast), awareness (that you can actually run faster without tools), and refinement (becoming aware of the time when it is correct to shed those tools).

This concept is illustrated by Rumsey, who said in the review that followed the tragedy that he thought Dodge had simply gone mad lighting another fire. He pointed out that if it had been explained to him on a blackboard in Missoula prior to the event, he might have been able to process it(6).

However, the difficulties in examining your tools cannot be overstressed. For many of us, using engineering tools is part of who we are, and dropping them is akin to giving up a little of that identity.

As Norman Maclean puts it so beautifully in his book on the tragedy Young Men and Fire: “When a firefighter is told to drop his firefighting tools, he is told to forget he is a firefighter and run for his life”(6).

Many engineers, no doubt, would feel a similar dilemma.

Examining our tools

This is not to suggest that we drop our tools across the board and revert to first principles.

To suggest so is as ridiculous as suggesting a firefighter should throw away his Pulaski and fight-fire barehanded. But there will always be situations when over-reliance on these tools will let us down; when we get to that point, we will need to know their limitations and recognise when to drop them.

If not, Mann Gulch tells us we will revert automatically and rely on them regardless of whether it is appropriate to do so. When we find ourselves in such a situation, will we act like 15 firefighters running uphill, clutching our tools, and heading for a ridge out of reach?

Or will we be more like Dodge? Will we know our tools well enough, as individuals, to identify when they are no longer useful and drop them, instead lighting an escape fire? Will we think like an engineer, the way we’re meant to?

Author: Sean Brady is the managing director of Brady Heywood, based in Brisbane, Australia. The firm provides forensic and investigative structural engineering services and specialises in determining the cause of engineering failure and non-performance. Web: www.bradyheywood.com.au Twitter: @BradyHeywood

References

1.) Wiley J (1998) ‘Expertise as mental set: The effects of domain knowledge in creative problem solving’, Mem. Cognit., 26 (4), pp. 716–730

2.) Kahneman D (2013) Thinking, fast and slow, New York, USA: Farrar, Straus and Giroux

3.) Weick KE (2007) ‘Drop your Tools: On Reconfi guring Management Education’, J Manag. Educ., 31 (1), pp. 5–16

4.) Simon HA (1957) Models of Man, Social and Rational, New York, USA: John Wiley and Sons

5.) Tversky A and Kahneman D (1974) ‘Judgment under Uncertainty: Heuristics and Biases’, Science, 185 (4157), pp. 1124–1131 6) Maclean N. (1992) Young Men and Fire, Chicago, USA: University of Chicago Press