Show Notes
Today I am continuing our series on the six principles of human performance. This time we are covering principle #2, Error-likely situations are predictable.
After we gain awareness of error and system induced violations, how our brains are wired, and why inattention and complacency are natural; we become stronger at predicting error. We start to see this concept on a macro and a micro scale. This is a beautiful thing because when we can predict error, we are better equipped to defend against it. Sometimes we can even change the system to eliminate error.
Last time, we talked about different performance modes. Skill based mode is less prone for error. Rule and knowledge based modes are more prone for error. If people have to follow a bunch of rules within a procedure, there is a chance our brain will forget a step. If that procedure is unavailable, error is highly likely.
In a study of this by James Reason, people are 20 times more likely to make an error if a procedure is unavailable. If a worker is unfamiliar with a task, they are 17 times more likely to make an error. If they are in a hurry, 10 times more likely to error.
When we look at our systems through this lens, it becomes much more predictable where the next incident will occur. We can’t predict everything, but we can get better at predicting.
Fatigue
In construction, fatigue is a common, predictable, error-likely situation.
Fatigue has the same effect on your brain as alcohol. Although a hard concept to accept, we are often managing a bunch of drunk people. If they were drunk on alcohol, we would most likely kick them off the job (and hopefully get them some help). But fatigue drunkenness is a risk tolerance that our industry commonly accepts.
I’ll start with an extreme example. If a paving contractor has to work all day and into the night, to meet the demands of the client, within the limited resources of the company; they could feasibly be awake for 21 hours straight between work demands, the commute and the stuff everyone has to deal with at home.
According to WorkSafeBC, that is the blood alcohol equivalent of .08%. The same number the State of Georgia uses to determine if you are too drunk to drive.
It’s an extreme example, and not every contractor is working that many hours, but some do in our industry. There are people out there doing road construction whose brains are operating the same way as a legally drunk person. That is a predictable, error-likely situation.
A less extreme example, but even more common in our industry, is going 17 hours without sleep. If a worker has to pull a 12-hour shift, drive an hour to and from work, we are up to 14 hours, just with the job aspect alone. But what about their home life? Who doesn’t have crap to do at home? Marriage, parenting, house chores; we all have stuff we are responsible for outside of work too. So, if we give the worker 30 minutes in the morning to get out of bed and hit the road, and 2.5 hours after work to deal with life before they get back in bed, we are up to 17 hours without sleep.
In this example, the blood-alcohol equivalent is 0.05%. So, they could pass a breathalyzer but they are one beer away from being legally drunk. In other words, they may not be drunk yet, but the fatigue is still equal to people drinking on the job from a brain-based standpoint. Error is predictable.
Everyone has a different relationship with alcohol but I’ll throw myself under the bus for a minute. Intellectually, I know how much I can drink before I do something stupid. But I also understand my brain can’t make great rational decisions when alcohol is introduced to it.
Let’s say a person plans to have two drinks. Then they are more relaxed, “oh heck I’ll have one more”. At that point, moderation and good decision making go out the window. Why? Because our brain has stopped making good judgement. Next thing you know, you are drunk while never intending to get that way. It happens, because the alcohol impairs our ability to make good judgements.
I’m sure not everyone listening has done that, but I’m also sure some of you know exactly what I’m talking about.
From a brain perspective, that’s happening on the jobsite. The more fatigued someone is, the less likely they will make good decisions. If you know people are working a 12, then error-likely situations are predictable. Especially when they are operating under a rule or knowledge based mode.
Some companies are very concerned with work-rest schedules. Is a fatigue management plan part of your safety program? If long shifts are predictable in your organization, then fatigue management should be an official system. It would be good to review if fatigue is in the table of contents and how is it actually being managed in the real world.
Scope of Work
Scope of work is another error-likely situation. The more work flows away from our typical scope, the more likely error becomes. Being unfamiliar with the task, means the worker is 17 times more likely to make a mistake. Combine the fatigue issue we just covered and you can easily predict where we are headed.
If we typically build poured in place concrete jobs, and now we have a wood frame job, error is predictable. From a general contractor perspective, we are now managing a completely different set of contractors. We may have mastered formwork, shoring and concrete systems; but now we are dealing with a bunch of carpenters. As the scope of work changes, error becomes predictable.
On a smaller scale, the client has some safety rules that are different than most jobs we work on. The rules have become normalized on our other projects. They have become more subconscious. On this current project, we have to stop and think more often and are expected to make good decisions. Because of this, our prefrontal is doing the work and more prone for error.
Let’s say we normally work on a scaffold, without personal fall arrest, as long as all the guardrails are in place. Now we are working for a new client and they require personal fall arrest and guardrails at all times. Maybe they have the best intent ever. Maybe they are viewing safety through the lens of layers of defenses. At the same time, we are requiring our workforce to work differently. Someone is going to forget no matter how long the safety orientation was. Instead of getting mad that someone forgot we should expect them to forget.
Any change in the scope of work is an error likely situation, both on a macro and a micro scale.
Another one is operating different equipment. Normally a worker operates a CAT. Something goes wrong and we send it back to the shop for maintenance. In the meantime, we are provided with a Komatsu. The equipment operates differently, which means error is predictable.
Most of the time we use Genie, but this time rental company sent out JLG. Same thing. Change in equipment, controls operate differently, capacity numbers change, approved attachments change, operational rules change; error is predictable.
New equipment, new harnesses, new fall anchorage, hydraulic shoring vs. a trench box, new rigging manufacturer, new type of scaffold; all of these things are creating error-likely situations.
When we are aware of these things, we can predict them. Then we can implement defenses if we are forced to use new and different stuff. We might even be able to change the system to lessen the amount of different equipment people use. Either way, it’s that style of thinking we need to evolve. Do we get this concept, talk about it in meetings and do something about it?
Another one is systems that influence error-likely situations. Especially within the context of violations.
A common one that comes to mind is not having the right tools for the job. The reason it falls under the system category is that we commonly drift away from providing the right stuff to do the job.
We will cover drift as its own topic in the future, but as a simple explanation, humans are always looking for an easier way to accomplish a task. It’s another brain-concept. The difference between the perception of efficiency and taking a safety shortcut is usually based on the outcome. If the shortcut got the work done faster and no one got hurt, it was a good, efficient decision. If someone got hurt, it was a safety shortcut.
So back to the right tools for the job. Sometimes organizations start with great intent of determining the needs on a project, providing those needs and then going to work. Over time, people can drift away from doing this. An equipment manager can perceive that they have done this so many times in the past, they know the work, they know what the people need, and provide equipment based on past, similar projects.
Now the workers are faced with some new challenges. They’ve used scissor lifts on previous projects but they have some unique challenges on this job that require a boom lift or a one-man lift, and now they perceive that management wants them to get the work done with what they have. So, we end up seeing workers standing on the rails of the scissor lift to get the job done.
Some may think the workers should tell us when these situations arise. From their view, maybe they are thinking management never does a needs assessment anymore. What they really want is for us to figure it out with what we have. The system has drifted and evolved.
There are overlaps in these principles. People and organizations drift. What is the official system on paper can drift into the unspoken system that is not on paper. It’s normal and needs a recheck from time to time. Our industry is constantly evolving. Our systems evolve to. Sometimes, in directions we would prefer them not to.
We may have a system that requires us to sub out a portion of our work. A common one I see is subbing out the role of traffic control in the road building industry. Maybe due to budget concerns, the organization has determined that subbing out flaggers is more cost effective. But at the worker level, our team may be forced to use subpar traffic control. Now the overall job is normalizing risky behavior because of the workplace system.
Complexity of a system is a predictable error-likely situation. When workers are overloaded with information, they are 6 times more likely to make an error. The complexity of filling out a LOTO form, ensuring stored energy is dissipated, all locks and keys are where they should be, everything has been communicated to all parties involved, and everything has been timed at the appropriate moment, is very complex. That’s a lot to put on the brain. You should expect errors with LOTO. That’s why a peer check, or a buddy system, a second set of eyes, can be valuable when you predict error to occur.
You can train, measure and hold people accountable for doing LOTO procedures perfectly. But when you view this as a complex system in which error is likely, you will change the system based on expecting error to occur. If your system is modified with a second set of eyes doing a double check, you minimize the potential for a human to forget a step.
That same concept can be applied to personal fall arrest systems, confined space work or anything that is a complex system. Complexity breeds error-likely situations.
As you can see, through the eyes of human error, there are many things going on in our work that make error more predictable. When we look at our organization through this lens, we have a better success rate at reducing these errors.
It’s not about a one-and-done view. It’s about thinking this way. Thinking about error likely scenarios and accepting that its an ongoing process. Things will always be changing including the scope of work, the client’s requirements, the tools, the equipment, the way we do the work and the systems we work within.
When we meditate on these concepts, we can come up with methods to defend against the errors that are predictable to occur. Especially in construction, we can’t eliminate everything. There are some things we just don’t have control over. What we can do is accept those realities and put in better defenses for these predictable scenarios.
I want to leave you with one last thought on all of this. Here’s a quote from Jay Shetty:
“It’s a mistake to think that when we read a book, attend a class, and implement changes that we’ll fix everything.”
This quote can be applied to so many things in life. There is no universal plan for perfection. We are not perfect and neither are the systems we work in. The way we get there is by training our minds to think differently.
Life, work, systems; they are always going to take a detour. We should expect the organizational ship to swerve, to change, to evolve, to drift. If we think differently about error, we can improve how we react and how we respond, when those never-ending changes occur.
We are not searching for the magic one-and-done fix-all. There is no magic safety dust. Our focus should be on evolving the way we think about risk, violations and the potential for human error. At the end of the day, understanding human error and defending against it is mostly concerned with thinking differently.
Next time we’ll tackle my favorite principle, individual behaviors are influenced by culture and leadership. Till then, hope you have a beautiful day my friends.