The question that vexed me
Over the past few years as a traveling PSM auditor I was exposed to hundreds of different safety programs. There were small, flat organizations with dozens of workers and large international conglomerates with thousands of workers and layer upon layer of bureaucracy. When I started doing audits, I had (erroneously) expected that the large organizations would have better programs than the smaller ones because they had so many resources at their disposal. I had thought that these resources: Time, Money, Experience, etc., would result in a superior program. What I found was something quite different.
Resources are important, and many a program suffers from a lack of them. It turned out though – at least from my observations – that once you had a certain threshold of resources, they were not a determining factor in the success of the programs. Put another way: you needed *enough* Time, Money and Experience to have a successful Safety Program, but having extra didn’t seem to make it better. More often than not, the places I went with large, complex Safety Programs had worse on-the-ground performance than their smaller, relatively resource-poor, counterparts.
Things like this really bug me. When I find something that seems to contradict conventional wisdom it causes a little voice in my head to continuously pester me with “what are we missing?” These kinds of questions lead me to stare into the darkness in many a hotel room pondering “big” questions on Safety Programs.
Summed up it went something like this: If the conventional wisdom of “MORE Resources = MORE safety,” isn’t as big of an issue as I’d assumed, then what else was at play? It’s a questions I asked myself (and any other safety professional that would listen) for years until I stumbled across what I think is the biggest safety issue facing us today.
The stumble
“Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing had happened.” –Winston Churchill
Two experiences set me up for a realization that would totally alter the way I approached safety programs. One was with a young engineer and the other was with a somewhat confused cashier at a rural Kansas Wendy’s.
Wendy’s first: I am a simple man when it comes to my interactions with clerks, fast-food attendants, etc. I figure out what I want from the person, then I turn that thing I want into as few simple words as possible and tell that person those words. In this particular interaction I wanted a Combo #4 with no onions and Diet Coke. The conversation started something like this:
Attendant: Can I help you?
Me: Combo #4 – no onions – with a Large Diet Coke
Attendant: Did you want the sandwich or the meal?
Me: The combo
Attendant: Did you want fries?
Me: Doesn’t the combo come with fries and a drink?
Attendant: You can get other things too. It comes with fries unless you want something else.
Me: Other than the no-onions bit, just give it to me as it comes then.
Attendant: So, yes. You want fries.
Me: …and a Large Diet Coke
Attendant: You want a drink too?
Me: …a Large Diet Coke…
This went on for quite a while. At no point did the attendant look up from her screen and it was obvious that she was struggling with turning what I wanted into the button presses that would tell the ordering system to get it for me. If you have traveled, you know this type of experience from countless rental car, airline and hotel clerks: The system isn’t taking the information the way that the people involved are trying to give it. The clerk is now trying to interpret the real-world information in a way that the system understands it. If I had a dollar for every time I heard “The system isn’t letting me…” I’d be hiring Bill Gates as a clown for my grandkid’s birthday parties.
You see, the system was designed around the needs of the programmer. The programmer had to deal with the needs of the person entering the information, the people making the food, the needs of the accountants, the needs of the inventory system, etc. The easy design choice for the programmer is to come up with a single interface that gives everyone that uses it what they need. Unfortunately, when you design like this you end up with a system that tries to be everything to everyone in the same way at the same time.
This is a broken system – the system for taking information need to be designed around the needs of the person entering it. The system should then translate that information to fit the needs of the person using that information whether they are cooks, accountants or managers.
Which brings me to my young engineer. He had produced a large spreadsheet to validate relief calculations. It was very impressive and just chock-full of numbers. It was also unintelligible to anyone that wasn’t an engineer. I couldn’t make much sense of it and showed it to another engineer who scratched his head, looked at it for a few minutes, and started: “I think what he did here was….”
It turns out the calculations were quite right and very defensible – assuming you had an engineer to translate it. For a compliance/safety document, that’s a bad design! The engineers found the document quite acceptable – it met their needs and provided the engineering basis for their design. What it didn’t do was meet the needs of the people using the document to prove compliance. Working with the engineers, we were able to “pretty up” the spreadsheet so it explained itself. We put another sheet in front of all the calculations that pulled the information from the engineering sheets and put it in a manner that any reasonably competent person could understand.
The false realization
To my mind, these were similar situations. In fact, I actually thought of the Wendy’s conversation while working with the engineers. I wondered, if this was occurring in engineering departments and fast-food places, would it explain the implementation problems I had been seeing nation-wide in Safety Programs?
I came up with a little motto that I used for a year or two, thinking that it explained the problem: The systems we are creating are too complex for the people that are implementing them.
Unfortunately, many people took this concept and blamed the PEOPLE. This was not my intent and the results of blaming the people actually made it worse. What management did with this concept was move implementation tasks even further away from the people on the ground. They assigned these programs to ever-higher levels of the company where extremely competent people designed extremely elaborate programs that ultimately further insulated the end user from the Safety Program. Now safety messages, policies and procedures were dictated from on-high. This actually made things far worse.
The realization – Where I went wrong and how it stands now
They were addressing a causal factor – people not understanding the system – rather than the root of the problem which was that the system was poorly designed or explained.
In a moment of brutal honesty, an operator once answered my question about an Emergency Response policy with something along the lines of “You know, they have a lot of these policies. It’s page after page of stuff. I need to make a split-second decision and that binder (referencing a 100+ page Emergency Action / Response program) doesn’t help me. They give a lot of this stuff but I just have to ignore it in an emergency and rely on what I know. ”
We looked through that binder and after about 15 minutes we eventually did find the answer we were looking for. It was buried somewhere in the middle of all these pages that all looked the same. The operator had a point, didn’t he? Do we actually expect an operator responding to an emergency so sit down, pull out a chair and thumb through a binder while the emergency is occurring? An Emergency Response policy that isn’t useful in an Emergency is a system failure.
That’s not the fault of the operator. The problem IS the system. A system that doesn’t meet the needs of the people implementing it is a broken system. Here’s the situation as I explain it now:
The systems we are creating are not providing guidance in a manner that meets the needs of the end-user.
Addressing this problem statement is much more fruitful. Sometimes we get caught up in “compliance” and “policy” and forget that their are real people at the end of that policy.
A function Safety Program is a system.
Design that system so that the people implementing it understand the guidance it provides.
A functional Safety Program has to be relevant to the needs of the implementer.