Like most people I believe in collecting data, analysing it in some way, expecting trends or patterns to emerge and tell us were to focus our safety efforts. What’s the data telling us about how we can prevent injuries?
In my workplace, we are very proud of the leading indicator data set. Over the past 4 years we’ve collected an impressive set of hazard, near miss, behavioural observations and audit data from the contractor groups. Our groups have consistently over delivered in relation to the rest of the site, indicting strong participation. Testimony to how strong our culture is. But does it tell us the real story?
I have always suspected the data set, even though large and easily statistically reliable, is riddled with many human biases, rendering it useless to fulfil its primary objective as predictive tool. Let me explain.
My observation is our group (the owners representative) has built a very strong safety culture. We’ve actively involved ourselves in our contractors safety business. We been very effective safety leaders on site, with reporting a foundation element. Hence, lots of hazard identification.
See it, fix it, share it (SFS) is our hazard and near miss branding if you will. An action lead initiative where the individual is expected to take ownership of a hazard and do something about it. We’ve been very success, building quite a machine around this initiative. Maybe too successful when it comes to using the data as a predictive tool.
The problem is this…..When we review the data, especially when you represent over time domain (weekly period), and overlay what initiative we are discussing that week, you see the group reacting to which ever initiative we are running. For example, when we ask everyone to be aware of vehicles and mobile plant in the work area, we receive lots of SFS’s about driving. When we discuss hazards in the work area. We get lots of SFS’s about slip and trip hazards. It is almost as if we are creating the data with a weeks lag!
What is going on here I suspect, stems from human biases and the “priming effect”. When the data set fills with hazards we discussed the week prior, clearly we are influencing the outcomes. And if we are influencing the data set, it will tell us what we already know.
Extending this thinking one step further, there appears to be a dichotomy going on here. The closer and stronger the relationship you have with the work force, the more priming affect you have. And subsequently, any data which you collect from that work group, will reflect that priming. Not good if we want to use the data set as predictive.
What’s the answer? Well, the first issue is beware of biases which run through your data set. If you are collecting data based on human behaviour it will have biases. Period. If you have a strong safety culture, the data set will reflect it. You will see in the data what you are driving. But this may not be a bad thing. If we can see priming effects then use this to your advantage. Prime away…some may say this is manipulative but the reality this is how humans work and our overall intent is to keep people safe so our cause is a noble one. But what do you prime with?
Well most of us who have been around the block a couple of times, and actually learn from the those trips, know that at the most basic level, injuries occur when a hazard and a person interact. Understanding the different interactions is important and the basis of much of our rules, procedures and documentation. But at the lowest level, it is about staying out of the line of fire and using our brain to drive our actions to achieve this. Priming your work groups to having eyes and mind on the hazard, will set them up for success. To go home to their loved ones in one piece.
We seem to be in a good place. The data is telling us we have influence over our work teams. Happy about that. But what data sources other than lag indicators do we use? Thats another topic but it involves getting out into the field and being observant. Safety is not a desk job. The best safety leaders I know are in the field, listening, learning and leading.
So be careful with your data sets. Some data is better then none but deeply understand what it is telling you.