Alpine skiing
Add news
News

ACCIDENT REDUCTION IN THE COVID ERA ‘The biggest error would be to believe that we don’t make errors’

0 17

The Covid-19 problem that confronts us all in this time of uncertainty is a danger we can’t see. It’s a danger that seems distant and, apart from the prevention measures imposed on us, it appears unlikely that it will have a direct effect on us. It doesn’t even really seem to harm people around us who aren’t following the protection advice. Despite knowing that Covid-19 has caused hundreds of thousands of deaths in just a few months, we still find it easy not take the threat seriously, unless perhaps we’re in a ‘vulnerable’ category or a loved one has died…, but that’s a very small percentage of us. For most of us, it’s easy to downplay the threat even though we know the results could be awful if we or someone we know got it, or if there was a ‘second wave’. This sort of conscious denial is due to the fact that we don’t get direct feedback from our actions – and when there’s no visible proof of danger, it’s hard to take it seriously.

Avalanche danger falls into a similar sort of risk context. Most of the time you can behave in a reckless way and see no cause for concern. Even though many of us know that it’s the victims that cause the avalanche that kills them, it’s easy not to take the threat seriously. This is because most of the time nothing bad happens, even if we don’t follow safety advice.

So, the similarity in risk context between Covid-19 and avalanche terrain is: 1. the ‘low validity’ (or low feedback) from the environment and 2. the ‘high consequence’ (real chance of death) if something eventually does happen. ‘Low validity’ is the lack of feedback from what you do – you can do dangerous things, and nothing happens…most of the time.

Both avalanche accidents and contracting Covid-19 are also low probability events and thus infrequent enough to seem so remote as to be practically impossible. BUT their high consequence outcomes are devastating enough, and in reality frequent enough (especially to those directly affected), to warrant applying preventative action.

Skier triggered avalanche at Ubaye: Photos by Emmanuelle Borg, Data Avalanche website.

In the Covid era, as in avalanche terrain, we know what basic risk reduction measures to take, but whether or not we apply them is a whole different story. Tragedy, or high consequence result, is usually due ‘human factors’, i.e. exposing ourselves, and others to danger mainly comes down to human error – something that could have been avoided if we were paying attention. ‘The basics were known, but not followed’ says Atul Gawande, Lead Author of the WHO Surgical Safety Checklist (a mandatory assessment for surgical teams in the UK and most hospitals in other countries) and expert on avoidable accident prevention.

The good news is that clearly steps can be taken to reduce the risk of accidental exposure. For example in avalanche terrain, we can reduce the danger in this high consequence/low validity risk context to what most of us consider acceptable, e.g. driving a car for an hour, if they are applied  (Bruce Tremper et al).

The problem is that we often don’t apply the known key risk reduction points (and thus reduce the danger) because we get distracted. Or it simply can be because we stop thinking about it due to the seemingly remote nature of the danger, ‘Nobody is going to die here in the store if I don’t wear a mask!’ or ‘it not going to hurt me if I don’t keep up distances with other people’.

Because of these similarities, I’ve been asked by colleagues, journalists and others to come up with some thoughts on accident reduction solutions for Covid-19 based on similarities between the high consequence/low validity uncertainty which surrounds both risk contexts. Indeed, at Henry’s Avalanche Talk (HAT) we’re committed to solving the ‘risk problem’ by giving off-piste skiers and tourers a better understanding of the risks and how do reduce them – thus improving the experience.

Applying avalanche safety solutions to Covid-19 may sound like a big leap. However, drawing connections with other endeavors and professions is something people have long been asking me do, starting thirty years ago with a gentleman who’d fought in the Rhodesian Bush War. He felt that at least some of his friends could have been saved if they’d applied some of the risk reduction techniques he’d heard me mention in my avalanche talk in the human factor section, ‘…the problem is not that they were incompetent: they just let their gaurd down – didn’t apply their experience and knowledge at the time…and they were ambushed’ .

More recently our ‘Safety is Freedom’ approach for avalanche terrain has spilled over into, for example, the surgical and financial sectors, where a similar interest in ‘Patient Safety is Freedom’ has emerged along with a financial planners version: ‘Decision making risk and crisis management from an avalanche point of view’ 

To be fair, the avalanche safety world was initially more of an adopter of risk management solutions from behavioral science and other sectors that operate in a high consequence/low validity risk context. And since a good sense of ‘the problem’ (the acceptance and management of human error) is essential before arriving at a solution, I’ve translated an easy-to-read article/interview with a risk management expert who worked in the financial sector, aviation (Concorde) and health sectors before becoming Director General of the École Centrale de Lyon, France. Frank Debouk’s animated multidisciplinary view, here below, gets to the heart of the ‘errors problem’ that needs to be clear if we want to arrive at an accident reduction solution.

So in summary, before moving on to Covid risk management solutions that can be taken from the avalanche world (my next article), we need to look at the emphasis on ‘error’ as an everyday human function, such as eating and sleeping. According to Debouck, we all make around thirty errors a day! 

The only addition I’ll make to Debouck’s interview is to re-emphasize the fact that making errors is a normal everyday occurrence and not synonymous with lack of experience or competence. Indeed, it looks as if the opposite may be true: ‘research demonstrates that experience increases the confidence with which people hold their ideas, but not necessarily the accuracy of those ideas’ (Daniel Kahneman, acclaimed Behavioral Psychologist and holder of the Nobel Prize in Economics).

There are many cases of top professionals making big errors in all sectors. Why is this so? Debouck answers that there are a dozen big reasons for top-level experts making errors in their field of endeavour. He draws particular attention to communication problems and the negative effects of stress. In addition to this, I’ll add that, as many experts like Kahneman say: intuition doesn’t work in high consequence/low validity risk contexts. Worse still, “we trust our intuitions even when they’re wrong”. Experienced, competent and respected people are particularly exposed – because they are most likely to trust their intuitions, even when they’re wrong. For me, this is the source of errors that lead to the most destructive accidents.

…………………………………….

Interview with Frank Debouck, Director General of the École Centrale de Lyon and Expert on Human and Organisational Factors in Aviation, Health and Financial Sectors

Interview by Martin Mazza, French Avalanche Association (ANENA)

Translated from the original by Henry Schniewind. (Link to original PDF article in French)

Frank Debouck. Photo from L’Ecole Centrale de Lyon website

ANENA: During the last General Assembly of the French Mountain Guide Federation held in Nice, your piece about human factors really made an impression on the audience. Before getting into the heart of the matter, could you briefly introduce yourself?

Frank Debouck: I’m an engineer by training and private pilot. I started off working in the world of nuclear submarines, and then moved into banking, working in a subsidiary branch of the Caisse des Dépôts et Consignations. I joined Air France in 1984 and stayed there for 27 years in various positions including Operations Manager for the Concorde. After that I mainly worked with the medical sector, before becoming Director of the Ecole Centrale de Lyon in 2011.

ANENA: What led you to pay particular attention to human factors?

FD: During my time at Air France, I took part in the introduction of human factors. Actually I was interested in this fascinating subject more than just as a work objective. I continued pursuing this interest when I created Air France Consulting in 1999, an internal advisory body linked to air transport.

Surprisingly, another stand-out event reinforced my interest in human factors, and their integration. This was the 2003 heatwave (an event that led to thousands of avoidable deaths, especially amongst the elderly, in hospitals and care homes throughout France). At the time I was shocked by the punitive logic in the aftermath of the heatwave with the dismissal of the Minister of Health and many organisational directors. The logic behind this was to make heads fall, as opposed to the type of logic used in the Aeronautic industry, which, following an accident, even back then was used to focus on the contributory factors to the accident and to implement corrective measures to prevent it from happening again.

ANENA: We often say that to err is human. However, is this really acceptable, and is it accepted?

FD: To want professionals who never make mistakes, or to believe they will never make mistakes, is an impossible dream. On average, every one of us makes thirty mistakes a day. We cannot prevent a doctor, a mountain guide or a pilot from making mistakes.

However, we should emphasize that no professional person does their job badly on purpose. But is it possible for an airline pilot to type an ‘8’ instead of a ‘3’ on the aircraft’s onboard computer? The answer is ‘yes’. Can this mistake happen more than once? The answer is ‘yes’ again. We see some staggering mistakes made by highly-regarded professionals.

ANENA: How do you explain how highly-regarded professionals can make big mistakes? Isn’t that a contradiction in itself?

FD: There are a dozen reasons for human error, one of which is communication. Let’s take an example to illustrate the importance of good communication: I give you a message ‘Quentin goes to school’. If that’s all I tell you, how will you interpret it? Your brain tells you that a young boy is going to school. In 99% of cases, it is, indeed, a young boy who is going to school. If I continue my sentence ‘Quentin goes to school, he’s having difficulties with his quantum physics course’. Then, it is no longer a young child of 7 or 8 going to school but, rather, we’re talking about a student. And if I continue ‘Quentin goes to school, he’s having difficulties with his quantum physics course, he has problems enforcing discipline’. Then, we tell ourselves that Quentin is a teacher. If I stop at the beginning of my sentence, no one will have understood the same thing. This shows the weakness of a system due to communication failure.

Stress can also play an important factor. We behave in strange ways under stress. As a student, I was stopped by the police while driving and asked to put my indicators on. For some inexplicable reason, I put on my windshield wipers, although I never use these to turn left or right. But under stress, my brain reacted in a bizarre way.

Today we know about the mechanisms of error-making. We therefore need to make our system robust in dealing with error, so that when an error is going to happen it can be detected and corrected.

ANENA: How do we make the system robust in dealing with error?

FD: This was the focal issue at Air France in the 70s and 80s. To make a system robust in dealing with error, we firstly need to accept the fact that there are going to be errors. Then, we have to implement good security practices that allow us to detect and correct errors before they happen – before they hit their critical target and cause an accident. That’s what’s important. Mistakes are going to happen that’s for sure, but what we don’t want is accidents or deaths.

There are 4 main types of error: technical; organisational; those linked to the environment; and finally, those linked to human factors. The more advanced we become in our systems and organisations, the fewer the technical and organisational problems. The environment, however, is difficult to predict. It leaves all kinds of room for interpretation and decision-making on an individual level. And then the human factors and weaknesses need to be taken into account.

At Air France, if we hadn’t made our system robust in dealing with error in the 1980s, we’d now be having one air crash per week due to the increase in airline traffic.

ANENA: Does ‘return of experience’ (from the French, ‘RETEX’:a specific way of sharing Lessons Learned through a formula of breaking down what happened, what went right, what went wrong, and what emergency services anywhere can learn from a particular incident) provide an adequate solution, or are there other complementary methodologies?

FD: Feedback is one solution among many others. We can’t forget or underestimate the importance of basic training, which is one of the most essential safety systems. You can’t improvise being a pilot, a nurse or a mountain guide. Other elements of the safety system include written procedures and references, audits, controls, simulations and feedback. ‘Human factor’ training has transformed our human mistakes into a subject of practical interest and work, instead of just focusing on ‘regrets’. In doing this, the training has helped to ‘free up’ any feedback, allowing us to build on any analysis, adding in more and more relevant details. This important element helps provide an explanation, particularly in the Aeronautics Sector, for the improved safety record over the last 2 decades. Looking to the future, it is also an asset which must be exploited, especially with increased growth in airline traffic.

I was Operations Manager for the Concorde at the time of the accident in July 2000, which left 113 people dead. On the day of the accident, the plane had run over a mechanical part. A chain reaction ensued, resulting in lost control of the plane and a crash. At Air France and British Airways, we’d had 63 previous cases of tyre blowouts, which had caused small holes in the wing. The first problem we’d known about was during a take-off from Washington. We’d then rectified some of the problem, but not all of it. After that we’d become used to the small holes and other small repairs. And getting used to something is the worst thing you can do. We’d noticed small holes appearing from time to time. But then one day, there was a massive one, resulting in 113 people dead.

After any accident, we must avoid saying to ourselves ‘Yes I knew something was going to happen’. We need to learn from things that have gone wrong and accidents, but also from incidents and ‘near-misses’. We don’t have to wait for an accident to happen to understand what went wrong. Say we narrowly miss being taken in an avalanche. Reflecting on the incident afterwards, we can learn as many things as if we had actually been taken and buried. Sweeping the incident under the carpet is a bad thing, even if it’s deeply distressing to talk about it at first.

ANENA: Why is it so distressing?

FD: Because of ego, the way other people look at us, pride and many other reasons. It’s difficult for a professional to admit to having made an error. At first, it’s easier to point the finger at other people’s mistakes, before agreeing to talk about our own. The important thing is that the person in question agrees to talk to about their mistakes. It’s equally essential that any reports or statements are given in total anonymity. This allows people to feel they can speak freely. Again, what we need are tangible, concrete first results.

ANENA: How do we get these concrete results?

FD: By developing best practices and safety procedures. If we take the example of Aeronautics, communication is very important. There is a procedure to repeat back everything we’ve heard. When the control tower says ‘authorisation to land on runway 26, reduce your speed to 30 knots and flap 20’ the captain repeats the message back to ensure that it hasn’t been altered. These procedures have been set in stone. If you don’t repeat the message back, the control tower will again ask you to do so. If you repeat the message back incorrectly, the control tower will give you the message again, and ask you to repeat it again, until this is done correctly.

For the results to be concrete, it’s important not to overcomplicate things by introducing too many procedures. We only check the essential things. For example, during take-off we only check 3 items: positioning of the flaps, positioning of the air-conditioning packs system and the trim. We used to have a long checklist, which included things such as the correct closing of the doors.  But, if we were to take off with the aircraft doors open, while not very smart, this wouldn’t actually kill anyone. We’d just need to go back, land and shut them. On the other hand, a mistake with the positioning of the flaps during take-off can kill everyone onboard. So, at the point of take-off there is now a new check: a new verification of those 3 parameters which have already been checked when the plane started up. The checklist is something extremely important:  It’s the (re)verification of actions we’ve already done.

University College Hospital London anaesthetists reviewing the WHO surgical safety checklist. Source: UCLH website

In the medical world, similar procedures have also been implemented at 3 specific moments: just before anaesthesia, before the first incision and just before bringing the patient back into the recovery room (note: this is known as the WHO Surgical Safety Checklist, now mandatory for every operation in France and in the UK). In France in 2006 surgical teams got the patient identify or the side of the body to be operated on wrong 2000 times. Now we check the identity of the patient, and which operation is to be carried out (i.e. on which limb and/or which side of the body) so that everyone involved knows what needs to be done and who needs to do what. This makes the system more reliable and robust in dealing with this type of error.

ANENA: Are the procedures easily accepted by professionals?

FD: We often hear that ‘We’re different, it’s not the same for us, we don’t do the same job as the others’. In aviation we’ve heard that it’s an art to be a pilot, and likewise in the medical world: ‘You cannot turn the profession of a doctor into a procedure ‘. In the 1970s and 80s, pilots found it very difficult to accept the wish to standardise procedures. Today, this has been done and is accepted by all.

View of a typical airline cockpit. IStock photo.

Here’s an example: in a plane cockpit, to the left sits the captain – the older grey-haired man, to the right is the young co-pilot. During any ‘critical’ part of flying the plane, each individual does a control check on the other. The co-pilot thus ‘checks’ the captain. This took a long time to implement, but today it no longer shocks anyone and has been accepted. While not compromising the authority of the captain, we have made the system much more reliable. Having someone else watch you is very important for avoiding errors and accidents.

In our collaboration with the medical profession, we worked with them to re-write their version. It wasn’t just a copy and paste job of what we’d done in aeronautics. Today, the treatment of breast cancer is a pretty standardised procedure wherever you are, be it in Marseille, Paris or Grenoble. However, for other types of body organs, there are still plenty of different professional interpretations, so still things which could be standardised. Not everything because each patient is different, just as every flight or mountain slope is different. However, there are good practices to be assimilated and taken on board by all.

ANENA: Is the acceptance of procedures an outcome of having made an error? Is someone who’s made an error more receptive to the implementation of safety procedures or corrective action?

FD: Yes. That’s why reports and papers about errors and shared feedback are important. In private aviation, this is the sort of thing we read about the most. For example, if we take 3 cases of running out of fuel, and the account of how they came about. When you read the stories, you can easily see how you could make the same mistake yourself. It’s rather like taking on board other people’s mistakes.

Another example concerns childhood cancer. Radiologists were all convinced that the scan would be identical for the same type of tumour, regardless of the practitioner. This is clearly false because the interpretation part is very real. Realising the problem, corrective action was needed. From now on, with every critical scan, particularly in children’s cases, 2 radiologists perform a scan and they compare their interpretations. If both come up with the same conclusion, it’s fine. If not, they do it again. It’s the same principle as that of the pilot/co-pilot in aeronautics.

Encouraging people to accept good practice is also easier if they aren’t given multiple examples of good practice to follow. This is a dangerous trap, and there’s no need to make thirty different checklists. We just need to provide a small amount of information with a few key points, but in the right place and at the right time. It’s best to have less information, but that this information is effective. In a hospital setting, where time is scarce, I proposed something quite surprising: I asked them to do an in-depth analysis on just one ‘near-miss’ type of event experienced every month, and then come up with three or four corrective solutions. I then asked them to hold onto just one of these solutions. Because I wanted them to direct their energy into the act of implementing corrective action. If we’re capable of doing this, just imagine how many corrective actions could be implemented, taken on board and followed after a year or two! The initial reaction from the Ministry of Health was to say that if there were 15 incidents these would all need to be analysed, and that if 20 corrective actions were identified, these would all need to be implemented. But that’s not realistic.

ANENA: Talking about analysing and preventing this human factors’ risk, do you believe there are commonalities in terms of analysis and prevention between winter sports and your specialist areas of aviation and health?

FD: I’m convinced of it. Provided that any methods and procedures implemented are specific to the profession or activity concerned and written in conjunction with the relevant experts. You can’t impose anything on a professional body without them playing an active role in it. Talking with the guides I’ve found the same sort of questions and attitudes we observed in the airline sector in the 1970s. They are professionals who want to progress and understand, and who have already put specific actions into place.

Some principles are clearly the same – men and women have the same weaknesses and flaws, whatever their line of work or activity. The feedback is generic; what’s left is the creative part of imagining how to implement it in each sector. Procedures are necessary, not too many of them, but necessary. They must be legible, feasible, acceptable, understood and, most importantly, written by professionals in the relevant sector. These are key underlying unchangeable principles that go across the specific disciplines.

ANENA: What are the key steps for transferring the principles and implementing the principles of controls of good practice in these two sectors?

FD: All of the difficulty resides here. Carrying out a thorough analysis of accidents, incidents or precursor events, enables us to identify and understand the causes. This crucial work leads us to identify one or more corrective measures, which will then need to be implemented.

Climbing partner check. ORTOVOX photo

For example, after analysis of a mountaineering accident:  say we’re told that from now on we all need to check the knots of all members of the rope team. Everyone ties their own knot, but this then needs to be checked by a third party. That’s very easy to say on paper, but how can you follow through and ‘control’ when you’re out on the mountain?

It’s a lot more difficult to carry out corrective measures than it is to define them. It takes a lot of energy to put them in place. Integrating them into basic training programmes provides an additional guarantee that they will be taken up later. It’s a long process.

When we set up the operating room checklist, I remember the reluctance and rejection from certain surgeons who didn’t want it. It’s a very long process in getting corrective action measures accepted. People have to understand what they are doing and what you’re asking them to do. You can’t create a checklist simply to please a ‘Mountaineering Federation; likewise, we didn’t devise procedures to please the Managing Director of Air France. In aviation, there are 3 things we absolutely do not want: a crash; a mid-fight collision; or losing a passenger in mid-flight. There are procedures in place to prevent these outcomes. You can’t make up successful preventative procedures just to please the powers that be – people will only do it because it makes sense. When someone understands the reason for doing something, it’s easier for them to take it on board and do it. Because, if you need to implement an action but people don’t understand the reason why, it won’t work.

In terms of assessment and controls, it’s tricky because we’re controlling professionals who do their job well. Aviation is a highly controlled environment with simulators and on-board controls. There may also be observers during the flight, watching us. We have to accept having people observe us, but this needs to be done with great tact.

ANENA: How do you explain why aviation became a forerunner in integrating human and organisational factors?

FD: An air accident very often causes hundreds of deaths and will attract huge media coverage. In some ways, this can be seen as an opportunity because it forces us to do something, to move things on, to realise that something’s not working. Paradoxically, 4,000 deaths occur on French roads every year (figures that would account for 40 Concorde crashes a year). Somehow, we accept this figure. Even so, it’s strange how this notion of mass fatality can bring about change or raised awareness ….

…………………………………….

In the next article we’ll take a closer look at how the experts explain why intelligent, competent people make the error of not applying basic risk reduction points in these high consequence/low validity risk contexts. Then we’ll take a look at a solution aimed at preventing this type of lapse of judgement and other simple human factors errors that lead to avoidable accidents. Until then, I look forward to seeing you here again on henrysavalanchetalk.com, where Safety is Freedom!

The post ACCIDENT REDUCTION IN THE COVID ERA ‘The biggest error would be to believe that we don’t make errors’ appeared first on HAT.

Загрузка...

Comments

Комментарии для сайта Cackle
Загрузка...

More news:

Professional Ski Instructors of America Rocky Mountain Division
Professional Ski Instructors of America Rocky Mountain Division

Read on Sportsweek.org:

Avax.news
Oann.com (One America News Network)
Oann.com (One America News Network)
Avax.news

Other sports

Sponsored