5 min read · 04 September 2022
Continual Improvement
This essay was originally published on 51CM on 04 Sep 2022, and was edited and re-published in 51CM in 2023.
Continual improvement is a core tenet of management systems and it’s something all organisations should strive to achieve across their security and crisis management programmes.
But what does continual improvement mean for crisis management?
What should companies be trying to improve? What options are available to achieve these improvements? How can they be measured?
In the context of crisis management, any focus on continual improvement should consider the different aspects of an organisation’s crisis management capability. At a high level, this capability can be broken down into two core components:
Let’s start by looking at how organisations can improve their procedures.
In my view, organisations should actually minimise the continual improvement of their procedures. While this approach may appear counterintuitive at first glance, there’s method to my madness.
You don’t want a scenario where procedures are continually changing. Every change will require additional training to inform the team of the change and then an exercise to practice the new procedures. At a more operational level, continual changes will be confusing for the Crisis Team and may actually increase dysfunction among the team should they have to respond to a crisis.
The ideal approach is for organisations to get it right the first time. Organisations should invest up front in a proper crisis plan that’s tailored to the needs of their organisation. Of course, no plan is perfect. But the organisation should aim to get their crisis plan to at least an 80% solution first time round.
Once the crisis plan is in place, it’s okay to make minor updates. In fact, it’s necessary. However, major reviews and substantive changes should only occur once every five years or so. Any more frequent, and the team will be in a constant state of flux and won’t be effective.
Policies can and probably should change over time. However, I would be very wary about changing crisis response procedures. Particularly in the case where you have a mature team that’s been relying on those procedures for many years. Any roll out of new procedures must be accompanied by comprehensive training and exercises.
What does all this mean in practice? It means that, unless your plan is terrible, it’s probably not a great idea to bring in a consulting company and pay them a bunch of money to re-write your crisis plan. The time to bring a consultant into your organisation to write the plan is early on. Instead of paying someone to re-write your plan, spend time with industry leaders and pick their brains about the types of details they include in their own plans. You can learn a lot by listening to what others have already learned.
To sum up, change for the sake of change when it comes to crisis response procedures isn’t necessarily good and can actually undermine team effectiveness.
As you’d expect, you could have the best crisis plan in the known universe, but if you don’t have a competent Crisis Team, you’ll fail badly during a crisis. Let’s look at teams next.
I’m passionate about building highly competent Crisis Teams. Probably unsurprisingly given the niche nature of this field, this isn’t a passion shared by many others. This is a shame, because where you’ll get the most bang for buck when it comes to continual improvement is up-skilling and ‘up-experiencing’ the Crisis Team.
There are a few different aspects to consider when thinking about the continual improvement of a Crisis Team. A good start point is to look at the structure of your team and determine whether you have the right people on the team. The reality is that the ‘right’ people will depend on nature of the crisis event. An effective approach is to have a core team and then an extended team that can augment the core team with specific capabilities. Taking this approach assures you’ll be able to maintain a robust yet flexible team structure.
Once you have the right people in the room, the next aspect to consider is building the team’s competence to manage crisis events. Building competence requires both training and exercises. However, training alone won’t be enough. Training doesn’t build the necessary levels of experience and judgement that are essential to competently manage a crisis. Crisis simulation exercises, where the Crisis Team goes through their paces in a realistic setting, provide an ideal medium for building experience and competence.
In my view, crisis simulation exercises are essential. Aside from a Crisis Team continually responding to actual crisis events, which is unlikely, crisis simulation exercises provide the only option to build the experience of a Crisis Team to the level necessary to competently manage a crisis.
Given the value of crisis simulation exercises, how can you build the Crisis Team’s experience in a way that’s guaranteed to improve the team’s capability and provide real results?
First, you’ll need to consider the number of crisis simulation exercises you conduct each year. If you want to simply maintain the capability without any desire to improve that capability, then one crisis simulation exercise a year will probably be enough (noting that you’ll have some turnover of team members each year). However, if you want to improve the capability of the Crisis Team, you’ll need to conduct more exercises. Of course, more doesn’t always equal better.
So in addition to conducting more exercises, you’ll also need to engineer the crisis simulation exercises to expose the Crisis Team to different types of crisis events and to the different types of situations that may occur within those events. Most crisis exercises aren’t deeply designed. Rather, they’re superficially designed simply to tick a compliance box. Rarely do the people designing the exercise do any serious engineering work behind the scenes to make sure the exercise actually improves the competence of the Crisis Team. Not their fault—most haven’t been exposed to a well-designed simulation exercise and won’t know any better.
So, while quality and quantity are both key factors, I would say that the design of the exercise is what drives effective learning and is what enables continual improvement.
Remember that one of the key benefits of crisis simulation exercises is that they provide a form of inoculation. By exposing the team to the stressors inherent in managing a crisis in a controlled environment, the simulation exercise helps to build real-world competence. Sitting in a room staring at a PowerPoint presentation while someone talks about company policies simply can’t achieve the same outcomes. The next question is this: how do you know you’re actually improving?
Measuring improvement in many specialist fields is hard. In crisis management, it’s harder still. So much so that most organisations don’t even bother trying to measure the improvement of their crisis management programme. Perhaps they assume that improvement is being achieved simply because they conduct an annual review of their crisis plan, or because they run training and exercises for their Crisis Teams. These steps are essential, but the still don’t provide a definitive answer to the question “are we better at crisis management today than we were this time last year?”
One of the key challenges in measuring competence is being sufficiently experienced in the field to be able to identify strengths and weaknesses, and to recommend improvements. While an elite athlete will have a coach, organisations don’t have any equivalent when it comes to crisis management. Each member of the Crisis Team has a day job, and no one on the team is actually a specialist in crisis management. It’s no different really from having a company social events committee who almost certainly aren’t experts in event management.
The consequence of not having experts in crisis management means that feedback after exercises and actual crisis events is typically limited to how well the team worked together. There may have been aspects of response that the team completely overlooked, but the facilitator would not be able to identify these because they lack expertise in the craft. Accordingly, there’s a huge advantage to be gained by bringing in a crisis management specialist to design and deliver crisis simulation exercises.
Continual improvement for a crisis management programme is difficult. So much so that I doubt most organisations do more than just maintain their crisis management capability.
Organisations can and should do more. Bringing in external specialists to design and deliver crisis simulation exercises is an excellent way to improve the experience of the Crisis Team, by exposing them to realistic situations. Learning to respond effectively to a crisis can really only be learned by responding to crisis events. You have to learn by doing. The more times a Crisis Team is exposed to realistic situations within the construct of a crisis simulation exercise, the better.