Common pitfalls in preparing for the future and how to avoid them.
Coronavirus, image: Kaspersky
The COVID 19 pandemic has put a spotlight on futures and scenario planning, creating a wealth of predictions and future scenarios. Working within futures, I am excited about the exposure foresight is receiving. However, this deluge of foresight fails to recognise people’s difficulty in working with foresight — the common misconceptions and cognitive biases that present themselves when thinking about the future. As a strategist with a background in cognitive psychology, I aim to remove and minimise biases as much as possible in the strategic process. Therefore I wanted to share the most common pitfalls in working with futures I have encountered and how to deal with them.
Humans are bad at preparing for uncertainty — events with a big impact, but low and uncertain probability. This is worrying given that In the last decade, the world has seen its fair share of crises. We have witnessed severe health crises, such as H1N1, MERS or Ebola to name a few. Different parts of the world have also faced environmental crises — from the mega fires in Australia to droughts across the world. Only just over a decade ago we had the global financial crisis of 2008, which also initiated a global recession. These circumstances are referred to as VUCA conditions, Volatile, Uncertain, Complex and Ambiguous. Unsurprisingly, not preparing for the shocks these VUCA conditions create is a costly affair. For example, The National Centres for Environmental Information calculated the total cost of billion dollar weather and climate disasters in the US for the last 5 years was $537 billion while The Roosevelt institute calculated that by 2016 the global financial crisis had cost the USA $4.6 trillion. The reality is, VUCA conditions are the new normal as our world becomes more interconnected and pressures on existing systems become more intense. As a consequence, more businesses have started to use foresight and scenario planning to build resilience into their strategies.
In the last couple of weeks scenario planning, has been catapulted into the mainstream. Management consultancies (such as McKinsey , Bain and BCG), governments and think tanks have been flooding our inboxes with various scenarios to help us navigate these VUCA conditions. While it is great to see an abundance of helpful insight into plausible future scenarios, one thing has been overlooked, namely that humans are really bad at working with foresight. We have been given the IKEA cabinet flat pack but without the manual. One unfortunate example of this human weakness is the foresight work the UK government did in relation to pandemics in 2016 called Exercise Cygnus. While the scenario work was very thorough, there was a lack of action on the back of the scenarios. Similarly, when hurricane Maria hit Puerto Rico in 2017, the US government response was severely inadequate despite an earlier foresight project that highlighted the critical weaknesses.
So why is it that even when people have a plethora of robust future scenarios they fail to act on them? While one could write a book about this topic, I want to focus on two pitfalls I have encountered most frequently in working with clients on futures and scenarios.
Firstly, sometimes futures are misused in an attempt to predict the future. The value of working with future scenarios lies not in predicting the future, but in broadening people’s views and challenging beliefs and assumptions in relation to the future to enhance their preparedness and build resilience.The scenario planning process is designed to be iterative, multi-stakeholder and multidisciplinary in order to surface assumptions, present different perspectives and develop different possible actions. There is certainly value in modelling aspects of the future, however, in VUCA conditions, the value of these quantitative models is low given that the past is increasingly a poor predictor for the future. Therefore it is important to be clear on the use and value of scenarios and manage expectations around probability. Failing to be clear on how somebody will be able to use foresight work will lead to disappointment and another report just gathering dust on the server.
Secondly, there are certain cognitive biases all of us have that prevent us from engaging with and acting on futures. As somebody with a background in cognitive psychology, I look out for biases in decision making and have developed some methods to counter them. If you are interested there is a very long list of cognitive biases and you can find a handy cheat sheet here. However, In this article I focus on those I have encountered most frequently and how you can deal with them. Anticipating these biases and spotting them early on will help to design better future scenarios and get people to also take action based on them.
Confirmation bias “I haven’t seen that before, I’m not so sure about that.”
Confirmation bias is when people look for information that confirms their already existing views while ignoring evidence that is in conflict with their beliefs.To avoid confirmation bias, the process needs to emphasise the development of several scenarios with a diverse group of people. A great exercise to use in this case is to interrogate the scenario and ask what the opposite case would look like and what would be needed for that to happen. Arguing different perspectives helps people to go beyond their initial views and stimulates research into areas that weren’t initially thought of.
Overconfidence bias and optimism bias: “We are the best in our category, this won’t affect us.”
Overconfidence bias happens when people place too much faith in their own knowledge and views. Closely related to this is optimism bias which is when people believe they are less likely to experience a negative event. This can lead to negative scenarios being dismissed or minimised, leaving the organisation unnecessarily vulnerable.
“Overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.” Daniel Kahneman
One way to deal with this bias is to assess the organization’s capabilities in group and encourage objective comparison with peers. For example, comparing the performance of the business to the industry average or asking how much effort it would take for competitors to match the business’ capabilities can help to reframe a business’ competitive position.
Loss aversion “This would cannibalise our existing business.”
In the case of loss aversion, potential losses are given disproportionately more weight in decisions than potential benefits. Put simple, the fear of losing is stronger than the pleasure of winning. A lot has been written on this topic and because loss aversion is influenced by both intrinsic human biases as well as company culture it is difficult to deal with. However, I found that in discussing future scenarios it helps to assess the risk of the investment and contrast this with the risk of doing nothing. More approaches can be found in this Harvard Business Review article or this McKinsey article.
Availability bias “That is just an anomaly, these are outliers.”
This bias is particularly common and other examples include “This is only a tiny segment of people.” or “My children don’t show this particular behaviour.”People use their own recent experience to frame the future. This leads to over-generalisations, tunnel-vision and missed opportunities and threats as a result. This bias is why it is crucial for scenarios to include a diverse set of protagonists and actors. Describing how a future would unfold for different people helps to create a more holistic picture that better outlines the mechanics of the system in the future. Furthermore, when thinking about the future, naturally examples of change will be small and contained. If all the examples of shifts were well-known and accepted, they would be the present, not the future. Like William Gibson said “The future is already here, it’s just not very evenly distributed.”
Hindsight bias “I always knew that was going to happen. I’ve often been right in the past with my predictions.”
In the case of hindsight bias, people see past events as more predictable than they were before the event took place. This leads people to overestimate their own ability to anticipate the future. In this case it is important to remind people about the purpose of futures work, which is not to predict the future with a level of certainty, but to think about the different futures that might unfold and what that means for us as a business.
Status Quo bias “I’m not convinced this means we would need to adapt. Our current system works just fine.”
When people prefer things to stay the same and continue as usual even though this would be suboptimal they are displaying status quo bias. There are a couple of underlying potential reasons that make this particularly tricky to deal with. It can emerge because people want to avoid regret, don’t want to invest resources into changing or are psychologically committed to the current situation. When noticing status quo bias, it helps to break down the change in progressive steps, rather than to present the future scenario as a complete shift. Another exercise that works well is to break down the current situation into what people like and believe work as well as what can be improved. This removes some of the commitment to the current status and helps people to positively engage with future alternatives.
Cognitive biases are incredibly hard to avoid, however, being aware of them and planning for them helps to minimise their impact. Furthermore, framing futures as an approach to build resilience into strategies through preparation, rather than prediction increases the utility of the work. Tackling these two common pitfalls goes a long way in allowing us to better prepare for the future by creating more robust scenarios and removing barriers to action. Whilst the process of preparing for the future is labour intensive and challenging, in a world characterised by increasing VUCA conditions it has become essential. Not preparing for the future, therefore is not an option. As Richard Rumelt put it “Strategy is always a balance of on-the-spot adaptation and anticipation. By definition, winging it is not a strategy.”