ETHOS Issue 04, Oct 2008
History is not without its disruptive surprises. At the end of the 19th century, it was thought that everything that could be known had been discovered. Yet only a few years later came the x-ray, sub-atomic particles, nuclear fission and other discoveries that completely changed the world.
Clearly, however, the pace and nature of disruptive change is evolving more rapidly than ever before. It is now commonplace to assert that the world is more complex and uncertain.
Three major factors have helped accelerate the pace of change. First, the increasing sophistication of markets and distribution channels has allowed new products to be diffused very rapidly, not only within national borders but also globally.
Second, transport and communications technologies have brought the world closer together. It has enabled global production and spread the economic gains and the knowledge of production globally in a far shorter period of time.
Third, the revolution in information technology has greatly speeded up the diffusion of information, both in terms of technology and also of social behaviour. Societies that used to exist in isolation without outside influence can now see and learn through the media or Internet about what is going on elsewhere almost instantaneously. Along with easier travel, information technology spreads and speeds up the learning process of societies in both good and bad ways.
The positive aspects of these developments are well-known. On the downside are “globalised” threats such as contagious diseases, financial crises, terrorism and conflict.
The increase in complexity and uncertainty brought about by these factors has implications on the way the public and private sectors manage their affairs.
RISKS OF COGNITIVE FRAGMENTATION
Studies into human cognition have shown that the human mind cannot handle too many items at once. It tries to cope with complexity by breaking it down into compartments or sub-components which are left to be handled by other people or at other times. This has given rise to the ever finer modes of specialisation that characterise modern societies. However, this also brings with it the risk of “fragmentation”. On occasion, this fragmentation can lead to a serious failure as shown in the following example by Gary Klein: 1
“During an operation, the surgeon decides to lower the patient’s blood pressure. He directs the anaesthesiologist to give the patient a drug that will have this effect, but does not explain what he is trying to accomplish. The anaesthesiologist gives the drug, notes that the patient’s blood pressure goes down, and boosts the level of another drug that will increase the blood pressure. To the anaesthesiologist, this is standard operating procedure to keep the patient’s vital signs stable. The surgeon notes that the blood pressure is higher than he wants and directs the anaesthesiologist to increase the dosage of the first drug. The anaesthesiologist follows the request, watches for the blood pressure to reduce, and then boosts the drug that will return the blood pressure to its normal level. This cycle continues until the patient ends the game by dying.”
This is a simple example that could be put right once the lesson has been learnt, but there are more complex situations where it is not as easy to avoid the negative consequences of fragmentation.
For example, people put their money into banks or insurance companies and assume that these institutions—and their regulators—know what they are doing. That did not prevent banks lending to sub-prime borrowers. The banks thought they had diversified their risks. They assumed that the financial products had been structured to minimise risks. They looked to credit rating agencies and even loan insurance agencies to take care of some of the work of risk assessment and management. So the whole chain of fragmentation continued.
Had nothing untoward happened, most of them would have been richly rewarded. As it turned out, there was a correlation of risks, and the whole chain fell apart with very serious consequences on a system-wide basis, harming even those who have little to do with it. The same could be said of the food production chain, and its potential impact on human health.
Most of our current mental
models assume that with given
starting conditions, we can
reasonably predict the
outcome of management
actions and therefore choose a
set of actions or strategy that
brings us to a desired outcome.
We act like a hive of bees, with each bee tending to his individual cell and depending on the actions of thousands of his companions to succeed as a colony. Through some self-organising principle, this arrangement succeeds remarkably well. Indeed, so widespread is this fragmentation process that it is a tribute to human organisation that failures are relatively rare. However, as complexity increases, we must expect that the risks arising from fragmentation will increase. We must, therefore, adopt an approach that caters to such risks rather than assume that they cannot occur. This calls for a different mental model in the management of complexity.
Most of our current mental models assume that with given starting conditions, we can reasonably predict the outcome of management actions and therefore choose a set of actions or strategy that brings us to a desired outcome. Studies in complexity clearly indicate that such precision of prediction is impossible for any reasonable period into the future. We need to replace the current mental model that says, “If we do such and such, then an outcome of such and such will result.” The new model is that “If we do such and such, then probably such a range of outcomes is likely to result”. In fact, we have to be prepared for outcomes that are totally unexpected and perhaps thought of as “crazy” before the fact. Such mental models do not become us easily because it is the job of many formal education systems to teach predictability and knowledge of a form that is deductive, i.e., that input A invariably leads to outcome A.
RISK MANAGEMENT AS POLICY
It has been said that good execution of a mediocre strategy is better than a brilliant strategy poorly executed. Yet many managers tend to assume that execution is something that can safely be left in the hands of other, usually more junior, staff. This is also an outcome of a linear mode of thinking, i.e., that once the key input parameters have been set in the form of the strategy chosen, the outcome must inevitably follow as night follows day. But if the outcome is uncertain and the unexpected has a fair chance to happen, we need to pay more attention to execution.
In particular, management has to accept the need to be prepared for surprises at all times. It should:
- cultivate a mindset that anticipates or at least prepares for “wild card” scenarios;
- accept the need to build and manage an effective risk anticipation and management system; and
- accept a certain cost to “insure” against low probability but high impact outcomes, e.g., choosing a strategy that yields slightly less value than the “optimal” strategy if doing so takes into account a low probability but high impact (or loss) outcome.
Adopting such a way of thinking is not without challenges. Practical leaders seldom want to waste time thinking about low-probability future events. They have their plates full with current problems. Incurring current costs to insure against future events may seem fruitless, particularly if fast and mobile managers would have moved on to new pastures by the time any adverse impact happens, if at all. But for those who have long-term and total responsibilities for the whole organisation, risk management and the associated costs cannot be avoided.
If the outcome is uncertain
and the unexpected has a
fair chance to happen, we
need to pay more attention
BEYOND SCENARIO PLANNING
Amongst the many tools of risk management, scenario planning has been advocated as a major tool to combat future uncertainty and to address the problem that the future is inherently unpredictable. Scenario planning is extremely useful in creating awareness of risks and the possibility of unexpected outcome. However, to fully realise the benefits of scenario planning requires a better understanding of the underlying process between scenario painting and the formulation of strategies.
Both analysis and practice show that the gap between scenarios and strategies cannot be bridged directly. One can see this with a simple artificial example of a stock market. The scenarios are simple. For a certain time horizon, the market is either up, down or flat. Knowing these scenarios does not at all help the investor in coming to a strategy. It is impossible to create a strategy that will be robust under all scenarios. The strategy of staying out of the market or having a fully diversified portfolio will earn close to zero returns over time.
What scenarios can do is to start one on the process of thinking about the possible range of outcomes and to develop leading indicators or “trip wires” which will add more knowledge to those outcomes that are more likely to result. Developing such indicators calls for great domain knowledge, extreme skill and judgment, and the ability to read so-called “weak signals”.
This is not a trivial task. Take the stock market for example: if the indicators point towards certain outcomes rather than others, then a strategy can be chosen. The sub-prime loan crisis is a good example of this process. Obviously the various scenarios of what can happen cannot be unknown to all the banks, but in their actual selection of strategy, each bank will act differently on the basis of its understanding of the market and its view of the indicators that it has developed. As it turned out, only Goldman Sachs read the appropriate signals and was duly rewarded.
THE NEED FOR CONTINGENT EXECUTION CAPABILITY
Another vital step in the scenarios to strategy process is the need to build contingent execution abilities. Singapore has been extremely successful in identifying needs and building the execution ability and institutions to carry them out. This is a key ingredient of our “brand name” overseas and what many other countries have sought to learn from and emulate. These execution abilities will have to be maintained. Going forward, however, future uncertainties identified from scenarios have to be addressed through what may be called contingent execution abilities. These are abilities that can be called upon for execution not for ongoing projects but for unexpected outcomes.
One good example of this approach is the military. It is clearly impossible to foretell the development or nature of any specific conflict and how it will unfold. No military is built to execute a specific strategy, however likely it is. Instead, armed forces must be trained and equipped to be capable of dealing with the full range of possible outcomes envisaged under the scenarios.
The fact is that outside of the military, contingent execution abilities are rare because of the cost of this extra “insurance”. In stable times when conditions are unlikely to change much or to change slowly enough for new execution abilities to be acquired in time, there is no point in paying this “insurance” cost. But the outcome of armed conflict is so clearly damaging that countries do pay this insurance cost for such contingent execution potential. Civil defence measures to manage potential natural disasters and preparations for health threats like a flu epidemic are other examples of such contingent execution ability potential.
CONCLUSION: EXPECT THE UNEXPECTED
If uncertainties and their downside risks are increasing in pace and impact, it is necessary to pay more attention to risk identification and anticipation. More creativity and diversity is needed in this process; scanning and interpretation of future outcomes are made more difficult by the human tendency to be trapped in past mental models. We must expect that linear extrapolation from past experience is not a sufficient guide.
Scenario planning, amongst other techniques, offers good help in extending the ability to “foresee” unexpected outcomes. However, enormous domain knowledge, skills and often luck are needed to set up a system of indicators or trip wires that will help guide strategy formulation. In the execution of strategies, more monitoring of the impact of decisions and, in particular, a system of risk assessment and management has to be deliberately set in place and institutionalised. It is no longer possible to trust that a well reasoned and thought-out strategy will be executed flawlessly or not encounter unexpected outcomes.
Finally, where the cost and benefits justify it, thought should be given to a development of contingent execution abilities beyond those needed for current operations.
- Klein, Gary, Sources of Power: How People Make Decisions (Cambridge, MA: MIT Press, 1999)