The traditional way of organizing companies was in functional departments where people with the same skillset and education could focus on specific challenges, solve these and then hand over the result to the department that would integrate all the parts from all the functions into one working system. The perceived advantage was that each expert could just focus on his or her domain and not bother about anything else.
In practice, that model didn’t work very well and the inter-team coordination cost easily skyrocketed as each function made all kinds of assumptions about the other parts of the system that generally didn’t prove to be accurate as these assumptions were based on historical data. As the saying goes, assumption is the mother of all screw-ups.
As individuals, we tend to take the same approach. Whenever we need to solve a problem, we look for ways to reduce it to the bare essentials and ignore everything we don’t think is relevant. The challenge is that most of us tend to ignore more than what’s actually relevant.
This is problematic for at least two reasons. First, in a digital world, the pace of change is much faster than before and continues to accelerate. As a consequence, things you earlier could assume to be stable are now likely to shift while you’re focusing on the challenge at hand. The risk is that by the time you’ve solved the problem with a beautiful, elegant solution, the problem itself has shifted and the solution doesn’t fit the context anymore as interfaces it assumes are no longer present.
A beautiful example in automotive is the CD player. For years, automotive companies were looking for reliable solutions to store several CDs in the infotainment system to give drivers access to as much music as possible. Just when good solutions were becoming available, the industry shifted to Bluetooth access and mobile-phone-based streaming solutions. We now all listen to Spotify while driving instead of juggling CDs.
Second, we tend to underestimate second-order effects. Whenever we develop solutions, we tend to focus on the particular outcome we’re looking to accomplish, ie the first-order effect. However, especially in complex systems, the effect of actions isn’t easy to predict and the outcome may easily be the opposite of what was intended.
The world is awash with examples of unintended consequences. Most historians consider the Treaty of Versailles at the end of World War I as the root cause for World War II, even if it was intended to ensure Germany was curtailed and wouldn’t be able to wage war. Another illustrative example is the Four Pests campaign by Mao in China where the population was encouraged to eradicate sparrows as these ate seeds after planting. The consequence was plagues of locusts causing entire crops to be lost. Both of these examples led to millions of deaths.
A controversial current example is the approach governments take to combat Covid-19. With entire societies shut down and all medical staff focused on treating corona patients, many wonder what the secondary effects are in terms of avoidable deaths from diseases left untreated, including cancer, as well as poor mental health and increased suicide rates due to the lack of human contact. Could it be that the deaths and deteriorated quality of life due to the secondary effects exceed the lives saved because of shutdowns and focusing the entire medical establishment on Covid?
The best approach to deal with these issues is, I believe, threefold, ie scope, humility and experimentation. First, whenever taking on a challenge, the first step has to be to ensure that you’re not unintentionally scoping the problem down to too narrow a focus. As Einstein famously said: you should make everything as simple as possible but not simpler. In most cases, this means including more in your scope than what you traditionally would have done as everything changes so quickly.
Second, we need to exercise humility. Rather than assuming we know, it’s wise to accept that most of the time, we actually don’t know anything. This is the case as there’s much more to know than what our poor brains can encompass and because many things simply are unknowable until they happen. Look no further than the stock market: even people who have worked in the industry for decades and spend all their time in the market are unable to beat the market in the long run. Of course, there are exceptions, like Warren Buffet, but these are exceptions that confirm the rule. The Dunning-Kruger effect not only shows that people new to a field tend to overestimate their skills but also that true experts know how little they know about their field.
Third, rather than taking decisive action in the face of uncertainty, the better approach is to figure out how to run experiments to test the effect of certain actions. One illustration is the Peltzman effect where, as an example, regulation to increase safety in traffic tends to cause traffic participants to take more risks as the perceived safety increases. Many laws and regulations don’t have the effect intended by lawmakers but rather are neutral or negative, due to secondary effects. In many cases, the only way to gain an understanding of the consequences of an action is to run the experiment, learn from it and then decide on the next steps.
In a digital world, the pace of change is fast and increasing. As a consequence, factors that earlier could be considered sufficiently stable to be ignored when addressing a challenge need to be incorporated in the scope. Therefore, it’s important to think holistically, to accept that there’s much that we don’t know or even can’t know and to experiment where possible to understand whether the intended effects are indeed achieved and negative consequences avoided. The road to hell is paved with good intentions, but I don’t think any of us wants to end up there.