There’s little doubt in anyone’s mind, I hope, that automation lies at the heart of all the progress in human well-being and economic development. It started with outsourcing physical labor to machines, initially through watermills and windmills and later through the use of internal combustion engines and electric motors. Later on, also “white collar” work was increasingly automated through the use of computers, and activities requiring intellectual labor were starting to get automated. This is a process that’s still ongoing, affecting industry after industry.
There’s an interesting pattern where an activity to be automated is first positioned and presented as supporting, with a human still running and in charge of it. The next step is to automate more of the work and to position the human increasingly as the supervisor of the automated process. Finally, with the increasing performance and reliability of the automated process, the human is removed from the loop entirely and the cost of the activity has dropped by orders of magnitude. This is why automation and the use of computers have such a strong deflationary effect, even if it may take a long time to capitalize on them.
One good example is the evolution toward autonomous cars. Currently, the majority of vehicles on the road have some sort of automated driver support with the driver still in control. The next step that’s increasingly available in modern cars is a mode, especially on highways, where the vehicle controls speed, lane keeping and distance to other traffic participants, but the driver is the supervisor to ensure that the automated functions don’t make mistakes. The final stage, which will be available in the coming years, is where the vehicle, at least in some operating domains, is fully autonomous and doesn’t require human supervision.
Of course, many are concerned about this in terms of employment as, in the US alone, more than 3 million people work in transportation, and many of them will lose their job when autonomous vehicles can handle the majority of use cases. My prediction is that the first step will be for trucks to fully autonomously travel from a logistics hub outside one city via a highway network to a logistics hub close to another city. Human drivers can then manage the last mile into the city. Similar scenarios are developed for sea shipping. Even if this will affect many professions, I view this as the umptieth disruption of work because of automation and I don’t worry at all about us creating new professions for the people no longer required in logistics. During the 20th century, the percentage of people working in agriculture went from somewhere around half of the entire workforce to around 1.5 percent and nobody worried too much about the poor farmers getting out of a job.
For all the enormous benefits provided by automation, as discussed in the previous post, there’s a “dirty little secret” concerning automation that few people talk about or reflect on: automation tends to stifle change and innovation. The cost invested in automating a process is often significant. Also, the knowledge of how the process was actually automated tends to be quickly lost as this is often done in a project and the project members disappear. Consequently, the cost of making changes is quite high and companies go out of their way to not change or to implement changes outside of the process. This may easily result in suboptimal outcomes as processes aren’t aligned with the current needs, but rather based on tradition and outdated beliefs.
The best way to address large, complex problems very often is to break them down into smaller chunks and solve all the pieces individually before integrating them into a comprehensive solution. It’s a basic, generic pattern applied to virtually all forms of problem-solving and design, and automation is no exception. The focus of modularization is to break processes into relatively independent blocks that can be freely composed to decrease the cost of changing processes. The question is of course what the right blocks are and how to compose them, ie process orchestration.
While it’s difficult to provide generic guidance, we can identify a few patterns. Most processes are a sequence of activities. Each activity typically has a trigger, collects input data, performs an operation and often generates some output. In addition, there often are conversions of data, starting applications and federating coordination between different systems. Tools for robotic process automation often have a set of building blocks that can be viewed as generic components.
One aspect of process orchestration is the notion of interfaces as modules or blocks need to be able to interact with each other. An important development over the last decades has been to minimize dependencies between the sender and the receiver. Currently, the best way we have to accomplish this is to use a message bus to which all modules, blocks or components are connected. Messages can be sent and received but the sender is unaware of who receives the message and the receiver doesn’t know where the message comes from. This allows for strong decoupling between different parts, which is exactly what we want.
However, there’s still a need to orchestrate all these components, modules or blocks in such a way that the desired system behavior is accomplished. The challenge often is less concerned with describing the main, standard path in an orchestrated process, but rather to capture all the relevant process deviations, the interactions between different processes and the edge cases that can easily undo the benefits of process automation if human intervention is required frequently.
Even though automation has provided amazing benefits to humankind, there’s a flip side few talk about: once automated, changing a process can easily become prohibitively expensive. The best way to avoid painting yourself into a corner is to modularize the processes you automate and facilitate the easy combination and recombination of the modules or blocks for process orchestration. Finding the right modularization and orchestration is often case specific, but failing to get this right may easily cause significant process debt that will be expensive to resolve. As William Edwards Deming said, “If you can’t describe what you’re doing as a process, you don’t know what you’re doing.”