Business process optimization is key to digital transformation. Don't get caught up in the latest trends without considering your specific needs. Instead, take the time to understand your processes first, then choose the best approach (automate, outsource, etc.) for each issue. Consider scaling challenges and avoid unintended consequences. Embrace new data-driven methods for process improvement.
“Every decade or so, a methodology or technology is heralded as the next best thing since sliced bread. Instead of chasing every shiny object, enterprise leaders must consider the shiny new object in context and embrace or ignore it based on their organization's reality.”
Humble business processes are having a renaissance. Today, business processes are at the epicenter of digital transformation, with concepts like process mining, RPA (robotic process automation), intelligent automation, process intelligence and low-code/no-code dominating the technology landscape. But for enterprise decision-makers, this cacophony of new concepts and drumbeats of vendors is deafening.
The desire to optimize business processes is more than a century old. Here is a brief walk down memory lane:
The 1920s and '30s saw a wave of interest in dissecting business processes as a part of the industrial revolution, particularly within manufacturing. Frederick Taylor and Lillian Gilbreth's famous time-and-motion study was one of the first known endeavors that aimed to understand, monitor and measure the efficiency of a work process. In the 1920s, the Naval Air Systems Command popularized the concept of TQM (total quality management), a Japanese-style process for quality improvement where the process is an integral element.
Of course, not all of these concepts were theoretical. One concrete example of process optimization is the assembly line, which revolutionized car making and led Henry Ford to the pinnacle of success.
This continuing wave of interest in business processes in the 1980s and '90s led to quantitative approaches like Lean Six Sigma and value-stream mapping. And who can forget Hammer and Champy's re-engineering revolution? While the polarizing nature of Hammer and Champy's work garnered a lot of attention, at around the same time, the Japanese concept of Kaizen — a continuous improvement method was flourishing in Japan and other places.
A more systemic and technology-based approach to business process design, execution and orchestration gained momentum in the 2000s. Starting in about the 2000s, the strain of manual processes led to RPA (robotic process automation). While the concept of digital workers (bots) doing all the grunt work seemed rather appealing, the fragile aspects of UI-based automation has led to a shift toward intelligent automation, or hyperautomation, where natural language processing and machine learning make the endeavors a bit more robust.
Around the same time, a Dutch professor, Wil van der Aalst, introduced the concept of process mining at the Technological University of Eindhoven, a method for analyzing logs to derive a data-driven process model. The concepts of process mining and, later on, task mining (or process discovery) have become the foundation for understanding work as it happens.
A recent but exciting entry into the process space is the advent of low-code and no-code platforms, where citizen developers can drag and drop their way into automating and optimizing the last mile and bridging the gaps between monolithic applications.
What does all this mean for executives of large firms?
Processes Are Forever, But Methods Come And Go:
Every decade or so, a methodology or technology is heralded as the next best thing since sliced bread. Instead of chasing every shiny object, enterprise leaders must consider the shiny new object in context and embrace or ignore it based on their organization's reality. It is OK to wait until the concept or technology crosses the chasm and becomes more robust. Yet, it also makes sense to be an early adopter or pioneer.
Insights Before Intervention
Large corporations tend to pick a tool, technology or methodology to solve a problem before understanding the core of the problem and its associated complexity.
A surgeon does not operate without some imaging — X-ray, CT scan, etc. — that provides context and clarity surrounding the target area. Likewise, even a car mechanic doesn't rip open an engine without diagnostics. Similarly, corporate leaders should develop a multi-dimensional, preferably data-driven, understanding of an underlying process before pushing ahead with an intervention.
The maxim of "understand before you act" should drive your decision-making process.
Different Strokes For Different Folks
Not all companies are alike, and not every process is synonymous. Processes vary within sectors and even within companies. And there are process variations that bust the myth of a seemingly proven — or "golden" — process.
You have a range of choices when it comes to approaching a problematic process, from leaving it alone to elimination, from outsourcing to automation, or from capacity calibration to re-engineering. The right approach depends on a variety of factors. One size does not fit all, and you need to approach different problems with different solutions.
Beware Of The Whack-A-Mole Effect
Considering the numerous silos prevalent in legacy corporations, it is essential to understand the dependencies and unintended consequences of any given process intervention. By prioritizing quality over speed in a process, are you causing a bottleneck and downstream delays? By outsourcing a process, are you leading to an entirely new set of functions — quality monitoring, coordination and too much triage, all leading to stakeholder dissatisfaction?
Before you jump in, map out any relationships or dependencies at play, as well as upstream and downstream effects.
Many Experiments Do Not Scale
What works in a small proof of concept (POC) or a pilot program with a carefully constrained environment often does not scale in large enterprises. Many executives have realized that the little bots that did work fabulously on a desktop or within a small department became brittle and broke down due to their processes' volume, variability and velocity.
Hence, volumetrics and non-functional features should be a paramount consideration when evaluating potential solutions to vexing business process problems.
We are in an exciting era of evidence-based, data-driven methods and technologies that will help large corporations transform and compete in the digital and cognitive age.
Read the original Article on Forbes: Here.