In my experience, I’ve noticed that when we develop new projects and processes, the primary aim is to get the task completed as opposed to completing it ‘efficiently’. Quite often, efficiency is a secondary aim, to be looked up only once the task is complete. In a significant number of cases, this secondary aim is never taken seriously as more and more projects come in and employees, as well as managers, find more value in trying to tackling new processes rather than making tweaks to existing ones which are working, albeit not so efficiently.
This was a famous phrase my high school chemistry teacher used to repeat multiple times as he would teach us how to calculate the rates of chemical reactions. Little did I know that it would be applicable in so many areas of life. Consider a process made up of several sub-processes where one particular component takes up a few hours (could be days too) whereas all other components can be done in relatively smaller time.
Finding these components of a process is critical to weeding out inefficiencies as these components are the time determining step. Even a small improvement on the smallest step can sometimes make a larger impact on the overall process as compared to a significant improvement on another step that doesn’t take up too much time.
There are several processes that are normally too complex to be automated. These are processes where the possible ways or paths of getting to the result are exponentially large and involve significant trial and error based on human expertise and intuition instead of a straightforward approach. These processes are normally very difficult to automate in entirety as there are several intermediate steps where human intervention is needed.
In principle, these means that if a process can be mapped out as a complex tree with multiple branches, there will be only a few branches (20%) that will be explored roughly 80% of the times. If there are 10 possible tasks that a process owner may be required to do based on the results of a previous task, chances are that the process owner will have to execute only 2 of them for roughly 8 times out of the 10 occasions that the process is run.
Over time, these projects build up across the board in a firm and this is where my work in the Process Improvement team (also known as Risk Innovation team) comes in. Our team’s main agenda is to understand these processes, and find solutions, conventional or otherwise, to weed out these inefficiencies from them. On one hand, we need to have significant business insight and on the other hand, we need to be aware of various technologies that can be leveraged to help the business.
I’ve been working for Morgan Stanley, one of America's largest Investment Banks, for over 2 years now, as an intern for about 15 months in Market Risk and subsequently, as an Analyst in the Risk Innovation team. During this period, I’ve been involved in improving 3 major processes and a few other minor projects, mainly for teams involved in managing Market Risk. Looking back at the projects that I have delivered, I can recall being able to find certain themes or patterns which helped me to weed out efficiencies and make those processes a little more efficient than they were before.
In this post, I’m going to mention a few of those major themes. This will be helpful to anyone who is in charge of running or managing any sort of process or project. Looking at your daily tasks through this lens will help you to identify areas of inefficiency.
I. “Slowest step is the rate determining step”
This was a famous phrase my high school chemistry teacher used to repeat multiple times as he would teach us how to calculate the rates of chemical reactions. Little did I know that it would be applicable in so many areas of life. Consider a process made up of several sub-processes where one particular component takes up a few hours (could be days too) whereas all other components can be done in relatively smaller time.
Example: Optimize on Sub-process B can add significantly to the overall process |
Finding these components of a process is critical to weeding out inefficiencies as these components are the time determining step. Even a small improvement on the smallest step can sometimes make a larger impact on the overall process as compared to a significant improvement on another step that doesn’t take up too much time.
II. Finding loops of repetition
Certain projects involve intermediate components that are repeated in some way or the other before the final results are achieved. Consider a project in which you may have to repeat an earlier step depending on the results that come out at a later stage. It may also be possible that the rules/condition for repetition may not be very concrete or well defined.
A process involving significant repetition of tasks based on certain rules (that may or may not be well defined) |
It is important to look for these parts and figure out ways to improve these steps. Improving loops usually involves choosing between 2 strategies:
- The elimination strategy - Can we perform the intermediate steps in a way that removes repetition altogether? In other words, is it possible to do the perform the steps in an efficient manner that repetition gets removed altogether?
- The automation strategy – In case elimination is not possible, can we automate the execution of the steps within a loop? This involved coming up with a set of conditions for repetition that can subsequently be converted into a code form and automated. As a result, the loop can run at the back-end in an automated fashion and the project owner has to only perform the subsequent steps that come after the repetitive process.
Weeding out these loops can save significant time and effort. It also increases the quality of the work of an employee. Rather than indulging in a manual and possibly mind-numbing task, the employee spends more time on tasks that require their financial skills and expertise.
III. Look for the Pareto principle (also known as the 80/20 rule)
There are several processes that are normally too complex to be automated. These are processes where the possible ways or paths of getting to the result are exponentially large and involve significant trial and error based on human expertise and intuition instead of a straightforward approach. These processes are normally very difficult to automate in entirety as there are several intermediate steps where human intervention is needed.
However, even in these processes, significant effects can be achieved based on the Pareto principle:
“For many events, roughly 80% of the effects come from 20% of the causes”
In principle, these means that if a process can be mapped out as a complex tree with multiple branches, there will be only a few branches (20%) that will be explored roughly 80% of the times. If there are 10 possible tasks that a process owner may be required to do based on the results of a previous task, chances are that the process owner will have to execute only 2 of them for roughly 8 times out of the 10 occasions that the process is run.
Whenever we’re presented with more complex projects, finding these areas of Pareto efficiency are critical to improving the project. Targeting these areas and speeding up these parts can have a large impact on the whole process. Using the Pareto principle, reducing the execution time of the commonly executed paths of processes by half can still lead to an overall improvement of nearly 40% over the whole process in the long term.