On October 19, I gave an AIIM webinar presentation, “5 Ways Process Mining Makes Your Company Run Better.” Below you’ll find the questions and answers from the webinar, both the questions I had time to answer during the session and those I didn’t.
You can watch the replay of the webinar below. I covered the basics of process mining and shared information Doculabs has learned in our client engagements. Here’s what I covered (from the webinar description):
“Execution gaps in business processes increase costs and can contribute to employee burnout and poor customer service. Process mining tools extract data from your existing system’s event logs. This allows you to see what employees and connected systems are actually doing and to see the real process, not the one created on a whiteboard in a conference room a decade ago.
With process mining, you can quickly identify the execution gaps, bottlenecks, and rework that continually inhibit efficiency, service levels, and profits.
In this webinar, Doculabs’ Executive Vice President, Marty Pavlik, will quickly explain process mining and where it fits in your business’ IT stack before explaining how it can be used to make operations more efficient. He’ll illustrate the power of process mining by discussing procurement processes.”
Here’s what the attendees wanted to know.
Q1: Tools don’t always get used necessarily the right way or things don't get set up right. What are some mistakes or some things for people to look out for – the less than positive things – that people could try to avoid as they go to use a process mining tool?
Pavlik: I think the first mistake we hear a lot is we don't have the data. In the last few years the notion of data lakes, business intelligence tools, and machine learning/artificial intelligence created this – I would almost call it a myth – that you need a lot of data to get data Insights.
With process mining you don't need that much data. As I mentioned earlier, if you just want a traditional process map we really need just those three data elements. If you want to start doing analytics you often have a lot of that data in your ERP system or any process and workflow system you have.
Let's start there. Let’s not try to boil the ocean. Let's start with the small data set. We've had one client where we couldn't build out the process. We even had one client where we created – we interpreted – what their activity codes were because they didn't have them within the system. We did that just based on some simple information.
The second mistake I see a clients make is they look at it as only a tool to identify ways to automate a process. This tool gets mentioned a lot with RPA. Although it is good to identify automation pinpoints it's a better tool to really execute your business processes in. More importantly, you can monitor those improvements to make sure you're making the business impact you want.
Q2: Does process mining work in just one repository, one system? How does it work in connecting the multiple systems for doing this type of analysis?
Pavlik: If you think about it, it's an API. So you could connect as many systems as you want in order for that data to be connected. For the most part it's really just a feed like any other type of software-as-a-service.
Q3: Do you see challenges in accessing, transforming, and normalizing event log data from multiple systems of record for that discovery analysis and enhancement you do?
Pavlik: I'll give you an example. One client wanted to bring their OCR data logs into the system. They wanted to determine the long-term impact of OCR identifying incorrectly. When we did the initial process map we not only were able to identify what transactions were coming through the OCR engine, but, more importantly, we determined just off of the existing data what transactions were causing the rework as well as the customers causing the rework.
Then we were able to go back and modify what changes needed to be made in their OCR system – particularly what data elements needed to have better identification of the images. We were going to make those changes without bringing in that OCR data because it was just too much data – more effort than kind of the payback. We were able to make that analytical solution without actually bringing in the data. It’s really making sure you're not bringing in data just to bring in data. You need to bring in data because it's going to add value and in this that particular case we didn't need.
Q4: If it's accessing the event logs, how would a process mining tool pick up people actions? Someone is asking specifically things like phone calls or tasks that don't occur in the system or I'm also thinking if it's not like a case management or a type of system that's designed to capture those transactions. Thoughts about how to capture or connect that?
Pavlik: You’ll see it from an offline schedule. You’ll know whether it's the reconciliation file or just the kind of collaboration with other finance departments. What we always say is first you'll use the traditional process mining to identify what part of that transaction is taking the longest.
So if you know Step A to B is taking three days and then you have two ways to really identify what's happening between step A and B. You could do it the old-fashioned way through interviews to understand what's going on there and the steps they’re taking. Or there is another technology that a lot of process mining tools have that's called task mining.
That is recording somebody on their desktop to understand what they're doing and what systems they're going to. Are they pasting things from Word to Excel? Are they going into another system? In that particular instance, we always recommend starting with the traditional interview. If you need more details after that, then you can go with the more modern task mining approach.
For example, we did some work in the financial crimes department of a bank. They need to do investigations across websites, investigating systems, credit reports . . . things of that nature. In that particular instance, interviewing really wouldn't have given us the data we needed to understand the process and determine how we could most effectively impact the process.
Q5: How do people talk about this to their bosses to get executive buy-in?
Pavlik: The first thing is to talk about the cost and complexity. People look at new technologies and assume things. Take RPA – robotics, robotics is going to be so expensive, right? But it’s not always true. For process mining it's not a big technical lift. That means you don't need to bring in a lot of data, so a proof of concept really takes week, two weeks, to show you that we could actually build out a process map.
Secondly you really have to make sure when you're conveying to executives why this is so important is to have that business case. One of the great things about process mining and when we do business cases is we ultimately know the output of what we're going to get from the data.
So, for example, earlier I said Maverick spend. Just from understanding the amount of transactions this client had – their spend categories – we were able to put together an estimate of what that business case was. What's great about it is then with the analytics from process mining we want to prove it out. In fact we actually found more of value than an initial business case. It was great. We went back to the original business case and we said, “Look, this tool works so great not only did we prove out that that we could capture this value, we found more.”
Q6: Which vendors have provided process mining tools?
Pavlik: It’s a crowded marketplace, but here are the leaders: Celonis, ABBYY, UiPath, IBM.
Q7: How does process mining pick up the "people" actions like phone calls or tasks that do not occur in a system?
Pavlik: There are 2 ways to understand/monitor a process:
Process mining – time stamped identity code and building out a map
Task mining – records a person’s activity on their desktop –
There is also a 3rd way – such a thing as voice recognition – typically done in a call center environment.
Q8: What challenges do you see in accessing, transforming, and normalizing event log data from multiple systems of record for process discovery, analysis and enhancement? Do you need an ETL to load event log data to the process mining platform?
Pavlik: The biggest challenge is you need to find a unique identifier to link systems together in order to get a full end to end process view. And ETL is not absolutely necessary to load event log data into the process mining platform; you could use a CSV file.
See Q3 above for more about event log data.
Q9: Can you advise as to who is the target buyer for process mining? Center of Excellence? Line of Business? Lean Six Sigma?
Pavlik: All three - we do find the majority come from an automation or process group or innovative leaders.
What questions do you have?
Email me at mpavlik@doculabs.com or tell us how we can help you with your process problems by clicking on the "Contact us" button on this page.
All of us at Doculabs look forward to answering any questions you have.
Photo credit: Photo by Camylla Battani on Unsplash