Microsoft Hyperautomation Demo Use Case 1 of 4 Discovery

Microsoft Hyperautomation Demo Use Case 1 of 4 Discovery MONA MORALES: A large insurance company with a strong center of excellence is using Power Platform and the provided COE toolkit to start a company-wide initiative to improve.

Microsoft Hyperautomation Demo Use Case 1 of 4 Discovery

Business processes and reach five million dollars in total cost savings annually. Employees within the company receive an e-mail sent through the COE toolkit with a link to the innovation backlog to add ideas on inefficiencies.

Within their business processes to help the COE team with discovery. Employees can add an idea to the backlog, including the people that are impacted, the tools that are being used,.

The ROI that will be improved, existing workflows of the process, and the complexity of the process based on various factors. For measuring ROI,.

The innovation backlog provide preset entry forms for estimating dollar impact based on time spent, total number of people, the average wage and estimation of time spent per week..

Once an idea is submitted, it will be ranked by ROI and complexity amongst the other submitted ideas providing COE team with not only a great starting point for which business processes to go after,.

But also some understanding of the ROI as well as the complexity. Once we understand the improvements we want to make, we can start ingesting data and analyze the process. With the innovation backlog,.

I can navigate to Power Automate Process Mining right in the app. We know that many of the top ideas submitted are tied to the overall claims to settlement process. The internal claims management system is.

Where we can find most of the relevant activities. Activities data from that system is being loaded into their company's Azure Data Lake instance as part of their ETL jobs. With Power Automate Process Mining,.

We can easily connect to this data. We can see that in the data lake connector, the activity log files are updated each day and each file corresponds with the activities in that day..

Using the data lake feature, we can easily connect to all of the existing and future activities by connecting directly to the folder where we see the process mining Copilot.

Automatically analyzes the folder and the data structure within the folder to provide us with insights into the data, which is great when you have lots of data in your lake and.

Posts Related:

    You want to ensure that you've

    Selected the right dataset for analysis. Not only can it tell me the process and the relevant activities, but I can also ask.

    Additional questions about the data to understand the columns, the data types of key columns, and if there are any empty values to assist in checking data quality..

    Once we have checked the dataset, I can proceed to the mapping step, where I can either manually map the data to the required fields for process mining or use the mapping suggestions.

    From Copilot based on the discovered process. Copilot really simplifies the typical data validation and mapping steps of process analysis. Once the process is analyzed, we can share the process to not only enable.

    Analysts and solution architects from the COE team to do the analysis, but also share it with the relevant business users. I can also set up scheduled and incremental refresh so that only new data that is added is analyzed,.

    Which improves speed and efficiency of the refresh. A process owner can now navigate to the dashboard automatically created using Power BI and embedded in my process web view, which provides an overview of the process,.

    As well as include an interactive process map. One very useful thing I can do here to get an idea of the overall end-to-end activities of the process is by looking at the top variants. This allows me to verify.

    Not only the count of activities, but also the flow of activities. Looking at the flow of the most frequent variant, I can see that we're still missing the settlement and payment activities.

    At the end of the process, which represents another linked sub process for claim settlement, and it happens to be in the SAP system and not the claims management system..

    While I could use a number of out of

    Box templates that connect SAP for specific financial processes like procure to pay or accounts payable, I actually want to connect the data from.

    My SAP system to my claims processing system. While for other solutions, this would mean that we need to provide updated requirements to data engineers to create.

    New ETL pipelines for the activities I want. In Power Automate Process Mining, we can leverage Power Query and data flows and connect data from multiple sources including SAP ERP.

    In an integrated experience. Using SAP ERP as my connector, I can provide the relevant connection details to my SAP instance, filter for the function I want to use,.

    Which is the RFC read table function. I can use the built-in function parameters to not only get the tables I want, including CDHDR and CDPOS, which are activity tables in SAP..

    But I'll also use the diagram feature in Dataflows to visualize and merge all the tables together to build the activity log I want from SAP without ever having to write a single line of code. Once I've created my SAP Dataflow.

    To extract the settlement and payment activities, I can use Dataflows to merge the output of that Power Query with my existing claims data from ADL, and even add department and resource data from my HR system to calculate costs,.

    Which will come in handy later. I can do all of this with an intuitive UI and create a process log that represents the complete end-to-end claims two settlement process. No other solution offers this level of flexibility..

    Once I've analyzed the combined process log, I can go back to my dashboard to verify that I'm now getting the settlement and payment activities. The process map view is dynamic and changes as I interact with the dashboard..

    I can select a few more variants to see how the process varies and where deviations occur. The dashboard comes with a lot of handy features to give you a good overview of the process,.

    Including the case duration, loop and rework, which is great for understanding process efficiency. Basic information such as how many variants, cases, and activities there are..

    All of these metrics are dynamic and update based on the selections I've made. I also have access to other views, including a larger process map view to give me more real estate to navigate the process map visual..

    As well as a variant view which gives me an efficient view to understand the activities in each variant. Back in summary, I can bring up the filter pane to filter for.

    Specific activities and also change the process map to show the performance or rework displays. I have a good overview of the process through the dashboard..

    And while we have more advanced analytical features, let's first see how our Copilot can help us find insights and provide recommendations for process improvement through natural language. Even as a process mining novice,.

    I can quickly get the top insights from copilot in a conversational manner. It tells me a few important things that it noticed from the process data. Including the fact that assigned handlers.

    Is the longest running activity. And that assigned handlers and coverage check activities have the most rework. Let's dig deeper into why. Copilot is not only useful for.

    Providing information and insights on the process, but it also provides teaching moments where I can understand concepts such as identifying bottlenecks without having to leave the experience and go to documentation or reference..

    We can ask Copilot to try and identify the bottleneck, and it tells us about variant 12, which is the longest running variant in the process. Let's take a look at this variant. Wow, it looks like in some cases,.

    After the handler is assigned, nothing happens for a long time. The process seems to reset and goes back to the coverage check step after a day or two. Something's breaking down with the assigned handler step.

    That is causing rework as well as prolonging the duration of the overall process. This is a great insight that we were able to uncover. Even as an expert Copilot can quickly guide me to these insights without a lot of manual analysis..

    Not only can Copilot, help me uncover insights and identify bottlenecks. It can also provide suggestions that help address these bottlenecks. Asking Copilot to address.

    My bottleneck results in two recommendations. One is Power Automate flow, which automates activities like policy and coverage check. The second is a companion.

    Power App that allows customer service reps to assign handlers more efficiently based on their caseload and relevant schedule information, like upcoming vacations. This management app can act as.

    The primary interface to manage claims assignment. This is a great recommendation and really addresses the bottleneck that was uncovered before. But we can take it a step further. And clicking on the suggestion takes us to.

    The Copilot experience in Power Apps, where we can use copilot capabilities to generate the scaffolding of an app based on the suggestions from process mining, the app will contain information about.

    The availability and the current caseload of the handler. Not only did Copilot create an app interface for me to use as a starting point in the background, it also created an actual table and data verse just based on.

    The natural language we provided it. Now all we need to do is connect the table to the underlying system. Copilot really simplifies the entire chain from process discovery to deriving.

    DISCLAIMER: In this description contains affiliate links, which means that if you click on one of the product links, I'll receive a small commission. This helps support the channel and allows us to continuetomake videos like this. All Content Responsibility lies with the Channel Producer. For Download, see The Author's channel. The content of this Post was transcribed from the Channel: https://www.youtube.com/watch?v=M1DRGukmiRw
Previous Post Next Post