Categories
Azure Azure Monitor DevOps Dynamics 365 Log Analytics Power Automate Power Platform

Monitoring and Alerting for Power Automate Flows

How can Power Automate Flows fit into your Enterprise monitoring and alerting strategy using Azure Monitor?

Power Automate Flows are being touted as the replacement for Asynchronous workflows in the Common Data Service / Dynamics 365 and as they continue to mature and make more and more sense as our go to for automation, the one question that kept coming up was how they can fit into our Enterprise monitoring and alerting strategy.

On a number of projects recently we have used Azure Log Analytics workspaces and Azure Monitor to provide our monitoring and alerting across all of our applications and services within the enterprise, however errors and failures of our Flows in Power Automate have thus far been limited to the Run history and Dashboards in the Power Automate Admin Center, which is rather reactive.

So… I set out on a mission to enable a more pro-active approach with the aim of bringing the failed Flow data into our Azure Monitor paradigm.

To make this work I needed 2 things :

  • A way to programmatically access the Flow Run data (and specifically the failures)
  • A way to programmatically put that data into an Azure Log Analytics workspace

As it turns out there are some Preview features available in both those areas which make pretty light work of the requirements

First off is the new PowerShell support for Power Apps (for Admins and Makers). This gives us a great set of new tools and in particular 2 commands that we need Get-AdminFlow (which as an admin gets a list of all the flows we have access to across all environments in our tenant) and Get-FlowRun (this is a maker command that gives us all of the Flow run details for a particular flow)

Secondly is the preview of Azure Monitors’ HTTP Data Collector API which allows us to push a payload of data into a Custom Log in an Azure Log Analytics workspace.

All that was required was to stitch these two components together and create a mechanism that could automatically trigger the updates.

I created the this PowerShell script which does all the work. It takes the following Input Parameters :

  • CustomerID – This is the Workspace ID for your Log Analytics workspace (found under Advanced Settings in the Azure Portal)
  • SharedKey – This is the Primary Key for your Log Analytics workspace
  • LogType – The name you want to use for the Custom Log in the Log Analytics workspace (this will be created for you)
  • Username – The username for your CDS Tenant admin
  • Password – The password for your CDS Tenant admin
  • Environment – A filter for Environment names (leaving blank will iterate all environments the User has access too)
  • HoursSinceLastCheck – Filters the results to the last x hours
Param(
    [string] [Parameter(Mandatory = $false)] $CustomerID = "",  # Replace with your Workspace ID
    [string] [Parameter(Mandatory = $false)] $SharedKey = "", # Replace with your Primary Key
    [string] [Parameter(Mandatory = $false)] $LogType = "FlowFailures", # Specify the name of the record type that you'll be creating
    [string] [Parameter(Mandatory = $false)] $Username = "",
    [string] [Parameter(Mandatory = $false)] $Password = "",
    [string] [Parameter(Mandatory = $false)] $EnvironmentName = "",
    [int32] [Parameter(Mandatory = $false)] $HoursSinceLastCheck = 1
)

I then created an Azure Function App using version 1 (which supports Windows PowerShell) to run this on a 5 minute timer.

3 replies on “Monitoring and Alerting for Power Automate Flows”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.