Monitoring the Power Platform: Model Driven Apps – Monitor Tool Part 2: Session Consumption and Analytics
Summary
Monitoring Dynamics 365 or Model Driven Applications is not a new concept. Understanding where services are failing, users are running into errors, where form and business processes could be tuned for performance are key drivers for most if not all businesses, from small companies to enterprises. Luckily, the Dynamics 365 platform provides many tools to help audit and monitor business and operational events.
This article will cover collecting, querying and analyzing user interface events, specifically from the . Azure DevOps with Monitor output. We’ll look at consuming and storing outputs for visualizations and analytics with Kusto queries. Finally, samples will be provided to parse and load session messages into Azure Application Insights and Azure Log Analytics.
The Monitor Tool
The Monitor Tool allows users and team members to collect messages and work together in debugging sessions. To begin, the Monitor Tool can be launched from the Power Apps Maker Portal. Once launched, the Play Model Driven App button can be pressed to begin a session attached to the tool.
AUTHOR’S NOTE: Click on each image to enlarge for more detail
The Monitor Tool can also be started by adding “&monitor=true” to the URL of your Model Driven Application.
After consenting or allowing to start a session, the Monitor Tool will light up rapidly with various messages. Similar to , each row can be further drilled into for investigation.
article on the Monitor Tool, titled provides a through deep dive including sample scenarios. I highly suggest reviewing and keeping close by for reference.
Thoughts on Canvas Apps
The Monitor tool works with Power Apps Canvas Driven Apps as shown in While this article is focused on Model Driven Apps, remember these techniques can also be utilized to serve Canvas Apps as well.
Consuming Monitor Sessions
Each time the Monitor tool is opened, a new session is created. Within each session are events that describe actions taken within the session as well as other helpful messages. Storing these sessions allow support teams to better understand errors and issues that arise during testing and production workloads. The previous article, covers scenarios that support users can use to better understand the data points delivered in the Monitor Tool.
The Monitor tool can also help analysts who want to learn more about the platform. For instance, user tendencies such as how long they spent on a page and which controls they interacted with in Canvas Driven Apps. For testing, the tool can help with non functional test strategies like A/B testing. Analyzing performance messages can point to potential code coverage gaps or advise on user experience impact. Network calls can be securitized to determine if queries can be optimized or web resources minified. The Monitor tool, in my opinion, really can open up a new view on how the platform is consumed and how users react with it.
Attaching to Azure DevOps Work Items
The Monitor Tool download artifacts work nicely with Azure DevOps Work Items. They can be attached to Bugs, Tasks and even Test Cases when performing exploratory or other types of tests.
Working with test cases within Azure DevOps, Analysts craft work items with use cases and expected outcomes to deliver to Makers and Developers. Specifically with Test Cases, Quality Analysts can leverage the Monitor Tool in conjunction with the . This allows for robust test cases complete with steps, screenshots, client information and the Monitor Tool output attached. Consider the gif below showing an example of using both the browser extension and the Monitor tool output.
In that example, we see an analyst has found a performance issue with a Dynamics 365 form. The analyst logged a new bug, included annotations and screenshots and the Monitor tool output. A developer can be assigned, pick this up and begin working on the bug. By having the Monitor tool output the developer can now see each call made and review the Attributions within the respective KPI. For more information, refer to the Attribution section within .
Storing Monitoring Sessions
The Monitor tool output comes in two flavors: CSV and JSON. Both make for light weight storage and are fairly easy to parse as shown later. These files can be attached to emails or stored in a shared location like a network drive.
Power BI and CSV Files
The csv files downloaded from the Monitor tool can be added Azure Blob Storage or stored locally and displayed in a Power BI Dashboard. This allows for analysts and support teams to drill down into sessions to gain further insights. The csv files can work both locally with Power BI Desktop and online. The below image shows a sample taken from a Canvas App Monitor Session. Additional information and samples can be found in
Experimenting with Extraction for Analytics
Storing outputs in Azure Blob Storage
During this writing and the writing of the , I began to collect outputs from both and storing within Azure Blob Storage containers. There are multiple reasons why I chose to utilize Azure Blob Storage, mainly cost but also interoperability with other services such as Power BI, Azure Data Lake and Azure Event Grid.
Azure Blob Storage also integrates very well with both , and . Each of these include a triggering mechanism on a new blob added to a container, working as a sort of drop folder. This may the choice to use Azure Blob Storage easy for me but I will also point out that specifically Power Automate Flows also can be triggered from or . This allows Makers to stay within the Microsoft 365 ecosphere and avoid spinning up multiple Azure services if desired.
Extracting to Log Stores
Extracting the messages with in the Monitor tool to a log store allows for analysts and support teams to parse and query the sessions. Determining how we want to store these messages will determine what services we leverage.
Choosing Azure Application Insights
If we want I’d suggest Azure Application Insights. Application Insights will allow for pushing messages to specialized tables that feed dashboards and features native to the service such as End to End Transactions and Exception parsing.
Network messages can be stored in the requests or dependencies tables, which are designed, along with page views, to visual a typical web application’s interactions. Fetch network messages, representing calls to an API, fit nicely into the requests table as shown below:
Dependencies on the other hand can represent dependent web resources.
Using Azure Function to serve data
Azure Application Insights works well with microservices built using Azure Functions. A benefit of Azure Functions is the ability to . For more information and a helpful . Below is the method signature of a sample Azure Function:
[FunctionName("BlobTriggeredMonitorToolToApplicationInsights")] public void Run([BlobTrigger("powerapps-monitortool-outputs/{name}", Connection = "")]Stream myBlob, string name, ILogger log)
you’ll see that the function takes the JSON payload, parses it and determines how to add to Azure Application Insights. Depending on the messageCategory property, it will funnel messages to custom events, requests or dependencies.
As always, review the Telemetry Client for ideas and techniques to enrich messages sent to Azure Application Insights. Also, if desired, review .
Choosing Azure Log Analytics
Azure Log Analytics allow for custom tables that provide the greatest flexibility. The Data Collector API has a native connector to Power Automate that allows Makers to quickly deliver Monitor messages with a no or low code solution. Both Power Automate and Azure Logic Apps both offer triggers on create of a blob providing flexibility on which service to choose.
Using Power Automate to serve data
To work with Power Automate, begin by creating a new Power Automate flow. Set the trigger type to use the Azure Blob Storage trigger action ““. If a connection hasn’t been established, create a connection. Once created locate the blob container to monitor. This container will be our drop folder.
The trigger is only designed to tell us a new blob is available, so the next step is to get the blob content. Using the FileLocator property we can now get the serialized messages and session information and deliver to Log Analytics.
Within Power Automate, search for the “” action. Set the JSON Request body field to be the File Content value from the “” action.
The advantage here is with three actions and no code written I am able to listen to a drop folder for output files and send to Azure Log Analytics. The native JSON serialization option from the Monitor tool really serves us well here, allowing a seamless insertion into our custom table.
Ideally we would expand the Power Automate flow to parse the JSON and iterate through messages to allow for individual entries into the table.
Just remember the content maybe encoded to “octet-stream” and will need to be converted.
Sample Kusto Queries
Below are sample Kusto queries designed for Azure Application Insights.
General Messages
//Review Performance Messages customEvents | extend cd=parse_json(customDimensions) | where cd.messageCategory == "Performance" | project session_Id, name, cd.dataSource
Browser Requests
//Request Method, ResultCode, Duration and Sync requests | extend cd=parse_json(customDimensions) | extend data=parse_json(tostring(cd.data)) | project session_Id, name, data.method, resultCode, data.name, data.duration, data.sync, cd.fileName
//Request Method, ResultCode, Duration and Resource Timings requests | extend cd=parse_json(customDimensions) | extend data=parse_json(tostring(cd.data)) | project session_Id, name, data.method, resultCode, data.name, data.duration, data.startTime, data.fetchStart, data.domainLookupStart, data.connectStart, data.requestStart, data.responseStart, data.responseEnd
Review the to better understand what these markers are derived from.
Key Performance Indicators
pageViews | extend cd=parse_json(customDimensions) | extend cm=parse_json(customMeasurements) | extend data=parse_json(tostring(cd.data)) | extend attribution=parse_json(tostring(data.Attribution)) | where name=="FullLoad" | order by tostring(data.FirstInteractionTime), toint(cm.duration) | project session_Id, name, data.FirstInteractionTime,cm.duration, attribution
Sample Code
Next Steps
In this article we have covered how to work with the Monitor tool output files. Viewing within Power BI Dashboards, attaching to DevOps work items and storing in Azure backed log stores are all possibilities. Included and Kusto queries to help you get started have also been provided.
This article showcases use cases and strategies for working with the Monitor tool but really only represents the tip of the iceberg. Continue collecting, examining and churning the output for deep insight into user and platform trends.
If you are interested in learning more about , which includes a , please contact your Technical Account Manager or Microsoft representative for further details.
Your feedback is extremely valuable so please leave a comment below and I’ll be happy to help where I can! Also, if you find any inconsistencies, omissions or have suggestions, .
Index