• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Monitoring the Power Platform: Model Driven Apps – Monitor Tool Part 2: Session Consumption and Analytics

August 15, 2020   Microsoft Dynamics CRM

Summary

 

Monitoring Dynamics 365 or Model Driven Applications is not a new concept. Understanding where services are failing, users are running into errors, where form and business processes could be tuned for performance are key drivers for most if not all businesses, from small companies to enterprises. Luckily, the Dynamics 365 platform provides many tools to help audit and monitor business and operational events.

This article will cover collecting, querying and analyzing user interface events, specifically from the recently announced Monitor Tool for Model Driven Apps. The previous article covered message data points and how to perceive them. In this go round, we will have a little fun exploring ways to utilize the output sessions. We will discuss how to build robust work items in Azure DevOps with Monitor output. We’ll look at consuming and storing outputs for visualizations and analytics with Kusto queries. Finally, samples will be provided to parse and load session messages into Azure Application Insights and Azure Log Analytics.

The Monitor Tool

 

The Monitor Tool allows users and team members to collect messages and work together in debugging sessions. To begin, the Monitor Tool can be launched from the Power Apps Maker Portal. Once launched, the Play Model Driven App button can be pressed to begin a session attached to the tool.

AUTHOR’S NOTE: Click on each image to enlarge for more detail

0647.pastedimage1597357184199v1 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

The Monitor Tool can also be started by adding “&monitor=true” to the URL of your Model Driven Application.

After consenting or allowing to start a session, the Monitor Tool will light up rapidly with various messages. Similar to the “Canvas Driven Apps – The Monitoring Tool” article, each row can be further drilled into for investigation.

8712.pastedimage1597357184200v2 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Jesse Parsons’ article on the Monitor Tool, titled ‘Monitor now supports model-driven apps‘ provides a through deep dive including sample scenarios. I highly suggest reviewing and keeping close by for reference.

Thoughts on Canvas Apps

 

The Monitor tool works with Power Apps Canvas Driven Apps as shown in the article “Canvas Driven Apps – The Monitoring Tool“. While this article is focused on Model Driven Apps, remember these techniques can also be utilized to serve Canvas Apps as well.

Consuming Monitor Sessions

 

Each time the Monitor tool is opened, a new session is created. Within each session are events that describe actions taken within the session as well as other helpful messages. Storing these sessions allow support teams to better understand errors and issues that arise during testing and production workloads. The previous article, “Monitor Tool Part 1: Messages and Scenarios“ covers scenarios that support users can use to better understand the data points delivered in the Monitor Tool.

The Monitor tool can also help analysts who want to learn more about the platform. For instance, user tendencies such as how long they spent on a page and which controls they interacted with in Canvas Driven Apps. For testing, the tool can help with non functional test strategies like A/B testing. Analyzing performance messages can point to potential code coverage gaps or advise on user experience impact. Network calls can be securitized to determine if queries can be optimized or web resources minified. The Monitor tool, in my opinion, really can open up a new view on how the platform is consumed and how users react with it.

Attaching to Azure DevOps Work Items

 

The Monitor Tool download artifacts work nicely with Azure DevOps Work Items. They can be attached to Bugs, Tasks and even Test Cases when performing exploratory or other types of tests.

1033.pastedimage1597357184200v3 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Working with test cases within Azure DevOps, Analysts craft work items with use cases and expected outcomes to deliver to Makers and Developers. Specifically with Test Cases, Quality Analysts can leverage the Monitor Tool in conjunction with the Test and Feedback Browser Extension. This allows for robust test cases complete with steps, screenshots, client information and the Monitor Tool output attached. Consider the gif below showing an example of using both the browser extension and the Monitor tool output.

AttachMonitorOutputToDevOpsBug Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

In that example, we see an analyst has found a performance issue with a Dynamics 365 form. The analyst logged a new bug, included annotations and screenshots and the Monitor tool output. A developer can be assigned, pick this up and begin working on the bug. By having the Monitor tool output the developer can now see each call made and review the Attributions within the respective KPI. For more information, refer to the Attribution section within Part 1.

0020.pastedimage1597357184200v5 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Storing Monitoring Sessions

 

The Monitor tool output comes in two flavors: CSV and JSON. Both make for light weight storage and are fairly easy to parse as shown later. These files can be attached to emails or stored in a shared location like a network drive.

Power BI and CSV Files

 

The csv files downloaded from the Monitor tool can be added Azure Blob Storage or stored locally and displayed in a Power BI Dashboard. This allows for analysts and support teams to drill down into sessions to gain further insights. The csv files can work both locally with Power BI Desktop and online. The below image shows a sample taken from a Canvas App Monitor Session. Additional information and samples can be found in the “Canvas Driven Apps – The Monitoring Tool” article.

7462.pastedimage1597357184201v6 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Experimenting with Extraction for Analytics

 

Storing outputs in Azure Blob Storage

 

During this writing and the writing of the Monitor Tool for Canvas Apps, I began to collect outputs from both and storing within Azure Blob Storage containers. There are multiple reasons why I chose to utilize Azure Blob Storage, mainly cost but also interoperability with other services such as Power BI, Azure Data Lake and Azure Event Grid.

Azure Blob Storage also integrates very well with both Azure Logic Apps, Azure Functions and Power Automate Flows. Each of these include a triggering mechanism on a new blob added to a container, working as a sort of drop folder. This may the choice to use Azure Blob Storage easy for me but I will also point out that specifically Power Automate Flows also can be triggered from OneDrive or SharePoint. This allows Makers to stay within the Microsoft 365 ecosphere and avoid spinning up multiple Azure services if desired.

Extracting to Log Stores

 

Extracting the messages with in the Monitor tool to a log store allows for analysts and support teams to parse and query the sessions. Determining how we want to store these messages will determine what services we leverage.

Choosing Azure Application Insights

 

If we want distributed transaction tracing I’d suggest Azure Application Insights. Application Insights will allow for pushing messages to specialized tables that feed dashboards and features native to the service such as End to End Transactions and Exception parsing.

Network messages can be stored in the requests or dependencies tables, which are designed, along with page views, to visual a typical web application’s interactions. Fetch network messages, representing calls to an API, fit nicely into the requests table as shown below:

8358.pastedimage1597357184201v7 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Dependencies on the other hand can represent dependent web resources.

8524.pastedimage1597357184201v8 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Using Azure Function to serve data

 

Azure Application Insights works well with microservices built using Azure Functions. A benefit of Azure Functions is the ability to have a function fire on create of a blob within a container. For more information and a helpful quick start to working with Blob triggered Azure Functions, check out this reference. Below is the method signature of a sample Azure Function:

[FunctionName("BlobTriggeredMonitorToolToApplicationInsights")]
public void Run([BlobTrigger("powerapps-monitortool-outputs/{name}", Connection = "")]Stream myBlob, string name, ILogger log)

In the sample provided you’ll see that the function takes the JSON payload, parses it and determines how to add to Azure Application Insights. Depending on the messageCategory property, it will funnel messages to custom events, requests or dependencies.

As always, review the Telemetry Client for ideas and techniques to enrich messages sent to Azure Application Insights. Also, if desired, review how to Sample messages to reduce noise and keep cost down.

Choosing Azure Log Analytics

 

Azure Log Analytics allow for custom tables that provide the greatest flexibility. The Data Collector API has a native connector to Power Automate that allows Makers to quickly deliver Monitor messages with a no or low code solution. Both Power Automate and Azure Logic Apps both offer triggers on create of a blob providing flexibility on which service to choose.

Using Power Automate to serve data

 

To work with Power Automate, begin by creating a new Power Automate flow. Set the trigger type to use the Azure Blob Storage trigger action “When a blob is added or modified“. If a connection hasn’t been established, create a connection. Once created locate the blob container to monitor. This container will be our drop folder.

The trigger is only designed to tell us a new blob is available, so the next step is to get the blob content. Using the FileLocator property we can now get the serialized messages and session information and deliver to Log Analytics.

Within Power Automate, search for the “Send Data” action. Set the JSON Request body field to be the File Content value from the “Get blob content” action.

7510.pastedimage1597357184201v9 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

The advantage here is with three actions and no code written I am able to listen to a drop folder for output files and send to Azure Log Analytics. The native JSON serialization option from the Monitor tool really serves us well here, allowing a seamless insertion into our custom table.

1680.pastedimage1597357184201v10 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Ideally we would expand the Power Automate flow to parse the JSON and iterate through messages to allow for individual entries into the table.

0211.pastedimage1597357184202v11 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Just remember the content maybe encoded to “octet-stream” and will need to be converted.

Sample Kusto Queries

 

Below are sample Kusto queries designed for Azure Application Insights.

General Messages

//Review Performance Messages
customEvents 
| extend cd=parse_json(customDimensions)
| where cd.messageCategory == "Performance"
| project session_Id, name, cd.dataSource

Browser Requests

//Request Method, ResultCode, Duration and Sync
requests 
| extend cd=parse_json(customDimensions)
| extend data=parse_json(tostring(cd.data))
| project session_Id, name, data.method, resultCode, data.name, data.duration, data.sync, cd.fileName

//Request Method, ResultCode, Duration and Resource Timings
requests 
| extend cd=parse_json(customDimensions)
| extend data=parse_json(tostring(cd.data))
| project session_Id, name, data.method, resultCode, data.name, data.duration,
data.startTime, 
data.fetchStart,
data.domainLookupStart,
data.connectStart,
data.requestStart,
data.responseStart,
data.responseEnd

Review the documentation on Resource Timings located here to better understand what these markers are derived from.

3301.pastedimage1597357184202v12 Monitoring the Power Platform: Model Driven Apps   Monitor Tool Part 2: Session Consumption and Analytics

Key Performance Indicators

pageViews 
| extend cd=parse_json(customDimensions)
| extend cm=parse_json(customMeasurements)
| extend data=parse_json(tostring(cd.data))
| extend attribution=parse_json(tostring(data.Attribution))
| where name=="FullLoad"
| order by tostring(data.FirstInteractionTime), toint(cm.duration)
| project session_Id, name, data.FirstInteractionTime,cm.duration, attribution

Sample Code

 

Azure Function and Azure Application Insights – Monitor Tool Extractor

Power Automate and Azure Log Analytics

Optional Azure Application Insights Custom Connector

Next Steps

 

In this article we have covered how to work with the Monitor tool output files. Viewing within Power BI Dashboards, attaching to DevOps work items and storing in Azure backed log stores are all possibilities. Included sample code and Kusto queries to help you get started have also been provided.

This article showcases use cases and strategies for working with the Monitor tool but really only represents the tip of the iceberg. Continue collecting, examining and churning the output for deep insight into user and platform trends.

If you are interested in learning more about specialized guidance and training for monitoring or other areas of the Power Platform, which includes a monitoring workshop, please contact your Technical Account Manager or Microsoft representative for further details.

Your feedback is extremely valuable so please leave a comment below and I’ll be happy to help where I can! Also, if you find any inconsistencies, omissions or have suggestions, please go here to submit a new issue.

Index

 

Monitoring the Power Platform: Introduction and Index

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

analytics, apps, Consumption, Driven, model, Monitor, monitoring, Part, Platform, Power, Session, tool
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited