Tag Archives: usage
Dynamics 365 Customer Engagement View Usage Logger using Azure Functions and Application Insights
I recently received the same request from two customers, so I felt maybe it might be a good topic to discuss here so others can take advantage of it as well. The request was as follows: The customers wanted a way to track active usage of the Views in their system to find out which ones actually got used. They can use this information to deactivate unused Views, and consolidate their list of views for each entity to only the ones needed by their users.
In order to help accomplish this goal, I’m going to use an asynchronous Service Bus plugin registered on the Retrieve message for the SavedQuery entity. This will tell us every time we retrieve a view definition, which should only happen when a user clicks a view from a view picker or through advanced find. There will also be times when the view definition has already been retrieved and is cached locally, so we’ll essentially be tracking “cold loads” of Views, or the first time they are retrieved in a browser session per user.
This article will have a very similar alternative that I created for customers who prefer Log Analytics to Application Insights. The alternative uses a Logic App in Azure to grab the message from the Service Bus Queue and push the data to log analytics.
Summary
Goal:
Identify views with the most traffic/requests, so that other unused views can be deleted and highly used ones can be optimized.
Process:
- Register Service Endpoint message on Retrieve of Saved Query entity in CRM. This will asynchronously post the execution context containing the view data to a Service Bus Queue/Topic, where it can be retrieved by a Logic App.
- The Logic App will parse out the relevant data (Entity Name, View Name) from the execution context, and pass to an Azure Function which will insert it into an Application Insights Tenant where it is logged and can be reported on.
Prerequisites:
- Service Bus Queue created in an Azure subscription, need the connection string for step 2b.
Details
Steps
- Create Service Bus Queue or Topic
-
Register Service Endpoint in the CRM Plugin Registration Tool
- Register->New Service Endpoint
- Paste in a Connection string retrieved from the Azure Portal
- On the next screen, Change the Message type from .Net Binary to JSON, Enter the Queue or Topic Name
- Click OK
-
Attach a message processing step to the new service endpoint in the Plugin Registration Tool
- Register->New Step
- In Message, enter Retrieve
- In Primary Entity, enter savedquery
- Change Execution Mode to Asynchronous
- Click Register
-
Create an Azure Function App to help translate the JSON from the plugin
- In the Azure Portal, click New->Serverless Function App
- Give the App a unique name, Resource Group, Storage Account
- Click Create
- Click the +/Add button, add a new HTTPTrigger function
-
Use this code for your function:
#r “Newtonsoft.Json”
using System.Net;
using System;
using Newtonsoft.Json;
using System.Collections.Generic;
using Microsoft.ApplicationInsights;
private static TelemetryClient telemetry = new TelemetryClient();
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
dynamic data = await req.Content.ReadAsAsync<object>();
log.Info(data.CorrelationId.ToString());
////////////////////////////////////////////////////////////////////
//log as much additional information from CRM as we can for auditing
//we can get CorrelationId, which ties directly back to the plugin
//execution and is also useful for Microsoft support to have
//UserId could also be helpful so you can tie a view retrieve directly
//back to a user in case you want to find out why they use that particular view
//giving a static Operation Name string will allow you to quickly filter
//down results to this type of operation if your Application Insights instance is heavily used
////////////////////////////////////////////////////////////////////
telemetry.Context.Operation.Id = data.CorrelationId.ToString();
telemetry.Context.User.Id = data.UserId.ToString();
telemetry.Context.Operation.Name = “View Accessed”;
string target = data.Target.ToString();
KeyValuePair<string,object>[] entity = JsonConvert.DeserializeObject<KeyValuePair<string,object>[]>(target);
List<KeyValuePair<string,object>> entList = entity.ToList<KeyValuePair<string,object>>();
Dictionary<string,object> entDict = entList.ToDictionary(k=>k.Key,v=>v.Value);
string newJson = JsonConvert.SerializeObject(entDict);
telemetry.TrackEvent(entDict[“returnedtypecode”].ToString() + ” – ” + entDict[“name”].ToString());
return req.CreateResponse(HttpStatusCode.OK, newJson);
}
- Create a new file in your project by expanding View files on the right, click Add, name the file project.json
-
Open project.json and add this code:
{
“frameworks”: {
“net46″:{
“dependencies”: {
“Microsoft.ApplicationInsights”: “2.2.0”
}
}
}
}
- The above code will tell the Azure function to download a nuget package for Application Insights.
-
Now we can start to test the functionality, to start, login to CRM, Navigate to an entity, change the view
- You can monitor the console in your Function App to see if any errors occur
-
Start reviewing results in Application Insights
- In the Azure portal, find Azure Functions and choose the Function App you created for this exercise.
-
Click Application Insights
- From here you can click Analytics (small button in the ribbon), then click the + new tab button
- Intellisense is very good so as you keep typing you can tab to complete your entries
- Here is a sample query to display the top views in order in a bar graph format:
customEvents
| where timestamp >= ago(30d)
| project name
| summarize count() by name
| order by count_ desc nulls last
| where count_ > 2
| render barchart
- The first line is the “table” name if you were comparing this query to a SQL query
- The next lines all begin with a pipe (|) operator which is just syntax, after that more querying keywords are specified. “where” is just like SQL, specifying a filter clause
- |project col1,col2,col3 specifies the columns to retrieve, like a “select” in sql. Omitting the project line is fine to retrieve all columns
- Comment lines out with // to try omitting various lines
- Functions help with dynamic time operations, like the ago(30d) function to only look back 30 days of logs, you can also use “m” for minutes “h” for hours, “d” for days
- |where count_ > 2 tells the query to forget about the views that only have 1 or 2 views and filter these out
- |summarize is the group by operator equivalent. In summarize you can use aggregates like count() max() avg(), followed by the keyword “by” which specifies columns to group on.
- |render barchart makes the output a graphical format, omitting makes it a table.
- Here is a sample output:
Visual Studio Console Application Development and Usage

In this blog, we will describe how the native client application in Dynamics 365 for Finance and Operations can be created to test custom services. When we talk about services, there are two types:
For any service we create in Dynamics 365 for Finance and Operations, we need to have authentications.
Here are the two types of authentications that are supported:
- Authorization Code Grant Flow: In this Authentication type, Azure Active Directory (Azure AD) uses OAuth 2.0 to enable users to authorize access to web applications and web APIs in a user’s Azure AD tenant. This guide is language independent and describes how to send and receive HTTP messages without using any of our open-source libraries. For more details, click this link.
- Service to service calls using client credentials (shared secret or certificate): For this Authentication type, the OAuth 2.0 Client Credentials Grant Flow permits a web service (confidential client) to use its own credentials instead of impersonating a user, to authenticate when calling another web service. In this scenario, the client is typically a middle-tier web service, a daemon service, or web site. For a higher level of assurance, Azure AD also allows the calling service to use a certificate (instead of a shared secret) as a credential. For more details click this link.
When testing Custom services using client applications, Microsoft Azure Active Directory (AAD) supports these two kinds of applications:
- Native client application: This application uses a user name and password for authentication and authorization.
- Web application (Confidential client): A confidential client is an application that can keep a client password confidential to the world. The authorization server assigned this client password to the client application.
The diagram below describes how authorization must be configured.
Console Application
Console application (app) is a program developed in Visual Studio which accepts input parameter, calls the required service, runs business logic and sends output to the console, this console is known as the command prompt.
Console Application Development Steps
The console app development can be divided into three main parts: register the native application in Azure, register the application in Dynamics 365 for Finance and Operations, and Visual Studio app development. Here are the steps you need to take:
Register the Native application in Azure
1. Sign in to the Azure Portal with your account that has access to Dynamics 365 for Finance and Operations.
2. Select Azure Active Directory on the left-hand side and App registrations then click New application registration.
3. Enter app name, select the application type Web app/API, enter your organization’s URL in Sign-on URL (example: https://demoXXXaos.cloudax.dynamics.com) and then click Create.
4. Create a new key for the registered app. Enter a Key description, Duration, and click Save. It will automatically generate a key value that you will need to save so that we can use it later in the Visual Studio project. Also, save the Application ID for later.
5. Now add Required permissions. Select an API: “Microsoft Dynamics ERP.”
6. Now select the permissions, check all 3 delegated permissions then click Select. Click Done.
7. You can verify the delegated permissions on the main dashboard.
Register the application in Dynamics 365 for Finance and Operations
1. Open your Dynamics 365 for Finance and Operations, Go to System administration > Setup > Azure Active Directory applications, click the New button to create a new record.
2. Enter your Azure Application Id into the Client Id field. Enter your Azure Application name into the name field. Select Admin is in User ID field.
3. Click Save and it will be created.
Visual Studio app development
1. Run Visual Studio and go to File > New Project. Go to Templates > Visual C#. Navigate to Windows Classic Desktop on the left side of the screen and select Console App on the right side. Enter the name of the project and click OK.
2. Right click on Visual Studio project and select Add > Service reference.
3. In the new window, enter the URL to WSDL of Dynamics 365 for Finance and Operations any service (Standard or Customized).
The URL will be like this: https://{AOSANAME}.cloudax.dynamics.com/soap/services/
4. Enter “Service Group name” in the Namespace field. Click OK and Visual Studio will generate proxy classes for the specified services endpoint.
5. Right click on Visual Studio project and select Manage NuGet Packages.
6. Visual Studio will find the Microsoft library: Microsoft.IdentityModel.Clients.ActiveDirectory. Select this library and click the Install button on right side of this window. Click Ok to continue.
7. Expand the project and click on Program.cs.
8. The first method will be to format SOAP service string.
9. Next requirement is to get the service binding.
This code first creates an instance of BasicHttpBinding, then reads the configuration parameters from the config file, and it creates an instance of RealtimeServiceClient which is then used to make actual calls.
BasicHttpbinding is a binding that a Windows Communication Foundation (WCF) service can use to configure and expose endpoints that are able to communicate with ASMX-based Web services and clients and other services.
Best practice when dealing with the Key (password) is that the password is stored in unencrypted form in the configuration file, it is then encrypted in the code and eventually used in constructor of RealtimeServiceClient (that object requires an encrypted password). In your real applications you should have a dedicated application/process which will first encrypt the password acquired from Azure Active Directory, and only then stored in the configuration file. This way, even if the key is stolen an attacker will not be able to use/decrypt it on a different machine. So, do not save your secrets unprotected.
10. Now in Main method of the class, the first web authentication code needs to be written.
- aosUri – This will Dynamics 365 for Finance and Operations AOS name.
Example: https://{AOSNAME}.cloudax.dynamics.com/ - activeDirectoryTenant – The Azure AD directory tenant used with the Dynamics 365 for Finance and Operations server.
Example: https://login.windows.net/{COMPANY_DOMAIN} - activeDirectoryClientAppId – The client ID was obtained when the Azure App registration was done.
- activeDirectoryClientAppSecret – This is the secret key which was generated during the Azure active directory registration.
- activeDirectoryResource – This will be the Dynamics 365 for Finance and Operations AOS URL. Note: Do not end this field with a forward slash character (/).
11. Now you must pass the service you want to consume, and then binding needs to be done.
12. Once binding is done, call the contract class for the service to pass parameters.
13. Finally write the code and read the result in the console line. The code written below calls the main entry point method of the service where the business logic is written and runs the method with the parameters passed above. The result is displayed on the command prompt screen.
Console Application Usage
A developer can create a console app to do the testing of customized services or standard services created in Dynamics 365 for Finance and Operations.
The console app can also be used as a proof-of-concept demonstration of functionality that will be later incorporated into a Windows desktop application or Universal Windows Platform App.
Conclusion
In this blog we learned how to create a Console application by creating connections between Azure, Dynamics 365 for Finance and Operations, and Visual studio. Similarly, in Dynamics 365 for Finance and Operations we have few more ways using in which integrations can be done. Here are a few listed below:
- SOAP: This is an XML based protocol for accessing Web Services and it synchronously exchanges data. It supports HTTPS and XML based integrations.
- REST (Representational State Transfer): This is same as SOAP but is lightweight and does not require any XML parsing. It supports HTTPS and JSON based integrations.
- ODATA (Open Data): This is a standard protocol for creating and consuming data, this also follows synchronous mode of integration. It supports HTTPS, JSON and XML based integrations.|
- CDS (Common Data services): This integrator lets you build an end-to-end view of business data by making Dynamics 365 for Finance and Operation data available in CDS from various data sources.
For more information about integrations in Dynamics 365 for Finance and Operations, here are a couple of useful links:
Data Management & Integration Using Data Entities – Part 1
Data Management and Integration Using Data Entities – Part 2
Happy Dynamics 365’ing!
Announcing the Power BI Usage Metrics Solution Template
Today, we want to highlight one of the newest solution templates available on AppSource, the new Power BI Usage Metrics Solution Template, built by our partner Neal Analytics. It provides a comprehensive view of user activity across your entire Power BI tenant. In addition to providing a detailed view of who’s creating, viewing, sharing or deleting content across all the workspaces in your organization, it also contains more detailed information around additional items like which items in your tenant are using premium capacity, what are the gateway names and types currently deployed, and the data refresh schedules currently setup for your reports and datasets. It’s an invaluable new way for Power BI administrators to monitor and identify adoption and usage of Power BI throughout their organization.
To set up this solution template you will need the following:
When deployed, the solution template takes advantage of the Office 365 Management API to access the audit logs for your Power BI tenant. It’ll fully automate the entire data extraction and transformation for you and store that data in a Azure SQL database, which gets updated every few hours going forward, depending on the update frequency you select.
Try it out & let us know
If you’d like to learn more about the Power BI Usage Metrics solution template, Adam Saxton has put together a new YouTube video that provides more details and a hands-on look at the solution. Once you’re ready to get started, you can follow this link to go directly to setup page for the solution. We’re excited to have you get started using this solution template from the folks at Neal Analytics. The team is always interested in any thoughts or feedback – you can reach us through our alias (pbisolntemplates@microsoft.com) or by leaving a comment on the Power BI Solution Template Community page.
Introducing per-user usage metrics: know your audience and amplify your impact
This June, we released usage metrics for reports and dashboards in the Power BI service. The response has been tremendous. Usage metrics has already been used hundreds of thousands of times by report authors to measure their impact and prioritize their next investments. Since release, by far the biggest request has been for a way understand which users were taking advantage of your content.
You asked, and we delivered.
Today, we are excited to announce that we’re supercharging usage metrics bysurfacingthe names of your end users. And of course, you’re free to copy and customize the pre-built usage metrics reports to drill into the data. The change is currently rolling out worldwide, and should be completed by the end of the week.
This simple change has the potential to magnify your impact like never before. Now, you can understand exactly who your audience is, and reach out to your top users directly to gather asks and feedback.
Excited? Read on for a walkthrough for what’s new in this update of usage metrics.
Feature Overview
Going forward, when you go to dashboard or report usage metrics, you’ll also see a breakdown of number of views by user. The visual includes the display names and login names of your end users.
Above:new visual in the pre-built usage metrics showing views by user
Note: if you have an existing personalized copy of the usage metrics report, you can continue using it as usual. However, you’ll need to re-personalize a copy to get the new per-user data.
With the “save as” feature, you can copy and customize the pre-built usage metrics report to further drill into how your end users are interacting with your reports. In this release, we’ve augmented the users dimension with display name, login name and the user’s GUID.
Above: the updated users dimension when customizing the usage metrics report
Once you copy the report, you can remove the pre-set dashboard/report filter to see the usage data – including usernames and UPNs – for the entire workspace.
Tip: the UserGuid (aka Object ID) and UserPrincipalName are both unique identifiers for the user in AAD. That means if you export the usage metrics data, you could join the usage metrics data against more data from your directory, like organizational structure, job title, etc.
For a full overview of usage metrics and its capabilities, read through our documentation.
Administering Usage Metrics in Your Organization
As an IT admin, we understand that you may be tasked with ensuring that Power BI remain compliant with a variety of compliance regulations and standards. With this release, we are giving IT admins further control over which users in their organizations can take advantage of usage metrics.
The usage metrics admin control is now granular, allowing you to enable usage metrics for a subset of your organization. In addition, a new option in the admin portal will allow you to disable all existing usage metrics reports in your organization.
Above: granular admin controls governing who has access to usage metrics
Together, these features give you full control over who in your organization can use usage metrics, regardless of how it’s currently being used. For example, if you have users taking advantage of usage metrics, yet would like to control its rollout, you could delete all usage metrics content for users to start with a blank slate. From there, you could enable the feature for an increasing set of security groups as the rollout progresses.
Next steps
- Try out the feature! To get started, head to the Power BI service and go to any pre-built usage metrics report. Note that we’re still rolling out the feature worldwide, so you may have to wait until the end of the week to see it in action
- Learn more about the feature through our documentation
- Have comments? Want to share a cool use case for the feature? We’d love to hear it! Please leave comments below or in the community forums
- Have ideas for where we should take the feature? Create new ideas or vote on existing ones in UserVoice
Build your own usage report with Power BI
Yesterday I found out we have a great feature in Power BI that everyone should know about but I stumbled upon by accident. it turns out you can create your own, fully customized, usage reports to share. I found it while going over the documentation right here: https://powerbi.microsoft.com/en-us/documentation/powerbi-service-usage-metrics/
So let’s get at it. I created this report:
Now I want to create a report that I can share with the sales team that shows how many users I have on this dashboard. First I click on the “Usage metrics“ at the top
That opens the out of the box report that comes with Power BI, the key here is that I also have ability to do “save as”
This now adds the report to my workspace that I can change how I want it:
But also it adds the Dataset to Power BI
This now allows me to connect to this dataset from PBI desktop as well:
And now I can do anything I want, use my own visuals, add new measures etc
Interesting to note here is that there is a single model for all the reports and another one for all the dashboards per workspace. This allows you to build some comprehensive reporting that you can share with stakeholders around a particular area.
Pretty cool right
Organization Insights allows for a deeper understanding of Dynamics 365 usage
In late December 2016 Microsoft released a new solution for Dynamics 365 called Organization Insights that surfaces a set of kpis that provides administrators with a deep understanding of Dynamics 365 adoption and troubleshooting for potential problems.
Among the kpis surfaced include charts that can measure and identify:
- Active and Inactive users
- Entities by number of records and storage usage
- Storage by CRM Instance
- Workflows that consistently fail, resulting in poor data quality and unexpected results
- Plugins that may not be working as expected
The available charts are customizable so that you can build custom Dashboard to meet your needs, but an additional feature will allow developers to extend the out of box charts to create their own kpis.
Organization Insights can be installed into Dynamics 365 systems from Microsoft Appsource. The installation is painless but can only be applied to cloud based organizations that have been upgraded to Dynamics 365. By default the Organization Insight dashboards are only available to System Administrators and Customizers, but security roles can be updated to give ither users access.
Organization Insights will be a boon to system administrators looking for data about who is using CRM most effectively as well as which users might need some additional training to have the best experience.
About enCloud9
enCloud9 has one of the most experienced Microsoft Dynamics CRM teams in the US. From pre-sales to project management, and user support, we respond quickly with our expertise to answer your questions. Our history dates back to 2009, but our experience dates back even longer. Our consultants have been advising companies for almost thirty years to give them the tools to achieve their goals. Our experience leads to your success. We use our unique approach to help small and medium-sized businesses lower their costs and boost productivity through Microsoft’s powerful range of cloud-based software.
We can be contacted at our webform or call us today at 402-235-8540.
New SSAS memory usage report using Power BI
I have created several version of the SSAS memory usage report here and here. But with the new 1200 compatibility level these memory reports stopped working due to the DMV’s not working anymore for these databases. So I decided it would be time to create a new one, this time of course using Power BI. Here I can also use Power Query to do some additional things I couldn’t do before .
So let’s look at the reports and data available we have available, in a follow up post we’ll look at how I build it.
To start you can download the Power BI Template here on GitHub (more on GitHub later).
Double clicking the file gives you a prompt where you can enter your server name and a database name (pick any, unfortunately PQ mandates us to connect to AS with a database even though it is not needed).
This will start loading the report with the data from your server (you do have to be admin on the server to run this report).
In the report we will have two tables:
- “Databases”. This table contains all the databases on the server that are compatibility level 1200 or higher (to future proof it)
- “DatabaseSize”. This table contains the low level detail with databasename, partition, column, rowcount, size in memory and much interesting stuff more.
Using that I created a few reports:
- Databases on server by Record Count, which shows all the databases on the server by recordcount
- And then a similar one that show Databases by size in MB
- Then one that show Tables and partitions with the potential to select a database so you can filter the results down to a single database.
Here you can see the report filtered by a single database.
The treemap show the tables and within them the partitions so you can quickly identify either one. When you hover over the partition you get more information like size, recordcount and the size a single row takes - In the next report you can look at the top 10 partitions for either the entire server or per database:
- In the last report you will be able to look at the most “expensive” column based on the memory it uses per row of data
That’s it. Lot’s of interesting information already but there is more hidden in the model that I haven’t explored yet.
As mentioned before I put this Power BI desktop template on GitHub. Anyone can download it there or even better make updates to it with nifty new things they discovered!
Have fun!
What to Think, Ep. 32: Opower’s secret weapon for cutting energy usage
Opower has vaulted itself onto public markets by analyzing energy usage for utility companies and providing charts and other bits from data to ratepayers. Opower has an ambitious master plan, and it involves decreasing energy usage at scale.
For that to happen, Opower needs a seriously sophisticated approach. It goes beyond data science. Deena Rosen, Opower’s vice president of user experience, believes the company is onto something by focusing on what she describes as behavioral design. It’s sort of like how advertisements can motivate us to buy things — but for a more noble, potentially world-saving purpose.
Whatever Opower is doing, it seems to be a good idea. The company has signed up more than 95 utilities, according to a recent statement. VentureBeat’s Dylan Tweney and I learned about the company and its uses of behavioral design in our latest episode of What to Think.
Plus, we also tell you what to think about:
All this and more is in our latest weekly episode. Check it out!
Download the MP3 of Episode 32 here.
Or you can find this latest edition of What to Think on iTunes.
In addition, you can listen to us on Stitcher or get the What to Think RSS feed for the podcast player of your choice.
Enjoy the show!
We’re studying conversion rate optimization. Take our quick survey and we’ll share the results with you.
This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.