Tag Archives: using

Conditional Formatting Using Text In Power BI Desktop

Using Text Conditional Formatting Video Image Conditional Formatting Using Text In Power BI Desktop

Hello P3 Nation! Today’s “post” is actually going to be a video link. Sometimes there are certain subjects, concepts, or post ideas that just don’t translate well to the written word, and especially to screenshots. So this post will be in moving picture form! We’ve posted it to our YouTube Channel which you can go to here. Bedside’s today’s video, we have TONS of content on that channel so please take a look at our other awesome videos as well.

Today’s topic covers how to apply Conditional Formatting using text in Power BI Desktop. Conditional Formatting recently got a feature update, that allows you to apply color formatting rules based on ANY DAX Measure. I recently encountered a business scenario that had me play around with this feature a bit, and I came up with a clever way to use this for a client. I’ve included below the video link, download link to the .pbix file, and links to other articles explaining the various features & design techniques I’ve applied in this report. Otherwise, enjoy the video!

My Top 5 Power BI Practices: Transforming Good to GREAT: Article talks about a lot of the Formatting and Design Practices you see above, plus the DAX Formulas table.

Power BI: Transforming Good to GREAT: Video that is an updated discussion & walkthrough of the article above. Discusses Formatting, Design Practices, What If Scenarios, and Forecasting.

You can download the .pbix file here

We like to think that is no accident.  We’re different.  First of a new breed – the kind who can speak tech, biz, and human all at the same time.

Want this kind of readily-absorbable, human-oriented Power BI instruction for your team? Hire us for a private training at your facility, OR attend one of our public workshops!

Let’s block ads! (Why?)


Monitoring Dynamics 365 CE service health and messages using the Microsoft Office 365 Service Communications API

Hi Everyone!

Recently I had a conversation with a customer regarding options to monitor Microsoft Dynamics 365 CE service health. In their particular case they needed greater visibility than the Office 365 Admin Center or Office 365 Mobile application provided. I find great value in both of these and as a Dynamics 365 Service Administrator they are very useful. However in some cases users may not have the appropriate role within Office 365 to access these messages and become dependent on someone who does.

Doing a bit of research I stumbled upon the Office 365 Service Communications API. Reviewing this I found that there are two separate endpoints with their own authorization mechanisms, service contracts, etc. resulting in the Office 365 Service Communications API and the preview version of the Office 365 Service Communications API.

The major thing I’ve seen through my work consuming both of the APIs is one requires an Office 365 Service Administrator or a partner role acting on behalf of (AOBO) for authentication while the other uses OAuth. This means one will use a username and password similar to logging into Office 365 while the other will use a client id and client secret. Discuss this with your application security team to determine what options you have here but I’d suggest the preview endpoint with OAuth per the recommendation given on the original API reference. This article does a good job detailing the steps needed to register your application with Azure Active Directory. In terms of data and events available I have not seen any big difference in the two versions of the API so from here on I will explain the preview API but if you are still interested in the original API I’d suggest reviewing this sample MVC application.

Once you have your authorization token you can now append to any request to the Office 365 Service Communications API. The API is formatted to take a tenant identifier (tenant GUID or tenant name e.g. contoso.onmicrosoft.com) and the operation desired:


Here’s a sample request for my sandbox tenant:


GET https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/Services

Authorization: Bearer <authorization bearer token>

Host: manage.office.com

As you can see your request can be as simple as a GET with the authorization header provided and correct URL. The Services response gives you the current services of all of the Office 365 Applications in the tenant that we can query for additional information. Since this post is focused on Dynamics 365 let’s add a filter to the CurrentStatus API call for only Dynamics 365 status:


GET https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/CurrentStatus?$ filter=Workload%20eq%20’DynamicsCRM’

Authorization: Bearer <authorization bearer token>

Host: manage.office.com

Response Body:


“@odata.context”:”https://office365servicecomms-prod.cloudapp.net/api/v1.0/contoso.onmicrosoft.com/$ metadata#CurrentStatus”,”value”:[




“FeatureDisplayName”:”Sign In”,”FeatureName”:”signin”,”FeatureServiceStatus”:”ServiceOperational”,”FeatureServiceStatusDisplayName”:”Normal service”


“FeatureDisplayName”:”Sign up and administration”,”FeatureName”:”admin”,”FeatureServiceStatus”:”ServiceOperational”,”FeatureServiceStatusDisplayName”:”Normal service”


“FeatureDisplayName”:”Organization access”,”FeatureName”:”orgaccess”,”FeatureServiceStatus”:”ServiceOperational”,”FeatureServiceStatusDisplayName”:”Normal service”


“FeatureDisplayName”:”Organization performance”,”FeatureName”:”orgperf”,”FeatureServiceStatus”:”ServiceOperational”,”FeatureServiceStatusDisplayName”:”Normal service”


“FeatureDisplayName”:”Components/Features”,”FeatureName”:”crmcomponents”,”FeatureServiceStatus”:”ServiceRestored”,”FeatureServiceStatusDisplayName”:”Service restored”




],”Status”:”ServiceRestored”,”StatusDisplayName”:”Service restored”,”StatusTime”:”2018-04-26T19:09:29.3038421Z”,”Workload”:”DynamicsCRM”,”WorkloadDisplayName”:”Dynamics 365″




Reviewing this response object I can see the current status of Dynamics 365 and features such as Sign In, Organization Access, Components/Features, etc. by referencing the FeatureServiceStatusDisplayName property. In this response we can see that the current status of Dynamics 365 is ‘Service Restored’ and the specific feature affected is ‘Component / Features’.

If we want to gain some historical perspective on how long this feature may have been affected we can use the HistoricalStatus as shown:


GET https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/HistoricalStatus?$ filter=Workload%20eq%20’DynamicsCRM’

Authorization: Bearer <authorization bearer token>

For the sake of the length of this article I’ll exclude the response but within the response you’ll see a smiliar contract as the CurrentStatus response showing the status of the Dynamics 365 features over a period of time including a Message Center identifier which can be used for our final method call today, GetMessages. GetMessages returns the title, description, message text, impact date, affected tenants, message id, etc. This method is highly beneficial in many ways: I can review messages for current outages, I can review messages for planned maintenance, I can filter messages based on a combination of message identifiers, features, area of interest, time frame, etc.

Here’s a reference of sample requests to help you filter by specific criteria:


Filter by Id:

https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/Messages?$ filter=Id%20eq%20’CR133521′

Filter by Message Center:

https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/Messages?$ filter=MessageType%20eq%20Microsoft.Office365ServiceComms.ExposedContracts.MessageType’MessageCenter’


https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/Messages?$ filter=MessageType%20eq%20Microsoft.Office365ServiceComms.ExposedContracts.MessageType’Incident’

Planned Maintenance:

https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/Messages?$ filter=MessageType%20eq%20Microsoft.Office365ServiceComms.ExposedContracts.MessageType’PlannedMaintenance’

Filter By Start Time and End Time:

https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/Messages?$ filter=StartTime%20ge%202018-04-23T00:00:00Z&EndTime%20le%202018-04-28T00:00:00Z

Filter by Workload:

https://manage.office.com/api/v1.0/contoso.onmicrosoft.com/ServiceComms/Messages?$ filter=Workload%20eq%20’DynamicsCRM’

At this point you have an API to give you the current status of Dynamics 365 and its features, information on how long a potential degradation may have impacted your tenant and the ability to plan for planned updates and maintenance performed on the application. Look ahead for my next blog post detailing how to schedule a workflow to consume the API and send an email reporting current status and messages to a single user or a distribution list using Microsoft Flow!


Get started with Office 365 Management APIs

About Office 365 Admin Roles

Office 365 Service Communications API Overview

Office 365 Service Communications API Sample Code

Office 365 Service Communications API Overview (preview)

Thanks and happy coding!

Ali Youssefi

Let’s block ads! (Why?)

Dynamics CRM in the Field

Dynamics 365 Customer Engagement View Usage Logger using Azure Functions and Application Insights

I recently received the same request from two customers, so I felt maybe it might be a good topic to discuss here so others can take advantage of it as well. The request was as follows: The customers wanted a way to track active usage of the Views in their system to find out which ones actually got used. They can use this information to deactivate unused Views, and consolidate their list of views for each entity to only the ones needed by their users.

In order to help accomplish this goal, I’m going to use an asynchronous Service Bus plugin registered on the Retrieve message for the SavedQuery entity. This will tell us every time we retrieve a view definition, which should only happen when a user clicks a view from a view picker or through advanced find. There will also be times when the view definition has already been retrieved and is cached locally, so we’ll essentially be tracking “cold loads” of Views, or the first time they are retrieved in a browser session per user.

This article will have a very similar alternative that I created for customers who prefer Log Analytics to Application Insights. The alternative uses a Logic App in Azure to grab the message from the Service Bus Queue and push the data to log analytics.



Identify views with the most traffic/requests, so that other unused views can be deleted and highly used ones can be optimized.


  • Register Service Endpoint message on Retrieve of Saved Query entity in CRM. This will asynchronously post the execution context containing the view data to a Service Bus Queue/Topic, where it can be retrieved by a Logic App.
  • The Logic App will parse out the relevant data (Entity Name, View Name) from the execution context, and pass to an Azure Function which will insert it into an Application Insights Tenant where it is logged and can be reported on.


  • Service Bus Queue created in an Azure subscription, need the connection string for step 2b.



  1. Create Service Bus Queue or Topic
  2. Register Service Endpoint in the CRM Plugin Registration Tool
    1. Register->New Service Endpoint
    2. Paste in a Connection string retrieved from the Azure Portal
    3. On the next screen, Change the Message type from .Net Binary to JSON, Enter the Queue or Topic Name
    4. Click OK
  3. Attach a message processing step to the new service endpoint in the Plugin Registration Tool
    1. Register->New Step
    2. In Message, enter Retrieve
    3. In Primary Entity, enter savedquery
    4. Change Execution Mode to Asynchronous
    5. Click Register
  4. Create an Azure Function App to help translate the JSON from the plugin
    1. In the Azure Portal, click New->Serverless Function App
    2. Give the App a unique name, Resource Group, Storage Account
    3. Click Create
    4. Click the +/Add button, add a new HTTPTrigger function
    5. Use this code for your function:

      #r “Newtonsoft.Json”

      using System.Net;

      using System;

      using Newtonsoft.Json;

      using System.Collections.Generic;

      using Microsoft.ApplicationInsights;

      private static TelemetryClient telemetry = new TelemetryClient();

      public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)


      dynamic data = await req.Content.ReadAsAsync<object>();



      //log as much additional information from CRM as we can for auditing

      //we can get CorrelationId, which ties directly back to the plugin

      //execution and is also useful for Microsoft support to have

      //UserId could also be helpful so you can tie a view retrieve directly

      //back to a user in case you want to find out why they use that particular view

      //giving a static Operation Name string will allow you to quickly filter

      //down results to this type of operation if your Application Insights instance is heavily used


      telemetry.Context.Operation.Id = data.CorrelationId.ToString();

      telemetry.Context.User.Id = data.UserId.ToString();

      telemetry.Context.Operation.Name = “View Accessed”;

      string target = data.Target.ToString();

      KeyValuePair<string,object>[] entity = JsonConvert.DeserializeObject<KeyValuePair<string,object>[]>(target);

      List<KeyValuePair<string,object>> entList = entity.ToList<KeyValuePair<string,object>>();

      Dictionary<string,object> entDict = entList.ToDictionary(k=>k.Key,v=>v.Value);

      string newJson = JsonConvert.SerializeObject(entDict);

      telemetry.TrackEvent(entDict[“returnedtypecode”].ToString() + ” – ” + entDict[“name”].ToString());

      return req.CreateResponse(HttpStatusCode.OK, newJson);


    6. Create a new file in your project by expanding View files on the right, click Add, name the file project.json
    7. Open project.json and add this code:


      “frameworks”: {


      “dependencies”: {

      “Microsoft.ApplicationInsights”: “2.2.0”





    8. The above code will tell the Azure function to download a nuget package for Application Insights.
  5. Now we can start to test the functionality, to start, login to CRM, Navigate to an entity, change the view
    1. You can monitor the console in your Function App to see if any errors occur
  6. Start reviewing results in Application Insights
    1. In the Azure portal, find Azure Functions and choose the Function App you created for this exercise.
    2. Click Application Insights
      1. From here you can click Analytics (small button in the ribbon), then click the + new tab button
      2. Intellisense is very good so as you keep typing you can tab to complete your entries
      3. Here is a sample query to display the top views in order in a bar graph format:


      | where timestamp >= ago(30d)

      | project name

      | summarize count() by name

      | order by count_ desc nulls last

      | where count_ > 2

      | render barchart

    3. The first line is the “table” name if you were comparing this query to a SQL query
    4. The next lines all begin with a pipe (|) operator which is just syntax, after that more querying keywords are specified. “where” is just like SQL, specifying a filter clause
    5. |project col1,col2,col3 specifies the columns to retrieve, like a “select” in sql. Omitting the project line is fine to retrieve all columns
    6. Comment lines out with // to try omitting various lines
    7. Functions help with dynamic time operations, like the ago(30d) function to only look back 30 days of logs, you can also use “m” for minutes “h” for hours, “d” for days
    8. |where count_ > 2 tells the query to forget about the views that only have 1 or 2 views and filter these out
    9. |summarize is the group by operator equivalent. In summarize you can use aggregates like count() max() avg(), followed by the keyword “by” which specifies columns to group on.
    10. |render barchart makes the output a graphical format, omitting makes it a table.
    11. Here is a sample output:

Let’s block ads! (Why?)

Dynamics CRM in the Field

Displaying Related Entity Data Using Calculated Fields

computer 3365366 960 720 300x225 Displaying Related Entity Data Using Calculated Fields

There are times when you need to display data from a related entity on a form without the overhead of replicating the information across entities. Quick View forms are a good way to do this if you are pulling multiple fields from the entity and displaying them in one column. To learn more on Quick View Forms, check out our previous post: Exploring the Quick View Forms in CRM 2013. If you need the flexibility to show fields in multiple columns or if you only need one field, calculated fields may be the way to go.

In this blog, we will show you how to use calculated fields to display discrete data from a related entity. To do this we will use the following scenario. Let’s say I need to display a Company’s Credit Limit on an Opportunity record. We can do this by adding a calculated field to the Opportunity Entity and then adding that to the Opportunity form. Let’s see it in action.

Step 1 – Create the Calculated field

1. Navigate to Settings > Customizations > Customize the System.

2. Expand the Opportunity entity and select Fields.

3. Click the New button.

060418 1655 DisplayingR1 Displaying Related Entity Data Using Calculated Fields

4. Add a Display Name and Description for the new field.

5. Select Calculated as the Field Type.

060418 1655 DisplayingR2 Displaying Related Entity Data Using Calculated Fields

Note: We are using the Currency Data Type for this field to ensure that the value is formatted correctly when we add it to the form.

6. Select the Edit button to open the Field Definition Editor.

060418 1655 DisplayingR3 Displaying Related Entity Data Using Calculated Fields

7. Next, we will want to add an Action by clicking the Add Action button.

060418 1655 DisplayingR4 Displaying Related Entity Data Using Calculated Fields

8. Click into the edit control and select parentaccountid from the list.

060418 1655 DisplayingR5 Displaying Related Entity Data Using Calculated Fields

9. Type a “.” after parentaccountid and select Credit Limit from the dropdown list.

Note: Adding the “.” after any lookup field will allow you to use related entity fields in your action step.

060418 1655 DisplayingR6 Displaying Related Entity Data Using Calculated Fields

10. Select the Check button to save your action.

060418 1655 DisplayingR7 Displaying Related Entity Data Using Calculated Fields

11. Click the Save a Close button on the Field Definition Editor to commit your changes

060418 1655 DisplayingR8 Displaying Related Entity Data Using Calculated Fields

12. Select Save and Close on the New Field from to update the system Schema

060418 1655 DisplayingR9 Displaying Related Entity Data Using Calculated Fields

Step 2 – Add the Calculated field to the Opportunity Form

1. Open the Opportunity Form where you would like to display Credit Limit and navigate to the appropriate section.

2. Select the Credit Limit field from the Field Explorer and add it to the form.

060418 1655 DisplayingR10 Displaying Related Entity Data Using Calculated Fields

3. Save and Publish your customizations.

4. Once you refresh your browser you will see the Credit Limit displayed on your Opportunity form.

060418 1655 DisplayingR11 Displaying Related Entity Data Using Calculated Fields

For more info and additional uses of calculated fields, visit the Calculated Fields section of The CRM Book!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

How to Reduce Dynamics 365 Storage Costs Using Attachment Manager

CRM Blog How to Reduce Dynamics 365 Storage Costs Using Attachment Manager

For Microsoft Dynamics 365 customers concerned about the cost of cloud storage, Azure Blob Storage is an increasingly popular solution.

Default storage capacity for Dynamics 365 is 10GB with extra capacity accruing at 5GB per 20 users. This can often prove insufficient if organizations have large volumes of attachments.

Email and note attachments often consume the most storage space so it will be more cost effective to store these in Azure Blob rather than the D365 tenant.

Using the Attachment Management add-on app, attachments are automatically be stored on Block Blob through Dynamics 365.

The immediate benefit is that Azure Storage costs are significantly cheaper compared to purchasing more D365 storage capacity.

Added flexibility through pay-as-you-go pricing means that organizations get a competitive price for their cloud data storage, and only pay for what they need.

Attachment Manager enables users to retrieve files on-demand with a single click. Further options include multiple file uploads and attachment download previews.

Built on Dynamics 365, Attachment Manager works on Dynamics CRM 2016, and above, enabling admins to optimize data storage and save money.

The Microsoft Attachment Management app is freely available on AppSource.

Azure Blob storage costs split into two sections.

  • Recurring storage fees apply for each month with tiered rate priced per GB / month
  • Transaction fees applied every time you want to add a new file, or access existing files

Preact provides technical consultancy to install and configure the Attachment Manager including moving across legacy attachments.

Contact us to find out more about the Attachment Management app and how this can be easily used to minimize D365 cloud costs.

About Preact

Preact is a Microsoft Gold Partner implementing and supporting CRM solutions since 1993.

Check our blog for more information about Microsoft Dynamics 365 and CRM.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

You implemented a new CRM system. Great! But why is no one using it?

CRM Blog You implemented a new CRM system. Great! But why is no one using it?

You implemented a new CRM system. Great! The hardest part is done now, right? Well . . .  not exactly. Getting your employees to actually use your new system can often be a major hurdle.

According to Principal Analyst Bill Band of Forrester, the biggest threat to a successful CRM project implementation is slow user adoption. Band also states that not only is early and pervasive user acceptance one of the biggest indicators of success, it is a MANDATORY precursor to success. But before we dig into this, it’s important to define exactly what we mean by User Adoption


User adoption is not simply the use of a CRM system. True user adoption occurs only when users want to use the CRM system. Users must be motivated and have a genuine desire to achieve key benefits that the system can bring. You can’t just force a new technology system on your staff. That strategy will never last long term and trying to push it through will be very painful for you and your entire staff.

Our CRM consultants have also found that having a power user, someone who is tech savvy and a team player, can help encourage others to use the new system. This can be helpful as users are often more likely to listen to their colleague than they are to an outside consultant. Having said that, there are five areas of focus that can make or break the user adoption of your CRM system:


1. Executive Sponsorship

Getting buy-in from the executive level can make a world of difference. Leaders can articulate the strategy, goals and importance of the CRM system, making it easier for employees to understand their role in rolling out the project.

2. Business Processes

As you are preparing the roll out your new CRM platform, it is the perfect time to review your business processes. A system like Microsoft Dynamics CRM is highly configurable and has the flexibility to adapt to whatever business processes you identify. This also allows the CRM to grow with your organization, as your CRM partner can help you reconfigure the system as needed if your business processes change over time.

3. Communication

This seems like an obvious one but it can’t be overlooked. We encourage our clients to develop a communications plan to keep the executive team updated on the status of the CRM project, including “wins” along the way. Communication to the end users is also important so your staff is not left guessing as to what is happening and when.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Authentication to Dynamics 365 using Azure Apps

accounting 3175108 960 720 300x225 Authentication to Dynamics 365 using Azure Apps

There are many scenarios where you might need to make a connection to Microsoft Dynamics 365 from an outside source whether it be a single page application, a mobile application, or within some other service. In almost all these situations, authenticating to Dynamics 365 can be a bit challenging. That is until now!

Let’s face it, nobody wants to deal with messy SOAP handshake protocols or developing/hosting/managing middleware just to establish a connection to Dynamics 365 from your latest mobile app or through new code running in some website backend. With the help of Azure Apps, we can drastically simplify the development of applications that need a connection to Dynamics 365 (or any other Microsoft Services).

You can perform all of the following steps by creating a trial Dynamics 365 account from Microsoft (did you know that you can access Azure AD without putting in your credit card info?) Below are some high level steps to set up an app in Azure, get a token using that info from C# code, and using the token from a simple JS code to access Dynamics 365.

Setting Up an App in Azure

1. Navigate to Azure Active Directory > App registrations > Click + New application registration

051818 1710 Authenticat1 Authentication to Dynamics 365 using Azure Apps

2. Now fill in the required fields as shown below and hit Create. Note that the sign-on URL only matters for something like a single page application – otherwise just putting a localhost URL is just fine.

051818 1710 Authenticat2 Authentication to Dynamics 365 using Azure Apps

3. Awesome! Now you have successfully created an Azure app. Double click the app and you will see its details as below.

051818 1710 Authenticat3 Authentication to Dynamics 365 using Azure Apps

4. Our next step is to give permission to the app to access Dynamics 365. Navigate to Required permissions and click +Add. Then select Dynamics 365 Online API.

051818 1710 Authenticat4 Authentication to Dynamics 365 using Azure Apps

5. Make sure to check the Delegated Permissions checkboxes as shown below. Also, click the Grant Permissions button for the changes to take effect.

051818 1710 Authenticat5 Authentication to Dynamics 365 using Azure Apps

6. We have one more thing to setup! Navigate to Keys, create a new key, and copy the value ASAP, be sure to save it somewhere (as it will be hidden in future).

051818 1710 Authenticat6 Authentication to Dynamics 365 using Azure Apps

7. Navigate to Endpoints in the App registrations list view and copy the OAUTH 2.0 AUTHORIZATION ENDPOINT URL that you will need in the code later.

051818 1710 Authenticat7 Authentication to Dynamics 365 using Azure Apps

Setting Up the Application user in CRM

Now that we have our Azure app set up complete, we’ll move on to creating an application user.

1. Once you create the app user, make sure to give it a custom Security role that has the access you want this user to have.

2. Navigate to your Dynamics 365 org > Settings > Security. Change the view to Application Users and then click +New.

051818 1710 Authenticat8 Authentication to Dynamics 365 using Azure Apps

3. This will show you a new user form (make sure to change the form to “APPLICATION USER”) where you will only need to fill the “Application ID” field. Use the application ID you created from your Azure app in the previous steps.

051818 1710 Authenticat9 Authentication to Dynamics 365 using Azure Apps

Sample C# code for testing above configurations

Now let’s look at the C# code that uses this app to retrieve an Authentication token. We are using the sample code that you can get here.

1. You will need to get the client Id, secret key value, resource URL, and OAUTH 2.0 Authorization Endpoint as described in the previous steps to successfully get the token from the C# code below.

051818 1710 Authenticat10 Authentication to Dynamics 365 using Azure Apps

2. We have shown the token in Visual Studio’s immediate window, but this token string is what your C# app will return. For the demo, we used the console app but this console app can be hosted in something like an Azure function so that it can be called from anywhere and isn’t too difficult to retrieve the Dynamics 365 authentication token.

Sample JS Code for Connecting to Dynamics 365 using the Token

Now that we have the token, let’s use it in a simple JS code. We generated a simple rest call snippet using CRM Rest Builder and we’re running the JS code inside jsfiddle. As you can see in the screenshot below, we have successfully sent the WhoAmI request to the Dynamics 365 org and retrieved the values as shown in the alert boxes below.

051818 1710 Authenticat11 Authentication to Dynamics 365 using Azure Apps

We hope this method will save some of the CRM developers out there some time and effort when dealing with creating custom third party applications that need to talk to Dynamics 365.

For more helpful Dynamics 365 tips and tricks, be sure to subscribe to our blog!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Using ConvexHullMesh inside NMinimize

 Using ConvexHullMesh inside NMinimize

I’m currently trying to find a set of points (in 2 Dimensions) in a way that the arclength of their convex hull will minimize some weird function:

f= Distanz[xarray, ReplacePart[yarray, i -> yarray[[i]] + s]] - Distanz[xarray, yarray] - d)

With Distanz being

Distanz[xvalues_, yvalues_] := ArcLength[RegionBoundary[ConvexHullMesh[{xvalues, yvalues} // Transpose]]]

When trying to Minimize the function f, using NMinimize, I never achieve good results. I read about that ConvexHullMesh only handels arguments that are real values, so I wondered wether there are some issues when using it inside NMinimize, or wether there are any subtilities one has to take care of, in order for it to work out right.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Using Virtual Entities with Dynamics 365

Blog Images Recovered 300x225 Using Virtual Entities with Dynamics 365

Virtual entities allow Microsoft Dynamics 365 users to utilize record information from external data sources and view the information in fields, search results, and even Fetch XML-based reports within Dynamic 365.

A data source record can be configured by a system customizer to create virtual entities without writing code. However, developers can still implement plugins to read data from other sources using the Plug-in Registration tool and the Dynamics 365 SDK.

When defining a virtual entity, the process is like defining a custom entity. With a virtual entity however, you must attach it to a data provider to achieve data retrieval. Specifically, the entity and its fields must be mapped to equivalent data in the external data source. The data provider must offer a GUID and associate it to the external record that represents this entity instance.

050318 1602 UsingVirtua1 Using Virtual Entities with Dynamics 365

Image from Microsoft

Initial Considerations of Virtual Entities:

  • Data is read-only.
  • Only organization-owned entities are supported.
  • Virtual entities cannot be enabled for queues.
  • Field-level security is not supported.
  • A virtual entity cannot represent an activity and do not support business process flows.

For more information on Virtual Entities within Dynamics check out this blog: Dynamics 365 July 2017 Update: Virtual Entities.

Stay tuned for more Dynamics 365 updates. Subscribe to our blog to stay up to date!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

How to segment a textured image using Gabor Filters?

I am following a matlab implementation of texture segmentation using Gabor filters. My test image (img) is from Brodatz and as follows:

n1mMl How to segment a textured image using Gabor Filters?

Below is what I have done so far.

{nrows, ncols} = ImageDimensions[img];
wavelengthmin = 4/Sqrt[2];
wavelengthmax = Sqrt[nrows^2 + ncols^2];
n = Floor[Log2[wavelengthmax/wavelengthmin]];
wavelength = Table[{2, 2} 2^i, {i, 0, n - 2}];
deltaTheta = Pi/4;
orientation = Table[i, {i, 0, Pi - deltaTheta, deltaTheta}];
gabormag = 
     GaborFilter[img, 1, wavelength[[i]], orientation[[j]]]], {i, 1, 
     Length@wavelength}, {j, 1, Length@orientation}]];
gaborWavelength = 
    Table[Norm[wavelength[[i]]], {i, 1, Length@wavelength}], 
K = 3;
gabormagfiltered = gabormag;
  sigma = 0.5 gaborWavelength[[i]];
  gabormagfiltered[[i]] = GaussianFilter[gabormag[[i]], K sigma]
  , {i, 1, Length@gabormag}];

After this, I am stuck on how to use this filtered images to get the final classification.

How can I proceed?

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange