• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: SSAS

Figure out which columns and measures are used in your SSAS Tabular model

October 22, 2019   Self-Service BI

I got an interesting question about being able to figure out which measures and columns are being used for all tabular models in a server, either by known reports or by self service users using Power BI. To solve this I decided to use Power BI :).

To get the required information I decided to capture and parse the queries being that are being send to AAS and parse the results Power BI desktop over a period of time. In this post I describe how to do it.

Start capturing data

First I started capturing the queries in AAS using Azure Log analytics as I described here. This can also be done using profiler or XEvents as well but I decided to be modern :).

In log analytics I can see all the queries that get send:

For my analysis I don’t need all the columns and events, so let’s filter that down with log analytics to something like this:

AzureDiagnostics
| where OperationName == “QueryBegin”
| project TimeGenerated, TextData_s, DatabaseName_s
| where TimeGenerated > ago(30d)

You can change the time range to any range you want to whatever max range you have data for. You need to keep the TimeGenerated column as we want to filter by it.

This will get us the data we can use in Power BI:

To get this data into Power BI you can click Export, “Export to Power BI” in the query log screen.

Bringing it into Power BI

This gets you a txt file with the M expression that is downloaded to your PC. You can then paste the M query into a blank Power BI query. Running it gets me the data I want:

From this I need to parse the columns and measures that are used over the time period I captured the trace. This requires some tricks in Power Query (and it might not work for all scenario’s). These are the steps I took (you can download the full M query here):

  • I got rid of all diagnostic queries by filtering out Select.
  • The time column is not needed for the analysis so I remove it, you need it for log analytics to filter on time.
  • Now here comes the main trick. I need to extract column, measure and table names from the query. I do this by replacing many special characters like ( ,[ ] EVALUATE by ~ and then split the columns. Thanks to Imke for the idea :). That starts to get us somewhere:
  • Next I Unpivot and clean up the data a bit
  • I filter to only show rows that start with ‘ or [. Which keeps all column, table and measure references.
  • To add more information I add a column to show what the type of field it is, measure, table or column.
  • Finally I group the table and add a count.

This gets me the data I want:

With the data I can start doing some reporting:

Now you have an overview of which fields are used and how often they are referenced. You can extend this solution much further of course, for example you could add a username to see who accessed what and how often or you could compare it with the schema that you can get from other discovers to see which fields are never used so you can clean them up.

You can download the full solution here as PBIT. When you open it up for the first time it asks for the URL, you can get it from the M query that you download from Log Analytics.

Share this:

Let’s block ads! (Why?)

Kasper On BI

Read More

Separate business names from technical names using SSAS and Power BI

October 22, 2018   Self-Service BI

I have heard this question come up now and then where someone wants to separate the names and columns seen by users from the names and users used in SSAS and in the reports. The main benefit this has is that name changes will not result in broken reports.

Turns out there is a neat trick that will allow you to do this. The trick is for you to add a translation to the model in the same language as the culture of the model is set to. As you know at the time of writing Power BI doesn’t support translations (it is planned though: https://docs.microsoft.com/en-us/business-applications-release-notes/october18/intelligence-platform/power-bi-service/dataset-metadata-translations) but when you add a translation for the default language (again the same as the model) Power BI will pick it up automatically  Separate business names from technical names using SSAS and Power BI.

Lets take a look. First you need SQL Server Tabular 2016 and up or Azure Analysis Services to support translations. Next I create a new model in SSDT and to make sure I check the culture by looking at the BIM file (using code view):

 Separate business names from technical names using SSAS and Power BI

Now we need to add a translation for the same language. Unfortunately the tool doesn’t allow you to add the same language as the model (oops I was the PM who added this feature..) but it is easy to change later.  Close the bim file and double click the model again to open it in the visual editor and select manage translations under Model in the top toolbar. Now add any language, select the language and press export.

 Separate business names from technical names using SSAS and Power BI

Now open the JSON file with your favorite editor:

 Separate business names from technical names using SSAS and Power BI

And change the name to the same string as the culture of your model and add the translations.

 Separate business names from technical names using SSAS and Power BI

Save the file and load it back in, go to Model, Translations, Load translation and select the JSON file again.

Now you see the translations is added:

 Separate business names from technical names using SSAS and Power BI

and now when I connect to that model using Power BI I see the translated names:

 Separate business names from technical names using SSAS and Power BI

If I now use them in a visual it shows the translated name:

 Separate business names from technical names using SSAS and Power BI

If I now rename the field using translations again to something else the visual still works and now has the same name (after a refresh):

 Separate business names from technical names using SSAS and Power BI

Nifty trick  Separate business names from technical names using SSAS and Power BI

Let’s block ads! (Why?)

Kasper On BI

Read More

Separate business names from technical names using SSAS and Power BI

October 22, 2018   Self-Service BI

I have heard this question come up now and then where someone wants to separate the names and columns seen by users from the names and users used in SSAS and in the reports. The main benefit this has is that name changes will not result in broken reports.

Turns out there is a neat trick that will allow you to do this. The trick is for you to add a translation to the model in the same language as the culture of the model is set to. As you know at the time of writing Power BI doesn’t support translations (it is planned though: https://docs.microsoft.com/en-us/business-applications-release-notes/october18/intelligence-platform/power-bi-service/dataset-metadata-translations) but when you add a translation for the default language (again the same as the model) Power BI will pick it up automatically  Separate business names from technical names using SSAS and Power BI.

Lets take a look. First you need SQL Server Tabular 2016 and up or Azure Analysis Services to support translations. Next I create a new model in SSDT and to make sure I check the culture by looking at the BIM file (using code view):

 Separate business names from technical names using SSAS and Power BI

Now we need to add a translation for the same language. Unfortunately the tool doesn’t allow you to add the same language as the model (oops I was the PM who added this feature..) but it is easy to change later.  Close the bim file and double click the model again to open it in the visual editor and select manage translations under Model in the top toolbar. Now add any language, select the language and press export.

 Separate business names from technical names using SSAS and Power BI

Now open the JSON file with your favorite editor:

 Separate business names from technical names using SSAS and Power BI

And change the name to the same string as the culture of your model and add the translations.

 Separate business names from technical names using SSAS and Power BI

Save the file and load it back in, go to Model, Translations, Load translation and select the JSON file again.

Now you see the translations is added:

 Separate business names from technical names using SSAS and Power BI

and now when I connect to that model using Power BI I see the translated names:

 Separate business names from technical names using SSAS and Power BI

If I now use them in a visual it shows the translated name:

 Separate business names from technical names using SSAS and Power BI

If I now rename the field using translations again to something else the visual still works and now has the same name (after a refresh):

 Separate business names from technical names using SSAS and Power BI

Nifty trick  Separate business names from technical names using SSAS and Power BI

Let’s block ads! (Why?)

Kasper On BI

Read More

Released: SSAS and SSRS 2017+ Management Packs (7.0.10.0)

October 5, 2018   BI News and Info

We are happy to announce that Management Packs for SQL Server 2017 Analysis Services and Reporting Services are released.

Please download at:

Version Agnostic SSAS and SSRS MPs

As you might recall, we introduced version agnostic MPs starting with SQL Server 2017. We are now doing the same for SSAS and SSRS. We understand that with many SQL Server versions in market and with new server releases becoming more frequent, it is becoming harder to manage a separate MP for each server version. We are moving to version agnostic MPs to address this issue. This will be valid going forward. The new MPs will be named SSAS and SSRS 2017+. The ‘+’ in the name indicates that it will be used to monitor SQL Server AS and RS 2017 and the releases that come after that. Current in-market MPs (2008 through 2016) will not be changed and the 2017+ MP cannot be used to monitor older releases.

For more details, please refer to the user guides that can be downloaded along with the corresponding Management Packs.

Release Note

These management packs have been changed so that they now require Microsoft SQL Server 2017+ Core Library of version 7.0.7.0, which is delivered within these management packs as well as within the management pack for Microsoft SQL Server 2017+ of version 7.0.7.0. If you have the Core Library of version 7.0.8.0 or higher installed in your test environment, it will not be possible to completely update the management packs to version 7.0.10.0, because SCOM cannot import a lower version of already installed either .MP or .MPB file. We recommend you entirely delete the CTP version of the management packs for SSRS 2017+ and SSAS 2017+ including the Core Library and then install the current RTM version of them.

SSAS  2017+ MP New Features and Fixes

  • Replaced the Core Library in the delivery with the version 7.0.7.0, the version delivered with the most recent RTM version of the management pack for SQL Server 2017+.
  • Improved displaying of the SSAS instance version (now shows Patch Level version instead of Version).
  • Added missed dependency monitors required to roll up the instance health appropriately.
  • Fixed alert for “Partition Storage Free Space” monitor.
  • Updated Summary dashboards.
  • Updated display strings.

SSRS  2017+ MP New Features and Fixes

  • Replaced the Core Library in the delivery with the version 7.0.7.0, the version delivered with the most recent RTM version of the management pack for SQL Server 2017+.
  • Updated the monitoring of Memory Consumption and CPU Usage in order to collect performance data for all subprocesses in addition to the main SSRS service process.
  • Updated display strings.
  • Updated Summary dashboards.
  • Fixed minor issues found in the CTP version.

Let’s block ads! (Why?)

SQL Server Release Services

Read More

Released: Public Preview for SSAS and SSRS 2017+ Management Packs (7.0.8.0)

July 7, 2018   BI News and Info

We are getting ready to release brand new management packs for SQL Server Analysis Services and SQL Server Reporting Services 2017. Please install and use this public preview and send us your feedback (sqlmpsfeedback@microsoft.com)! We appreciate the time and effort you spend on these previews which make the final product so much better.

Please download at:

Microsoft System Center Management Packs (Community Technology Preview) for SQL Server 2017+ Reporting Services and Analysis Services

Version Agnostic SSAS and SSRS MPs

As you might recall, we introduced version agnostic MPs starting with SQL Server 2017. We are now doing the same for SSAS and SSRS. We understand that with many SQL Server versions in market and with new server releases becoming more frequent, it is becoming harder to manage a separate MP for each server version. We are moving to version agnostic MPs to address this issue. This will be valid going forward. The new MPs will be named SSAS and SSRS 2017+. The ‘+’ in the name indicates that it will be used to monitor SQL Server AS and RS 2017 and the releases that come after that. Current in-market MPs (2008 through 2016) will not be changed and the 2017+ MP cannot be used to monitor older releases.

For more details, please refer to the user guides that can be downloaded along with the corresponding Management Packs.
We are looking forward to hearing your feedback at sqlmpsfeedback@microsoft.com.

Let’s block ads! (Why?)

SQL Server Release Services

Read More

How to turn on SSAS profiler traces in Power BI desktop

December 7, 2017   Self-Service BI

Just a quick post today. Power BI desktop has a feature to turn on the SSAS profile traces that allows you to see all the queries (DAX and SQL in DirectQuery mode) generated by the reports. I couldn’t find any documentation to turn it on so let’s take you through the steps here.

I start with a very simple PBI desktop file that points to a SQL Azure Database in Direct Query mode:

To see which queries are being run I turn on Tracing by going to File -> Options & Settings -> Options and enable tracing:

Now I press OK and click refresh on the report to make sure the query gets send to the database. Now a trace file has been generated. I can go to “C:\Users\\Microsoft\Power BI Desktop Store App\AnalysisServicesWorkspaces\AnalysisServicesWorkspace1459206073\Data” and find the “FlightRecorderCurrent.trc” file there. This contains all the information.

Note: the “AnalysisServicesWorkspace1459206073” part will be different for you as this is unique for my workbook

Note2: I am using the Power BI desktop from the windows store. If you use the downloaded version you can find it at “C:\Users\\AppData\Local\Microsoft\Power BI Desktop\”

Copy the TRC file to a different folder and open it with SQL Server Profiler (if you don’t have it installed, it comes with SQL Server Management Studio).

This will show you a very detailed list of queries and operations that have been done with the data model:

The most interesting events are Query Begin, DirectQuery Begin and End that shows you the actual queries that are being generated, it also the contains other data like query duration. That allows you, for example, to find the query that took the longest to return.

This profile functionality Power BI gets for free since as it is based on the SSAS engine. We can find more information on this in the documentation for SSAS: https://docs.microsoft.com/en-us/sql/analysis-services/instances/monitor-an-analysis-services-instance

Hope this helps for any potential debugging.

Related

Let’s block ads! (Why?)

Kasper On BI

Read More

Change your Power BI report to point to an external SSAS model

October 19, 2017   Self-Service BI

A common questions I get is to change the connection string to from my report to SSAS after I move my Power BI desktop file into SSAS. It turns out this actually pretty simple as there is an API that allows you to copy and then bind the report to ANY dataset. Let’s walk through this.

I start with a very simple Power BI desktop file that contains a single table that I imported from SQL:

 Change your Power BI report to point to an external SSAS model

I then upload it to Power BI, I added it to it’s own app workspace.

Then I created a report that points to my local SSAS and uploaded it to the same workspace, I do this to make sure I have a dataset in Power BI that points to SSAS. If you already have a dataset present you can skip this step. Of course you have to set up a data gateway in case of using a local SSAS.

So now I have 2 datasets in my workspace (I use the same workspace but you can also have them live in different ones):

 Change your Power BI report to point to an external SSAS model

SalesIt is the dataset that points to my SSAS instance and the SalesPBI is the embedded data model. I also have the 2 reports:

 Change your Power BI report to point to an external SSAS model

Now here comes the magic. I am using a PowerShell report created by my colleague Sirui that allows you to copy and bind the new report to ANY dataset in Power BI. You can find the script here. The key thing here is to make sure the 2 schema’s are the same, it needs to have the same columns, measures, tables etc otherwise the report will throw errors. In my example I didn’t actually use a imported model but used a SQL Server 2016 RTM model with the same schema and that also works.

Ok now for the PowerShell script, it does two things:

1 It creates a copy of the report

2 It binds the new report to a dataset provided in the script

The script uses to the Power BI Clone API to clone the report and rebind it..

First we need to configure the the script, I created a new “app” on https://dev.powerbi.com/apps as described in the script to get a new client Id and set a name for my new report called “SalesNowIt”. Next I got all the ID’s needed for the script to run like report and groupId’s. The script has step by step instructions.

Now after configuring I just run the PowerShell script (no additional changes necessary). And now see a new report showing up:

 Change your Power BI report to point to an external SSAS model

And now when I run the report that previously pointed to my embedded model it still works:

 Change your Power BI report to point to an external SSAS model

But when running profiler I see queries going to my local SSAS instead of the embedded model:

 Change your Power BI report to point to an external SSAS model

So that’s it, pretty straightforward and very straightforward.  Of course you can extend this PowerShell script yourself to do whatever you want, for example, loop through all the reports in a workspace and rebind them all.

Related

Let’s block ads! (Why?)

Kasper On BI

Read More

Currency conversion in DAX for Power BI and SSAS

September 30, 2017   Self-Service BI

Another interesting question that came up recently is how to do currency conversion in Power BI. My colleague Jeffrey Wang had a very neat solution that I want to share more broadly.

Imagine you have sales in 3 different currencies happening on different dates:

 Currency conversion in DAX for Power BI and SSAS

Now when you are doing reporting you want to report them all in a single currency. So the first thing we need is to get a table with conversion rates for each date for each currency:

 Currency conversion in DAX for Power BI and SSAS

Next we need to model this so in the report we can choose which currency we want to report on and then automatically apply the right conversion rate. The model looks like this:

 Currency conversion in DAX for Power BI and SSAS

Both the Sales and the Exchange rate table are fact tables so we need to create 3 dimension tables to tie them together. I use calculated tables to create them as the data is already there:

ExchangeDate = Distinct(FactExchangeRate[Date]).

ReportCurrency = Distinct(FactExchangeRate[ReportCurrency])

TransactionCurrency = Distinct(FactExchangeRate[TransactionCurrency])

The ReportCurrency table will be used to create the slicer in the report that allows us to select the currency. The other 2 tables will allow us to get the correct Exchange rate for the each sales transaction using date and currency. This means that for each row in the factsales tables we filter down to 3 rows in the FactExchangerate table based on the date and currency relationship. Now based on the ReportCurrency slicer we filter it down to a single row thus leaving us with the Exchange rate we need.  The problem here of course is that we need to do this on the fly because for each day we might have transactions using multiple currencies so we need to take that into account.

We can do that using DAX, I start with getting the factor to calculate the exchange rate:

TrnExchangeRate = Min(FactExchangeRate[Factor])

This will get the lowest factor value from the fact table.

Now to calculate the sales per transaction I will use SUMX to achieve this:

TotalSales = if(HASONEVALUE(ReportCurrency[ReportCurrency]),
SumX( FactSales, [Sales] *                                                                                                                                        [TrnExchangeRate]))

This DAX expression will summarize (SUMX)  each row in the FactSales table (that is in the current selection) . For each row it will use the value from the sales column times the Exchange rate returned from the TrnExchangeRate measure. The SUMX is key to get the right numbers, it will iterate over the rows in the fact table dynamically and summarize the results into a single number (currency). Again this is key as each row might be a different currency on a different date and the relationships will make sure only a single value will get returned from the FactExchangeRate.

Putting them all together it allows me to build a report like this:

 Currency conversion in DAX for Power BI and SSAS

As you can see the 100 Euro’s get translated to 117.44 USD where the USD amounts stay the same. So a clever combination of the model and some DAX will get you currency conversion. Also the grand total is showing the one USD sales number.  Unfortunately there is dynamic formatting yet in Power BI to apply the format on the fly as well.

You can download the sample file here.

Related

Let’s block ads! (Why?)

Kasper On BI

Read More

Using CUSTOMDATA and SSAS with Power BI Embedded

September 24, 2017   Self-Service BI

Another Power BI embedded blog post while I have a playground up and running. Often when you want to use SSAS with Power BI embedded you have a “authentication issue”. Power BI and SSAS both leverage AD so that means that any user you pass from Power BI to SSAS needs to be known in AD. Now that is great in enterprise scenario’s where this usually is the case (and even required) but for ISV’s and any custom application this is a bigger problem. When embedding Power BI in a custom app you generally want to use the custom app’s authentication and not AD. Traditionally SSAS allows you to solve this using CUSTOMDATA and ROLES on the connection string https://docs.microsoft.com/en-us/sql/analysis-services/instances/connection-string-properties-analysis-services. With Power BI you cannot access the connectionstring so it is a bit harder. But there is a way. Let’s solve this issue with Power BI.

First of course I have a SSAS model, for convenience sake I added a simple measure CustomData = Customdata() that will allow us to see what has been passed in.

I start with a PBI desktop file that points to the SSAS model and create a very simple report. You see the CustomData measure doesn’t return anything as it hasn’t been set on the connection string:

 Using CUSTOMDATA and SSAS with Power BI Embedded

Next I publish the report to Power BI and set up a data gateway.  Here I am using a AD user that is admin to SSAS as the username and password (usually a service account)

 Using CUSTOMDATA and SSAS with Power BI Embedded

In the Power BI service the report looks identical:

 Using CUSTOMDATA and SSAS with Power BI Embedded

Now by default Power BI sets the EffectiveUsername property on the connection string as we can see in SQL Server Profiler output below on my Discover end (pro tip: Always have this up and running when testing like this, it is a real treasure trove  Using CUSTOMDATA and SSAS with Power BI Embedded):

SalesModel2.0SalesModelPowerBIDOMAIN\USERNAME103345052SchemaDataTabularebe7a284-b140-4bd9-8c38-e162e36d8f9966a221b2-37b9-283a-1064-6d0c2fb6a4a0c882c8d1-60e8-4c1e-8beb-01865c2751c7
< /PropertyList>

Now here comes the interesting part. In the gateway configuration part we can actually configure Power BI to also send the customdata propertie. This is hidden away in the Map user names pane:

 Using CUSTOMDATA and SSAS with Power BI Embedded

As soon as you check this you will see 2 things. One the report now shows my email address as result from the CUSTOMDATA function

 Using CUSTOMDATA and SSAS with Power BI Embedded

But also in SQL profiler you will see Customdata showing up

SalesModel2.0SalesModelPowerBIDOMAIN\USERNAMEusername@domain.com103345052SchemaDataTabularebe7a284-b140-4bd9-8c38-e162e36d8f9966a221b2-37b9-283a-1064-6d0c2fb6a4a0c882c8d1-60e8-4c1e-8beb-01865c2751c7

Part 1 completed. Now let’s start the Power BI embedding part.

Again I used the PowerBIEmbedded_AppOwnsData sample to get started and get up and running, again by default when you run it will act the same way as Power BI and the credentials used as the service account will be the one passed into SSAS. Now as explained in this Power BI help document you can change the sample to pass in any username you want. Now here is the key, because we configured the gateway to use CUSTOMDATA instead of the EffectiveUsername it will change the CUSTOMDATA part with whatever you specify. So in my sample project (HomeController.cs) I replaced :

var generateTokenRequestParameters = new GenerateTokenRequest(accessLevel: "view");
var tokenResponse = await client.Reports.GenerateTokenInGroupAsync(GroupId, report.Id, generateTokenRequestParameters);

with

var generateTokenRequestParameters = new GenerateTokenRequest("View", null, identities: new List { new EffectiveIdentity(username: "username", roles: new List { "roleA" }, datasets: new List { report.DatasetId.ToString() }) });
var tokenResponse = await client.Reports.GenerateTokenInGroupAsync(GroupId, report.Id, generateTokenRequestParameters);

Now things get interesting on the SSAS side. We see something we might not expect as we are connecting to SSAS as administrator:

Either the user, ‘DOMAIN\USERNAME’, does not have access to the 'Sales' database, or the database does not exist.

The reason we are getting this is because we also specified the role. Going back to the connection string document in MSDN we can read the following on Roles:

Specify a comma-delimited list of predefined roles to connect to a server or database using permissions conveyed by that role. If this property is omitted, all roles are used, and the effective permissions are the combination of all roles. Setting the property to an empty value (for example, Roles=’ ‘) the client connection has no role membership.An administrator using this property connects using the permissions conveyed by the role. Some commands might fail if the role does not provide sufficient permission.

So this means that on this connection he is no longer admin and he needs a role to connect to. So let’s add one, I go into SSMS and add the role with the role name “roleA” as defined in the code above with Read rights:
 Using CUSTOMDATA and SSAS with Power BI Embedded

Next you have to make sure you add the service account as member and lastly you can add a RLS expression.

 Using CUSTOMDATA and SSAS with Power BI Embedded

In my case I added something hard coded:

=DimProduct[ColorName] = IF(CUSTOMDATA() = "username", "Silver", BLANK())

but of course you can follow the complete RLS pattern here and just replace USERNAME for CUSTOMDATA.

Running the report shows what we wanted to see, it only shows Silver colors and returns “username” out of the CUSTOMDATA function:

 Using CUSTOMDATA and SSAS with Power BI Embedded

Looking at the profiler trace we can again see it passes in the right CUSTOMDATA field:

Sales Model 9 0BA0F14D-56AA-4F8A-BC65-93FD2556FD8E 0A1AEB37-E7C5-45EB-8F75-8816A0648B0C 0BA0F14D-56AA-4F8A-BC65-93FD2556FD8E 1033 roleA DOMAIN\USERNAME PowerBI username

That’s it folks  Using CUSTOMDATA and SSAS with Power BI Embedded This opens a lot of interesting scenario’s for connecting to AS from Power BI. Unfortunately this doesn’t work for Azure AS as you cannot use the gateway to connect to it.

Share this:

Let’s block ads! (Why?)

Kasper On BI

Read More

Use Always Encrypted data with SSAS and Power BI

May 15, 2017   Self-Service BI

I got an interesting question today about using Always Encrypted columns with SSAS and Power BI and it turns out it this just works. Let’s take a look at how we can set this up.

The way that Always Encrypted works is that a SQL client driver (provider) transparently decrypts the data after reading it from the column. The data, stored in the encrypted column, never appears in plaintext inside SQL Server or over the wire. To be able to decrypt and read the data, the client connecting to SQL Server needs to have access to a column master key, such a certificate installed on the client machine, which is then used to decrypt a column encryption key, which subsequently allows the provider to decrypt the data locally. For more please read this article: https://aka.ms/AlwaysEncrypted.

To start I need to create some encrypted data. I created a database and create a single table where I encrypted two columns using the Always Encrypted wizard:

 Use Always Encrypted data with SSAS and Power BI

That takes care of generating the keys, including the certificate I will be using as the column master key, and encrypting the data. Of course this is not an in-depth post on how you should set up Always Encrypted – I would recommend reading the best practices and considerations at BOL. One side note here is that you can only use certificates, the Azure key vault option doesn’t work with SSAS.

Next I created a SSAS model in SSDT in Direct Query mode (as described here) pointing to this table in SQL Server using the general steps. When I now connect Power BI desktop to my model I get what I expected – encrypted data:

 Use Always Encrypted data with SSAS and Power BI

Now here comes the trick. In this case Analysis Services is our SQL client, it receives DAX queries from the client tools and then translates these into SQL queries and then sends them to SQL on the BI clients (Excel / Power BI desktop / Power BI service) behalf. This means we can use SSAS to do the decryption, to configure this I need to do 3 things:

  1. Install the client certificate I am using as the column master key on my AS Server. In my case, the AS Server is running on the machine I generated the certificate on by running the Always Encrypted wizard, so I don’t have to do anything. If your AS Server is running on a different machine than the computer you generated the certificate on, you will need to install the certificate on the AS Server machine and make sure the AS Server has a permission to access the certificate. More on this here.
  2. The second thing is that I need to switch the provider in SSAS from the default, the SQL Native client, to the .Net Framework provider. Note: the  .NET Framework 4.6.1 or later, needs to be present on the AS Server machine. Use Always Encrypted data with SSAS and Power BI
  3. The last thing I need to do, is to change the provider setting “Column Encryption Setting” to “Enabled”
     Use Always Encrypted data with SSAS and Power BI

That’s it. I now just deploy the changes and refresh my Power BI desktop file:

 Use Always Encrypted data with SSAS and Power BI

Now instead of encrypted data I see the plaintext values.  Now just to show you, this also works when I publish my report to Power BI and use the gateway:

 Use Always Encrypted data with SSAS and Power BI

The reason for this, and I want to make sure this is clear, is that the data is decrypted at the AS Server and then transported to the clients connecting to AS in plain text. In this case the data is still in encrypted text between AS and SQL but in plain text between AS and the Gateway and then on to Power BI, of course the data still goes through encrypted network communication (thanks Greg for pointing this out).

Just a note, this works for both Direct Query and import mode but importing the data kind of defeats the purpose as the data is now stored in SSAS unencrypted. It also works for Multi dimensional models where you can follow the same steps as above to set the provider settings to support AE.

Now if you want to use other SQL Server security features like data masking or RLS you need to make sure you pass on the username of the user connecting to SSAS on to SQL, then the correct security features will be in effect. This is possible as I described here: https://www.kasperonbi.com/use-sql-server-rls-with-ssas-and-power-bi/

Share this:

Let’s block ads! (Why?)

Kasper On BI

Read More
« Older posts
  • Recent Posts

    • Why it’s time for fintechs and FIs to jump on the open banking bandwagon (VB Live)
    • Integrating a function with integration limits also dependent on a variable
    • GIVEN WHAT HE TOLD A MARINE…..IT WOULD NOT SURPRISE ME
    • How the pandemic is accelerating enterprise open source adoption
    • Rickey Smiley To Host 22nd Annual Super Bowl Gospel Celebration On BET
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited