Tag Archives: SSAS
Figure out which columns and measures are used in your SSAS Tabular model
I got an interesting question about being able to figure out which measures and columns are being used for all tabular models in a server, either by known reports or by self service users using Power BI. To solve this I decided to use Power BI :).
To get the required information I decided to capture and parse the queries being that are being send to AAS and parse the results Power BI desktop over a period of time. In this post I describe how to do it.
Start capturing data
First I started capturing the queries in AAS using Azure Log analytics as I described here. This can also be done using profiler or XEvents as well but I decided to be modern :).
In log analytics I can see all the queries that get send:

For my analysis I don’t need all the columns and events, so let’s filter that down with log analytics to something like this:
AzureDiagnostics
| where OperationName == “QueryBegin”
| project TimeGenerated, TextData_s, DatabaseName_s
| where TimeGenerated > ago(30d)
You can change the time range to any range you want to whatever max range you have data for. You need to keep the TimeGenerated column as we want to filter by it.
This will get us the data we can use in Power BI:

To get this data into Power BI you can click Export, “Export to Power BI” in the query log screen.

Bringing it into Power BI
This gets you a txt file with the M expression that is downloaded to your PC. You can then paste the M query into a blank Power BI query. Running it gets me the data I want:

From this I need to parse the columns and measures that are used over the time period I captured the trace. This requires some tricks in Power Query (and it might not work for all scenario’s). These are the steps I took (you can download the full M query here):
- I got rid of all diagnostic queries by filtering out Select.
- The time column is not needed for the analysis so I remove it, you need it for log analytics to filter on time.
- Now here comes the main trick. I need to extract column, measure and table names from the query. I do this by replacing many special characters like ( ,[ ] EVALUATE by ~ and then split the columns. Thanks to Imke for the idea :). That starts to get us somewhere:

- Next I Unpivot and clean up the data a bit
- I filter to only show rows that start with ‘ or [. Which keeps all column, table and measure references.
- To add more information I add a column to show what the type of field it is, measure, table or column.
- Finally I group the table and add a count.
This gets me the data I want:

With the data I can start doing some reporting:

Now you have an overview of which fields are used and how often they are referenced. You can extend this solution much further of course, for example you could add a username to see who accessed what and how often or you could compare it with the schema that you can get from other discovers to see which fields are never used so you can clean them up.
You can download the full solution here as PBIT. When you open it up for the first time it asks for the URL, you can get it from the M query that you download from Log Analytics.
Separate business names from technical names using SSAS and Power BI
I have heard this question come up now and then where someone wants to separate the names and columns seen by users from the names and users used in SSAS and in the reports. The main benefit this has is that name changes will not result in broken reports.
Turns out there is a neat trick that will allow you to do this. The trick is for you to add a translation to the model in the same language as the culture of the model is set to. As you know at the time of writing Power BI doesn’t support translations (it is planned though: https://docs.microsoft.com/en-us/business-applications-release-notes/october18/intelligence-platform/power-bi-service/dataset-metadata-translations) but when you add a translation for the default language (again the same as the model) Power BI will pick it up automatically .
Lets take a look. First you need SQL Server Tabular 2016 and up or Azure Analysis Services to support translations. Next I create a new model in SSDT and to make sure I check the culture by looking at the BIM file (using code view):
Now we need to add a translation for the same language. Unfortunately the tool doesn’t allow you to add the same language as the model (oops I was the PM who added this feature..) but it is easy to change later. Close the bim file and double click the model again to open it in the visual editor and select manage translations under Model in the top toolbar. Now add any language, select the language and press export.
Now open the JSON file with your favorite editor:
And change the name to the same string as the culture of your model and add the translations.
Save the file and load it back in, go to Model, Translations, Load translation and select the JSON file again.
Now you see the translations is added:
and now when I connect to that model using Power BI I see the translated names:
If I now use them in a visual it shows the translated name:
If I now rename the field using translations again to something else the visual still works and now has the same name (after a refresh):
Nifty trick
Separate business names from technical names using SSAS and Power BI
I have heard this question come up now and then where someone wants to separate the names and columns seen by users from the names and users used in SSAS and in the reports. The main benefit this has is that name changes will not result in broken reports.
Turns out there is a neat trick that will allow you to do this. The trick is for you to add a translation to the model in the same language as the culture of the model is set to. As you know at the time of writing Power BI doesn’t support translations (it is planned though: https://docs.microsoft.com/en-us/business-applications-release-notes/october18/intelligence-platform/power-bi-service/dataset-metadata-translations) but when you add a translation for the default language (again the same as the model) Power BI will pick it up automatically .
Lets take a look. First you need SQL Server Tabular 2016 and up or Azure Analysis Services to support translations. Next I create a new model in SSDT and to make sure I check the culture by looking at the BIM file (using code view):
Now we need to add a translation for the same language. Unfortunately the tool doesn’t allow you to add the same language as the model (oops I was the PM who added this feature..) but it is easy to change later. Close the bim file and double click the model again to open it in the visual editor and select manage translations under Model in the top toolbar. Now add any language, select the language and press export.
Now open the JSON file with your favorite editor:
And change the name to the same string as the culture of your model and add the translations.
Save the file and load it back in, go to Model, Translations, Load translation and select the JSON file again.
Now you see the translations is added:
and now when I connect to that model using Power BI I see the translated names:
If I now use them in a visual it shows the translated name:
If I now rename the field using translations again to something else the visual still works and now has the same name (after a refresh):
Nifty trick
Released: SSAS and SSRS 2017+ Management Packs (7.0.10.0)
We are happy to announce that Management Packs for SQL Server 2017 Analysis Services and Reporting Services are released.
Please download at:
Version Agnostic SSAS and SSRS MPs
As you might recall, we introduced version agnostic MPs starting with SQL Server 2017. We are now doing the same for SSAS and SSRS. We understand that with many SQL Server versions in market and with new server releases becoming more frequent, it is becoming harder to manage a separate MP for each server version. We are moving to version agnostic MPs to address this issue. This will be valid going forward. The new MPs will be named SSAS and SSRS 2017+. The ‘+’ in the name indicates that it will be used to monitor SQL Server AS and RS 2017 and the releases that come after that. Current in-market MPs (2008 through 2016) will not be changed and the 2017+ MP cannot be used to monitor older releases.
For more details, please refer to the user guides that can be downloaded along with the corresponding Management Packs.
Release Note
These management packs have been changed so that they now require Microsoft SQL Server 2017+ Core Library of version 7.0.7.0, which is delivered within these management packs as well as within the management pack for Microsoft SQL Server 2017+ of version 7.0.7.0. If you have the Core Library of version 7.0.8.0 or higher installed in your test environment, it will not be possible to completely update the management packs to version 7.0.10.0, because SCOM cannot import a lower version of already installed either .MP or .MPB file. We recommend you entirely delete the CTP version of the management packs for SSRS 2017+ and SSAS 2017+ including the Core Library and then install the current RTM version of them.
SSAS 2017+ MP New Features and Fixes
- Replaced the Core Library in the delivery with the version 7.0.7.0, the version delivered with the most recent RTM version of the management pack for SQL Server 2017+.
- Improved displaying of the SSAS instance version (now shows Patch Level version instead of Version).
- Added missed dependency monitors required to roll up the instance health appropriately.
- Fixed alert for “Partition Storage Free Space” monitor.
- Updated Summary dashboards.
- Updated display strings.
SSRS 2017+ MP New Features and Fixes
- Replaced the Core Library in the delivery with the version 7.0.7.0, the version delivered with the most recent RTM version of the management pack for SQL Server 2017+.
- Updated the monitoring of Memory Consumption and CPU Usage in order to collect performance data for all subprocesses in addition to the main SSRS service process.
- Updated display strings.
- Updated Summary dashboards.
- Fixed minor issues found in the CTP version.
Released: Public Preview for SSAS and SSRS 2017+ Management Packs (7.0.8.0)
We are getting ready to release brand new management packs for SQL Server Analysis Services and SQL Server Reporting Services 2017. Please install and use this public preview and send us your feedback (sqlmpsfeedback@microsoft.com)! We appreciate the time and effort you spend on these previews which make the final product so much better.
Please download at:
Version Agnostic SSAS and SSRS MPs
As you might recall, we introduced version agnostic MPs starting with SQL Server 2017. We are now doing the same for SSAS and SSRS. We understand that with many SQL Server versions in market and with new server releases becoming more frequent, it is becoming harder to manage a separate MP for each server version. We are moving to version agnostic MPs to address this issue. This will be valid going forward. The new MPs will be named SSAS and SSRS 2017+. The ‘+’ in the name indicates that it will be used to monitor SQL Server AS and RS 2017 and the releases that come after that. Current in-market MPs (2008 through 2016) will not be changed and the 2017+ MP cannot be used to monitor older releases.
For more details, please refer to the user guides that can be downloaded along with the corresponding Management Packs.
We are looking forward to hearing your feedback at sqlmpsfeedback@microsoft.com.
How to turn on SSAS profiler traces in Power BI desktop
Just a quick post today. Power BI desktop has a feature to turn on the SSAS profile traces that allows you to see all the queries (DAX and SQL in DirectQuery mode) generated by the reports. I couldn’t find any documentation to turn it on so let’s take you through the steps here.
I start with a very simple PBI desktop file that points to a SQL Azure Database in Direct Query mode:
To see which queries are being run I turn on Tracing by going to File -> Options & Settings -> Options and enable tracing:
Now I press OK and click refresh on the report to make sure the query gets send to the database. Now a trace file has been generated. I can go to “C:\Users\
Note: the “AnalysisServicesWorkspace1459206073” part will be different for you as this is unique for my workbook
Note2: I am using the Power BI desktop from the windows store. If you use the downloaded version you can find it at “C:\Users\
Copy the TRC file to a different folder and open it with SQL Server Profiler (if you don’t have it installed, it comes with SQL Server Management Studio).
This will show you a very detailed list of queries and operations that have been done with the data model:
The most interesting events are Query Begin, DirectQuery Begin and End that shows you the actual queries that are being generated, it also the contains other data like query duration. That allows you, for example, to find the query that took the longest to return.
This profile functionality Power BI gets for free since as it is based on the SSAS engine. We can find more information on this in the documentation for SSAS: https://docs.microsoft.com/en-us/sql/analysis-services/instances/monitor-an-analysis-services-instance
Hope this helps for any potential debugging.
Change your Power BI report to point to an external SSAS model
A common questions I get is to change the connection string to from my report to SSAS after I move my Power BI desktop file into SSAS. It turns out this actually pretty simple as there is an API that allows you to copy and then bind the report to ANY dataset. Let’s walk through this.
I start with a very simple Power BI desktop file that contains a single table that I imported from SQL:
I then upload it to Power BI, I added it to it’s own app workspace.
Then I created a report that points to my local SSAS and uploaded it to the same workspace, I do this to make sure I have a dataset in Power BI that points to SSAS. If you already have a dataset present you can skip this step. Of course you have to set up a data gateway in case of using a local SSAS.
So now I have 2 datasets in my workspace (I use the same workspace but you can also have them live in different ones):
SalesIt is the dataset that points to my SSAS instance and the SalesPBI is the embedded data model. I also have the 2 reports:
Now here comes the magic. I am using a PowerShell report created by my colleague Sirui that allows you to copy and bind the new report to ANY dataset in Power BI. You can find the script here. The key thing here is to make sure the 2 schema’s are the same, it needs to have the same columns, measures, tables etc otherwise the report will throw errors. In my example I didn’t actually use a imported model but used a SQL Server 2016 RTM model with the same schema and that also works.
Ok now for the PowerShell script, it does two things:
1 It creates a copy of the report
2 It binds the new report to a dataset provided in the script
The script uses to the Power BI Clone API to clone the report and rebind it..
First we need to configure the the script, I created a new “app” on https://dev.powerbi.com/apps as described in the script to get a new client Id and set a name for my new report called “SalesNowIt”. Next I got all the ID’s needed for the script to run like report and groupId’s. The script has step by step instructions.
Now after configuring I just run the PowerShell script (no additional changes necessary). And now see a new report showing up:
And now when I run the report that previously pointed to my embedded model it still works:
But when running profiler I see queries going to my local SSAS instead of the embedded model:
So that’s it, pretty straightforward and very straightforward. Of course you can extend this PowerShell script yourself to do whatever you want, for example, loop through all the reports in a workspace and rebind them all.
Currency conversion in DAX for Power BI and SSAS
Another interesting question that came up recently is how to do currency conversion in Power BI. My colleague Jeffrey Wang had a very neat solution that I want to share more broadly.
Imagine you have sales in 3 different currencies happening on different dates:
Now when you are doing reporting you want to report them all in a single currency. So the first thing we need is to get a table with conversion rates for each date for each currency:
Next we need to model this so in the report we can choose which currency we want to report on and then automatically apply the right conversion rate. The model looks like this:
Both the Sales and the Exchange rate table are fact tables so we need to create 3 dimension tables to tie them together. I use calculated tables to create them as the data is already there:
ExchangeDate = Distinct(FactExchangeRate[Date]).
ReportCurrency = Distinct(FactExchangeRate[ReportCurrency])
TransactionCurrency = Distinct(FactExchangeRate[TransactionCurrency])
The ReportCurrency table will be used to create the slicer in the report that allows us to select the currency. The other 2 tables will allow us to get the correct Exchange rate for the each sales transaction using date and currency. This means that for each row in the factsales tables we filter down to 3 rows in the FactExchangerate table based on the date and currency relationship. Now based on the ReportCurrency slicer we filter it down to a single row thus leaving us with the Exchange rate we need. The problem here of course is that we need to do this on the fly because for each day we might have transactions using multiple currencies so we need to take that into account.
We can do that using DAX, I start with getting the factor to calculate the exchange rate:
TrnExchangeRate = Min(FactExchangeRate[Factor])
This will get the lowest factor value from the fact table.
Now to calculate the sales per transaction I will use SUMX to achieve this:
TotalSales = if(HASONEVALUE(ReportCurrency[ReportCurrency]),
SumX( FactSales, [Sales] * [TrnExchangeRate]))
This DAX expression will summarize (SUMX) each row in the FactSales table (that is in the current selection) . For each row it will use the value from the sales column times the Exchange rate returned from the TrnExchangeRate measure. The SUMX is key to get the right numbers, it will iterate over the rows in the fact table dynamically and summarize the results into a single number (currency). Again this is key as each row might be a different currency on a different date and the relationships will make sure only a single value will get returned from the FactExchangeRate.
Putting them all together it allows me to build a report like this:
As you can see the 100 Euro’s get translated to 117.44 USD where the USD amounts stay the same. So a clever combination of the model and some DAX will get you currency conversion. Also the grand total is showing the one USD sales number. Unfortunately there is dynamic formatting yet in Power BI to apply the format on the fly as well.
You can download the sample file here.
Use Always Encrypted data with SSAS and Power BI
I got an interesting question today about using Always Encrypted columns with SSAS and Power BI and it turns out it this just works. Let’s take a look at how we can set this up.
The way that Always Encrypted works is that a SQL client driver (provider) transparently decrypts the data after reading it from the column. The data, stored in the encrypted column, never appears in plaintext inside SQL Server or over the wire. To be able to decrypt and read the data, the client connecting to SQL Server needs to have access to a column master key, such a certificate installed on the client machine, which is then used to decrypt a column encryption key, which subsequently allows the provider to decrypt the data locally. For more please read this article: https://aka.ms/AlwaysEncrypted.
To start I need to create some encrypted data. I created a database and create a single table where I encrypted two columns using the Always Encrypted wizard:
That takes care of generating the keys, including the certificate I will be using as the column master key, and encrypting the data. Of course this is not an in-depth post on how you should set up Always Encrypted – I would recommend reading the best practices and considerations at BOL. One side note here is that you can only use certificates, the Azure key vault option doesn’t work with SSAS.
Next I created a SSAS model in SSDT in Direct Query mode (as described here) pointing to this table in SQL Server using the general steps. When I now connect Power BI desktop to my model I get what I expected – encrypted data:
Now here comes the trick. In this case Analysis Services is our SQL client, it receives DAX queries from the client tools and then translates these into SQL queries and then sends them to SQL on the BI clients (Excel / Power BI desktop / Power BI service) behalf. This means we can use SSAS to do the decryption, to configure this I need to do 3 things:
- Install the client certificate I am using as the column master key on my AS Server. In my case, the AS Server is running on the machine I generated the certificate on by running the Always Encrypted wizard, so I don’t have to do anything. If your AS Server is running on a different machine than the computer you generated the certificate on, you will need to install the certificate on the AS Server machine and make sure the AS Server has a permission to access the certificate. More on this here.
- The second thing is that I need to switch the provider in SSAS from the default, the SQL Native client, to the .Net Framework provider. Note: the .NET Framework 4.6.1 or later, needs to be present on the AS Server machine.
- The last thing I need to do, is to change the provider setting “Column Encryption Setting” to “Enabled”
That’s it. I now just deploy the changes and refresh my Power BI desktop file:
Now instead of encrypted data I see the plaintext values. Now just to show you, this also works when I publish my report to Power BI and use the gateway:
The reason for this, and I want to make sure this is clear, is that the data is decrypted at the AS Server and then transported to the clients connecting to AS in plain text. In this case the data is still in encrypted text between AS and SQL but in plain text between AS and the Gateway and then on to Power BI, of course the data still goes through encrypted network communication (thanks Greg for pointing this out).
Just a note, this works for both Direct Query and import mode but importing the data kind of defeats the purpose as the data is now stored in SSAS unencrypted. It also works for Multi dimensional models where you can follow the same steps as above to set the provider settings to support AE.
Now if you want to use other SQL Server security features like data masking or RLS you need to make sure you pass on the username of the user connecting to SSAS on to SQL, then the correct security features will be in effect. This is possible as I described here: https://www.kasperonbi.com/use-sql-server-rls-with-ssas-and-power-bi/