Tag Archives: Power

Webinar 7/17/2018 July 2018 Update for Power BI Embedded with Ted Pattison

Join Charles Sterling and Ted Pattison as they walk through what is new and coming for Power BI Embedding in July 2018. 

If you are not fmiliar with with Power BI Embedded for application developers you can embed visual analytics in your products, so your users and customers can gain valuable insights, and you can get to market fast. To embed Power BI in your application or portal, you'll need at least one Power BI Pro account, which will serve as a master account for your application. This master account will allow you to generate embed tokens to enable your application to access Power BI. 

Where: https://community.powerbi.com/t5/Webinars-and-Video-Gallery/July-2018-Update-for-Power-BI-Embedding-with-Ted-Pattison/m-p/463628#M116

When: 7/17/2018

pattisoncriticalpath Webinar 7/17/2018 July 2018 Update for Power BI Embedded with Ted Pattison

Ted Pattison is an author, instructor and owner of Critical Path Training (www.CriticalPathTraining.com), a company dedicated to education on Microsoft technologies including Power BI, SharePoint and Office 365. Ted has been speaking at conferences for over 20 years and is a regular speaker at users groups including the Power BI User Group which started in April of 2016. Ted also maintains several open source projects in GitHub including the SharePoint 2016 VM Setup Guide, The Online Virtual Classroom setup Guide and the Power BI Party Pack with sample data and code for demonstrations.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Row-Level Security in Power BI with Dynamics 365

row level security 300x225 Row Level Security in Power BI with Dynamics 365

Power BI offers a suite of security features to help restrict data. One way to do this is with Row-level security. Row-level security (RLS) with the Power BI Desktop can be used to restrict data access for specific users, filtering data at the row level, and defining filters within roles.

In today’s blog, we’ll go over how to set-up this feature in Power BI and an example of how you can use it in Dynamics 365.

What you will need:

  • Power BI Desktop
  • Dynamics 365 Organization linked with Power BI Service

In this example, we’re going to be using an Excel file composed of 10,000 fictional Orders, across multiple companies, located on the West and East Coasts. Our goal is to have a single Dashboard viewable within Dynamics 365 that displays the records appropriate per role.

Our starting point will be the image below, all orders put into a simple Orders Dashboard.

071618 1449 RowLevelSec1 Row Level Security in Power BI with Dynamics 365

Defining Roles within Power BI Desktop

1. Select the Modeling tab.

2. Select Manage Roles.

071618 1449 RowLevelSec2 Row Level Security in Power BI with Dynamics 365

3. Select Create.

4. Provide a name for the role.

5. Select the table to apply a DAX expression.

6. Enter the DAX expressions. This expression should return a true or false.

071618 1449 RowLevelSec3 Row Level Security in Power BI with Dynamics 365

7. Select Save.

Viewing a Role within Power BI Desktop

Once your role has been created, you can view the results of the role by executing the following steps:

1. From the Modeling tab, select View as Roles.

071618 1449 RowLevelSec4 Row Level Security in Power BI with Dynamics 365

2. The View as Roles dialog allows you to change the view of what you are seeing for that specific user or role.

071618 1449 RowLevelSec5 Row Level Security in Power BI with Dynamics 365

3. Select the role you created and then select OK to apply that role to what you are viewing. The reports will only render the data relevant for that role.

071618 1449 RowLevelSec6 Row Level Security in Power BI with Dynamics 365

Compared to the image we saw earlier; the difference is clear:

071618 1449 RowLevelSec7 Row Level Security in Power BI with Dynamics 365

We are now ready to assign the role to a new user.

Assigning Roles in Power BI

1. Navigate to the Power BI service.

2. Go to DATASETS and click on the ellipses to the right of the name.

3. Click on SECURITY.

071618 1449 RowLevelSec8 Row Level Security in Power BI with Dynamics 365

4. Enter the name of the user or group you want to apply Row-level security to.

071618 1449 RowLevelSec9 Row Level Security in Power BI with Dynamics 365

5. Click Add.

6. Click Save.

Now that we have a working Power BI Dashboard with Row-level Security applied, let’s look at how it renders in Dynamics 365.

1. From the Dashboard, add a new Power BI Dashboard and select the Dashboard you published to the Power BI service.

Note: if you are not presented with the option to create a new Power BI dashboard within Dynamics, you may need to enable Power BI on the Reporting tab in System Settings.

071618 1449 RowLevelSec10 Row Level Security in Power BI with Dynamics 365

2. Share the Dashboard with any user or team who will need access.

071618 1449 RowLevelSec11 Row Level Security in Power BI with Dynamics 365

That’s it! Maximizing the report in CRM as Power BI and CRM administrators, you’ll be able to view the Dashboard with the entire data set.

071618 1449 RowLevelSec12 Row Level Security in Power BI with Dynamics 365

If you sign in as a Standard User and view the same Dashboard, we get different results from Row-level security in Power BI. From this screen, we can also leverage Power BI to dig into the data appropriate to their role.

071618 1449 RowLevelSec13 Row Level Security in Power BI with Dynamics 365

Row-level security in Power BI gives you the ability to restrict data at the row level based on true or false statements for users or groups. Leveraging the compatibility of Dynamics 365 with Power BI, we can use Row-level security to show users within CRM only the rows appropriate to their role.

You can achieve the same results by putting your data into CRM and using CRM Security Roles, however, here are a few reasons you might want to use this method instead:

  • Not wanting to store data in Dynamics 365 due to storage space or business decisions.
  • Needing to manage one Dashboard instead of multiple ones per user/ team within Power BI.
  • Business doesn’t want to run an integration between the data warehouse and Dynamics 365.
  • Not wanting to modify existing security roles or business unit security in Dynamics 365.

In the situations above, Row-level security with Power BI offers us an option to present the clearest information with reduced administrative overhead to Dynamics 365 Administrators. To learn more about Power BI, check out our Power BI Showcase.

Happy D365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Credit Knowledge is Power

Screen Shot 2018 07 10 at 3.08.33 PM Credit Knowledge is Power

This is a guest post from Rourke O’Brien, Assistant Professor of Public Affairs at the University of Wisconsin-Madison

Does regular consumer access to their FICO® Score influence financial knowledge and behavior?

Professors Abigail Sussman, Tatiana Homonoff and I recently completed a multi-year research study with 400,000 Sallie Mae customers.  Using a randomized control trial research design, we studied whether student loan borrowers who check their credit scores regularly make better financial decisions and manage finances more responsibly. The answer? An emphatic YES.

Kelly Christiano, SVP at Sallie Mae and I had the opportunity to present the findings at this year’s FICO World. Sallie Mae was the first national private education lender to offer free access to quarterly FICO Scores through the FICO®Score Open Access program.

Our research found that after one year, student loan borrowers who logged on to view their FICO Score had fewer past due accounts and took steps to establish a credit history. These positive behaviors ultimately translated into higher FICO Scores for borrowers.

For students who want to improve financial health, it is important to be aware of the terms of opening new accounts (only open new lines of credit when needed) and to make sure that payments are on time. Moreover this research suggests that all consumers, not only student loan borrowers, should regularly check their FICO Score to be thoughtful and empowered about how their credit behavior impacts their long-term financial health.

You can also review the full research details here “Does Knowing Your FICO Score Change Financial Behavior? Evidence from a Field Experiment with Student Loan Borrowers

Let’s block ads! (Why?)


Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

The series so far:

  1. Power BI Introduction: Tour of Power BI — Part 1
  2. Power BI Introduction: Working with Power BI Desktop — Part 2
  3. Power BI Introduction: Working with R Scripts in Power BI Desktop — Part 3
  4. Power BI Introduction: Working with Parameters in Power BI Desktop — Part 4

Power BI Desktop, the downloadable application that supports the Power BI service, lets you define parameters that you can use in various ways when working with datasets. The parameters are easy to create and can be incorporated into the import process or later to refine a dataset or add dynamic elements. For example, you can create parameters that supply connection information to a data source or that provide predefined values for filtering data.

This article demonstrates how to create a set of parameters to use with data imported from a local SQL Server instance. The examples are based on data from the AdventureWorks2017 database, but you can use an earlier version of the database or whatever database you choose, as long as the returned data follows a structure similar to what is used here. Just be sure to update the parameter values accordingly as you work through the article.

Adding Connection-Specific Parameters

Some data source connections in Power BI Desktop—such as SharePoint, Salesforce Objects, and SQL Server—let you use parameters when defining the connection properties. For example, if retrieving data from SQL Server, you can use a parameter for the SQL Server instance and a parameter for the target database.

Because parameters are independent of any datasets, you can create them before adding a dataset or at any time after creating your datasets. However, you must define them and set their initial values in Query Editor. The parameters you create are listed in the Queries pane, where you can view and update their values, as well as reconfigure their settings.

In this article, you will create two connection parameters for retrieving data from a SQL Server database. The first parameter will include a list of SQL Server instances that could host the source data. (Only one instance needs to work in this case. The rest can be for demonstration purposes only.)

To create the parameter, open Query Editor, click the Manage Parameters down arrow on the Home ribbon, and then click New Parameter. In the Parameters dialog box, type SqlSrvInstance in the Name text box, and then type a parameter description in the Description text box.

From the Type drop-down list, select Text, and from the Suggested Values drop-down list, select List of values. When you select the List of values option, a grid appears, where you type in the individual values you want to assign to the variable, as shown in the following figure. Be sure that at least one of those values is the name of an actual SQL Server instance.

word image Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

After you type the list of values, select a default value from the Default Value drop-down list and then select the variable’s current value from the Current Value drop-down list. In the figure above, a local SQL Server instance named SqlSrv17a is used for both the default and current values.

Click OK to close the Parameters dialog box. Query Editor adds the parameter to the Queries pane, with the current value shown in parentheses. When you select the parameter, Query Editor displays the Current Value drop-down list and the Manage Parameter button in the main pane, as shown in the following figure.

word image 1 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

You can change the current value at any time by selecting a different one from the Current Value drop-down list, or you can change the parameter’s settings by clicking the Manage Parameter button, which returns you to the Parameters dialog box.

To create the parameter for the target database, repeat the process, but name the parameter Database, as shown in the following figure. Be sure to provide at least one valid database in your list of databases. When you click OK, Query Editor adds the parameter to the Queries pane, just like it did for the SqlSrvInstance parameter.

word image 2 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

That’s all there is to creating the connection parameters. You can also configure parameters with different data types, such as decimals or dates, or with different value formats. For example, you can configure the parameter to accept any value or to use values from a list query, a type of query that contains only one column. You can create a list query manually by using an M statement, or you can create a list query based on a regular dataset, although this approach still seems somewhat buggy. For this article, you’ll stick with lists, but know you have other options.

Using Parameters to Connect to a Data Source

With your parameters defined, you’re ready to connect to the SQL Server instance and retrieve data from the target database. If you haven’t already done so, apply your changes in Query Editor and close the window.

For this exercise, you will be running a T-SQL query, but before you try to do this, verify that the Require user approval for new native database queries property is disabled. If it is enabled, you will receive an error when trying to run the query. To access the property, click the File menu, point to Options and settings, and then click Options. In the Options dialog box, go to the Security category, clear the property’s checkbox if selected, and click OK.

Then, in the main Power BI Desktop window, go to Data view and click Get Data on the Home ribbon. In the Get Data dialog box, navigate to the Database connections and select SQL Server database, as shown in the following figure.

word image 3 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

When you click the Connect button, the SQL Server database dialog box appears. In the Server section at the top of the dialog box, click the down arrow associated with the first option (the one to the left) and then click Parameter. The second option will change from a text box to a drop-down list that contains the two parameters you just created. Select SqlSrvInstance, and then repeat the process in the Database section, only this time, select Database.

Next, click the Advanced options arrow and enter the following T-SQL statement in the SQL statement box (or a comparable statement appropriate for your data):

The SQL Server database dialog box should now look similar to the one shown in the following figure, with the SqlSrvInstance and Database parameters selected.

word image 4 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

When you click OK, Power BI Desktop displays a preview window (assuming everything is working as it should), with the name of the two parameters at the top, as shown in the following figure.

word image 5 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

When you click the Load button, Power BI Desktop adds the dataset to Data view. Before taking any other steps, rename the dataset to RepSales or something to your liking. You should end up with a dataset that looks similar to the one shown in the following figure.

word image 6 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

By defining parameters for the connection properties, you can easily change their values at any time, should you need to retrieve the data from a different SQL Server instance or database. You can also use the same parameters for multiple datasets, saving you the trouble of having to repeat connection information each time you create a dataset based on the same data source.

Later in the article, you’ll learn more about working with existing parameters after they’ve been incorporated into your dataset. But for now, open Query Editor and view the M statement associated with the dataset’s Source step in the Applied Steps section, as shown in the following figure.

word image 7 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

If you examine the M statement closely, you’ll see that the SQL Server connection data references the SqlSrvInstance and Database parameters. They’re included as the first two arguments to the Sql.Database function. I’ve copied the statement here and highlighted the two parameters for easier viewing:

An interesting implication of all this is how easily you can reference parameters within your M statements, providing you with a very powerful and flexible tool for customizing the applied steps that make up your dataset.

Adding Parameters to Filter Data

In some cases, you might want to use parameters to filter a dataset, rather than applying a Filtered Rows step, an approach that can be inflexible and difficult to update. For example, you might want to use a parameter in place of the hard-coded 2013 specified the WHERE clause of the original T-SQL statement:

You can replace the hard-coded value with a parameter that supports a range of values. To do so, create a parameter just like you saw earlier, only this time, name the parameter SalesYear, define a list that contains the years 2011 through 2014, and set the default and current values to 2011, as shown in the following figure.

word image 8 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

After you’ve created the parameter, update the M statement associated with the dataset’s Source step by replacing the 2013 value with the following code snippet (including the quotation marks):

Once you’ve updated the code, click the checkmark to the left of the statement to verify that your changes are formatted correctly and that you didn’t somehow break the code. The statement and dataset should now look similar to those shown in the following figure.

word image 9 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

To test the new variable, select SalesYear in the Queries pane and choose a different year from the default 2011. Then re-select the RepSales dataset and verify that the data has been updated.

Adding Parameters to Control Statement Logic

In some cases, you might want to use parameters to control a query’s logic, rather than just filtering data. For example, the SELECT clause in the original T-SQL statement uses the SUM aggregate function to come up with the total sales for each sales rep:

You can instead insert a parameter in the M statement that allows you to apply a different aggregate function. First, create a parameter name AggType and then define a list that includes one item for each function, using the SUM function as the default and current values, as shown in the following figure.

word image 10 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

I’ve included the SubTotal column within the options in part to keep the logic clearer, but also to demonstrate that you can summarize the data based on other columns as well, as long as your dataset supports them. For example, you dataset might also include the DiscountAmounts column, which provides the total sales amounts, less any discounts. In such cases, each parameter option can define the specific function and column, including such values as SUM(h.DiscountAmounts) and AVG(h.SubTotal).

After you create the parameter, update the M statement associated with the dataset’s Source step by replacing the hard-coded SUM(h.SubTotal) fragment with the following snippet (including the quotes):

You can also do something similar with the original ORDER BY clause, using a variable to provide options for how to order the data. First, create a variable named ResultsOrder. Then, for each option in the list, specify the column on which to base the sorting (FullName or SalesAmounts) and whether the sorting should be done in ascending or descending order, as shown in the following figure. In this case, the option FullName ASC is used for the default and current values.

word image 11 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Once you’ve created the parameter, update the M statement by replacing the FullName ASC code fragment with the following snippet (including the quotes):

Your M statement and dataset should now look similar to the ones shown in the following figure.

word image 12 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Once you understand how to incorporate parameters into your M statements, you have a wide range of options for manipulating data and making your datasets more flexible and accommodating, without having to add a lot of steps to you query. Just be sure that whenever you make any changes, you apply and save them so you know they work and won’t get lost.

Using Parameters to Name Dataset Object

Another fun thing you can do with parameters is to use them for applying dynamic names to objects. For example, you can change the name of the SalesAmounts column to one that reflects the sales year and type of aggregation being applied. A simple way to do this is to first add a Renamed Columns step to your dataset in Query Editor and then update the associated M statement to include the parameter values.

To add the Renamed Columns step, right-click the SalesAmounts column header, click Rename, type Sales as a temporary column name, and then press Enter. Query Editor adds the Renamed Columns step to the Applied Steps section, along with the following M statement:

To incorporate the SalesYear and AggType parameters into the statement, replace Sales with the following code:

The concatenation operator (&) joins the name Sales with the two variable values, which are separated by a dash and enclosed in parentheses. The Text.Range function retrieves only the first three characters from the AggType variable.

After you update the statement, be sure to verify your changes. The dataset and M statement should now look similar to the ones shown in the following figure.

word image 13 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Notice that the name of the dataset’s third column now includes the year and function in parentheses, making it easy to see what parameter values have been applied to the data set.

Working with Parameters in Data View

As you saw earlier, to change a parameter value in Query Editor, you need to select the parameter in the Queries pane and update its value accordingly. However, this can be a cumbersome process, especially when you want to update multiple parameter values concurrently.

Fortunately, you can set parameter values directly in Data view. To set the values, click the Edit Queries down arrow on the Home ribbon and then click Edit Parameters. When the Enter Parameters dialog box appears, select the values you want to apply to your datasets and then click OK. The following figure shows the Enter Parameters dialog box and the current parameter settings.

word image 14 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Although the Edit Parameters option name and the Enter Parameters dialog box name are somewhat misleading, the features they represent provide an effective way to update the parameter values and in turn update the data. Be aware, however, that the changes you make here apply to all datasets using the parameters. If you want to include similar types of parameters in multiple datasets, but you do not what them to all share the same values, you should create parameters specific to the dataset, naming them in such a way to make them easily distinguishable from each other.

Once you know how to set the parameter values, you can try different variations, viewing the dataset each time you apply the new settings. For example, the following figure shows the dataset after applying the parameter settings shown in the previous figure.

word image 15 Power BI Introduction: Working with Parameters in Power BI Desktop —Part 4

Notice that the Sales column includes the year and aggregation type in the column name and that the data is sorted based on the Sales values, in descending order. You can also change the parameter values while in Report view, allowing you to see you changes immediately within your visualizations.

Clearly, the parameter capabilities help to make Power BI Desktop an even more robust and flexible tool, and you can use parameters in a variety of ways, regardless of where the data originates. The more comfortable you become working with parameters, the more effectively you can use them, and the more control you’ll have over your datasets.

Let’s block ads! (Why?)

SQL – Simple Talk

Why Haven’t YOU Applied for a Power BI Job With Us?

image thumb 11 Why Haven’t YOU Applied for a Power BI Job With Us?

You’re Good at Power BI, but Underutilized, Bored, and/or Feeling Alone?
Maybe We Can Fix All That.

This is one of my favorite recent quotes, and it came from a member of our consulting team on one of our virtual team meetings.  He was saying that his colleagues at PowerPivotPro make it the only workplace he’s ever truly enjoyed.  For me, that’s some seriously heartwarming validation – that we can gather together “my people” and provide that positive of an experience…  that’s some top-shelf, right-in-the-feels stuff.

But it gets better…  notice his use of the words “would” and “back?”  That’s because we don’t have an office per se.  We all work primarily from a combination of two places:  our homes, and at clients’ facilities.  Most days, there’s zero commute and zero need for business attire – we settle in with our morning coffee and get right to work solving client problems with our good friends DAX and M – and the best team in the business just a Slack call away.

So Ryan was saying he’d even give up all of that flexible, telecommuting convenience if we someday were all in one place.  THAT, my friends, is perhaps the strongest endorsement I could EVER want.

  1. We have IMMEDIATE opportunities if you live in the greater Seattle area.  We could have you started as soon as a proper notice period is respected with your current employer.
  2. If you’re interested in supplementing your existing income, we also have an ongoing and immediate need for contracted talent (as long as you have 9-5 availability).
  3. If you’re interested only in FTE work, PLEASE still apply!  There’s no telling how hot things are gonna get in the second half of this year, and we like to have good candidates queued up.

(Note that this is all still USA only folks, sorry).

Apply Here for Principal Consultant

As we grow here at P3, we’re increasingly in need of “HQ” support.  I’ve said this numerous times but will say it again:  there’s never been a BI/Data consulting firm anything like ours in terms of business and operational models, and we’re discovering/inventing the template as we go.  Blazing trails.  That’s challenging and interesting work, and not the sort of thing we can bog down the consulting team with.  We need more brainpower working with us in this effort, full-time focused.

Would you (or someone you know) like to help us on that front?  This isn’t a client-facing role, at least in the beginning, so Power BI expertise is not a requirement (but even a little bit of exposure to it, and/or a willingness to pick some of it up, is a plus).

Here’s a fragment of the job description (click the Apply link below to see it all):

We’re looking for a Sales and Operations Support Specialist, preferably in the Portland, OR area, who will provide administrative and operations support to our Vice President of Client Services to ensure processes are running smooth and efficient in a highly dynamic, entrepreneurial environment. You will perform, organize, and streamline operational tasks while monitoring for problems, taking immediate action where possible and escalating when required.

You will be expected to gain an understanding of our processes, systems, people, and executive strategy to anticipate needs and offer solutions. This individual will be detail oriented, solutions/process focused, and a team player with an ability to handle multiple, and often shifting priorities. This is an exciting opportunity to grow and learn in a fast paced, analytics consulting and training company.

Apply Here for Operations Specialist

I hope to see some new faces around the virtual table at our next team meeting wlEmoticon smile Why Haven’t YOU Applied for a Power BI Job With Us?

Let’s block ads! (Why?)


7/12 Webinar: Text Analysis in Power BI with Cognitive services with Leila Etaati


This week we will be delivering a sneak peak of some of the Business Applications Summit content!:

Text Analysis in Power BI with Cognitive services with Leila Etaati

Data that we collected always is not about numbers and structured data. In any organization, there is a need to analyze the text data such as customer comments, extract the primary purpose of a call from its scripts, detect the language of customer feedback and translate it and so forth. To address this issue, Microsoft Cognitive Services provides a set of APIs, SDKs, and services available to developers to do text analysis without writing R or Python codes. In this session, I will explain what is text analysis such as sentiment analysis, key phrase extraction, Language detection and so forth. Next, the process of text analysis in Power BI using cognitive services will be demonstrated.

When: July 12, 2018 10:00 AM PST

Where: https://www.youtube.com/watch?v=WWod8ETS7J8

About Leila Etaati

Leila 7/12 Webinar: Text Analysis in Power BI with Cognitive services with Leila Etaati

Leila is Data Scientist, PhD, MVP, and BI Consultant, and Speaker. She has over 10 years’ experience working with databases and software systems. She was involved in many large-scale projects for big sized companies. Leila has PhD of Information System department, University of Auckland, MS and BS in computer science.

She worked in Industries including banking financial, power and utility, manufacturing … She is a lecturer and trainer in Business intelligence and data base design course in University of Auckland.

Leila speaks in international SQL Server and BI conferences such as Microsoft Ignite, PASS Summit, PASS Business Analytics, PASS Rally, and many SQL Saturdays in USA, Europe, Australia, and New Zealand on Machine Learning and Analytics topics.

Ask Leila about these technologies and areas;

  • Machine Learning
  • Advanced Analytics
  • Predictive, Prescriptive, and Descriptive Analytics
  • Azure Machine Learning
  • R
  • Python
  • On-Prem Machine Learning with SQL Server, R and Python
  • Power BI and R
  • Power BI
  • Stunning Visualization with R
  • Microsoft BI Suite in SQL Server
  • Azure Stream Analytics
  • IOT
  • Real-time Streaming

When Leila is not doing data related work, his hobbies are;

  • Playing Game Boards!
  • Watching all type of movies!
  • Swimming if it is not too cold!

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Power BI expands self-service prep for big data, unifies modern and enterprise BI

More organizations are embracing a data culture that unifies information from many sources to drive business decisions. For a data-centric culture to thrive, it requires everyone to work from the same data platform, and intuitive tools that let them leverage vast quantities of data quickly to reach insights.

The Power BI updates we’re announcing today help organizations unify their enterprise BI needs on one platform and empower business analysts to leverage data more easily. Starting in July, these capabilities will begin to appear in Power BI in preview.

Accelerating data prep and unifying access to data across roles

I recently discussed the challenges associated with analyzing large volumes of data – specifically ingesting, integrating and analyzing disparate data. Data preparation and modeling are key components of the data journey, and our customers tell us that these activities often account for the majority of the time users spend working with data.

We’re investing in capabilities that reduce the time it takes for business analysts working with data to get insights.

  • Self-service prep for big data – We’re expanding self-service data prep in Power BI by introducing new capabilities to help business analysts extract insights from big data. Using the Power Query experience familiar to millions of Power BI Desktop and Excel users, business analysts can ingest, transform, integrate and enrich big data directly in the Power BI web service – including data from a large and growing set of supported on-premises and cloud-based data sources, such as Dynamics 365, Salesforce, Azure SQL Data Warehouse, Excel and SharePoint. The ingested data can now be shared across multiple Power BI models, reports and dashboards enabling easy data reuse.

  • Advanced analytics and AI with Azure – Additionally, it’s important that all of the users throughout an organization operate on the same data. We’re fueling collaboration across roles by unifying access to data between Power BI and Azure Data Lake Storage Gen2. Business analysts can seamlessly operate on data stored in Azure Data Lake Storage taking advantage of its scale, performance, security and analytics with the self-service capabilities in Power BI, while data engineers, data scientists and other users can extend access to insights with advanced analytics and AI from complementary Azure Data Services like Azure Data Factory, Azure Databricks, and Azure Machine Learning.

For example, data engineers can add, enrich and orchestrate data; data scientists can build machine learning models; and business analysts can benefit from the work of others and the data available in the Azure Data Lake Storage while continuing to use the self-service tools in Power BI to build and share insights broadly.

  • Support for the common data model – Power BI will also support the common data model, which gives organizations the ability to leverage a standardized and extensible collection of data schemas (entities, attributes and relationships). Users can take advantage of a standard schema – or customize based on their unique needs – to simplify how they enrich their data with other sources from Microsoft and third parties to accelerate analysis across a broad, unified dataset.

Unifying self-service and enterprise BI

Just as managing data from multiple sources can create challenges, managing multiple BI platforms within an organization can limit centralized access to insights.

Moving to one modern, compliant platform as the single destination for business analytics is easier than ever with key enterprise-facing updates on Power BI.

  • Enterprise-scale BI models & application lifecycle management (ALM) – We’re taking another step to bring advanced capabilities from SQL Server Analysis Services into Power BI, enabling larger data volumes, lifecycle management, and third-party BI tool connectivity. Incremental refresh, higher dataset size limits, and aggregates will allow customers to reach large dataset sizes, while maintaining fast and fluid reporting end users expect. With new support for the XMLA protocol, existing Analysis Services tools for managing lifecycle—from deployment through operations can now work with Power BI datasets. Additionally, since many third-party BI tools support XMLA, Power BI can now fuel analytics for all users across your enterprise, regardless of which reporting tool they choose to use.

  • Enterprise reporting – Popular SQL Server Reporting Services technology is now part of Power BI, creating a unified, secure, enterprise-wide reporting platform accessible to any user across devices. Pixel-perfect paginated reports can now be included alongside Power BI’s existing interactive reports.

f5a205db b3e8 4c61 805b e7d2a64a0d0b Power BI expands self service prep for big data, unifies modern and enterprise BI

Three years ago we launched Power BI to create new opportunities for businesses to drive a data culture where all employees, regardless of role and skillset, can access the insights and intelligence they need to make smart decisions based on facts.

Today, Power BI supports 43 languages and is used by customers in more than 18,000 cities. More than 19 million data models are hosted in Power BI and over 8 million dashboard queries are processed each hour. Thank you to our users for being part of this journey, and for the vibrant community that helps shape the product.

Learn more about these latest updates during Microsoft Inspire at the Power BI session, and for a deeper dive, join us at the Microsoft Business Applications Summit in Seattle on July 23 where you will learn more about these capabilities and see them in action. Register now for the Microsoft Business Applications Summit if you haven’t already!

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Sorting by a measure not part of the visual in Power BI

Every now and then this request comes up, I want to sort by a measure that is not part of the visual. Even though it is not “visualization” best practice sometimes the job requires it anyway  Sorting by a measure not part of the visual in Power BI.

So lets start with the simple one, I want to sort a chart on a measure not part of the visual. Let’s take this visual:
 Sorting by a measure not part of the visual in Power BI

Now instead of sorting by OrderQuantity I want to sort by the ListPrice. The trick here is to make the measure part of the query, and one way you can do that is by adding it to the tooltip. As such:

 Sorting by a measure not part of the visual in Power BI

Now in the sorting option I can also pick ListPrice:

 Sorting by a measure not part of the visual in Power BI

And now the result sorted by ListPrice:

 Sorting by a measure not part of the visual in Power BI

Pretty cool right ! Apparently this trick has been around for some time but I didn’t know until Will and Amanda told me.

Ok next up is sorting in a matrix \ table. This same trick doesn’t work there, although again we need to make it part of the query.

Let’s take the same table of data but now in a matrix, and I also want to sort on ListPrice here.

 Sorting by a measure not part of the visual in Power BI

The trick here is to add listprice to the matrix and sort by it:

 Sorting by a measure not part of the visual in Power BI

And finally hide the column by just dragging the column width to be very small:

 Sorting by a measure not part of the visual in Power BI

Et voila. Pretty neat tricks!

Let’s block ads! (Why?)

Kasper On BI

Creating the Right Measures: A Power Pivot Journey

Right Measures Fuller Banner 1024x581 Creating the Right Measures: A Power Pivot Journey


How did I come to PowerPivot and the related tools? Desperation in trying to fix a process problem. How did I get to the point where I can return a reasonable answer to a data query using the Power tools? Much frustration, a total Google search time measured in weeks if not months, Rob Collie’s book and the P3 website, and a dedicated IT department to help me determine if my reporting discrepancies were a problem in my DAX formulas or an underlying gap in the data (there were plenty of both). I found it tantalizingly challenging to learn – tantalizing because I always felt like I was one DAX command away from getting what I needed but unable to figure out what that one command was. If there is one thing I would urge upon anyone starting their PowerPivot journey it is perseverance.

The Objective

Like any large organization’s supply chain mission, our department’s primary objective is to ensure a product is always available when needed at the point of use with the right price point and quality level. Our project was to implement a 2-bin Kanban solution at four locations within our system. This effort had been hugely successful at 3 of the 4 hospitals in our system, but the largest location was encountering difficulties in implementation. Stockouts (lack of product at the needed time) were at an unacceptable level – the Kanban implementation had worked at our other locations, so why was it failing to deliver the expected results at our flagship location? Sheer volume didn’t seem to be the culprit as we could get the quantities of product from the dock to the points of use. The issue was that the product just wasn’t getting to the dock in the right quantity or wasn’t getting there at all.

This led to our first realization about our MMIS (Material Management Information System) – while it was very good at telling us what had happened, it wasn’t set up to tell us what should have happened but didn’t. Like many OLTP systems, it is optimized for transactional throughput with a secondary focus on analysis. If the data isn’t in the system in the first place, or it’s stored elsewhere requiring a great deal of processing power to retrieve, then finding the gaps in the necessary events is difficult.

I had used Excel for 20 years and didn’t know about PowerPivot until I was faced with trying to analyze these failures in 130 storerooms stocking 40,000+ SKUs. Learning what measures were, and creating the right measures using DAX was a giant step, and a lengthy one. SUM() and DIVIDE() could be managed pretty quickly, but when we started moving to histograms things got complicated. The P3 blog (and its willingness to link to other published articles by the community) was a tremendous help in understanding how to build such measures.

image thumb Creating the Right Measures: A Power Pivot Journey

Why start with histograms? They’re helpful to show variations in large datasets for multiple criteria. They revealed the gaps in what our MMIS system could provide. What significant value is missing from the above histogram? (cue theme music from Jeopardy…) There is no zero value on the x-axis! We can see how many times something on the shelf was ordered if it was ordered at least once during the period – but how many items weren’t ordered at all? Those are the items we want to know if we should have ordered but didn’t – and if they’re legitimately not wanted then get them off the shelf and make more room for the fast-moving items that we were running out of.

DAX and PowerPivot started highlighting the data gaps, and this is where the partnership with our IT department took off. Our storeroom (aka cart) data is not historical – the MMIS system can tell you what 378 items are in storeroom 5A today and what their reorder (par) quantities are, but it couldn’t tell you what that item and value was a week ago or a month ago. So, on a daily basis, we had to take a snapshot of the storeroom inventory and bump it against the order history. With 40,000+ items managed within the location that file ballooned quickly, so we limited ourselves to a 90-day history.

Once we had the core transactional data including the non-orders we wanted to slice ‘n’ dice the data in a variety of ways, so we built the lookup tables and linked them together. There’s no need to go into detail on the next diagram – your business probably has a similar complexity of relationships and acronyms. We used P3 hourly consulting at various points to help us with the measures along with continual refinement of the queries generating the fact tables from our PeopleSoft MMIS. We also had to work out the timing of the data sources – for example, our Advance Ship Notice (ASN) data from the previous day’s orders were processed at 6 am in the MMIS, so we had to ensure our corresponding Power Query jobs ran after the ASN information was received.

image thumb 1 Creating the Right Measures: A Power Pivot Journey

Without going into mind-numbing detail about the various measures we created and studied we can say that several features of PowerPivot were vital in that root-cause analysis.  Among them are:

  • Immediate update of the tables and graphs for each data refresh
  • Immediate answers when we sliced and filtered the data for various factors
  • Graphical representation of events for rapid comparison and evaluation

Rob’s posts on allowing multiple measure selections with disconnected tables let us build this next pivot table combined with sparklines. The measure shown gives us the amount ordered on a particular day subtracted from the amount received within two subsequent business days, leaving anything either not ordered or completely filled BLANK(). This report highlights what I consider one of the greatest strengths of PowerPivot – each mark represents the result of a sophisticated measure but they are embedded in a common Excel trope for easy comparison by lay users.

The vertical red marks highlighted by the blue box indicate a day when the EDI transmissions from our vendor were not sent. (The actual goods were shipped and received, but electronically we were blind.) We called the vendor as soon as the report was run to determine the status and verified their confirmation against our dock receipts.

The horizontal marks highlighted by the orange box represent a problem with shipments for the 36 French catheter – despite repeated orders the item was not coming in. This was a problem requiring the immediate substitution of the item until the vendor was able to resume shipments.

image thumb 2 Creating the Right Measures: A Power Pivot Journey

The report is an immediate visual check for a stock person to know if an order has not been filled. We considered using a higher-level dashboard with a traffic-light metaphor so problems with any of 400+ items in larger storerooms would be highlighted on a single section of the screen, but the importance of a storeroom manager knowing which items were not received and the need to substitute for them at once led us to keep this level of detail.

Having this aggregated data meant that we could match stockout issues to external (product not shipping) vs. internal (correct orders not being made) causes. Once the transactional data was trustworthy as well as complete, we could continue developing more complex measures. We were now able to correctly differentiate a stockout (where something should have been there but wasn’t) to a lack of an order because an item had a sufficient par level, or it was the wrong day of the week for an order or other valid reasons. We could now automatically identify which events were “bad” ordering decisions and thus move to target the root cause. Power Pivot helped us with that too, but that’s another story.

Power Pivot (or any similar tool) didn’t/couldn’t give us an answer to our issues out of the box because at the start we didn’t have the proper data. What it could do was to lead us to the data organization we needed and then demonstrate where our data black holes were. Once those holes were fixed, we could then continually improve our reporting until we understood what was happening and adjust the people/process/technology issues. Without Power Pivot and the help provided by P3 and the books that everyone should have on their shelf (Rob Collie, M is for Data Monkey, Definitive Guide to DAX, …) we would have foundered on the rocks.

Despite that help, and maybe it’s just me, but here’s something very hard for me to understand… Why is learning the tool so difficult? (Picture Will Smith in Men In Black yelling, “hey old guys!”) I have used spreadsheets for longer than most of the P3 readers have been alive (VisiCalc on a VAX 1170 anyone?), and I have picked up what I needed as I went along, but always in service of a business objective rather than to become an Excel guru per se. Excel and the Power tools are not my day job, though I rarely do not have at least one spreadsheet open on my monitor. So why did it take me so much longer to get where I needed to go? To be sure I was tackling a more sophisticated problem… but it’s Excel!

The Challenges

P3 has had several posts about how approaching the Power tools from an Excel perspective can lead to problems, but I think they only cover part of the mystery. Some of the difficulty may be in changing the way you think about what the purpose of a cell should be. For a traditional spreadsheet, a cell (say $ B$ 3) has a particular meaning based on the formulas within. Specific information is found in a particular cell, and that cell should always have that same information (or the same calculation when the source materials change). With the Power tools, the connection between any given cell and a specific piece of information is no longer relevant. The focus switches from a single number to a set of numbers. Your desired result might be a set comprised of a unique number but the formula to achieve that result is no longer visible or traceable within the cells of the spreadsheet.

The in-house help functions for DAX and PowerPivot are terrible as well. For a regular Excel function, the help system gives you an example which can be copied to your spreadsheet and subsequently built piece by piece to duplicate the result in the help function. For DAX the examples are given in relation to the AdventureWorks database (not included), and proper importation and manipulation of the data model is needed before you could hope to start duplicating the help examples. The database wasn’t built for Power Pivot anyway and is more suited to creating flexible SQL joins than hardcoded one-to-many relationships. And just sayin’, could we have datasets with dates from the current decade at least?

Perhaps most disrupting is the mental switch to filtering – analysis is subtractive rather than additive. Instead of carefully building a report one data point at a time you are now chipping away at large chunks of data, like Michelangelo on a block of granite, until you have retained only the part that you need. (No, I’m not Michelangelo with my data, more like the guy on the road crew with a jackhammer.) The various filters and sequence of application needed to ensure you don’t blast away the part you intended to be the sculpted hand on your overall statue are very confusing to a newcomer. The Italians (yet another Renaissance motif) are so far advanced in their understanding of filter applications that even their basic postings on ALL() are intimidating. Microsoft does have some rudimentary datasets for HR, procurement, and some other common areas but the examples are pretty basic and quickly unable to reflect common issues that a user will encounter.

I also didn’t talk much about PowerBI. First, it wasn’t available when we were resolving our needs. Nowadays it’s very much the flavor of the moment, and it’s truly a great tool especially if you don’t already have a Tableau or Qlik license. But, I’m not ready to stop and build a dashboard. What I need to know changes as fast as my understanding of the real-world issues I’m trying to answer. There’s always another data source to link into my model from both internal and external sources, taking another slice of the data looking for a correlation between factors, testing against the real world. The measures and graphs I started with are not the ones I use today. I keep records of them, footprints in the sand for others to follow as they come up to speed on the questions I’ve already answered, but I need those followers to branch out and ask their questions in different directions from the path I’ve followed. There’s too much to know.

As Power Pivot allows us to become more accurate and specific in reporting – not coincidentally expanding our source data to millions of rows – so too we have to be sure that accurate granularity is supported at the associated table level. When executive reporting was highly aggregated the inputs to those reports could also be aggregated. Now I need to break apart the data I already have into more atomic units to give the finer level of reporting we can now provide. I can’t hide my summary values as averages of averages because my customers can drill to the detail.


Cole Nussbaum Knaflic notes that data analysis has both exploration and explanation – but as tools like Power Pivot allow us to explore at finer levels of detail, we must remember that data validation remains a vital precursor to exploration, especially for aggregations that we have uncritically used in the past. We have to look with fresh eyes at the data and queries we’ve used for years and be ready to rebuild from the ground up… Power Pivot is the red pill of data. (Does that make Rob Morpheus?)

No one knows the Power BI ecosystem better than the folks who started the whole thing – us.

Let us guide your organization through the change required to become a data-oriented culture while squeezing every last drop* of value out of your Microsoft platform investment.

* – we reserve the right to substitute Olympic-sized swimming pools of value in place of “drops” at our discretion.

Let’s block ads! (Why?)


Power BI Developer community June update

This blog post covers the latest updates for Power BI Developers community. Don’t forget to check out the May blog post, if you haven’t done so already.


32515048 1d99 4a23 8999 cd5de750b022 Power BI Developer community June update

Here is the list of June updates for Embedded Analytics…

Embed capabilities

Set Slicer values through JavaScript API

Use Themes in embedded dashboards

‘Interactive feature showcase’ section added to the Playground tool

Automation & life-cycle management

New REST API documentation

Power BI Embedded in Azure

Power BI Workspace Collections is being deprecated

Embed capabilities

Set Slicers value through JavaScript API

When users interact with reports and want to see specific views of the data, they usually use the filter pane or Slicers. A slicer is an alternate way of filtering that narrows the portion of the dataset shown in the other visualizations in a report. Many authors add Slicers into their report as it is more compelling and discoverable by end users.

While the Filters API has been available in JS SDK, we now offer Slicers API to give developers full control on the interactivity and drilling capabilities for end users. The API allows you to get the slicer’s current settings or apply a new filter on the slicer. The API can be used at any time during the user’s session, including upon report load. The API supports both Native and Custom slicers*, and functionality of syncing slicers will also be consistent when you use the new API.

* Custom slicers will be available within 2 weeks.


Use Themes in embedded dashboards

We are happy to announce that you can now embed dashboards that includes themes, similarly to report themes. With dashboard themes you can apply a color theme to the dashboard your users consume. The Power BI portal is used to set and apply the dashboard theme.

b2f2f883 6cdd 4442 a6e2 68c4c8efcfef Power BI Developer community June update

‘Interactive feature showcase’ section added to the Playground tool

Our newly designed Playground tool includes a showcase section of interactive feature. This showcase walks you through some of our capabilities and shows how you can unlock their potential in your application.

As we keep growing our product, more showcases will be added to this section. Currently we are releasing two showcases:

1. ‘Dynamic report layout’ Showcase

Learn how to build a personal selection of visuals for each user by dynamically embedding visuals from a report page.

ed368380 96f5 4496 9692 8ef74cb65f1e Power BI Developer community June update

2. ‘Capture & share bookmarks’ showcase

Learn how your users can create and share their own views of the report page.

9b49d730 f2ac 4f3a a41a 77d1f7cf2faf Power BI Developer community June update

Automation & life-cycle management

New REST API documentation

We are excited to announce that we have new Documentation for all of Power BI REST APIs. We moved them to a new location, so you can find them here.

The new documentation are auto-generated from Swagger files of our REST APIs. It might take some time to get used to the new structure, but as these docs gets auto-generated, you can be sure that you always see the most updated version in documentation, and that it is in-sync with our latest SDK version.

In the future we will add the ability to explore and try the REST APIs interactively.

Power BI Embedded in Azure

Power BI Workspace Collections is being deprecated

As announced in May 2017, Power BI Workspace collections is being retired. The retirement will be effective in the next few days. If you are a Workspace Collections customer without an EA signed prior to June 2017, visuals embedded using Power BI Workspace Collections will no longer be viewable within your apps.

We recommend migrating to Power BI Embedded to keep using the embedded analytics capabilities you’re familiar with, and to take advantage of additional Power BI Embedded features.


If you are puzzled or run into few issues, be sure to check our resources that can help you get by:

Report Themes

Now your custom visual supports Report Themes. Users can apply a color theme to their entire report, such as corporate colors, seasonal coloring, or any other color theme they might want to apply. To apply color themes, users need Report Theme JSON file. A whole bunch of read-made JSON files can be found here, users can download and import to their report.

Find more information on Report theme JSON file format to create your own theme here.


· Custom visuals that have hard-coded colors are not affected by theme change.

· Extended JSON format file is not supported for custom visuals.

db6ef18f a203 4ad8 87b3 351e4bbb1d52 Power BI Developer community June update

4ee1847f 0635 4e48 85db f67c881ecf0c Power BI Developer community June update

As always, feel free to use all the communication channels at your disposal to connect with our team, share your thoughts and ask questions:

· Community

· Contact us for support- pbicvsupport@microsoft.com


Come and hear about Power BI Embedded in Microsoft’s Partner event of the year.

5d5c97a8 59f7 4841 9025 ee1686084820 Power BI Developer community June update

Microsoft Business Applications Summit

Tons of content for Power BI and Power BI Embedded, and a great chance to interact with the Power BI Product team.

bed3cc60 c9b2 442b bf1c 46b0cc5a49f2 Power BI Developer community June update

That’s all for this post. We hope you found it useful. Please continue sending us your feedback, it’s very important for us. Have an amazing feature in mind? please share it and vote in our Power BI Embedded Ideas forum, or our Custom Visuals Ideas forum.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI