Category Archives: Self-Service BI

Individual Excellence vs. Organizational Impact: Know the Difference!

image thumb 4 Individual Excellence vs. Organizational Impact: Know the Difference!

Guess Which One Grows Your Career (or Company) More? (Hint: It’s the One on the Right)
(But Individual Excellence is a Prerequisite for Org-Wide Impact)

  -Marshall Mathers

image thumb 10 Individual Excellence vs. Organizational Impact: Know the Difference!Last week’s post on learning DAX before learning M mostly met with positive reviews, but also drew some fire.  A few staunch M supporters showed up and voiced their disagreement – including the one and only Ken Puls.  Now I know from experience not to mess with Ken…  OK OK, I confess…  messing with Ken is a barrel of monkeys actually.  Put it on your bucket list.  That said, I have immense respect for his skills and perspective, and only enjoy messing with him out of friendship.  He’s an amazing human being and learns more things in a year than I could ever squeeze into my leaky head over a lifetime.

But I still firmly believe in what I said.  I’m not here to offer apologies – only clarification and justification.  There very much IS an olive branch in all of this, but again, that comes merely from clarification.

First clarification:  I love love LOVE Power Query and M!  They are a godsend!  I never said what some people thought I was saying, which was “meh, you can ignore/neglect that part of the platform.”  Nope, you absolutely benefit tremendously from both.

The minor tension from last week raised a MASSIVELY important point, one that transcends any technical debate and puts things in their proper perspective.  So I’m grateful for the opportunity that the misunderstanding provides us.  Let’s begin with…

image thumb 11 Individual Excellence vs. Organizational Impact: Know the Difference!

At first this sounds like an incredibly abstract question.  I mean, how can you put a dollar figure on massive gains in personal efficiency?  Sounds impossible, right?

But I’ve got a card up my sleeve:  how much is your salary?  It’s not terribly outlandish to say that the best you could EVER do, in terms of speeding up the tasks in your workday, is to completely make yourself redundant.

And the market has already put a dollar value on THAT, right?  It’s called your salary.  No, it’s not a perfect number, not at all.  Many of you reading this are criminally underpaid in some very real sense, for instance.  But this is what the market is saying today, and that’s a very real quantity.  Furthermore it’s not like it’s really possible to automate 100% of your duties using ANY of the tools available today (thankfully!), so it’s somewhat “gracious” to set the max at 100% of salary.

The folks applying for our Principal Consultant role lately have ranged in current salary from a definitely-criminal $ 40k on the low end to a damn-near-executive $ 160k on the high end.

So let’s continue with the “gracious” theme and go with the high end:  $ 160k per year is a serviceable maximum ROI for “making my current job run faster.”

You might be thinking, at this point, that I’m still not being Gracious Enough.  It’s possible, after all, for a single hyper-efficient individual to suddenly replace MULTIPLE other individuals right?  Setting aside the distasteful notion of those lost jobs for a moment, I still think my $ 160k figure isn’t that bad, given that it’s 100% of an entire individual on the high end of the range.  But fine, if you want to multiply it by 3 and make it $ 480k, I don’t think that necessarily undermines any of the points I want to make.  I’m in the business of adding more zeroes to the productivity multiplier, not linear multiplications.

que01 thumb Individual Excellence vs. Organizational Impact: Know the Difference!At first, nothing.  Both DAX* and M, in the early going, are BOTH very much “speed up my current workflow” kind of things.  And that’s perfectly natural – what you’re currently doing is ALWAYS the best place to start, the best place to learn.

(* Remember, when I say “DAX,” I use that as shorthand for “DAX and Modeling,” where “Modeling” is best described as “figuring out how many tables you should have, what they should look like, and how they are related).

And that “improve what I’m currently doing” lens is why M/Power Query steals the show in the earliest demos – it’s easier to see how it’s going to change your life, because it automates/accelerates a larger percentage of what Excel Pros have traditionally done.

Hold this thought for a moment while I introduce something else…

image thumb 8 Individual Excellence vs. Organizational Impact: Know the Difference!This is a real thing, it belongs to one of our clients, and we helped them build it.  To call it a “workbook” is a bit of an insult of course, because it’s a modern marvel – an industrial-strength DAX model with a suite of complementary scorecards as a frontend.  But all built in Excel.  (Power Pivot, specifically).

And this model has provably returned about $ 25M a year to the bottom line for this client.  As in, profit.  Not revenue.  Pure sweet earnings.  This workbook is visible on their quarterly earnings reports to Wall Street.

And this wonder of the modern world is well into its fourth year of service now, bringing its lifetime “winnings” into the $ 100 Million range.  No lie.  This happened, and continues to happen.  This “workbook” is a tireless machine that makes it rain money.

Let’s do some math:  $ 25M per year vs. $ 160k per year is… 150x.

In other words, the ROI of this project went FAR beyond any amount of “accelerating what we already did.”  It was, instead, a MASSIVE dose of “doing something we’ve NEVER done before.”

image thumb 12 Individual Excellence vs. Organizational Impact: Know the Difference!This may sound like a cheap verbal trick, but I sincerely think it is a weighty truth that everyone should internalize.  The workbook above, which now in some sense runs the show at this large client, had no predecessor whatsoever (its “forerunners” were a scattered collection of hundreds of distinct reports, each of which was just burying readers in borderline-raw data).  For an even bigger example, consider that Instagram started as a hybrid of Foursquare and Mafia Wars before deciding to go “all-in” on their most popular feature, photo sharing.  The blank canvas has no ceiling, if you permit me to mash-up some metaphors, and both of these success stories are rooted in a combination of analytics and courage.

What we’ve been doing traditionally, in both the traditional Excel and traditional BI worlds, is nothing to brag about.  Most of our reporting and analysis output has been, traditionally-speaking, designed by the path of least resistance – as opposed to defined by careful and creative thinking about what truly matters.  The president of one of our clients/partners’ told us last week, “people tend to measure what they can easily count, as opposed to what they SHOULD measure,” and I just about leapt out of my chair screaming “YES!  PRECISELY!”

If you want to learn more about this topic, start with Ten Things Data Can Do For You and We Have a “Crush” on Verblike Reports.  For now, it’s time to move on to Leverage.

image thumb 13 Individual Excellence vs. Organizational Impact: Know the Difference!

You wanna know why Data is so “Hot” these days?  It’s because of Leverage.  Data is hot precisely because proper application of data can impact the behavior and productivity of MANY people simultaneously.  You can’t typically save or make millions of incremental dollars as an individual, but it’s “easy” to do if you can magnify benefits across dozens of other people – or hundreds, or even thousands (as is the case with the $ 100M workbook).

In fact, it’s worth considering that the $ 100M Workbook actually offers only modest benefit!  On a per-person basis, on a single day, you wouldn’t even notice the difference.  But multiply that modest, say, 3% benefit across tens of thousands of people and 365 days…  and you get $ 25M per year.  When you have the power of Leverage, you don’t even have to find something “big,” like the Instagram “pivot” from one mission to another, to get something BIG.

We are all, everyone reading this, INCREDIBY FORTUNATE to be working in data, because of its somewhat-unique capacity for leverage.  So many jobs, whether white- or blue-collar, are essentially cogs in the machine, and the top-end benefits they provide are limited to the “just you” size.  But WE have hit the jackpot.  WE have a job that is “unfairly” capable of leverage.  It just fell into our laps.  But then, the mind-numbing dosages of VLOOKUP (in the traditional Excel world) and endless documentation and miscommunication of requirements (in the traditional BI world) deflected us off into a relatively un-ambitious mindset.

Data has ALWAYS had the advantage of Leverage, but the traditional methodologies and tools brought tremendous friction and inertia to the table.  They wore us down – in terms of time, money, and psychic energy.  They “chokepointed” our potential.  They enforced a terribly-linear culture of thinking.  In short, the traditional tools took the potential 100x or even 1,000x leverage possibilities of Data and tamped them down to about 10x – still good!  But so much less than what we COULD do.

Well guess what?  No more chokepoint.  Whatever you want to call it – Power BI, Power Pivot, Modern Excel – the next-generation toolset from Microsoft gives us those extra zeroes of potential.

Note the quotation marks in that heading, because the next section is more conciliatory, but there IS something very important to bring home here.

If you MADE me choose one or the other, I’d definitely choose DAX, because I think it offers us the virtually-unlimited twin powers of  WWNDB and Leverage.  In fact, I don’t think that, I know that – I (and my companies) have been blowing people’s doors off with this new toolset since 2010.  We didn’t even get Power Query until what, 2014?  Fully half the lifetime of this revolution pre-dates M.  Even the $ 100M Workbook predates M!  Heck, until Power Update came along, you couldn’t even schedule refreshes of models that relied on M, which almost by definition “funneled” M usage down the “just for me” path – and to this day, Microsoft still hasn’t finally released a server that natively runs M.

I just don’t think it’s nearly as easy to explore/exploit WWNDB or Leverage via the M path.  Not impossible, because there are plenty of exceptions that prove the rule.  And to be clear, I think most of the exceptions will be in the WWNDB category, not the Leverage category.

And that was kinda my whole point in last week’s article – Power Query dramatically captures the attention of new converts to Modern Excel precisely because of how well it fits and improves What We’ve Already Been Doing, as Individuals.  This is a Good Thing!  No caveats needed.  I just don’t want anyone to become so distracted with it that we miss the Big Wins of WWNDB and Leverage.

image thumb 14 Individual Excellence vs. Organizational Impact: Know the Difference!

This is What We Can Do With “Just” DAX and Modeling

The picture above illustrates how a single individual (you, or a member of your team) can achieve wins MUCH bigger than just them.  And it’s my experience-powered belief that you cannot get a Win of that size without leveraging DAX and Modeling.

But what if you THEN take that single individual’s newfound powers of WWNDB and Leverage, provided by DAX and Modeling, and now make THAT person more efficient?  “Holy Additional Multiplier, Batman!”

image thumb 15 Individual Excellence vs. Organizational Impact: Know the Difference!

If Our DAX Modeler Superhero “Levels Up” with the Efficiency Gains of Power Query / M…  Look Out!

Yeah, if you take THAT person, and make THEM more efficient, WOW, you can do EVEN MORE of the amazing, transformational, WWNB-and-Leverage style work.

Which we can all agree…  is a Very Good Thing.  One of my favorite personal sayings is “the length of a rectangle is not more ‘responsible’ for the area of the rectangle than the width.”  Double either one, and you double the area.  But that’s essentially my point in a nutshell – you can 100x with DAX and modeling, AND you can double with M.  If you had to choose one, choose 100x.  But we don’t have to choose.  Adding Power Query and M to your org-wide-impact powers, even if it’s “just” 2x or 3x, delivers JUST AS MUCH, or more, incremental Big Win as the original 100x.

We can have our flagons full of mead and drink them too, as Lothar once said.

Let’s block ads! (Why?)


DAX “Reanimator” Series, Episode 1: Dynamic TopN Reports via Slicers

Power BI Report thumb DAX “Reanimator” Series, Episode 1: Dynamic TopN Reports via Slicers

Guess how many articles are here on  Go ahead and think of a number, I’ll wait.

The answer, at time of writing, is 923.  Rob alone has published 715 articles!  And these date all the way back to 2009.

A lot of these articles are “old,” but folks, the DAX engine is still 99% the same today in Power BI (and Excel 2016) as it was when it first “hit the shelves” in Spring 2010.

The motivation behind this “Reanimator” series, then, is twofold:

  1. Help newer converts/readers rediscover some of the most-awesome techniques previously covered here (without being so lazy as re-posting them in their original form)
  2. “Refresh” those techniques for the brave new world of Power BI (since the vast majority of old articles were written when we only had Power Pivot)

What better way to do that than to re-create those workbooks in Power BI Desktop and embed the report directly…Within. This. Post! wlEmoticon smile DAX “Reanimator” Series, Episode 1: Dynamic TopN Reports via Slicers

A New Age of Self-Service BI Users

I’ve been fortunate enough to be given the honor of sharing with you, our community, all these wonderful posts written by many of our in-house industry experts. Updated in all their glory into the wonderful world of Power BI. Now you can click, slice, interact, touch (…dirty), and drill (dirtier!) with these reports to your hearts desire. Just as the BI gods intended them to be! My hope is that these updates will instill these tools to the growing number self-service BI users just getting into the field and who want to do AWESOME things with their reports.

Highlights From The Original Post(s)

So this update is actually a continuation of not just one…but TWO posts written by Rob in the distance past of 2012 (in technology years that’s basically forever). The two original posts were:

Dynamic TopN Reports Using PowerPivot V2!

Dynamic TopN Reports via Slicers, Part 2

Excel Report thumb DAX “Reanimator” Series, Episode 1: Dynamic TopN Reports via Slicers

Rob demos some pretty ingenious techniques using his (now prolific) disconnected slicers technique to not only control the Top N Number you’d like to see on charts or graphs, but also the Value that you want to see that Top N Number ranked on. I’ve used it in MANY reports I’ve made over the years, always impressing the customers who used them.

Now I don’t want to give too much away in this post, instead directing you back to the walkthrough via the links above. I’m just here to whet your appetite enough with some fancy Power BI Reports, and if you want to learn the DAX code, hop into Rob’s posts.

This “Picture” Below is an Interactive Power BI!

Isn’t Something Missing?

Some of our more avid blog readers may be thinking “wasn’t there a THIRD post about TopN filtering?”. Yes, in fact there was. It was written by guest contributor Colin Banfield and is called Dynamic TopN Reports via Slicers, Part 3. It’s a fantastic post which covers ways to add BottomN metrics, Month/Year slicers, and more. I chose not to use that workbook since I wanted to capture the core story from the original posts written by Rob. If you’re inclined however, I recommend reading all three as they will add real value to your DAX tool belt. Until next time P3 Nation!

Download the Files!

Download the PBIX files


Get Your Files

Let’s block ads! (Why?)


Tech Tip Thursday: Better use of colors in Power BI

Microsoft’s Guy in a Cube has been providing tips and tricks for Power BI and Business Intelligence on his YouTube channel since 2014. Occasionally on Thursdays we highlight a different helpful video from his collection.

In this video, Guy in a Cube has a special guest presenter: regular Power BI webinar host Chuck Sterling! Chuck talks about color themes, one of his favorite new features. He looks at how you can quickly create your own themes to use within Power BI, or browse the Themes Gallery in the community to get pre-created themes from others.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Explore your Office 365 Adoption Data in Power BI

Today we are pleased to announce the release of the Public Preview of the Office 365 Adoption Content Pack in Power BI.

The content pack combines the intelligence of the Office 365 usage reports that are available for you in the admin center with the interactive analysis capabilities of Power BI, providing rich usage and adoption insights. With these insights, admins can drive more targeted user training and communication that helps them transform how their organizations communicate and collaborate, enabling a truly modern workplace.

For more details on what the content pack has to offer, check this blog post by the Office 365 team.

a76fe21f ba2c 4be2 81e9 453a5cb3d9ac Explore your Office 365 Adoption Data in Power BI

Get started with the content pack

To connect to the content pack, you first need to enable it in the Office 365 admin center. On the Usage Reports page, you will see a new card at the bottom of the page where you can opt in to the content pack. This step kicks off a workflow that generates your historical usage trends. This data processing task takes between 2 and 48 hours, depending on the size of your organization and how long you’ve been using Office 365. After data preparation is complete, it’s ready to show in the content pack. Please follow the steps described to connect to the content pack for your organization.

2c2b8aae f3ac 48fb b6b9 f2f885cd2036 Explore your Office 365 Adoption Data in Power BI

Please note that you must be a global admin or a product admin (Exchange, Skype for Business, SharePoint) to connect to the content pack.

Learn more

You can find additional information about the content pack, including FAQs, in the following support articles:

If you have questions, please post them in the Adoption Content Pack group in the Microsoft Tech Community. Also, join us for an Ask Microsoft Anything (AMA) session, hosted by the Microsoft Tech Community on June 7, 2017 at 9 a.m. PDT. This live online event will give you the opportunity to connect with members of the product and engineering teams who will be on hand to answer your questions and listen to feedback. Add the event to your calendar and join us in the Adoption Content Pack in Power BI AMA group.

Let us know what you think!

Try the public preview of the Office 365 Adoption Content Pack in Power BI and provide feedback using the feedback link in the lower-right corner of the Usage Reports page in the admin center.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Webinar Wednesday: Dashboard in an Hour

If you’re new to Power BI, or want to introduce the rest of your team to ways they can go quickly from data to insight to action, you won’t want to miss Wednesday’s webinar!

Dashboard in an Hour with Reid Havens

Dashboard In An Hour is a high-level introduction to Power BI in just 60 minutes. This webinar will show you how to build a Power BI dashboard, including:

  • How Power BI can turn your unrelated sources of data into coherent, visually immersive, and interactive insights.
  • Ways to easily connect to all of your data sources, visualize (or discover) what’s important, and share that with anyone or everyone you want.
  • How to use Q&A to find the answers you and your organization need to make smart business decisions.

This is an excellent session if for anyone who is new to Power BI!

Subscribe and join live!
When: 5/24/2017 10:00 a.m. PT

About Reid Havens

Reid Havens is the Founder & CEO of Havens Consulting Inc. He has a formal background in technology and organizational management, having obtained separate degrees in Organizational Development & Business Analytics. Reid has experience working with Fortune 500 companies such as Microsoft as well as with Non-Profit Organizations. He is also a contributing author and Principal Consultant at PowerPivotPro. Additionally he has moonlighted as an adjunct professor at Bellevue College and guest lectures at the University of Washington, teaching Business Intelligence, Reporting, & Design courses.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Reusing Datasets Imported to the Power BI Service

I’m a big fan of reusing Power BI datasets whenever possible to minimize redundant datasets floating around the organization. It’s less maintenance, less chance of calculation differences, and less data refreshes to schedule.

Info verified as of: May 20, 2017

In this post I’m referring to datasets which are imported into Power BI (thus requiring a data refresh to update the imported data). We’re already “reusing” a dataset which is in DirectQuery or SSAS Live Connection mode, so those are useful techniques too — just not applicable to this particular post.

To reuse an imported dataset, there are three options I’m aware of:

  1. Report in the Power BI Service. This refers to using the web interface for creating a new report separate from the original PBIX file.
  2. Analyze In Excel. This refers to creating an Excel report and can currently be used by anyone with read or edit access to the dataset. Hence, very useful for self-service BI purposes.
  3. Power BI Service Live Connection. This refers to creating a Power BI Desktop report. This option can currently only be used by people with edit permissions on the dataset (so not appropriate for broad self-service BI reporting at this time).

Report in the Power BI Service

In the Power BI Service, if you have edit permissions, the option to create a report is on the Actions menu. It will open up a blank report connected to the dataset:

 Reusing Datasets Imported to the Power BI Service

When you save the new report, it will appear as another report although it shares the same exact underlying dataset. This is a great way to divide up reporting needs for different people, yet use the same exact dataset.

That was the simple one. Next we have…

Analyze In Excel

In the Power BI Service, if you have edit permissions, the option to use Analyze In Excel is on the Actions menu:

PowerBI AnalyzeInExcel Reusing Datasets Imported to the Power BI Service

If you’re a read only user, it’s not as prominently displayed. However, it works the same. You can find it through the “Related Content” pane:

 Reusing Datasets Imported to the Power BI Service

The first time you’ll be prompted to download and install an Analysis Services OLEDB driver which handles connectivity back to the Power BI dataset in the Service:

 Reusing Datasets Imported to the Power BI Service

The next thing to know is that the connection will be stored in a separate .ODC file (short for Office Data Connection). You’ll want to keep all of your .ODC files in a single location, and only have one .ODC file per connection (this makes it easy to change the data connection info later if you need to).

From there, you can create pivot tables, charts, etc like normal in Excel. The data connection properties inside of Excel will look like this:

 Reusing Datasets Imported to the Power BI Service

If things don’t work, you might want to check that this option in the Power BI Admin portal hasn’t been turned off for Analyze In Excel (though this Admin portal setting is applicable only to datasets where the underlying data is SSAS in Live Connection mode):

 Reusing Datasets Imported to the Power BI Service

More info about Analyze In Excel is here: – the post has more details about requirements for Excel version, the need for a measure in the dataset, etc. 

Note that you can’t publish an Analyze In Excel workbook back to the Power BI Service (because workbooks in Power BI are only supported if it has imported data in the workbook). Maybe we’ll get this feature in the future, because having one place to publish would be very nice.

One more option which is similar but not quite…

Power BI Service Live Connection

For this one, we start inside of Power BI Desktop. As of the time I’m writing this, it’s still a preview feature which needs to be enabled first in the Options menu:

PowerBIServiceLiveConnection Option Reusing Datasets Imported to the Power BI Service

To get started, visit the Get Data menu. You can locate the “Power BI Service” option under Online Services:

 Reusing Datasets Imported to the Power BI Service

Here’s where things differ a LOT from Analyze In Excel. 

If you are a read-only user, you’ll see your list of workspaces. However, you won’t see any datasets to choose from. That is because currently you are required to have edit privileges on the dataset in order to use this feature.

However, if you do have edit permissions, you can select the dataset. Then it will open a blank report with a connection back to the dataset:

 Reusing Datasets Imported to the Power BI Service

Let’s block ads! (Why?)

Blog – SQL Chick

M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Let’s say I have ten minutes with an “uninitiated” Excel pro – someone who slings VLOOKUP and Pivots all the time, but has no idea that Power Pivot and Power Query exist.  And in those ten minutes, I’ve got to quickly demonstrate the capabilities of “Modern Excel” or Power BI (where the former means “Excel with Power Pivot and Power Query”, and the latter contains both of those same technologies).

volleyball 1 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

M “Sets Up” DAX.  And then DAX… Spikes Success All Over the Competition.
(If Will Ferrell ever made a volleyball movie, I could imagine him saying something like that)

I’m going to inevitably spend eight minutes on DAX, and then two minutes on Power Query at the end.  I favor that 80/20 split because I think that’s a realistic picture of your future as an Agile BI Pro:  80% DAX, 20% Power Query/M.

But for the uninitiated, Power Query is always the sexier demo.  Always.

Why is that, and how do I “square” that with my 80/20 ratio, and my stubborn insistence to not “lead” with the sexier demo?

It makes total sense to me actually.  Traditional (pre-DAX, pre-M) Excel workflows always have involved a heavy dose of manual data munging.  Squashing a folder full of CSV’s together into a single table, for instance, was just as much “a thing” in 2002 as it is now, fifteen years later.  If you can reduce that to a few clicks, AND then automate all subsequent “runs” of that work down to a single click, wow, that’s mind-blowing to longtime Excel users.

So the magic of Power Query is instantly apparent and tangible to basically any Excel Pro.  They can immediately see how PQ will save them oodles of time and anguish.

The benefits of DAX and relationships, by contrast, are less readily-apparent on first glance.  Portable/re-useable formulas that enable rapid iteration, the answering of “emergent” questions in near real-time, as well as a “subdivide and segment” capability?  Or how about multi-data-table capabilities that provide an integrated and convenient view across many different sources of formerly-siloed data?  These concepts are simply alien to the longtime Excel user, even though they are MONSTERS in terms of their biz value (as well as time-savers and anguish-reducers).  None of the impact “lands” up front because it can’t adequately be contemplated until you’ve started DOING it.

“Help me do what I already do, faster” is FAR easier to absorb than “blow the doors off my existing world, unlocking entirely-new workflows.”

image thumb M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Yeah, fair question – PQ does get used FIRST, prior to DAX and Relationships, in any XLSX or PBIX file you create.  So, um, why do I say learn it second?

Here’s a slide we use fairly often, in which we illustrate the relationship between Power Query (aka the M engine) and Power Pivot (aka the DAX/Relationships engine):

image thumb 1 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Power Query/M is a “Preprocessor/Cleaner” for Data Before it’s Fed to DAX

So let’s say that Power Query “feeds” the DAX / Power Pivot engine.

And Power Query can “make” just about anything (in terms of output data shapes) – let’s start out using cake, burgers, croissants, and pizza as a demonstration of its culinary range…

image thumb 2 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Well then…  what does the DAX engine like to eat?  Shouldn’t we get to know it better first?  What if it has dietary preferences, allergies, or other particular needs?

image thumb 3 M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First

Sure, fine, but let’s ride it to its conclusion, if for no other reason than we employed a lot of clipart in the making of this post and clipart needs jobs, too.

Here’s the gist:  if DAX is where the transformative power truly lies (which is true), and it has strong preferences about the shapes of data you feed it (also true), we need to understand that before we can use Power Query / M properly.

And some of the things it likes/prefers might NOT be things we’d even think to “cook” (hence the coffee example above).  So, we need to get a feel for what DAX likes to eat.

No, nothing so extreme.  What I’m really cautioning about are the dangers of an insidious disease,Queryus Infatuationus, that tends to strike adopters of Modern Excel and Power BI in their early days of usage.

In short, the malady is this:  “look, I’ve got this cool new M hammer!  It’s the answer to my dreams and clearly I should use it for everything!”

I’ve seen M used extensively as an analysis tool, for instance, and that is Not What You Want to Do.  Using M to pre-aggregate and analyze data (instead of DAX), is akin to using SQL for that same purpose (also instead of DAX), and I’ve written about the latter before.  Short version:  you sacrifice all those lovely benefits of portable formulas I mentioned at the beginning of this article (iteration, answering emerging questions, subdivide and segment, etc.)

Bottom line:  in the early going you are going to be learning both (unless all your data lives in databases, and the admins of said databases are willing to make changes for you – in which case you may not need Power Query / M very much).

I’m suggesting, specifically, that you should advance to Intermediate skill in DAXbefore advancing to Intermediate skill in Power Query / M, and then to Mastery of DAX before Mastery of M.

Another take on this:  if you’re learning to write M scripts from scratch (as opposed to getting by with the toolbar buttons and the occasional google search for a snippet of M), and you haven’t yet conquered CALCULATE, ALL, and FILTER, you’re probably getting too deep into M at the expense of your DAX skill.

I think it’s only honest at this point to say that “getting by with the toolbar buttons and the occasional google search for a snippet of M” is exactly where *I* am at personally with Power Query / M.

On one hand, you may read that and say “oh great, he’s just advising us to be like him so he doesn’t feel bad.”  Even I wondered about that a bit, while writing this article – I was on the lookout for my own confirmation bias because my inner critic doesn’t stay inner – he sits on my shoulder while I write.  And I’m very much aware that at my own company, I rank in the bottom 10 percentile when it comes to M.

But on the other hand, I’m very confident in my assertion that M is not the ideal producer of final results – you don’t want to feed M output directly into visualization layers.  The bar charts at the top of this article paint a proper picture – you don’t get any of the Ten Things Data Can Do For You until you learn how to wield DAX proficiently.  So you absolutely are sacrificing TREMENDOUS capabilities if you leave DAX out of your equation – even if “leaving it out” simply means “not fully leveraging it.”  And if you ARE fully leveraging it, it has a “say” in what you should be doing (and learning!) with M.

Go get ‘em.

Let’s block ads! (Why?)


What’s new in SQL Server 2017 CTP 2.1 for Analysis Services

The public CTP 2.1 of SQL Server 2017 is available here! This public preview includes the following enhancements for Analysis Services tabular models.

  • Shared M expressions are shown in the SSDT Tabular Model Explorer, and can be maintained using the Query Editor.
  • Data Management View (DMV) improvements.
  • Opening an file with the .MSDAX extension in SSDT enables DAX non-model related IntelliSense.
  • Encoding hints can be set in the SSDT properties window.

Shared M expressions

Shared M expressions are shown in the Tabular Model Explorer in SSDT! By right clicking the Expressions node, you can edit the expressions in the Query Editor. This should seem familiar to Power BI Desktop users.

Query Editor What’s new in SQL Server 2017 CTP 2.1 for Analysis Services

DMV improvements


M dependencies are included in DISCOVER_CALC_DEPENDENCY for CTP 2.1. As communicated in the CTP 2.0 blog post, DISCOVER_CALC_DEPENDENCY now works with 1200 models.

The following query returns the output shown below. M expressions and structured data sources are included for 1400 models.


DMV Output What’s new in SQL Server 2017 CTP 2.1 for Analysis Services

The output is a superset of the information shown by the Query Dependencies visual, which is available in SSDT from the Query Editor.

Query Dependencies What’s new in SQL Server 2017 CTP 2.1 for Analysis Services

This information is useful for numerous scenarios including the following.

  • Documentation of tabular models.
  • Community tools such as BISM Normalizer that perform incremental metadata deployment and merging, as well as other 3rd party tools, can use it for impact analysis.


CTP 2.1 provides a fix for MDSCHEMA_MEASUREGROUP_DIMENSIONS. This DMV is used by various client tools to show measure dimensionality. For example, the Explore feature in Excel Pivot Tables allows the user to cross-drill to dimensions related to the selected measures.

Explore What’s new in SQL Server 2017 CTP 2.1 for Analysis Services

Prior to CTP 2.1, some rows were missing in the output for 1200 models, which meant the Explore feature did not work correctly. This is fixed in CTP 2.1.

We intend to make further DMV improvements in forthcoming releases, so stay tuned.

DAX file editing

Opening a file with the .MSDAX extension in SSDT allows DAX editing with non-model related IntelliSense such as highlighting, statement completion and parameter info. As you can imagine, we intend to use this for interesting features to be released in the future!

DAX file editing What’s new in SQL Server 2017 CTP 2.1 for Analysis Services

Encoding hints

As documented in this blog post, encoding hints are an advanced feature introduced by CTP 1.3. They can help optimization of processing (data refresh) for large in-memory tabular models. In CTP 2.1, encoding hints can be set in the SSDT Visual Studio properties window.

Encoding Hints What’s new in SQL Server 2017 CTP 2.1 for Analysis Services

Download now!

To get started, download SQL Server 2017 CTP2.1. The May 2017 release of the Analysis Services VSIX for SSDT is available here. VSIX deployment for Visual Studio 2017 is discussed in this blog post.

Be sure to keep an eye on this blog to stay up to date on Analysis Services!

Let’s block ads! (Why?)

Analysis Services Team Blog

Power Query (M)agic Part 1: Always Have Good References

Data IS Life…and “Life, Uh, Finds A Way.”

P3 friends…for those of you in the know (and for those who aren’t), Power Query (or PQ for short) is one of the cornerstones of the Microsoft BI Suite. It’s a tool built into Excel 2016 and Power BI Desktop that allows us to extract our data (78 data connectors and GROWING), transform our data (it’s more than meets the eye), and finally load our data into a Data Model. Wait…so PQ extracts, transforms, and loads data? Those words spell out ETL as an initialism!

DNA Data Gene thumb 4 Power Query (M)agic Part 1: Always Have Good References

Anyone familiar with SQL Server Analysis Services (SSAS) or SQL Server Integration Services (SSIS) might have caught that. I carefully chose to describe PQ that way because in many many ways it reproduces what was historically done with these other tools. While PQ is not a complete replacement for these tools, it does allow you to transform or clean your data for nearly 99% of the data scenarios you encounter.

For a more in-depth post about what Power Query is and when to use it I’d recommend reading this week’s post by Rob. It’s called M/Power Query “Sets Up” DAX, so Learn DAX (and Modeling) First and it does a fantastic job of breaking down when and where transformations should be applied, and when to use DAX. Matt Allington also wrote up a great technical post titled Power Query as an SSIS Alternative that gives a great (and DEEP!) dive into some great ways to leverage Power Query in conjunction with Power Update. Between these two posts you’ll get a pretty good grasp of how much of a heavy hitter Power Query can be!

Knowledge is a Terrible Thing to Waste

Now that we have a basic grasp of what Power Query is, I’d like to impart some knowledge to you. Specifically I’d love to introduce you to a new post series called Power Query (M)agic. The goal of this series is to share some of the wonderful best practices that we at P3 have acquired over the years. These practices have been either been self taught, or graciously passed down from other BI Ninjas. These are practices that can be implemented in nearly any Excel or Power BI Report. The best part is MOST of these practices won’t require you to be a Power Query expert, or even an advanced user!

As long as you have a BASIC UNDERSTANDING of Power Query you’ll be able to implement these yourself. With that being said let’s dive in to today’s technique shall we?

Technique #1 – One Data Source To Rule Them All
One Source to Rule Them All thumb 3 Power Query (M)agic Part 1: Always Have Good References

So I LOVED discovering this technique! I’ve seen the Reference button in Power Query for a while…but honestly I never took the time to figure out how to fully utilize it. Avi Singh is owed credit for showing me how to fully use this feature. So the idea behind this Reference technique is to create a BASE QUERY that acts as the singular data source for that connection to all other queries.

Hold on Reid, don’t ALL PQ queries already have a source?? Well YES, technically whenever you create a new PQ query, the first step shown IS called Source, and it does point to your data. However, let’s imagine a scenario where you’ve built out a report that has 20+ PQ queries in it, EACH with their own INDIVIDUAL data source. Now if multiple PQ queries point to a data source that changed location or got updated…you’d have to update every single query that references it. Not ideal in my book!

This is where the Reference feature comes in. Reference let’s you create newquery, referencing the original query as the data source in any number of additional queries. So if the data connection ever needs updated, now you only have to update ONE QUERY! Have I lost any of you yet? Don’t worry, I’ll paint you a picture with words (well…screenshots) to help explain this better.

Traditional list of PQ queries, each with it’s own separate data source:

Queries Without Data Source MARKED UP thumb 1 Power Query (M)agic Part 1: Always Have Good References

New PQ source query, referenced by SEVEN other queries:

Queries With Data Source MARKED UP thumb Power Query (M)agic Part 1: Always Have Good References

Does that first image of queries look familiar? This is a typical setup in most report models…just a long list of queries. The problem (as stated earlier) is that they all have individual sources, many of them pointing to the same connection or location! The solution and output will look something like the second image; a single source query that is referenced by any other query that requires that shared connection. It’s relatively straight forward to implement, so “come with me if you want to learn” and I’ll explain how it’s done. wlEmoticon smile 1 Power Query (M)agic Part 1: Always Have Good References

Step 1 – Create a new PQ query to ANY table in the database you’re wanting to reference:

DB Step 1 thumb Power Query (M)agic Part 1: Always Have Good References

Step 2 – DELETE the navigation step, and rename your query. Now you have a base connection to any table in the database:

DB Step 2 thumb Power Query (M)agic Part 1: Always Have Good References

Step 3 – Right click the query and select Reference to create a new query with this as the data source:

DB Step 3 thumb Power Query (M)agic Part 1: Always Have Good References

Step 4 – Navigate to the DB connection the table you want, and rename the query (DONE!):

DB Step 4 thumb Power Query (M)agic Part 1: Always Have Good References

Simple right, and NOT a single line of code required! This step can be repeated for as many tables or views that you have in that database. But wait…what if there’s a report that’s already been built, and we want to update existing PQ queries? Well there’s a solution for this too! Although for this we will technically we will have to write some M code, it’ll be painless though I promise. All it will require is a single line of code to be written, that will override the Source code of your existing query.

Go to the source step of the existing PQ query you wish to update, and replace the SQL Connection with the connection query name:

DB Step 5 thumb Power Query (M)agic Part 1: Always Have Good References

When typing in the name, just make sure to include a pound symbol and quote (#”) before the PQ query name, and a pound symbol () following it. If you’re ever having trouble figuring out how to write this Source step, you can create a new Reference Query and copy the code from there, then delete that reference query. Honestly one of the great things about this function is how similar the steps for are for doing this with any data source. The majority of the steps I’m outlining here are transferrable! Well that’s it for today P3 Nation, stay tuned for the next part in this series so until then.

Let’s block ads! (Why?)


Power BI Report Server preview now available

On May 3, Microsoft announced Power BI Premium, a capacity-based licensing model that increases flexibility for how users access, share and distribute content. The new offering also introduces the ability to manage Power BI reports on-premises with the included Power BI Report Server.

Today, we’re excited to make available a preview of Power BI Report Server.

Download Power BI Report Server (Preview)

With Power BI Desktop and Power BI Report Server, you can:

1. Create beautiful, interactive reports using Power BI Desktop

2. Publish reports to Power BI Report Server

3. View and interact with reports in your web browser or in Power BI Mobile on your phone or tablet

1519e3fc 1150 4691 9113 025c884f2d2d Power BI Report Server preview now available

At launch, Power BI Report Server will support most Power BI report features, included but not limited to:

  • Create reports in Power BI Desktop
  • Connect to Analysis Services data models (Tabular or Multidimensional)
  • Visualize data using built-in or custom visuals
  • View and interact with reports in your web browser
  • Export report data to CSV
  • Print a report page
  • View and interact with reports in Power BI Mobile

With this first release, you connect directly to an Analysis Services data model, which in turn can connect to a variety of other data sources, including SQL Server, Oracle, Teradata, and more. We aim to extend direct connectivity to the other Power BI Desktop data sources, targeting later this year.

In addition to its self-service BI capabilities, Power BI Report Server includes the enterprise reporting capabilities of SQL Server Reporting Services (SSRS), so you can generate the precisely-formatted reports your business needs.

Power BI Report Server will be generally available along with Power BI Premium late in the second quarter of 2017.

Try it now

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI