Category Archives: Self-Service BI

Deploying Solutions from the Cortana Intelligence Gallery

 Deploying Solutions from the Cortana Intelligence Gallery

The Gallery is a community site. Many of the contributions are from Microsoft directly. Individual community members can make contributions to the Gallery as well.

The “Solutions” are particularly interesting. Let’s say you’ve searched and found Data Warehousing and Modern BI on Azure:

Deploying a Solution from the Gallery

What makes these solutions pretty appealing is the “Deploy” button. They’re packaged up to deploy all (or most) of the components into your Azure environment. I admit I’d like to see some fine-tuning of this deployment process as it progresses through the public preview. Here’s a quick rundown what to expect.

1|Create new deployment:

CISGallery Deployment 1 Deploying Solutions from the Cortana Intelligence Gallery

The most important thing in step 1 above is that your deployment name ends up being your resource group. The resource group is created as soon as you click the Create button (so if you change your mind on naming, you’ll have to go manually delete the RG). Also note that you’re only allowed 9 characters, which makes it hard to implement a good naming convention. (Have I ever mentioned how fond I am of naming conventions?!?)

Resource groups are an incredibly important concept in Azure. They are a way to logically organize related resources which (usually) have the same lifecycle and are managed together. All items within a single resource group are included in an ARM template. Resource groups can serve as a boundary for security/permissions at the RG level, and can be used to track the cost of a solution. So, it’s extremely important to plan out resource group structure in your real environment. In our situation here, having all of these related resources for testing/learning purposes is perfect.

2|Provide configuration parameters:

CISGallery Deployment 2 Deploying Solutions from the Cortana Intelligence Gallery

In step 2 above, the only thing we need to specify is a user and password. This will be the server admin for both Azure SQL Database and Azure SQL Data Warehouse which are provisioned. It will use SQL authentication.

As soon as you hit the Next button, the solution is provisioning.

3|Resource provisioning (automated):

 Deploying Solutions from the Cortana Intelligence Gallery

In step 3 above we see the progress. Depending on the type of resource, it may take a little while.

4|Done:

CISGallery Deployment 4 Deploying Solutions from the Cortana Intelligence Gallery

When provisioning is complete, as shown in step 4 above (partial screenshot), you get a list of what was created and instructions for follow-up steps. For instance, in this solution our next steps are to go and create an Azure Service Principal and then create the Azure Analysis Services model (via PowerShell script saved in an Azure runbook provided by the solution).

They also send an e-mail to confirm the deployment:

 Deploying Solutions from the Cortana Intelligence Gallery

If we pop over to the Azure portal and review what was provisioned so far, we see the following:

 Deploying Solutions from the Cortana Intelligence Gallery

We had no options along the way for selecting names for resources, so we have a lot of auto-generated suffixes for our resource names. This is ok for purely learning scenarios, but not my preference if we’re starting a true project with a pre-configured solution. Following an existing naming convention is impossible with solutions (at this point anyway). A wish list item I have is for the solution deployment UI to display the proposed names for each resource and let us alter if desired before the provisioning begins.

The deployment also doesn’t prompt for which subscription to deploy to (if you have multiple subscriptions like I do). The deployment did go to the subscription I wanted, however, it would be really nice to have that as a selection to make sure it’s not just luck.

We aren’t prompted to select scale levels during deployment. From what I can tell, it chooses the lowest possible scale (I noted that the SQL Data Warehouse was provisioned with 100 DWUs, and the SQLDB had 5 DTUs).

To minimize cost, don’t forget to pause what you can (such as the SQL Data Warehouse) when you’re not using it. The HDInsight piece of this will be the most expensive, and it cannot be paused, so you might want to learn & experiment with that first then de-provision HDInsight in order to save on cost. If you’re done with the whole solution, you can just delete the resource group (in which case all resources within it will be deleted permanently).

Referring to Documentation for Deployed Solutions

You can find each of your deployed solutions here: https://start.cortanaintelligence.com/Deployments

From this view, you can refer back to the documentation for a solution deployment (which is the same info presented in Step 4 when it was finished provisioning).

You can also ‘Clean up Deployments’ which is a nice feature. The clean up operation first deletes each individual resource, then it deletes the resource group:

 Deploying Solutions from the Cortana Intelligence Gallery

Let’s block ads! (Why?)

Blog – SQL Chick

Introducing the Quick Measures Gallery

In the April update of Power BI Desktop, we released a powerful feature called ‘quick measures’ that helps create DAX measures based on templates for common patterns and expressions. Thanks to everyone who tried the feature out, gave us feedback, and even created ideas on our forum with new suggestions! We’re continuing to build more of these templates to address different scenarios, but we’re also looking for help creating more measures to share with everyone.

PowerBI QuickMeasuresGallery Banner Introducing the Quick Measures Gallery

You can submit your common calculations that would be helpful for the rest of the community on our new Quick Measures Gallery. We’ll evaluate the DAX statement and may use it as the basis of a future quick measure – you can even get your name immortalized in Power BI Desktop!

You can see instructions for how to write the best submissions, including a template to use, on the Community forum.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Set size for multiple visualizations in #PowerBI at the same time

April 23, 2017 / Erik Svensen

Set size for multiple visualizations in #PowerBI at the same time

 Set size for multiple visualizations in #PowerBI at the same time

When designing your reports in Power BI Desktop you properly spent a lot of time making sure your visualizations is aligned and at least for some of them making sure they have the same size.

So far, we only have the align feature in the Power BI Desktop

 Set size for multiple visualizations in #PowerBI at the same time

To change the size of the visualizations we must use the General properties under Format to resize the elements

 Set size for multiple visualizations in #PowerBI at the same time

But what if you want to resize more than one element at a time – If you select more than one you get the size of the first selection in the general tab

 Set size for multiple visualizations in #PowerBI at the same time

Now here is the trick – modify the width and Height with 1 each

 Set size for multiple visualizations in #PowerBI at the same time

And then back again

 Set size for multiple visualizations in #PowerBI at the same time

And your visualizations have the same size.

OBS – This only works when you select the same type of visualizations – if select different types you won’t be able to see General under Format.

Hope this can help you too –

Advertisements

New Get Data Capabilities in the GA Release of SSDT Tabular 17.0 (April 2017)

With the General Availability (GA) release of SSDT 17.0, the modern Get Data experience in Analysis Service Tabular projects comes with several exciting improvements, including DirectQuery support (see the blog article “Introducing DirectQuery Support for Tabular 1400”), additional data sources (particularly file-based), and support for data access options that control how the mashup engine handles privacy levels, redirects, and null values. Moreover, the GA release coincides with the CTP 2.0 release of SQL Server 2017, so the modern Get Data experience benefits from significant performance improvements when importing data. Thanks to the tireless effort of the Mashup engine team, data import performance over structured data sources is now at par with legacy provider data sources. Internal testing shows that importing data from a SQL Server database through the Mashup engine is in fact faster than importing the same data by using SQL Server Native Client directly!

Last month, the blog article “What makes a Data Source a Data Source?” previewed context expressions for structured data sources—and the file-based data sources that SSDT Tabular 17.0 GA adds to the portfolio of available data sources make use of context expressions to define a generic file-based source as an Access Database, an Excel workbook, or as a CSV, XML, or JSON file. The following screenshot shows a structured data source with a context expression that SSDT Tabular created for importing an XML file.

XMLFileImport 1024x561 New Get Data Capabilities in the GA Release of SSDT Tabular 17.0 (April 2017)

Note that file-based data sources are still a work in progress. Specifically, the Navigator window that Power BI Desktop shows for importing multiple tables from a source is not yet enabled so you end up immediately in the Query Editor in SSDT. This is not ideal because it makes it hard to import multiple tables. A forthcoming SSDT release is going to address this issue. Also, when trying to import from an Access database, note that SSDT Tabular in Integrated Workspace mode would require both the 32-bit and 64-bit ACE provider, but both cannot be installed on the same computer. This issue requires you to use a remote workspace server running SQL Server 2017 CTP 2.0, so that you can install the 32-bit driver on the SSDT workstation and the 64-bit driver on the server running Analysis Services CTP 2.0.


Keep in mind that SSDT Tabular 17.0 GA uses the Analysis Services CTP 2.0 database schema for Tabular 1400 models. This schema is incompatible with CTPs of SQL vNext Analysis Services. You cannot open Tabular 1400 models with previous schemas and you cannot deploy Tabular 1400 models with a CTP 2.0 database schema to a server running a previous CTP version.


Another great data source that you can find for the first time in SSDT Tabular is Azure Blob Storage, which will be particularly interesting when Azure Analysis Services provides support for the 1400 compatibility level. When connecting to Azure Blob Storage, make sure you provide the account name or URL without any containers in the data source definition, such as https://myblobdata.blob.core.windows.net. If you appended a container name to the URL, SSDT Tabular would fail to generate the full set of data source settings. Instead, select the desired contain in the Navigator window, as illustrated in the following screenshot.

AzureBlobs 1024x663 New Get Data Capabilities in the GA Release of SSDT Tabular 17.0 (April 2017)

As mentioned above, SSDT Tabular 17.0 GA uses the Analysis Services CTP 2.0 database schema for Tabular 1400 models. This database schema is more complete than any previous schema version. Specifically, you can find additional Data Access Options in the Properties window when selecting the Model.bim file in Solution Explorer (see the following screenshot). These data access options correspond to those options in Power BI Desktop that are applicable to Tabular 1400 models hosted on an Analysis Services server, including:

  • Enable Fast Combine (default is false)   When set to true, the mashup engine will ignore data source privacy levels when combining data.  
  • Enable Legacy Redirects (default is false)  When set to true, the mashup engine will follow HTTP redirects that are potentially insecure (for example, a redirect from an HTTPS to an HTTP URI).  
  • Return Error Values as Null (default is false)  When set to true, cell level errors will be returned as null. When false, an exception will be raised if a cell contains an error.  

DataAccessOptions 1024x725 New Get Data Capabilities in the GA Release of SSDT Tabular 17.0 (April 2017)

And especially with the Enable Fast Combine setting you can now begin to refer to multiple data sources in a single source query.

Yet another great feature that is now available to you in SSDT Tabular is the Add Column from Example capability introduced with the April 2017 Update of Power BI Desktop. For details, refer to the article “Add a column from an example in Power BI Desktop.” The steps are practically identical. Add Column from Example is a great illustration of how the close collaboration and teamwork between the AS engine, Mashup engine, Power BI Desktop, and SSDT Tabular teams is compounding the value delivered to our customers.

Looking ahead, apart from tying up loose ends, such as the Navigator dialog for file-based sources, there is still a sizeable list of data sources we are going to add in further SSDT releases. Named expressions discussed in this blog article a while ago also still need to find their way into SSDT Tabular, and there are other things such as support for the full set of impersonation options that Analysis Services provides for data sources that can use Windows authentication. Currently, only service account and explicit Windows credentials can be used. Forthcoming impersonation options include current user and unattended accounts.

In short, the work to enable the modern Get Data experience in SSDT Tabular is not yet finished. Even though SSDT Tabular 17.0 GA is fully supported in production environments, Tabular 1400 is still evolving. The database schema is considered complete with CTP 2.0, but minor changes might still be coming. So please be invited to deploy SSDT Tabular 17.0 GA, use it to work with your Tabular 1200 models and take Tabular 1400 for a thorough test drive. And as always, please send us your feedback and suggestions by using ProBIToolsFeedback or SSASPrev at Microsoft.com. Or use any other available communication channels such as UserVoice or MSDN forums. Influence the evolution of the Analysis Services connectivity stack to the benefit of all our customers!

Let’s block ads! (Why?)

Analysis Services Team Blog

Introducing a DAX Editor Tool Window for SSDT Tabular

The April 2017 release of SSDT Tabular for Visual Studio 2015 and 2017 comes with a DAX editor tool window that can be considered a complement to or replacement for the formula bar. You can find it on the View menu under Other Windows, and then select DAX Editor, as the following screenshot illustrates. You can dock this tool window anywhere in Visual Studio. If you select a measure in the Measure Grid, DAX Editor lets you edit the formula conveniently. You can also right-click on a measure in Tabular Model Explorer and select Edit Formula. Authoring new measures is as easy as typing a new formula in DAX Editor and clicking Apply. Of course, DAX Editor also lets you edit the expressions for calculated columns.

DAX Editor 1024x559 Introducing a DAX Editor Tool Window for SSDT Tabular

SSDT Tabular also displays the DAX Editor when defining Detail Rows expressions, which is an improvement over previous releases of SSDT Tabular that merely let you paste an expression into the corresponding textbox in the Properties windows, as the following screenshot illustrates. When working with measures, calculated columns, and the detail rows expression properties, note that there is only one DAX Editor tool window instance, so the DAX Editor switches to the expression you currently want to edit.

DetailRowsDaxEditor Big 1024x506 Introducing a DAX Editor Tool Window for SSDT Tabular

The DAX Editor tool window is a continuous improvement project. We have plans to include features such as code formatting and additional IntelliSense capabilities. Of course, we are also looking forward to hearing from you. So please send us your feedback and suggestions via ProBIToolsFeedback or SSASPrev at Microsoft.com, and report any issues you encounter. Or use any other available communication channels such as UserVoice or MSDN forums. You can influence the evolution of SSDT Tabular to the benefit of all our customers.

Let’s block ads! (Why?)

Analysis Services Team Blog

What’s new in SQL Server 2017 CTP 2.0 for Analysis Services

The public CTP 2.0 of SQL Server 2017 on Windows is available here! This public preview includes the following enhancements for Analysis Services tabular.

  • Object-level security to secure model metadata in addition to data.
  • Transaction-performance improvements for a more responsive developer experience.
  • Dynamic Management View improvements for 1200 and 1400 models enabling dependency analysis and reporting.
  • Improvements to the authoring experience of detail rows expressions.
  • Hierarchy and column reuse to be surfaced in more helpful locations in the Power BI field list.
  • Date relationships to easily create relationships to date dimensions based on date columns.
  • Default installation option for Analysis Services is tabular, not multidimensional.

Other enhancements not covered by this post include the following.

  • New Power Query data sources. See this post for more info.
  • DAX Editor for SSDT. See this post for more info.
  • Existing Direct Query data sources support for M expressions. See this post for more info.
  • SSMS improvements, such as viewing, editing, and scripting support for structured data sources.

Incompatibility with previous CTP versions

Tabular models with 1400 compatibility level that were created with previous versions are incompatible with CTP 2.0. They do not work correctly with the latest tools. Please download and install the April 2017 (17.0 GA) release of SSDT and SSMS.

Object-level security

Roles in tabular models already support a granular list of permissions, and row-level filters to help protect sensitive data. Further information is available here. CTP 1.1 introduced table-level security.

CTP 2.0 builds on this by introducing column-level security, which allows sensitive columns to be protected. This helps prevent a malicious user from discovering that such a column exists.

Column-level and table-level security are collectively referred to as object-level security (OLS).

The current version requires that column-level security is set using the JSON-based metadata, Tabular Model Scripting Language (TMSL), or Tabular Object Model (TOM). We plan to deliver SSDT support soon. The following snippet of JSON-based metadata from the Model.bim file secures the Base Rate column in the Employee table of the Adventure Works sample tabular model by setting the MetadataPermission property of the ColumnPermission class to None.

"roles": [
  {
    "name": "Users",
    "description": "All allowed users to query the model",
    "modelPermission": "read",
    "tablePermissions": [
      {
        "name": "Employee",
        "columnPermissions": [
          {
            "name": "Base Rate",
            "metadataPermission": "none"
          }
        ]
      }
    ]
  }

DAX query references to secured objects

If the current user is a member only of the Users role, the following query that explicitly refers to the [Base Rate] column fails with an error message saying the column cannot be found or may not be used.

EVALUATESELECTCOLUMNS(
    Employee,
    "Id", Employee[Employee Id],
    "Name", Employee[Full Name],
    "Base Rate", Employee[Base Rate] --Secured column
)

The following query refers to a measure that is defined in the model. The measure formula refers to the Base Rate column. It also fails with an equivalent error message. Model measures that refer to secured tables or columns are indirectly secured from queries.

EVALUATE
{ [Average of Base Rate] } --Indirectly secured measure

As you would expect, IntelliSense for DAX queries in SSMS also honors column-level security and does not disclose secured column names to unauthorized users.

Detail-rows expression references to secured objects

It is anticipated that the SELECTCOLUMNS() function will be commonly used for detail-rows expressions. Due to this, SELECTCOLUMNS() is subject to special behavior when used by DAX expressions in the model. The following detail-rows expression defined on the [Reseller Total Sales] measure does not return an error when invoked by a user without access to the [Base Rate] column. Instead it returns a table with the [Base Rate] column excluded.

--Detail rows expression for [Reseller Total Sales] measureSELECTCOLUMNS(
    Employee,
    "Id", Employee[Employee Id],
    "Name", Employee[Full Name],
    "Base Rate", Employee[Base Rate] --Secured column
)

The following query returns the output shown below – with the [Base Rate] column excluded from the output – instead of returning an error.

EVALUATEDETAILROWS([Reseller Total Sales])

DetailRowsSecuredOutput What’s new in SQL Server 2017 CTP 2.0 for Analysis Services

However, derivation of a scalar value using a secured column fails on invocation of the detail-rows expression.

--Detail rows expression for [Reseller Total Sales] measureSELECTCOLUMNS(
    Employee,
    "Id", Employee[Employee Id],
    "Name", Employee[Full Name],
    "Base Rate", Employee[Base Rate] * 1.1--Secured column
)

Limitations of RLS and OLS combined from different roles

OLS and RLS are additive; conceptually they grant access rather than deny access. This means that combined membership from different roles that specify RLS and OLS could inadvertently cause security leaks. Hence combined RLS and OLS from different roles is not permitted.

RLS additive membership

Consider the following roles and row filters.

Role Model Permission Table  
RoleA Read Geography RLS Filter:

Geography[Country Region Name] = “United Kingdom”

RoleB Read Geography RLS Filter:

Geography[Country Region Name] = “United States”

Users who are members of both RoleA and RoleB can see data for the UK and the US.

OLS additive membership

A similar concept applies to OLS. Consider the following roles.

Role Model Permission Table  
RoleA Read Employee OLS Column Permission:

[Base Rate], MetadataPermission=None

RoleB Read

RoleB allows access to all tables and columns in the model. Therefore, users who are members of both RoleA and RoleB can query the [Base Rate] column.

RLS and OLS combined from different roles

Consider the following roles that combine RLS and OLS.

Role Purpose Model Permission Table  
RoleA Provide access to sales in the UK by customer (not product) Read Geography RLS Filter:

Geography[Country Region Name] = “United Kingdom”

Product OLS Table Permission:

MetadataPermission=None

RoleB Provide access to sales in the US by product (not customer) Read Geography RLS Filter:

Geography[Country Region Name] = “United States”

Customer OLS Table Permission:

MetadataPermission=None

The following diagram shows the intersection of the tables and rows relevant to this discussion.

RLS OLS Quadrant Copy What’s new in SQL Server 2017 CTP 2.0 for Analysis Services

RoleA is intended to expose data only for the top right quadrant.

RoleB is intended to expose data only for the bottom left quadrant.

Given the additive nature of OLS and RLS, Analysis Services would be allowing access to all 4 quadrants by combining these permissions for users who are members of both roles. Data would be exposed that neither role had the intention of exposing. For this reason, queries for users who are granted RLS and OLS permissions combined from different roles fail with an error message stating that the combination of active roles results in dynamic security configuration that is not supported.

Transaction-performance improvements

SSDT updates the workspace database during the development process. Optimized transaction management in CTP 2.0 is expected to result in a more responsive developer experience due to faster metadata updates to the workspace database.

DMV improvements

DISCOVER_CALC_DEPENDENCY is back! This Dynamic Management View (DMV) is useful for tracking and documenting dependencies between calculations and other objects in a tabular model. In previous versions, it worked for tabular models with compatibility level of 1100 and 1103, but it did not work for 1200 models. In CTP 2.0, it works for all tabular compatibility levels including 1200 and 1400.

The following query shows how to use the DISCOVER_CALC_DEPENDENCY DMV.

SELECT * FROM $  System.DISCOVER_CALC_DEPENDENCY;

There are differences in the output for 1200 and 1400 models. The easiest way to understand them is to compare the output for models with different compatibility levels. Notable differences are listed here for reference.

  • Relationships in 1200 and higher are identified by name (normally a GUID) in the OBJECT column. Active relationships have OBJECT_TYPE of “ACTIVE_RELATIONSHIP”; inactive relationships have OBJECT_TYPE of “RELATIONSHIP”. 1103 and lower models differ because they include all relationships with OBJECT_TYPE of “RELATIONSHIP” and an additional “ACTIVE_RELATIONSHIP” row to flag each active relationship.
  • 1103 and lower models include a row with OBJECT_TYPE “HIERARCHY” for each attribute hierarchy dependency on its column. 1200 and higher do not.
  • 1200 and higher models include rows for calculated tables with OBJECT_TYPE “CALC_TABLE”. Calculated tables are not supported in 1103 or lower models.
  • 1200 and higher models currently do not include rows for measure data dependencies on tables and columns. Data dependencies between DAX measures are included.

We intend to may make further improvements to DISCOVER_CALC_DEPENDENCY in forthcoming CTPs, so stay tuned.

Improved authoring experience for Detail Rows

The April 2017 release (17.0 GA) of SSDT provides an improved authoring experience with IntelliSense and syntax highlighting for detail rows expressions using the new DAX Editor for SSDT. Click on the ellipsis in the Detail Rows Expression property to activate the DAX editor.

DetailRowsDaxEditor What’s new in SQL Server 2017 CTP 2.0 for Analysis Services

Hierarchy & column reuse

Hierarchy reuse is a Power BI feature, although it is surfaced differently in Analysis Services. Power BI uses it to provide easy access to implicit date hierarchies for date fields. Introducing such features for Analysis Services furthers the strategic objective of enabling a consistent modeling experience with Power BI.

Power BI Variations What’s new in SQL Server 2017 CTP 2.0 for Analysis Services

Tabular models created with CTP 2.0 can leverage hierarchy reuse to surface user hierarchies and columns – not limited to those from a date dimension table – in more helpful locations in the Power BI field list. This can provide a more guided analytics experience for business users.

For example, the Calendar hierarchy from the Date table can be surfaced as a field in Internet Sales, and the Fiscal hierarchy as a field in the Sales Quota table. This assumes that, for some business reason, sales quotas are frequently reported by fiscal date.

The current version requires that hierarchy and column reuse is set using the JSON-based metadata, Tabular Model Scripting Language (TMSL), or Tabular Object Model (TOM). The following snippet of JSON-based metadata from the Model.bim file associates the Calendar hierarchy from the Date table with the Order Date column from the Internet Sales table. As shown by the type name, the feature is also known as variations.

{
  "name": "Order Date",
  "dataType": "dateTime",
  "sourceColumn": "OrderDate",
  "variations": [
    {
      "name": "Calendar Reuse",
      "description": "Show Calendar hierarchy as field in Internet Sales",
      "relationship": "3db0e485-88a9-44d9-9a12-657c8ef0f881",
      "defaultHierarchy": {
          "table": "Date",
          "hierarchy": "Calendar"
      },
      "isDefault": true
    }
  ]
}

The current version also requires the ShowAsVariationsOnly property on the dimension table to be set to true, which hides the dimension table. We intend to remove this restriction in a forthcoming CTP.

{
  "name": "DimDate",
  "showAsVariationsOnly": true

The Order Date field in Internet Sales now defaults to the Calendar hierarchy, and allows access to the other columns and hierarchies in the Date table.

AS Variations What’s new in SQL Server 2017 CTP 2.0 for Analysis Services

Date relationships

Continuing the theme of bringing Power BI features to Analysis Services, CTP 2.0 allows the creation of date relationships using only the date part of a DateTime value. Power BI uses this internally for relationships to hidden date tables.

Date relationships that ignore the time component currently only work for imported models, not Direct Query.

The current version requires that date relationship behavior is set using the JSON-based metadata, Tabular Model Scripting Language (TMSL), or Tabular Object Model (TOM). The following snippet of JSON-based metadata from the Model.bim file defines a relationship from Reseller Sales to Order based on the date part only of the Order Date column. Valid values for JoinOnDateBehavior are DateAndTime and DatePartOnly.

{
  "name": "100ca454-655f-4e46-a040-cfa2ca981f88",
  "fromTable": "Reseller Sales",
  "fromColumn": "Order Date",
  "toTable": "Date",
  "toColumn": "Date",
  "joinOnDateBehavior": "datePartOnly"
}

Default installation option is tabular

Tabular mode is now the default installation option for SQL Server Analysis Services in CTP 2.0.

Default Tabular Install What’s new in SQL Server 2017 CTP 2.0 for Analysis Services

Note: this also applies to installations from the command line. Please see this document for further information on how to set up automated installations of Analysis Services from the command line. In CTP 2.0, if the ASSERVERMODE parameter is not provided, the installation will be in tabular mode. Previously it was multidimensional.

Extended events

Extended events were not working in CTP 1.3. They do work again in CTP 2.0 (actually since CTP 1.4).

Download now!

To get started, download SQL Server 2017 on Windows CTP 2.0 from here. Be sure to keep an eye on this blog to stay up to date on Analysis Services.

Let’s block ads! (Why?)

Analysis Services Team Blog

May and June Webinars: Power BI Security, Best Practices from the Microsoft Operations Team, Power BI Embedded, Marco Russo on Design and more!

We’ve got a great lineup of webinars planned for April and May! Attend these free sessions to gain expertise in Power BI, and learn more about how PowerApps and Microsoft Flow can help improve your organization’s processes without a lot of overhead. Most of our webinars are delivered through YouTube, and you can subscribe to my channel to get notifications about all of these events. These webinars can also be viewed on-demand on YouTube after the event, if you’re not able to be a part of the live presentation. Please note that all webinars are listed in Pacific Time. Check out what we’ve got planned:

What’s New, Exciting, and Coming Next for Power BI Embedded by Aviv Ezrachi

image181 May and June Webinars: Power BI Security, Best Practices from the Microsoft Operations Team, Power BI Embedded, Marco Russo on Design and more!

Power BI Embedded is an Azure service that enables you to integrate impactful and interactive data visualizations built in Power BI Desktop into your web or mobile applications. In this webinar, Aviv Ezrachi will start with a level set as to what Power BI Embedded offers developers, walk through some announcements of recent and forth coming releases, and finish with the team’s future priorities.

About Aviv Ezrachi

Aviv is a seasoned Product Manager (MSC) with over 12 years’ experience in various roles in telecom and enterprise software development.

Register today!
When: 5/18/2017 10:00 AM

Power BI security deep dive by Kasper de Jonge

One hot topic with Power BI is security, and in this deep dive session we will look at all the aspects of security of Power BI. Kasper will discuss logging where and how your data is stored, and look at how to leverage additional Azure services to make your information even more secure.

About Kasper de Jonge

Kasper%20de%20Jonge May and June Webinars: Power BI Security, Best Practices from the Microsoft Operations Team, Power BI Embedded, Marco Russo on Design and more!

Kasper de Jonge is a senior program manager on the Business Applications team at Microsoft, where he has worked developing features for Power Pivot and other Analysis Services products such as the Tabular model and Multidimensional cubes. He is a frequent speaker at conferences such as TechEd, SQLPASS, and SQLSaturday, and is the creator of http://www.PowerPivotBlog.com, one of the leading Power Pivot websites.

Subscribe and join live!
When: 5/23/2017 10:00AM

Power BI visualization best practices by Marco Russo

Designing great-looking Power BI reports and dashboards isn’t just about beauty; it also makes the information easier to understand. In this webinar, Marco Russo will give us a sneak peek at his soon-to-be-released training series on this exciting topic.

Subscribe and join live!
When: 5/31/2017 10:00AM

Power BI Best Practices from the Power BI Operations Team

Learn tips and tricks for getting the most out of Power BI, direct from the Operations Team!

Subscribe and join live!

When: 6/8/2017 10:00AM

Unleash the Power of Power BI – tips and tricks by Philip Seamark

Join Philip Seamark as he walks through some of the details around the largest deployment of Power BI in the Southern Hemisphere, how it’s used, and how it meets their BI needs. This session will cover a variety of tips and tricks to help you enhance your Power BI reports. Learn how to:

  • Create self-annotating Dashboards
  • Use Power BI to optimise Power BI
  • Use the API
  • Take advantage of bi-directional relationships
  • Use Measures more creatively
  • And much more…..

Subscribe and join live!
When: 6/22/2017 10:00AM

Webinars for related products:

Introducing Flow Approvals: better enable People-based automation

Subscribe and join live!
When: 4/20/2017 10:00 AM

Rebuilding an InfoPath Designer form in PowerApps

Subscribe and join live!

When 4/25/2017 10:00 AM

Microsoft Flow best practices and examples for Business Analysts by Jon Levesque

Subscribe and join live!
When: 5/2/2017 10:00AM

Connecting to On-Premises data from PowerApps by Archana Nair And Dimah Zaidalkilani

Subscribe and join live!
When: 5/16/2017 10:00AM

Deploying your PowerApps applications by James Oleinik

Subscribe and join live!
When: 5/25/2017 10:00AM

Look behind the curtain with one of the PowerApp Developers: Marie Hoeger

Subscribe and join live!

When: 6/1/2017 10:00AM

Deep dive on PowerApps formulas by Greg Lindhorst

Subscribe and join live!
When: 6/20/2017 10:00AM

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Rediscovering (fire)matting: a truly amazing BI “tool”

P3 Nation WITH Fire thumb Rediscovering (fire)matting: a truly amazing BI tool

Picture2 thumb Rediscovering (fire)matting: a truly amazing BI toolHello P3 Nation, (can this be a thing?)…I’m excited to show you a piece of software that I’m confident near 100% of you will utilize! I’m here to talk to you about an amazing piece of software provided by the industry professionals over at SQLBI. Among the many great services, training’s, software, & utilities they provide is a powerful tool called DAX Formatter. Honesty the first time I came across this online I felt like a caveman discovering a new tool, and now I…

Use It. Every. Day.

Like many of my fellow PowerPivot, DAX, and M language enthusiasts I do my best to keep my workbooks and my code neat, orderly, and I even use inline comments if I’m feeling dedicated to my craft. Sounds great on paper doesn’t it? However many of us sometimes don’t apply as much effort as we’d like to give ourselves credit for and forgo the formatting or notes.

Forgoing code formatting has definitely come back to bite me in the err…posterior months later when I came back to review my formula, only to find a giant blob of code with no discernible breakouts.

If any of you have ever had code that looks like the monstrosity to my right (don’t judge me…), then prepare to hold on to your socks for a nearly automated solution (I promise I’m not being dramatic). wlEmoticon smile 1 Rediscovering (fire)matting: a truly amazing BI tool

Positively painful, nearly impossible to read, an abomination, not even something a mother could love, should be tossed out with the bathwater…ok ok enough with the clichéd analogies I promise.

The wonderful software wizards over at SQLBI know how to turn the ugly into the beautiful. It’s honestly as easy as 1, 2, 3.

Step 1: Go to the DAXFormatter website:

2017 03 13 1702 thumb Rediscovering (fire)matting: a truly amazing BI tool

Step 2: Grab a chunk of code (that you’re honestly considering giving up for adoption at this point) and paste it into the editor window:

2017 03 13 1659 thumb Rediscovering (fire)matting: a truly amazing BI tool

Step 3: Press the MAGICAL format button and watch the beast turn into the beauty:

2017 03 13 1658 thumb Rediscovering (fire)matting: a truly amazing BI tool

You’re then given a list of options to then export this newly transformed formula anywhere you need it. It’s really that simple! If I was to attempt to format my formula to this standard manually it would easily have taken me MUCH LONGER, and even then I doubt it would have looked this clean. This tool has saved me innumerable hours each month and I find this useful for at least 80% of my DAX formulas.

Not only do we have this fabulous website, the engineering wizards at SQLBI have also gifted us with this tool (and so MANY others) built right into an Excel add-in, let’s all take a moment to let the mic finish dropping… This wonderful tool set is called Power Pivot Utilities and I cannot emphasize enough how useful its features are. For the purpose of this post I’m only mentioning the DAX Formatter feature but don’t worry, I’ll cover the rest of this tool’s best features in a future post. Oh and did I mention that this add-in is…

PP Utilities DAX Formatter thumb Rediscovering (fire)matting: a truly amazing BI tool

Don’t you love it when you can have your cake and eat it too? Once you’ve installed the add-in for Excel you’ll see a new ribbon called PP Utilities. Within that tab you’ll find a checkbox for “Auto DAX formatting”. This will auto-format all your DAX Measures and Calculated Columns whenever you refresh the workbook…can’t get anymore convenient than that! With that ladies and gentleman until next time. Hopefully this tool will be as useful to you as it has been to me. wlEmoticon smile 1 Rediscovering (fire)matting: a truly amazing BI tool

Let’s block ads! (Why?)

PowerPivotPro

Connecting to datasets in the Power BI service from Desktop

In the April update of Power BI Desktop, we released the ability to connect to datasets in the Power BI service. This feature allows you to create new reports off existing datasets you’ve already published to the cloud. You can leverage this feature to better collaborate and reduce duplicate efforts across your team.

Model once, create many reports from that model.

Many Power BI users tell us that data cleaning and data prep tasks take up 70% of their time when creating a new report. Now, you can do the data modeling once and then create new reports on top of that same model, while still remaining in Power BI Desktop.

Easier data refresh management.

Every time you publish a new dataset to the service, you need to make sure data refresh is scheduled and the credentials are set up. If there are issues with refresh such as a password change, you’ll need to go and update that on all your datasets using that source. In addition, the more datasets you have with scheduled refresh on that source, the more load that source will incur when refreshes take place. If your reports are all leveraging the same dataset, this only needs to be managed once.

Divide up work between your team.

Instead of everyone in your team needing to do data modeling and report authoring, you can divide up the tasks. One person can be responsible for doing the data prep and managing the dataset. Others on the team can focus on report authoring and visualizations.

Leverage content packs as a starting point for your reports.

Not only can you connect to datasets you or your teammates have created, you can also connect to datasets created through content packs. For example, you can connect to a service content pack that you’ve set up such as Dynamics or Quickbooks and leverage that dataset in Power BI Desktop. In addition, you can make a copy of any Organization Apps you have installed to create new reports from those datasets in Desktop.

Getting Started

To get started with this feature, you’ll first need to enable the preview option in Power BI Desktop. Navigate to File > Options and settings > Options > Preview features and enable Power BI Service Live Connection. You’ll need to restart Desktop in order for the option to take effect.

3587c8bf ab0d 4edc 9ad8 1fe47fd891b7 Connecting to datasets in the Power BI service from Desktop

When you create a new Power BI report, you’ll see the Power BI service option under Online Services.

24f1c967 fd73 4246 a386 3ee829fbec76 Connecting to datasets in the Power BI service from Desktop

From there, you’ll get a list of your workspaces. Expand each workspace to see the list of datasets you can use to create reports. You may not see every dataset that is available to you in the Power BI service, the datasets you’ll see listed are:

  • Datasets you have edit access to. These include datasets you’ve created or datasets you have edit access to in a shared workspace. Datasets created from service content packs should show up in this list as well.
  • Datasets coming from data sources via import, DirectQuery or Push. We do not yet support SSAS data sources via this method, but it is on our roadmap to support this as well.
  • Datasets from Organization Apps once you’ve made a copy. In order to get edit access to a dataset in an org app, you first need to make a copy.

5267188a 7e57 4ab1 b404 147e3fb47e62 Connecting to datasets in the Power BI service from Desktop

Once you’ve connected to a dataset, you’ll get a read-only view of the model. The data view, relationship view and all modeling actions will be disabled. Once you’ve created the report, you can publish it up to the Power BI service to pin to a dashboard and share with others.

For more details about how to use the feature, step by step instructions and any current limitations, check out the documentation.

Tips & Tricks

When you download a report that is live connected to a dataset on the service, it will remain live connected when you open it in Desktop. This means you have read-only access to the model.

What if you need to make changes to the original dataset? In your list of datasets for any workspace, you’ll see the Download .pbix option in the context menu under the “…”.

32685b88 1d00 45c9 a49c 707876de6af5 Connecting to datasets in the Power BI service from Desktop

What if you don’t remember which dataset you were using for this report? Using the new Related Content Pane in the Power BI service, you can find the dataset and quickly download the PBIX that contains the model.

973e225b 67de 4759 a355 861edb1206a7 Connecting to datasets in the Power BI service from Desktop

Please give the public preview a try and share your feedback. As always, if you have ideas for new capabilities you’d like to see, post them on our User Voice forum.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Meet the Azure Analysis Services team at upcoming user group meetings

Come meet the Analysis Services team in person as they answer your questions on Analysis Services in Azure. Learn about the new service and features available now.

The success of any modern data-driven organization requires that information is available at the fingertips of every business user, not just IT professionals and data scientists, to guide their day-to-day decisions. Self-service BI tools have made huge strides in making data accessible to business users. However, most business users don’t have the expertise or desire to do the heavy lifting that is typically required, finding the right sources of data, importing the raw data, transforming it into the right shape, and adding business logic and metrics, before they can explore the data to derive insights. With Azure Analysis Services, a BI professional can create a semantic model over the raw data and share it with business users so that all they need to do is connect to the model from any BI tool and immediately explore the data and gain insights. Azure Analysis Services uses a highly optimized in-memory engine to provide responses to user queries at the speed of thought.

SQL Saturday Silicon Valley – April 22nd
Microsoft Technology Center, 1065 La Avenida, Mountain View, CA
Group Site
Register Now

Boston Power BI User Group – April 25th 6:30pm – 8:30pm
MS Office 5 Wayside Road, Burlington, MA
Group Site
Register Now

New York Power BI User Group – April 27th 6pm-8:30pm
MS Office Times Square, NY
Group Site
Register Now

Philadelphia Power BI User Group – May 1st 3pm-6pm
MS Office Malvern, PA
Group Site
Register Now

Philadelphia SQL User Group – May 2nd
Group Site
Registration: Coming Soon!

Portland Power BI User Group Meeting – late May
CSG Pro Office 734 NW 14th Ave Portland OR 97209
Group Site
Registration: Coming Soon!

New to Azure Analysis Services? Find out how you can try Azure Analysis Services or learn how to create your first data model.

Let’s block ads! (Why?)

Analysis Services Team Blog