Tag Archives: model

The Telco Cloud: A New Model for a Changed Market—Part 2

telecomcloud The Telco Cloud: A New Model for a Changed Market—Part 2

As first appeared on Vanilla Plus.

In the previous part of this blog we saw how network operators need fresh approaches in order to generate new revenue and hold up their profitability in a challenging and fast-evolving market. In this part, Joerg Koenig, TIBCO’s director of Vertical Solutions, explains two more reasons for changed market.

The issues that lie behind the model

So far, so good. But what are the challenges of this new model? How does it work in practical terms? On the path to digital transformation, where are the stumbling blocks?

To begin with, the telco must realize that this change in business model will lead to billing flexibility, away from minutes, bytes and message volumes and towards value-based pricing. In other words, the telco can base pricing on the value of a transaction or the service being consumed, as opposed to the number of bytes consumed. This does create challenges for the telco around the valuation of a transaction or service, and thus makes it useful for the telco to have the right tools so that they can manage and monitor what is really going on with traffic.

Transforming a telco business to take advantage of this new operating model will be all about integration and automation. The telco operates a broad web of applications from its core to edge and will need to support much higher levels of unpredictable scaling that will be inevitable with an on-demand model. Some of these applications will have to be migrated, replaced by SaaS applications, or deployed into different private or public clouds and platforms.

Some will have to be kept on premise for data legality or latency reasons, or perhaps re-written to support a microservice architecture. Furthermore, the rise of IoT will mean integration of new devices that will require data aggregation and filtering. This wide range of architecture, deployment, scaling, and performance choices will change over time and will require a matching integration capability that will support future innovation.

Essential to the telco cloud model will be the success with which APIs are made available by the telco to third parties. Providing easily consumable, well packaged API products, will attract partners and so drive revenues and business growth. In a platform model, APIs will be the ‘face’ of the telco and the ease with which they can be deployed, tried and consumed will provide crucial differentiation. The speed with which they can be created, tailored, managed and secured to protect backend systems will be critical.

Unpredictability is the new normal. Once APIs are published, the take up from customers and partners should lead both to great success and also to unforeseen levels of usage. There will inevitably be a feeling that the telco is losing control of the API. This is a very different model from the classical subscriber growth approach, and will need a very different mindset.

All of this on-demand network configuration from customers and partners, added to the effect of billions of connected IoT devices, brings the need to monitor and manage decisions in real time in a way that is bigger than any sort of human interaction can cope with. This means that all aspects of OSS/BSS orchestration, configuration and operation will trend towards automation. This necessitates deep and wide advanced analytics and the use of lightweight, real-time algorithms to detect and trigger action, deployed from core systems right out to devices on the edge of the network.

Finally, as the telco network becomes more programmable and network functions become virtualized, the promise of a reduced cost of ownership will be realised. Added to this are the value benefits of the platform approach once an ecosystem of partners can, via a self-service portal and APIs, configure new services on demand. Maximizing the value that the telco can derive from this, requires that the entire OSS/BSS stack be in alignment.

Everything from on-boarding customers and partners, executing on demand order and service orchestration to network configuration, tailored pricing, end to end assurance and all across physical, virtual and third-party networks and services will need to support the telco cloud platform. The wide variety of possible users and services means that such processes must be automatically generated, not simply scripted or coded.

Time to lead, not follow

We have seen that it is no longer sustainable to base a telco business around old ways of delivering and charging for services. The old-school linear pipeline where customers buy the capacity they need from operators via a conventional supply chain is on the way out. It no longer meets requirements. This creates an opportunity for telcos to reinvent the basis on which capacity is supplied. The successful telco will be a leader and innovator in this dynamic ecosystem, not a follower. They will need to show this leadership in numerous ways, for example by embracing a software-driven automated approach to network provisioning and management, and by enabling partners to work with them on taking their business in new directions. There is a need for a bold and decisive approach, and the time for that is now.

Learn more about what TIBCO is doing in the telecom industry.

Let’s block ads! (Why?)

The TIBCO Blog

The Telco Cloud: A New Model for a Changed Market—Part 1

telecomcloud The Telco Cloud: A New Model for a Changed Market—Part 1

As first appeared on Vanilla Plus.

Network operators need fresh approaches in order to generate new revenue and hold up their profitability in a challenging and fast-evolving market. By basing their business around a cloud-based platform model, they can ensure their survival in a world of unpredictable traffic patterns and massive data growth. Joerg Koenig, director Vertical Solutions, here at TIBCO, examines the dividends, challenges, and practicalities of such a move.

Time to take back control

The historic way in which telco business has been conducted must change. The model of locking the customer into your network and billing them for it in ways that do not necessarily match their consumption is redundant.

Today’s consumers see connectivity as just another commodity, a utility they depend on, but which they largely take for granted. What excites people is not the connection itself, but the data and content that travels over it. This is why telco Over-The-Top (OTT) service providers have been able to achieve so much success—success which has largely come at the expense of the network operator. Not only have OTTs eroded telco revenues, they have damaged the perceived relevance and centrality of the telco by relegating them to a mere pipe for their services, an undistinguished highway between locations.

There are other issues with traditional telco business practices. Network availability has traditionally been dimensioned to support maximum peaks of traffic. This guarantees that a network will be both expensive to build and massively underemployed for most of the time.

The arrival of the software-driven, virtualized network has helped to address the problem of network build costs. Now it’s time for operators to complete the job of grabbing back control by cutting the cost of driving revenue. They can do this by adopting the model of the Telco Cloud—telecoms delivered as a service over a cloud-based platform.

The Telco cloud

The old ‘consumption model’—where telco customers are charged on the basis of a predetermined level of usage—does not fit well with current needs. Consumption patterns, driven by the effect of social media and other forms of rich content, are far less predictable than they were when this model was conceived.

There is an opportunity for bold and far-sighted telcos to reinvent the basis on which capacity is supplied. They must position themselves so that customers see them as suppliers of ‘telco as a service’ instead. This platform-based model is perfectly in tune with what their customers, both on the consumer and enterprise side, really want. A platform-based model enables on-demand scaling and agile, rapid delivery of new services, and is geared to meet the most unpredictable of peaks and troughs.

The model works best when it leaves the creation of these services in the hands of an ecosystem of newly empowered customers and partners, reliant on the telco’s core strength of providing ubiquitous mobile and wireless data connectivity. This has positive implications for revenue, as effectively the telco is providing an army of vertical market specialists with an advanced platform for their products, and letting them do the selling. This means, the old one-size-fits-all linear value chain is broken down into what is effectively a network of smaller value chains. No longer confined to only being able to grow their business by growing the number of subscribers—and without having to resort to price cutting as their best means of competing with rivals—the telco is free to participate in an infinite number of revenue-spinning vertical use cases and new initiatives.

Learn more about what TIBCO is doing in the telecom industry.

Let’s block ads! (Why?)

The TIBCO Blog

Checklist for Finalizing a Data Model in Power BI Desktop

My colleague and friend Meagan Longoria has been giving me a few “gentle nudges” to update an old post about this topic: 

Meagan Checklist for Finalizing a Data Model in Power BI Desktop

This post is a checklist of suggested items to be completed in your Power BI data model, within Power BI Desktop, before you consider it “done done.” This is not a how-to post; it’s meant to truly be a checklist with just a bit of context in each ‘why’ column. (Though I concede I got more than a little wordy at 13 pages.)

The checklist focuses on the data model itself, rather than report design. For sound practices and suggestions about report design, check out Meagan’s series:  Design Concepts for Better Power BI Reports

A downloadable version of the checklist:  Checklist

Checklist last updated: 12/23/2017


  • Data model = the data imported to Power BI; also known as the dataset
  • PBIX: the Power BI Desktop file
  • Field: columns and derived measures which are displayed in the field list

Why Is the Power BI Data Model So Important?

There are many things you can do on the report side – change field names, adjust formats, and lots more. However, adjustments to things like names and formats within the report should be the exception rather than the norm. If all fine-tuning is done within the data model, it’s done *once* and every downstream report takes advantage of it. You will save yourself much time and effort, and improve consistency from report to report tremendously. Other users who consume your data model through reports and/or Q&A will appreciate the extra effort you’ve made to improve the user experience.

PBIX File:

To Do


Use a Consistent, Unchanging PBIX File Name
with Business Relevance

The PBIX name you assign is what the user will
see after it’s been published. The name for the PBIX file should describe the
set of reports, and it should be a consistent name which does not change over
time. While a version number (like “SalesAnalysis-V1-December) is ok to keep straight
interim copies of the PBIX, the file name needs to remain consistent once it
is published. This is to ensure favorites, links, and apps all continue to
work whenever changes are released.

Consider Reusing an Existing Dataset

Before starting a new PBIX, give some
consideration to whether the data already exists in another Power BI data
model. If new reports are the goal, yet the data already exists, a “hub and
spoke” approach may work in which an existing published data model is reused.
This can be done by creating new reports in the Power BI Service that refer
to an existing dataset, by using a Power BI Service Live Connection, or via
Analyze in Excel functionality. A hub and spoke approach (a) decreases the
amount of redundant data stored in various PBIX files, (b) reduces the number
of data refresh operations, (c) reduces future maintenance needs, (d) improves
consistency because there are fewer files with data models, and (e) facilitates
the separation of dataset development from report development. More details:
Reusing Datasets Imported to the Power BI Service.

Use the Correct Release of Power BI Desktop Based
on Your Target Deployment Destination

There are now two ‘flavors’ of Power BI
Desktop. The standard
one is updated monthly
, and it correlates with functionality releases in
the Power BI Service. The newer
one is Power BI Desktop Optimized for Power BI Report Server
– that one
is updated every 3-4 months. If you develop Power BI solutions which are deployed
to both the Power BI Service and to Power BI Report Server, you will need
to carefully manage which release you are using for your data model
development (they can run side by side on the same machine). Keep in mind that
if collaboration will occur on a Power BI Desktop file, all users need to
have the same version of Power BI Desktop installed on their machine (this is
easier if you install the standard version of Power BI Desktop via the Windows
so that it will update itself in the background).

Manage the Dataset Size

File size limits impact the amount of data
which can be imported into the data model. At the time of this writing, the
file size limits (after compression is applied) are as follows:


Power BI Desktop – based on individual
PC system capability

Power BI Service – 1GB, or 10GB if PBIX
is deployed to a Premium workspace

Power BI Report Server – 2GB if PBIX is
deployed to PBIRS

Utilize Version History for PBIX File

Where to store the original PBIX file is an
important decision. One common location is One Drive for Business, which has
built-in versioning available, so you can go back to a prior version if something
goes wrong. A source control repository is also ideal. The original file
should be in a team site, not an individual user’s laptop.

Consider Using a Power BI Template

A Power BI template (PBIT) can contain
anything from a PBIX, except the data itself. (Note this is different from solution
templates available on AppSource.) Templates can contain common queries and
data tables (such as a Date table in every model). The original template file
should reside in the same original location with version history.

Consider Using a Power BI Theme

A Power BI report theme allows for a consistent use of a color palette in a report.
Though theming affects reports and not the data model (the data model is the
focus of this checklist), it is possible a data modeler will handle this task
since custom report themes are based on JSON. The original JSON file should
reside in the same original location with version history.

PBIX File Properties:

To Do



Decide if Using Auto Time Intelligence

File > Options and Settings > Options
> Current File: Auto Date Time

Auto time intelligence is enabled by
default, and it applies to each individual PBIX file (there’s not a global
option). For most datetime columns that exist in the dataset, a
hidden date table is created in the model to support time-oriented DAX
calculations. This is great functionality for newer users, or if you have a
very simple data model. However, if you typically utilize a standard Date table,
then you will want to disable the hidden date tables to reduce the file size.
(Tip: You can view the hidden date tables if you connect to the PBIX via DAX

Control Options for Relationships

File > Options and Settings > Options
> Current File: Relationships

Once the data model and relationships are
complete, you may wish to ensure the relationships do not change without your
knowledge. To do this, disable two items in each PBIX file (there’s not a
global option): “Update relationships when refreshing queries” and
“Autodetect new relationships after data is loaded.”

Use Preview Features Carefully

File > Options and Settings > Options
> Global: Preview Features

Enabling public preview features should be
used cautiously since preview features are not typically supported in the
Power BI Service or Power BI Report Server. It’s easy to forget you used a
preview feature which won’t work properly after the report has been published.

Query Editor:

To Do



Decide the Data Connectivity Mode

Home > Edit Queries

A very important decision is whether to
utilize import mode or DirectQuery mode. Guidance about this decision can be
found in the
Planning a Power BI Enterprise Deployment whitepaper.

Import the Minimum # of Columns

Home > Edit Queries

Importing only the columns that are needed
simplifies the model and reduces the size. Model size is very important since
the Power BI model resides in-memory (when in import mode). Since hidden
columns still consume size and memory, the rule of thumb is: if a column doesn’t
contribute to a report, calculation, or a relationship, then it shouldn’t be
imported to the model.

Parameterize Data Source Connections or
Other Changing Data

Home > Edit Queries > Manage

Parameters can be useful to minimize
hard-coding and reduce maintenance when changes need to occur. For data
source connections, by default the connection is stored in the first step of
every query. For dev>test>prod scenarios, changing the connection only
once in a parameter is far less tedious than for every query. It also reduces
the chance of error or omission.

Set Data Source Privacy Level

File > Options and Settings > Data
Source Settings > Edit Permissions > Privacy Level

When importing data from various types of
sources, the types of optimizations available vary per data source (such as
query folding which passes filters down to an underlying data source). If
data is sensitive and you do not want it to be passed in-memory to another
server for consolidation or aggregations, and/or you want to ensure a DBA
monitoring the server cannot see values, you can control isolation by setting
privacy level. This may result in a slower consolidation or comparison of
data, but is more secure for sensitive data.

Set Encryption for Data Source Connections

File > Options and Settings > Data
Source Settings > Edit Permissions > Encryption

Connections should be encrypted whenever
possible as a standard security measure. This does need to be supported by the
data source which is being queried.

Use Consistent File Locations

Home > Edit Queries > Data Source Settings

When accessing sources such as flat files (ex:
CSV or Excel), make sure the files are accessible from a centralized location
which is not based on an individual user path. This will ensure a data
refresh operation won’t fail if another person executes it.

Retrieve Data from a View

Home > Edit Queries

When possible, when retrieving data from a
relational database, select data from a database view rather than a table. Different
sets of views allow for different calculations and customization for
different purposes: DW views can feed a semantic layer for corporate BI delivery
(ex: Analysis Services). Alternatively, USR or RPT (user / report) views can be
customized for power users who are allowed to access relational data (ex: from
Power BI Desktop).

Tables and Relationships:

To Do



Use Friendly Business Name for Tables

Data pane > Fields

A table name like “Students” is nicer to
look at within a field list than something like “TblStdt.” The name should
represent the data contained within the table. Other contextual information
about the type of table is very helpful to include in the name, such as:
Financial Monthly Snapshot, Financial Transactions, or Financial Current
Balances. Though often debated, either singular or plural table names are fine
– just be consistent.

Validate Relationships are Accurate

Relationships pane

Depending on the source of data, relationships may or may not be auto-generated. One of the most critical
tasks is to verify all required relationships are in place and accurate with
respect to the 1-to-many direction of each.

Verify Relationships Do Not Change Direction

Relationships pane

The direction of a relationship affects how
filters propagate through a model. If the direction of relationships change,
a ‘dead-end’ is created which means reports do not display correct data. This
is more likely to occur for more normalized data models. Verifying this is a
critical piece of model validation.

Validate and Use Bi-Directional
Relationships Purposefully

Relationships pane

Bidirectional relationships should be used
minimally, and with a specific purpose. They should be used as the exception
– to accomplish something specific – rather than the default, particularly if
you have any complexity to your data model. Sometimes CrossFilter() can be a
substitute for a bidirectional relationship. Verifying that any bidirectional
relationships are set up correctly is critical to model validation.

Validate and Use Inactive Relationships

Relationships pane

Inactive relationships in the data model
should exist only when they have purposefully been set up that way, in which
case the relationship is invoked via a DAX calculation. Verifying that any inactive
relationships are set up correctly is critical to model validation.

Create Synonyms to Improve Q&A

Relationships pane > Modeling >

If report consumers also use Q&A
functionality for natural language querying (ex: Sales by Year by Division), the
thoughtful creation of synonyms improves the user experience significantly so
that users do not always have to input field names exactly. For example,
perhaps the terms Division and Business Unit are interchangeable at your
organization. Or, perhaps there are common acronyms users might input into
Q&A that need to be translated to full field names.

Assume Referential Integrity When Possible

Home > Manage Relationships

If you are utilizing data in DirectQuery
mode from a reliable and clean data source (such as a data warehouse), you
can improve performance by assuming
referential integrity.

Fields, Columns, and Measures:

To Do



Create Unique Field Names Across the Entire Dataset

Data pane

Although the Power BI software permits columns
to exist which are named the same across tables, that is a poor practice to allow
in a data model. Let’s say that there is a Sales Rep table and a Sales
Manager table, and both have a Sales Region column. Both Sales Regions should
be renamed to be Sales Rep Region, and Sales Manager Region for clarity (or,
even better, renamed in an underlying database view where Power BI retrieves
the data). The general rule to ensure report labels are easy to understand: the
name assigned to each column needs to be self-explanatory on its own rather
than relying on the context of its table.

Expose a Field Only Once in the Entire

Data pane

Sometimes a column exists in the data model
more than once. The ideal choice is for the redundant column to not be
imported at all to the dataset. If that isn’t a viable choice for whatever
reason, then make sure the “extra” copy of the column is hidden from report
view. For instance, if a Sales Header ID exists in both the Header and the
Lines tables – make sure it’s hidden in the Lines table and only shown once
in the data model. This way, users won’t accidentally use both columns. Using
both could result in possibly needing to use two separate filters which would
provide a very poor user experience.

Hide Fields Not Utilized for Reporting

Data pane

IDs and surrogate keys are needed for
relationships, but are not useful for reporting. Hiding them simplifies the
data model because there’s less fields shown in the field list. The
consumers of your data model will appreciate the lack of what they’d view as
clutter. (Reminder: if a column is not needed for something – a relationship,
basis for a calculation, sorting, something – then don’t import it at all.)

Use Friendly Names for Fields

Data pane

A field such as “Student Name” is nicer to
look at for reporting than something like “Stdt_nm” which may be how it’s
stored in the source. Since field names impact the naming of column titles,
filters, and slicers, assigning friendly names with business-relevance are
well worth a bit of time investment. Just be consistent with respect to
spaces, casing, and abbreviations (such as Desc or Nbr). Try to only use
acronyms if you are certain all users know the meaning.

Set Formatting for All Numeric and Date

Modeling > Formatting

It’s no fun to add a field onto a report
that needs to be reformatted every single time. Defining units as whole
numbers, or amounts as currency, is a helpful timesaver on the reporting end.
Don’t forget to also set the comma to assist with readability.

Specify Sorting for Columns

Modeling > Sort

Creating a default sort order is a common
need for certain types of columns. For example, you may need to sort the
Month Name column by the Month Number column. Oftentimes the column serving
as the sort value can be hidden from report view.

Set Default Summarization for All Numeric

Modeling > Properties > Default

The summarization default is sum, but some numeric
columns are more suitable as count, min, or max. For example, high
temperature per day would never be summed for a meaningful value; average is
likely a better choice. Setting this properly allows subtotals and totals to
be presented properly. Summarization for column that aren’t really numeric, such
as Customer Number or Year Number, should be set to “don’t summarize.” More
Why the Default Summarization Property in Power BI is So

Create Useful Calculated Columns

Modeling > Calculations > New Column

Creation of calculated columns (aka derived fields) is vital for enrichment of the data
model. A very simple example of this is names – perhaps the underlying data
source keeps First Name and Last Name in separate columns; you may wish to
derive a Full Name field for reporting purposes which concatenates them

Create Useful Calculated Measures

Modeling > Calculations > New Measure

Creation of calculated measures (aka explicit measures) is extremely useful to augment
reporting and analysis. Year-to-date and year-over-year are common examples
of calculated measures which rely upon current context for calculation at runtime.

Decide on Location for Calculated Measures

Modeling > Properties > Home Table

Some Power BI practitioners prefer to locate
all calculated measures in a specific table just for measures. However, this results
in calculated columns and calculated measures residing in different tables which
might seem confusing to users. If you do choose to use a “home table” for all
measures, or certain critical measures, focus on organization to help users find
them easily.

Consolidate Intermediary Calculated Columns


Frequently it is easier to understand and test
a complicated calculated column in a series of steps that build upon each
other. The tradeoff for this understandability is additional size and memory
usage. Therefore, you may consider consolidating calculated columns once testing
is complete rather than leaving the separate steps permanently in the model.
If they remain, make certain that any intermediary calculated columns are
hidden from report view.

Decide on Use of Implicit Measures


An implicit measure utilizes the default
summarization setting such as “Sum of Sales Amount.” Some Power BI
practitioners prefer to use explicit calculated measures only, in which case
base numeric columns are hidden. Like many other things, consistency here is
what matters.

Set the Data Category

Modeling > Properties > Data Category

Address-related columns, such as
city/state/zip and so forth, need to be specified in the data model to ensure
geocoding can occur properly. Typically, Power BI is very good about detecting
data categories from the data, but the
data category for addresses, URLs and barcodes needs to be verified.

Other Dataset Items:

To Do



Ensure a ‘Data Of Date’ Exists in the

Modeling > Calculations > New Column

It is common to display a ‘Data As Of’ date
on reports. This should exist in the data model, based on the data itself and
not reliant on the current datetime. The objective is for report consumers to
understand the effective date of the data being displayed. As of dates tend
to be most significant for certain types of data such as sales quotas, or during
key periods such as month-end.

Create Hierarchies for Usability

Field List > New Hierarchy

Date columns (such as
Year>Quarter>Month) are a common scenario for use of a hierarchy and
should be present in most models. Geography columns (such as
Country>State>City) are great candidates as well. After a column has
been added to a hierarchy, then it’s up to you whether the individual columns
should still be visible and available for reporting (sometimes users find that
seeing both the individual columns & the hierarchy columns to be

Set Up Row-Level Security

Modeling > Security > Manage Roles

Row-level security is useful to specify
which rows of data certain users are permitted to view. There are several
techniques for how to accomplish RLS. (Tip: if you find you are repeating
work in numerous PBIX files to set RLS, that means it may be time to look at
using Analysis Services instead.)

A Few Common DAX Best Practices:

cbits – certainly not intended to be a complete reference.

To Do



Use a Naming Convention When Referencing a

Modeling > Calculations

When referencing a measure in a DAX
calculation, do not include the table name. This commonly used best
practice in the Power BI community helps you identify what type of column is
input to the calculation, especially if you are troubleshooting context and
calculation results. If you need convincing, watch this
PASS Summit video from Marco Russo

Use a Naming Convention When Referencing a

Modeling > Calculations

When referencing a column in a DAX
calculation, always include the table name as a prefix in the format of
‘table'[Column]. This commonly used best practice in the Power BI community
helps you identify what type of column is input to the calculation — especially
if you are troubleshooting context and calculation results. If you need
convincing, watch this
PASS Summit video from Marco Russo

Apply the Most Selective Condition on Inner

Modeling > Calculations

If a DAX calculation is nested, the
innermost function is evaluated first. For performance optimization purposes,
the most selective condition should typically be nested to limit data as
early as possible.

Use Error Messaging in DAX

Modeling > Calculations

The Error() function provides a description
if a calculation result cannot be rendered, which significantly can prevent users
being frustrated should something go wrong with the report display. Error
messaging is particularly important when displaying data which is manually

Always Use the Divide() Function

Modeling > Calculations

If you utilize the Divide() function,
instead of Column A / Column B, you don’t have to worry about nulls or coding
around divide-by-zero errors. This standard practice prevents invalid numbers
being displayed on reports (which in turn minimizes users being tempted to download
to Excel just to ‘clean up’ the report output).

Use Variables as a Common Practice

Modeling > Calculations

Use of variables in DAX is becoming a best
practice, as it can make the syntax shorter and easier to read. Variables can
improve performance because they are “lazy” meaning they evaluate only once
and won’t be evaluated if not used.

Documentation Within a PBIX:

Last, but certainly not least!

To Do



Solution Documentation

Report > New Page

You may consider using the first or last
report page as a place to include helpful tips, definitions, or an introduction
to the solution. It can also link to additional information, identify who to
contact with questions, and/or who the owner of the solution is.

Descriptions for Fields

Field List > Properties > Description

A description can be immensely helpful for
users to understand the contents of a column or measure, when and how to use
it. When a description has been defined, it is shown in the field list as a
tooltip when the mouse hovers on the field name. (Tip: if you find you are
repeating work in lots of different PBIX files to input descriptions, that
means it may be time to look at using Analysis Services instead.)

Comments in DAX Calculations

Modeling > Calculations

Comments in DAX (using // or //) can be
very helpful to describe what’s happening in a calculation. This can be to help
others understand the calculation, or to leave yourself a reminder.

Comments Within M Scripts

Home > Edit Queries > View >
Advanced Editor

Comments in the M script (using // or //)
can be very helpful to describe the data transformations which are occurring.

Description for Query

Home > Edit Queries > Properties >

If a query is doing something unusual, or is
serving as an intermediary query, it is very helpful to include a description
of the purpose for the query.

Naming of Query Steps

Home > Edit Queries > Query Settings
> Applied Steps

Steps in the query editor get auto-named,
which is sufficient for most needs. If you are doing a lot of data
transformations, or something complex, it can be helpful to rename the query
steps to do what’s happening at a particular stage.

Let’s block ads! (Why?)

Blog – SQL Chick

Change your Power BI report to point to an external SSAS model

A common questions I get is to change the connection string to from my report to SSAS after I move my Power BI desktop file into SSAS. It turns out this actually pretty simple as there is an API that allows you to copy and then bind the report to ANY dataset. Let’s walk through this.

I start with a very simple Power BI desktop file that contains a single table that I imported from SQL:

 Change your Power BI report to point to an external SSAS model

I then upload it to Power BI, I added it to it’s own app workspace.

Then I created a report that points to my local SSAS and uploaded it to the same workspace, I do this to make sure I have a dataset in Power BI that points to SSAS. If you already have a dataset present you can skip this step. Of course you have to set up a data gateway in case of using a local SSAS.

So now I have 2 datasets in my workspace (I use the same workspace but you can also have them live in different ones):

 Change your Power BI report to point to an external SSAS model

SalesIt is the dataset that points to my SSAS instance and the SalesPBI is the embedded data model. I also have the 2 reports:

 Change your Power BI report to point to an external SSAS model

Now here comes the magic. I am using a PowerShell report created by my colleague Sirui that allows you to copy and bind the new report to ANY dataset in Power BI. You can find the script here. The key thing here is to make sure the 2 schema’s are the same, it needs to have the same columns, measures, tables etc otherwise the report will throw errors. In my example I didn’t actually use a imported model but used a SQL Server 2016 RTM model with the same schema and that also works.

Ok now for the PowerShell script, it does two things:

1 It creates a copy of the report

2 It binds the new report to a dataset provided in the script

The script uses to the Power BI Clone API to clone the report and rebind it..

First we need to configure the the script, I created a new “app” on https://dev.powerbi.com/apps as described in the script to get a new client Id and set a name for my new report called “SalesNowIt”. Next I got all the ID’s needed for the script to run like report and groupId’s. The script has step by step instructions.

Now after configuring I just run the PowerShell script (no additional changes necessary). And now see a new report showing up:

 Change your Power BI report to point to an external SSAS model

And now when I run the report that previously pointed to my embedded model it still works:

 Change your Power BI report to point to an external SSAS model

But when running profiler I see queries going to my local SSAS instead of the embedded model:

 Change your Power BI report to point to an external SSAS model

So that’s it, pretty straightforward and very straightforward.  Of course you can extend this PowerShell script yourself to do whatever you want, for example, loop through all the reports in a workspace and rebind them all.

Let’s block ads! (Why?)

Kasper On BI

Announcing the Modern Servicing Model for SQL Server

Background to SQL Server servicing

Historically, we have released Cumulative Updates (CUs) every 2 months after a major version is released, and roughly yearly Service Packs (SPs), containing fixes from all previous CUs, plus any feature completeness or supportability enhancements that may require localization. You can read more about the SQL Server Incremental Servicing Model (ISM) here.

Up to and including SQL Server 2016, RTM and any subsequent SPs establish a new product baseline. For each new baseline, CUs are provided for roughly 12 months after the next SP releases, or at the end of the mainstream phase of product lifecycle, whichever comes first.

For the entire product lifecycle, we release General Distribution Releases (GDRs) when needed, containing only security related fixes.

The Modern Servicing Model

Starting with SQL Server 2017, we are adopting a simplified, predictable mainstream servicing lifecycle:

  • SPs will no longer be made available. Only CUs, and GDRs when needed.
  • CUs will now accommodate localized content, allowing new feature completeness and supportability enhancements to be delivered faster.
  • CUs will be delivered more often at first and then less frequently. Every month for the first 12 months, and every quarter for the remainder 4 years of the full 5-year mainstream lifecycle.
  • CUs are delivered on the same week of the month: week of 3rd Tuesday.

Note: the Modern Servicing Model (MSM) only applies to SQL Server 2017 and future versions.

Servicing lifecycle

The servicing lifecycle is unchanged from SQL Server 2016:

  • Years 0-5 (Mainstream Support): Security and Functional issue resolution though CUs. Security issues through GDRs.
  • Years 6-10 (Extended Support): Security or critical functional issues.
  • Years 11-16 (Premium Assurance): Optional paid extension to Extended Support (no scope change).


Having questions is expected. Please read below in case we have already covered it in this FAQ.

Q1: SPs were fully localized, and you released one update file for every supported language. How will this be handled with no SPs?
A1: CUs will be localized starting with SQL 2017. CUs will handle this requirement maintaining a single update file.

Q2: When we upgraded from a previous version of SQL Server, we did so at SP1 using slipstream media provided by Microsoft. How will this work with no SPs?
A2: We will provide CU based slipstream media for CU12 allowing for this.

Q3: My company always waited for SP1 to perform an upgrade from a previous version. What are my options now?
A3: Even before GA, the final SQL Server 2016 CTP versions were considered production-ready having gone through exhaustive testing both internally and with many preview customers. So there is no need to wait for an SP to install the latest SQL Server – you can install confidently as soon as a given version goes GA.
With that, you can still target any CU for Upgrade. For example, you could target CU12 for upgrade, and have slipstream media available.

Q4: I service an instance only with GDRs. I do not apply CUs, but apply SPs. Will I need to move to a CU servicing train if I need a non-critical/security fix?
A4: Yes. While this was previously true only leading up to SPs, now you must apply latest CU and there will not be an opportunity to reset back to receiving GDR updates only.

Q5: Assume that after Mainstream Support, you release a security fix. Are these going to be GDRs only? If so, how can I install it, if I’m already on a CU servicing train?
A5: During Extended Support, we will release GDRs and GDR-CUs separately. The same is valid for customers that purchase the additional Premium Assurance.

Q6: Previously, once SP2 was released (for example), if I was on the RTM baseline I would have to upgrade to SP1 or SP2 to get a hotfix. How will this work now?
A6: The only baseline will be RTM, and it will receive CUs for 5 years. There are no upgrades to an SP to receive CUs, or worry about which baseline a CU applies to.

Q7: If I am on RTM baseline, and CU20 (for example) was just released, will I receive technical support?
A7: This may be handled on a case by case basis. If the issue/question is in an area that has received a significant number of updates throughout the years, you may be asked to update to a later CU, yes.

Q8: Will SQL Server on Linux receive CUs and GDRs as well?
A8: Yes, every CU and GDR will have corresponding updates to all current Linux platforms.

Q9: Will CU and GDR KB articles then cover both SQL Server on Windows and Linux?
A9: Yes. Issues addressed in each release will be categorized by impacted platform(s).

Q10: Will SQL Server for Linux CUs and GDRs be updates to an existing installation like SQL Server on Windows?
A10: No, SQL Server on Linux updates will completely replace all binaries in the existing installation.

Q11: On SQL Server on Linux, can I remove an update?
A11: Yes, however this operation is performed by re-installing any desired previous servicing level package.

Q12: Will the testing and resulting quality levels of CUs be the same as SPs?
A12: Yes. CUs for all versions of SQL Server are tested to the same levels of Service Packs. As announced in January 2016, you should plan to install a CU with the same level of confidence you plan to install SPs as they are released. You can read more about that here.

Q13: Monthly CU releases are fast, I do not believe my business can keep pace with this, yet you have been proactively encouraging customers to stay current.
A13: Yes, the cadence is fast for the first 12 months. However, payload will be roughly 50% in theory, so these should be easier to consume. Of course, you still have the option to install every other CU for example, for the first 12 months. As the name suggests, all CUs are cumulative.

Q14: Why release CUs every month only for the first year, then move to quarterly updates for the remaining 4 years?
A14: Data shows that the vast majority of all hotfixes issued for a major release occurs in the first 12 months. The monthly cadence brings these fixes to customers much faster when it has the most impact. Reducing to quarterly updates reduces customer and operational overhead the course of the remaining 4 years.

Q15: Will the availability of CUs remain unchanged?
A15: For SQL Server on Windows CUs, no changes are planned. The most recent CU will be available on the Download Center, Windows Catalog, and WSUS. Previous CUs will be available in the Windows Catalog.

Q16: Where will I look for SQL Server on Linux CUs and GDRs?
A16: All updates, current and previous, will be maintained and available in repositories.

Q17: I see that Reporting Services (SSRS) is no longer installed by Setup. Where is it and how will it be serviced?
A17: RS is available for download via a link in Setup. Servicing will be independent moving forward.

Let’s block ads! (Why?)

SQL Server Release Services

New season, New software, New servicing model

Greetings. It feels like a lot of change is in the air! Last Friday marked the Autumnal equinox. Depending upon which hemisphere you live in, the changes you might see with the new season might differ. People living in northern hemisphere welcome Fall season while folks in southern hemisphere welcome Spring.

If you are working in the database field, you could not have missed the announcements of astronomical proportions coming from the Ignite conference this week. Specifically, for SQL Server, Scott Guthrie and Rohan Kumar announced the general availability of SQL Server 2017. You can read the complete announcement from Rohan Kumar @ Microsoft for the Modern Data Estate. Lot of customers, users and fans of SQL Server will be excited to deploy SQL Server across different platforms as well as experience the amazing new features introduced.

While the engineering team was busy getting ready to release the product, lot of engineers from the support team participated in reviewing the product behavior, providing feedback, filing bugs, tracking changes and getting trained on new technologies and so on. The support team is all geared up and ready to work with customers who will start deploying the new release.

New product release provides an opportunity to innovate how we service the product as well. You might have seen the announcements from SQL Server Release Services about the modern servicing model for SQL Server. My friend Pedro Lopes has blogged about this in great detail along with a FAQ @ Announcing the Modern Servicing Model for SQL Server. Please take the time to read through all the details and information provided. This will help you prepare to keep your SQL Server install base healthy and up to date on patches, fixes, improvements. When you are working with members of our support team you will hear about these changes – especially if you need a fix/change for the product.

Let us all welcome the new season, the new software and the new ways in which we will get updates! Looking forward to working with all of you on the new scenarios and possibilities the new software opens for all of us.

Suresh Kandoth / Pradeep M.M. / Arun Kumar K

[On behalf of all SQL Server support engineers, managers and escalation engineers]

Let’s block ads! (Why?)

CSS SQL Server Engineers

Courier Service Thrives Under Radical Business Model

When outspoken venture capitalist and Netscape co-founder Marc Andreessen wrote in The Wall Street Journal in 2011 that software is eating the world, he was only partly correct. In fact, business services based on software platforms are what’s eating the world.

Companies like Apple, which remade the mobile phone industry by offering app developers easy access to millions of iPhone owners through its iTunes App Store platform, are changing the economy. However, these world-eating companies are not just in the tech world. They are also emerging in industries that you might not expect: retailers, finance companies, transportation firms, and others outside of Silicon Valley are all at the forefront of the platform revolution.

These outsiders are taking platforms to the next level by building them around business services and data, not just apps. Companies are making business services such as logistics, 3D printing, and even roadside assistance for drivers available through a software connection that other companies can plug in to and consume or offer to their own customers.

SAP Q317 DigitalDoubles Feature1 Image2 Courier Service Thrives Under Radical Business ModelThere are two kinds of players in this business platform revolution: providers and participants. Providers create the platform and create incentives for developers to write apps for it. Developers, meanwhile, are participants; they can extend the reach of their apps by offering them through the platform’s virtual shelves.

Business platforms let companies outside of the technology world become powerful tech players, unleashing a torrent of innovation that they could never produce on their own. Good business platforms create millions in extra revenue for companies by enlisting external developers to innovate for them. It’s as if strangers are handing you entirely new revenue streams and business models on the street.

Powering this movement are application programming interfaces (APIs) and software development kits (SDKs), which enable developers to easily plug their apps into a platform without having to know much about the complex software code that drives it. Developers get more time to focus on what they do best: writing great apps. Platform providers benefit because they can offer many innovative business services to end customers without having to create them themselves.

Any company can leverage APIs and SDKs to create new business models and products that might not, in fact, be its primary method of monetization. However, these platforms give companies new opportunities and let them outflank smaller, more nimble competitors.

Indeed, the platform economy can generate unbelievable revenue streams for companies. According to Platform Revolution authors Geoffrey G. Parker, Marshall W. Van Alstyne, and Sangeet Paul Choudary, travel site Expedia makes approximately 90% of its revenue by making business services available to other travel companies through its API.

In TechCrunch in May 2016, Matt Murphy and Steve Sloane wrote that “the number of SaaS applications has exploded and there is a rising wave of software innovation in APIs that provide critical connective tissue and increasingly important functionality.” ProgrammableWeb.com, an API resource and directory, offers searchable access to more than 15,000 different APIs.

According to Accenture Technology Vision 2016, 82% of executives believe that platforms will be the “glue that brings organizations together in the digital economy.” The top 15 platforms (which include companies built entirely on this software architecture, such as eBay and Priceline.com) have a combined market capitalization of US$ 2.6 trillion.

It’s time for all companies to join the revolution. Whether working in alliance with partners or launching entirely in-house, companies need to think about platforms now, because they will have a disruptive impact on every major industry.

SAP Q317 DigitalDoubles Feature1 Image3 1024x572 Courier Service Thrives Under Radical Business Model

To the Barricades

Several factors converged to make monetizing a company’s business services easier. Many of the factors come from the rise of smartphones, specifically the rise of Bluetooth and 3G (and then 4G and LTE) connections. These connections turned smartphones into consumption hubs that weren’t feasible when high-speed mobile access was spottier.

One good example of this is PayPal’s rise. In the early 2000s, it functioned primarily as a standalone web site, but as mobile purchasing became more widespread, third-party merchants clamored to integrate PayPal’s payment processing service into their own sites and apps.

In Platform Revolution, Parker, Van Alstyne, and Choudary claim that “platforms are eating pipelines,” with pipelines being the old, direct-to-consumer business methods of the past. The first stage of this takeover involved much more efficient digital pipelines (think of Amazon in the retail space and Grubhub for food delivery) challenging their offline counterparts.

What Makes Great Business Platforms Run?

SAP Q317 DigitalDoubles Feature1 Image8 Courier Service Thrives Under Radical Business Model

The quality of the ecosystem that powers your platform is as important as the quality of experience you offer to customers. Here’s how to do it right.

Although the platform economy depends on them, application programming interfaces (APIs) and software development kits (SDKs) aren’t magic buttons. They’re tools that organizations can leverage to attract users and developers.

To succeed, organizations must ensure that APIs include extensive documentation and are easy for developers to add into their own products. Another part of platform success is building a general digital enterprise platform that includes both APIs and SDKs.

A good platform balances ease of use, developer support, security, data architecture (that is, will it play nice with a company’s existing systems?), edge processing (whether analytics are processed locally or in the cloud), and infrastructure (whether a platform provider operates its own data centers and cloud infrastructure or uses public cloud services). The exact formula for which elements to embrace, however, will vary according to the use case, the industry, the organization, and its customers.

In all cases, the platform should offer a value proposition that’s a cut above its competitors. That means a platform should offer a compelling business service that is difficult to duplicate.

By creating open standards and easy-to-work-with tools, organizations can greatly improve the platforms they offer. APIs and SDKs may sound complicated, but they’re just tools for talented people to do their jobs with. Enable these talented people, and your platform will take off.

In the second stage, platforms replace pipelines. Platform Revolution’s authors write: “The Internet no longer acts merely as a distribution channel (a pipeline). It also acts as a creation infrastructure and a coordination mechanism. Platforms are leveraging this new capability to create entirely new business models.” Good examples of second-stage companies include Airbnb, DoubleClick, Spotify, and Uber.

Allstate Takes Advantage of Its Hidden Jewels

Many companies taking advantage of platforms were around long before APIs, or even the internet, existed. Allstate, one of the largest insurers in the United States, has traditionally focused on insurance services. But recently, the company expanded into new markets—including the platform economy.

Allstate companies Allstate Roadside Services (ARS) and Arity, a technology company founded by Allstate in late 2016, have provided their parent company with new sources of revenue, thanks to new offerings. ARS launched Good Hands Rescue APIs, which allow third parties to leverage Allstate’s roadside assistance network in their own apps. Meanwhile, Arity offers a portfolio of APIs that let third parties leverage Allstate’s aggregate data on driver behavior and intellectual property related to risk prediction for uses spanning mobility, consumer, and insurance solutions.

SAP Q317 DigitalDoubles Feature1 Image4 Courier Service Thrives Under Radical Business ModelFor example, Verizon licenses an Allstate Good Hands Rescue API for its own roadside assistance app. And automakers GM and BMW also offer roadside assistance service through Allstate.

Potential customers for Arity’s API include insurance providers, shared mobility companies, automotive parts makers, telecoms, and others.

“Arity is an acknowledgement that we have to be digital first and think about the services we provide to customers and businesses,” says Chetan Phadnis, Arity’s head of product development. “Thinking about our intellectual property system and software products is a key part of our transformation. We think it will create new ways to make money in the vertical transportation ecosystem.”

One of Allstate’s major challenges is a change in auto ownership that threatens the traditional auto insurance model. No-car and one-car households are on the rise, ridesharing services such as Uber and Lyft work on very different insurance models than passenger cars or traditional taxi companies, and autonomous vehicles could disrupt the traditional auto insurance model entirely.

This means that companies like Allstate are smart to look for revenue streams beyond traditional insurance offerings. The intangible assets that Allstate has accumulated over the years—a massive aggregate collection of driver data, an extensive set of risk models and predictive algorithms, and a network of garages and mechanics to help stranded motorists—can also serve as a new revenue stream for the future.

By offering two distinct API services for the platform economy, Allstate is also able to see what customers might want in the future. While the Good Hands Rescue APIs let third-party users integrate a specific service (such as roadside assistance) into their software tools, Arity instead lets third-party developers leverage huge data sets as a piece of other, less narrowly defined projects, such as auto maintenance. As Arity gains insights into how customers use and respond to those offerings, it gets a preview into potential future directions for its own products and services.

SAP Q317 DigitalDoubles Feature1 Image5 1024x572 Courier Service Thrives Under Radical Business Model

Farmers Harvest Cash from a Platform

Another example of innovation fueling the platform economy doesn’t come from a boldfaced tech name. Instead, it comes from a relatively small startup that has nimbly built its business model around data with an interesting twist: it turns its customers into entrepreneurs.

Farmobile is a Kansas City–based agriculture tech company whose smart device, the Passive Uplink Connection (PUC), can be plugged into tractors, combines, sprayers, and other farm equipment.

Farmobile uses the PUC to enable farmers to monetize data from their fields, which is one of the savviest routes to success with platforms—making your platform so irresistible to end consumers that they foment the revolution for you.

Once installed, says CEO Jason Tatge, the PUC streams second-by-second data to farmers’ Farmobile accounts. This gives them finely detailed reports, called Electronic Field Records (EFRs), that they can use to improve their own business, share with trusted advisors, and sell to third parties.

The PUC gives farmers detailed records for tracking analytics on their crops, farms, and equipment and creates a marketplace where farmers can sell their data to third parties. Farmers benefit because they generate extra income; Farmobile benefits because it makes a commission on each purchase and builds a giant store of aggregated farming data.

This last bit is important if Farmobile is to successfully compete with traditional agricultural equipment manufacturers, which also gather data from farmers. Farmobile’s advantage (at least for now) is that the equipment makers limit their data gathering to their existing customer bases and sell it back to them in the form of services designed to improve crop yields and optimize equipment performance.

Farmobile, meanwhile, is trying to appeal to all farmers by sharing the wealth, which could help it leapfrog the giants that already have large customer bases. “The ability to bring data together easily is good for farmers, so we built API integrations to put data in one place,” says Tatge.

Farmers can resell their data on Farmobile’s Data Store to buyers such as reinsurance firm Guy Carpenter. To encourage farmers to opt in, says Tatge, “we told farmers that if they run our device over planting and harvest season, we can guarantee them $ 2 per acre for their EFRs.”

So far, Farmobile’s customers have sent the Data Store approximately 4,200 completed EFRs for both planting and harvest, which will serve as the backbone of the company’s data monetization efforts. Eventually, Farmobile hopes to expand the offerings on the Data Store to include records from at least 10 times as many different farm fields.

SAP Q317 DigitalDoubles Feature1 Image6 1024x572 Courier Service Thrives Under Radical Business Model

Under Armour Binges on APIs

Another model for the emerging business platform world comes from Under Armour, the sports apparel giant. Alongside its very successful clothing and shoe lines, Under Armour has put its platform at the heart of its business model.

But rather than build a platform itself, Under Armour has used its growing revenues to create an industry-leading ecosystem. Over the past decade, it has purchased companies that already offer APIs, including MapMyFitness, Endomondo, and MyFitnessPal, and then linked them all together into a massive platform that serves 30 million consumers.

This strategy has made Under Armour an indispensable part of the sprawling mobile fitness economy. According to the company’s 2016 annual results, its business platform ecosystem, known as the Connected Fitness division, generated $ 80 million in revenue that year—a 51% increase over 2015.

SAP Q317 DigitalDoubles Feature1 Image7 Courier Service Thrives Under Radical Business ModelBy combining existing APIs from its different apps with original tools built in-house, extensive developer support, and a robust SDK, third-party developers have everything they need to build their own fitness app or web site.

Depending on their needs, third-party developers can sign up for several different payment plans with varying access to Under Armour’s APIs and SDKs. Indeed, the company’s tiered developer pricing plan for Connected Fitness, which is separated into Starter, Pro, and Premium levels, makes Under Armour seem more like a tech company than a sports apparel firm.

As a result, Under Armour’s APIs and SDKs are the underpinnings of a vast platform cooperative. Under Armour’s apps seamlessly integrate with popular services like Fitbit and Garmin (even though Under Armour has a fitness tracker of its own) and are licensed by corporations ranging from Microsoft to Coca-Cola to Purina. They’re even used by fitness app competitors like AthletePath and Lose It.

A large part of Under Armour’s success is the sheer amount of data its fitness apps collect and then make available to developers. MyFitnessPal, for instance, is an industry-leading calorie and food tracker used for weight loss, and Endomondo is an extremely popular running and biking record keeper and route-sharing platform.

One way of looking at the Connected Fitness platform is as a combination of traditional consumer purchasing data with insights gleaned from Under Armour’s suite of apps, as well as from the third-party apps that Under Armour’s products use.

Indeed, Under Armour gets a bonus from the platform economy: it helps the company understand its customers better, creating a virtuous cycle. As end users use different apps fueled by Under Armour’s services and data-sharing capabilities, Under Armour can then use that data to fuel customer engagement and attract additional third-party app developers to add new services to the ecosystem.

What Successful Platforms Have in Common

The most successful business platforms have three things in common: They’re easy to work with, they fulfill a market need, and they offer data that’s useful to customers.

For instance, Farmobile’s marketplace fulfills a valuable need in the market: it lets farmers monetize data and develop a new revenue stream that otherwise would not exist. Similarly, Allstate’s Arity experiment turns large volumes of data collected by Allstate over the years into a revenue stream that drives down costs for Arity’s clients by giving them more accurate data to integrate into their apps and software tools.

Meanwhile, Under Armour’s Connected Fitness platform and API suite encourage users to sign up for more apps in the company’s ecosystem. If you track your meals in MyFitnessPal, you’ll want to track your runs in Endomondo or MapMyRun. Similarly, if you’re an app developer in the health and fitness space, Under Armour has a readily available collection of tools that will make it easy for users to switch over to your app and cheaper for you to develop your app.

As the platform economy grows, all three of these approaches—Allstate’s leveraging of its legacy business data, Farmobile’s marketplace for users to become data entrepreneurs, and Under Armour’s one-stop fitness app ecosystem—are extremely useful examples of what happens next.

In the coming months and years, the platform economy will see other big changes. In 2016 for example, Apple, Microsoft, Facebook, and Google all released APIs for their AI-powered voice assistant platforms, the most famous of which is Apple’s Siri.

The introduction of APIs confirms that the AI technology behind these bots has matured significantly and that a new wave of AI-based platform innovation is nigh. (In fact, Digitalistpredicted last year that the emergence of an API for these AIs would open them up beyond conventional uses.) New voice-operated technologies such as Google Home and Amazon Alexa offer exciting opportunities for developers to create full-featured, immersive applications on top of existing platforms.

We will also see AI- and machine learning–based APIs emerge that will allow developers to quickly leverage unstructured data (such as social media posts or texts) for new applications and services. For instance, sentiment analysis APIs can help explore and better understand customers’ interests, emotions, and preferences in social media.

As large providers offer APIs and associated services for smaller organizations to leverage AI and machine learning, these companies can in turn create their own platforms for clients to use unstructured data—everything from insights from uploaded photographs to recognizing a user’s emotion based on facial expression or tone of voice—in their own apps and products. Meanwhile, the ever-increasing power of cloud platforms like Amazon Web Services and Microsoft Azure will give these computing-intensive app platforms the juice they need to become deeper and richer.

These business services will depend on easy ways to exchange and implement data for success. The good news is that finding easy ways to share data isn’t hard and the API and SDK offerings that fuel the platform economy will become increasingly robust. Thanks to the opportunities generated by these new platforms and the new opportunities offered to end users, developers, and platform businesses themselves, everyone stands to win—if they act soon. D!

About the Authors

Bernd Leukert is a member of the Executive Board, Products and Innovation, for SAP.

Björn Goerke is Chief Technology Officer and President, SAP Cloud Platform, for SAP.

Volker Hildebrand is Global Vice President for SAP Hybris solutions.

Sethu M is President, Mobile Services, for SAP.

Neal Ungerleider is a Los Angeles-based technology journalist and consultant.

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


Let’s block ads! (Why?)

Digitalist Magazine

Online Analysis Services Course: Developing a Multidimensional Model

Check out the excellent, new online course by Peter Myers and Chris Randall for Microsoft Learning Experiences (LeX). Lean how to develop multidimensional data models with SQL Server 2016 Analysis Services. The complete course is available on edX at no cost to audit, or you can highlight your new knowledge and skills with a Verified Certificate for a small charge. Enrollment is available at edX.

Let’s block ads! (Why?)

Analysis Services Team Blog

Model Comparison and Merging for Analysis Services

Relational-database schema comparison and merging is a well-established market. Leading products include SSDT Schema Compare and Redgate SQL Compare, which is partially integrated into Visual Studio. These tools are used by organizations seeking to adopt a DevOps culture to automate build-and-deployment processes and increase the reliability and repeatability of mission critical systems.

Comparison and merging of BI models also introduces opportunities to bridge the gap between self-service and IT-owned “corporate BI”. This helps organizations seeking to adopt a “bi-modal BI” strategy to mitigate the risk of competing IT-owned and business-owned models offering redundant solutions with conflicting definitions.

Such functionality is available for Analysis Services tabular models. Please see the Model Comparison and Merging for Analysis Services whitepaper for detailed usage scenarios, instructions and workflows.

This is made possible with BISM Normalizer, which we are pleased to announce now resides on the Analysis Services Git repo. BISM Normalizer is a popular open-source tool that works with Azure Analysis Services and SQL Server Analysis Services. All tabular model objects and compatibility levels, including the new 1400 compatibility level, are supported. As a Visual Studio extension, it is tightly integrated with source control systems, build and deployment processes, and model management workflows.

Schema Diff2 Model Comparison and Merging for Analysis Services

Thanks to Javier Guillen (Blue Granite), Chris Webb (Crossjoin Consulting), Marco Russo (SQLBI), Chris Woolderink (Tabular) and Bill Anton (Opifex Solutions) for their contributions to the whitepaper.

Let’s block ads! (Why?)

Analysis Services Team Blog

Will Blockchain Technology Finally Lead Us To A Fair Payment Model For Content?

DECENT thumb Forbes01 e1498495694264 Will Blockchain Technology Finally Lead Us To A Fair Payment Model For Content?

During my journalism internship, I could already see there was going to be a problem with my chosen vocation. The small New England television station where I was interning was struck by a wave of media consolidation and competition from new cable networks. Writers were happy just to be called in as weekend temps. Geez, even my mentor was telling me to go to law school.

That was 1992, and traditional journalism was in deep crisis. Then along came the Internet, that happy digital highway of instant-publishing gratification. This had the effect of turning nearly everyone into a writer and publisher, for better or for worse. But the question ingrained upon me during my internship remained: How does anyone seriously get paid for this?

Different models – subscriptions, sponsorships, embedded ads, SEO, and pay-per-download – have been tried for promoting equitable distribution of digital content, but the question of secure, reliable payment still hovers over the media industry – as it does for most anyone who produces creative content, like books, movies, music, or pictures.

The traditional route of an author has been to present their work to a publisher and hope that their particular flavor of creative genius will be chosen for readers’ mass consumption. Few make the cut, and those who do face absorbent publishing fees that detract from hard-won earnings, sometimes leaving the author with a slim royalty of 25% for their e-book (source: Business of Publishing).

One of my favorite authors, Elizabeth Gilbert, wrote in her book Big Magic: Creative Living Beyond Fear about the many rejection letters – some helpful, some seemingly capricious – that she received from publishers. Undeterred, she chalked it up as part of the creative process, preferring perseverance to despair. She also kept her day job until her third book, the New York Times bestseller Eat, Pray, Love, won critical acclaim. Oprah promoted it in her book club, and Julia Roberts played the lead role in the movie version.

But what if there’s another path to publishing that’s open, free from media manipulation, and inherently just in matters of financial compensation? The answer might be found in blockchain technology. Underpinning the new “Internet of trust,” blockchains are open, distributed ledgers for recording transactions so they are verifiable and permanent. Inherently secure and resistant to tampering, this technology is ideal for recording events, records management, identity management, and transaction processing. But blockchain technology isn’t just for the likes of banks and insurance companies. It can also help creatives showcase their work, cultivate an audience – and get paid securely and directly.

DECENT, a Swiss tech startup, is applying blockchain technology to build a secured and trusted content distribution platform for authors, artists, and creatives of all calibers. Already in its second round of testing, with a launch date set for June 30, DECENT is attracting interest from technology enthusiasts and authors alike. Content posted on the platform includes e-books, blogs, videos, music, photos, and white papers, as well as independent software packages. Additionally, DECENT is encouraging third-party developers to build their own apps on top of its open-source protocol, opening the way for new publishing opportunities such as blogging spaces, photo galleries, and independent newspapers.

Freedom of speech on the blockchain

What makes DECENT different from other content distribution platforms is both the blockchain technology that supports it and its commitment to freedom of speech. Founded in 2015 by Matej Michalko and Matej Boda, both from Slovakia, DECENT is an independent, nonprofit peer-to-peer network that is wholly owned by its users. As such, it is not affiliated with any economic, media, or political party. In this decentralized network, content is hosted on multiple sites, meaning that it cannot be blocked, manipulated, or tampered with once it is published. This level of security provided by blockchain technology, in effect, also presents an interesting opportunity for independent media and organizations, like political opposition press and dissidents. “We have created a fully integrated and trustworthy worldwide platform system of digital content distribution, where blockchain plays a central role,” says Michalko. “Communication and all payments are done through the blockchain.”

That means no middlemen stand between the content author and the consumer. Whether in the form of publishing fees or content hosting fees, the cut taken by middlemen can be a serious threat to trust and profitability. “Most middlemen are unnecessary,” says Michalko. Citing a key advantage of blockchain compared to other technologies, he adds, “There is a significant reduction of costs of hosting because blockchain is peer-to-peer technology.”

DECENT community: Authors, consumers, publishers

The DECENT community is divided into three groups based on their relationships to the content: authors, consumers, and publishers. Authors comprise all types of content creators, whether they are individuals, organizations, businesses, artists, writers, photographers, software developers, music producers, or videographers. These people can upload their content, regardless of format, to the DECENT platform and set their price for which they will allow someone to access it. “There are no steps needed from the author. Just basically, upload the content to the platform and push the Publish button,” says Michalko. And what about tracking the content downstream? DECENT will tag the content with a digital fingerprint, enabling the author to identify copies of the content, so no unauthorized person can access or distribute it.

Consumers are the people who purchase the content and download it for their use. They can either access DECENT via its app or on the Web. “We built our products aimed at the general public, so no special knowledge is needed,” says Michalko. Purchasing content is like entering into a contract: the consumer makes a promise-to-pay; the author then delivers the content; and the payment is made. No payment, no content.

Payment is made with DCT tokens, a cryptocurrency like the Bitcoin that is the value denomination used in the DECENT community. DCT tokens are exchangeable to Bitcoins (BTC) and fiat currencies like euros, dollars, or pounds.

Which brings us to the role of publishers. Publishers are the members of the community who sustain the network by applying their independent computing resources for activities like processing transactions between authors and consumers and mining of Bitcoins. They typically provide their services in exchange for a fee. To be a publisher requires the necessary hardware and computing resources.

Startup credo: Build fresh, hire smart

DECENT stands out among blockchain startups in that it has its own blockchain built on open-source protocols. This provides the DECENT team with assurance in both the integrity of its technology and the mission of its platform. From September to November of 2016, DECENT held its initial coin offering (ICO) for its DCT tokens, raising 5,881+ BTC in the value of USD$ 4.2M at the end of the ICO and having a value of $ 15M in June 2017. More than 4,000 people backed the ICO, often buying in at two bitcoins or less. The funds will be used by the startup to further the development of its technology. DECENT’s team consists of 35+ people, spread among four global locations: Switzerland, Slovakia, China, and Armenia. Recently, DECENT hosted an AMA (Ask-Me-Anything) session to answer questions about its service offering and progress toward launching its main net. You can watch the recording of that AMA session here.

Michalko’s vision for DECENT is to be the content distribution platform of choice. He says, “I have confidence because we have great technology and a great team, and we are able to eliminate all the unnecessary middlemen.”

For more on blockchain, see How Can You Turn Blockchain Into Business Value?


Let’s block ads! (Why?)

Digitalist Magazine