Tag Archives: Impact

NCAP Medical Collection Removals are Rare and Have No Material Impact to FICO® Scores

Medical Collections Banner Image NCAP Medical Collection Removals are Rare and Have No Material Impact to FICO® Scores

The National Consumer Assistance Plan (NCAP) is a comprehensive series of initiatives intended to evaluate the accuracy of credit reports, the process of dealing with credit information, and consumer transparency. In a previous post, we showed that July 2017 NCAP public record removals (civil judgments and some tax liens) had no material impact to FICO® Scores. In mid-September 2017, the three consumer reporting agencies (CRAs) are also scheduled to remove the following from credit reports:

  • Medical collections less than 180 days old
  • Medical collections that are ‘paid by insurance’

FICO recently conducted research on a representative sample of millions of US consumers to assess the impact of the NCAP-driven removal of these 3rd party medical collection agency accounts on the FICO® Score.  Our results showed that NCAP-related medical collection removals have no material impact on the aggregate population to the FICO® Score’s predictive performance, odds-to-score relationship, or score distribution.

The removal of medical collections less than 180 days old is based on the date of first delinquency. Since most medical collections aren’t reported to the CRAs until more than 180 days after the first delinquency, we found that only 0.1% of the total FICO scorable population (roughly 200,000 consumers out of ~200 million) has a medical collection less than 180 days old. Medical collections that are identified in the credit file as being ‘paid by insurance’ are even less common.

Of the impacted population, roughly 3 in 4 saw score changes of less than 20 points. Most consumers with these medical collections have other derogatory information on their credit files, resulting in minimal impact to their FICO® Score once these collections are removed.

In 2014, FICO® Score 9 introduced a more sophisticated way of assessing collections, by ignoring all paid collections and differentiating unpaid medical collections from unpaid non-medical collections. Therefore, paid medical collections removed because of NCAP would already have been bypassed from FICO® Score 9.

FICO® Score 9’s enhancements led to improved predictive performance, while ensuring that those with different types of collections would receive a score commensurate with their credit risk. So these enhancements benefited lenders with a more predictive FICO® Score, while benefiting consumers who took a positive step toward credit responsibility and paid off a collection.

In conclusion, since NCAP medical collection removals are so rare, we observed virtually no perceptible impact to the ability of FICO® Scores to rank-order risk, volumes above or below score cut-offs, or bad rates at any given FICO® Score. Lenders can continue to rely on the stability and predictive performance of the industry standard FICO® Score.

Let’s block ads! (Why?)


Introducing per-user usage metrics: know your audience and amplify your impact

This June, we released usage metrics for reports and dashboards in the Power BI service. The response has been tremendous. Usage metrics has already been used hundreds of thousands of times by report authors to measure their impact and prioritize their next investments. Since release, by far the biggest request has been for a way understand which users were taking advantage of your content.

You asked, and we delivered.

Today, we are excited to announce that we’re supercharging usage metrics bysurfacingthe names of your end users. And of course, you’re free to copy and customize the pre-built usage metrics reports to drill into the data. The change is currently rolling out worldwide, and should be completed by the end of the week.

This simple change has the potential to magnify your impact like never before. Now, you can understand exactly who your audience is, and reach out to your top users directly to gather asks and feedback.

Excited? Read on for a walkthrough for what’s new in this update of usage metrics.

Feature Overview

Going forward, when you go to dashboard or report usage metrics, you’ll also see a breakdown of number of views by user. The visual includes the display names and login names of your end users.

0f11064c c73d 4953 bf54 0047ecb27df9 Introducing per user usage metrics: know your audience and amplify your impact
Above:new visual in the pre-built usage metrics showing views by user

Note: if you have an existing personalized copy of the usage metrics report, you can continue using it as usual. However, you’ll need to re-personalize a copy to get the new per-user data.

With the “save as” feature, you can copy and customize the pre-built usage metrics report to further drill into how your end users are interacting with your reports. In this release, we’ve augmented the users dimension with display name, login name and the user’s GUID.

b332b807 6149 4ada aa2c 05b9a2b18380 Introducing per user usage metrics: know your audience and amplify your impact
Above: the updated users dimension when customizing the usage metrics report

Once you copy the report, you can remove the pre-set dashboard/report filter to see the usage data – including usernames and UPNs – for the entire workspace.

Tip: the UserGuid (aka Object ID) and UserPrincipalName are both unique identifiers for the user in AAD. That means if you export the usage metrics data, you could join the usage metrics data against more data from your directory, like organizational structure, job title, etc.

For a full overview of usage metrics and its capabilities, read through our documentation.

Administering Usage Metrics in Your Organization

As an IT admin, we understand that you may be tasked with ensuring that Power BI remain compliant with a variety of compliance regulations and standards. With this release, we are giving IT admins further control over which users in their organizations can take advantage of usage metrics.

The usage metrics admin control is now granular, allowing you to enable usage metrics for a subset of your organization. In addition, a new option in the admin portal will allow you to disable all existing usage metrics reports in your organization.

23070a4e 954c 47fc 854f 2d2dc464fd68 Introducing per user usage metrics: know your audience and amplify your impact
Above: granular admin controls governing who has access to usage metrics

Together, these features give you full control over who in your organization can use usage metrics, regardless of how it’s currently being used. For example, if you have users taking advantage of usage metrics, yet would like to control its rollout, you could delete all usage metrics content for users to start with a blank slate. From there, you could enable the feature for an increasing set of security groups as the rollout progresses.

Next steps

  • Try out the feature! To get started, head to the Power BI service and go to any pre-built usage metrics report. Note that we’re still rolling out the feature worldwide, so you may have to wait until the end of the week to see it in action
  • Learn more about the feature through our documentation
  • Have comments? Want to share a cool use case for the feature? We’d love to hear it! Please leave comments below or in the community forums
  • Have ideas for where we should take the feature? Create new ideas or vote on existing ones in UserVoice

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

How We Have Seen Digital Transformation Impact Three Clients

Digital transformation refers to the introduction of technology into your business processes to help your team work smarter, not harder. We’ve seen companies evolve into paperless offices with better organization of documents, centralized security and the automation of certain processes to reduce costs and improve productivity. We’ve also see how better communication technology, for instance Microsoft’s Skype software, which provides video and voice calls as well as instant messaging, can help a company move faster and maintain accuracy.

Digital technology can help facilitate employee workforce mobility programs, providing a big boost to employee productivity and morale while improving an organization’s culture.

Here are just a few of the ways we have seen our clients benefit from digital transformation:

1) An insurance agency that used to manage their client information on separate spreadsheets. Using Microsoft Dynamics 365 (formerly Microsoft Dynamics CRM) has:

  • provided a consolidated system of all client information across every Broker relationship.
  • significantly reduced the administrative time needed for tracking and updating policy renewal details.
  • resulted in implementation of commission tracking functionality that saves 20 hours a month on what used to be a very tedious manual process.

2) A web design agency wanted to efficiently manage project tasks and budget for individual customer projects. We designed a project management process in Microsoft Dynamics 365 that has:

  • provided Project Templates for each type of project, for quick and consistent setup of new projects.
  • streamlined task management through workflow rules based on project milestones.
  • simplified time entry allowing for real time analysis of project budget vs. project costs.

3) A technology company needed to automate their sales processes based on their newly designed sales methodology. We created a sales workflow in Microsoft Dynamics 365 that:

  • aligns defined opportunity sales stages to the order and completion of relevant sales activities.
  • drives adoption of new sales methodology by providing clear definition of the sales process.
  • reduces sales pipeline subjectivity for better visibility and accuracy of the sales forecast.

Could your business benefit from digital transformation? We think it could, and we’d like to show you how. Download the free whitepaper at www.crmsoftwareblog.com/digital for more examples.

Contact our digital transformation experts at Crowe Horwath at 877-600-2253 or [email protected].

By Ryan Plourde, Crowe Horwath, Microsoft Dynamics 365 Gold Partner, www.crowecrm.com

Follow on Twitter: @CroweCRM

728X90 13 625x77 How We Have Seen Digital Transformation Impact Three Clients

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Timing is Everything: Leveraging Adaptive Sending and Posting to Optimize Impact

insideAO 351x200 Timing is Everything: Leveraging Adaptive Sending and Posting to Optimize Impact

Ever pull into a full parking lot and get a front row spot because someone just happened to be pulling out? Great timing brings great advantages! For marketers, maximizing engagement with your prospects and customers can feel a lot like getting that ideal parking spot – sometimes you get lucky and send out a social post or email to a target contact at the perfect time. Score!

What if you were able to rely less on luck and also improve ‒ and even perfect ‒ your timing? Well, although I’m sorry to say that there’s no app to get you the best parking spot every time, there’s now a better way for you to know how to time your interactions with your target audiences.

Act-On’s Adaptive Journey’s vision is coming to life and I’m excited to be sharing some of our tech and products teams’ great progress on this front. Specifically, I’m referring to the release for all customers of our new Adaptive Social Posting, along with the private beta for Adaptive Sending that begins soon.

For those of you who weren’t able to make it to Act-On’s San Francisco I♥Marketing event (don’t worry, if you’re on the east coast we have one in New York on August 8) to hear about these advancements in person, I thought I’d take a moment to highlight our product’s two very helpful new capabilities.

Adaptive Sending – solving the age-old dilemma of when to send that email

A common outbound marketing dilemma is figuring out when to send that email. You can read one blog post that says the best open- or click-through rates are on a Sunday at 6 pm … and then read another that claims Tuesday at 10 am is best. Most marketers try to take into account these published best practices while also leveraging their own results from past campaigns. They also do a little A/B testing and ‒ let’s face it ‒ guesswork.

The problem, however, is that the buying journey for each individual can easily look like this:

In the above picture, the interaction times are very different for Ben (morning) vs. Beth (afternoon/evening).

Act-On is solving this all-over-the-map buyers’ journey challenge by not only having a platform that tracks, scores, measures, and connects all of these interactions, but also automatically learns from them. What we call “Adaptive Sending” includes the following main tenets:

  1. Assess each individual’s interaction times.
  2. Predict the best time each person should receive an email (within a campaign that may include thousands of contacts).
  3. Do all of this with one-click simplicity.

This sort of capability goes beyond just adjusting send times for time zones ‒ a function we already do today and that some vendors in this space consider “smartsend” functionality. It also goes beyond just looking at past behaviors and engagement in aggregate to provide one generally recommended send time for all recipients; our tailored-to-the-individual approach is something few vendors do. In addition, Adaptive Sending doesn’t just leverage past email interaction data (opens, click-throughs, etc.), it also takes into consideration each prospect and customer’s past engagement times with your company, such as visits to your website and landing pages and social engagement.

It’s easy to see why we think Adaptive Sending will be a big deal: The impact of a 6% vs. a 3% click-through or conversion rate can be monumental across all campaigns.

Adaptive Social Posting solving the common inbound marketing dilemma of when to post

As in the outbound marketing activity of emailing a contact, marketers have the similar challenge of determining the best times to post on LinkedIn, Facebook, or Twitter. They can look across their past campaigns for clues, of course. They can also consult research, but it always seems to tell a different story. However, with the new Adaptive Social Posting capability, which is now available to all of our customers to flip on in Labs, Act-On will be constantly learning when your target audience has engaged with you the most on the social web and when they’ve shared or liked or commented or clicked through your past posts the most. It only takes a simple one-click step to leverage this functionality in our out-of-the-box social capabilities (see example below).

This innovative system takes the guesswork out of social posting and allows our customers to drive more impressions and engagement ‒and ultimately greater conversion rates.

There’s much more to come as we drive forward with our Adaptive Journeys vision. I’m looking forward to seeing customers start leveraging some of these great capabilities … and then reap the gains!

Let’s block ads! (Why?)

Act-On Blog

How a new wave of machine learning will impact today’s enterprise

 How a new wave of machine learning will impact today’s enterprise

Advances in deep learning and other machine learning algorithms are currently causing a tectonic shift in the technology landscape. Technology behemoths like Google, Microsoft, Amazon, Facebook and Salesforce are engaged in an artificial intelligence (AI) arms race, gobbling up machine learning talent and startups at an alarming pace. They are building AI technology war chests in an effort to develop an insurmountable competitive advantage.

Today, you can watch a 30-minute deep learning tutorial online, spin up a 10-node cluster over the weekend to experiment, and shut it down on Monday when you’re done – all for the cost of a few hundred bucks. Betting big on an AI future, cloud providers are investing resources to simplify and promote machine learning to win new cloud customers. This has led to an unprecedented level of accessibility that is breeding grassroots innovation in AI. A comparable technology democratization occurred with the internet in the 1990s and, if AI innovation follows a similar trajectory, the world will be a very interesting place in five years.

First, advances in computing technology (GPU chips and cloud computing, in particular) are enabling engineers to solve problems in ways that weren’t possible before. For example, chipmaker NVIDIA has been ramping up production of GPU processors designed specifically to accelerate machine learning, and cloud providers like Microsoft and Google have been using them in their machine learning services.

These advances have a broader impact than just the development of faster, cheaper processors, however. The low cost of computation and the ease of accessing cloud-managed clusters have democratized AI in a way we’ve never seen before. In the past, building a computer cluster to train a deep neural network would have required access to capital or a university research facility. You would have also needed someone with a PhD in mathematics to understand the academic research papers on subjects like convolutional neural networks.

Although everybody points to improvements in CPU/GPU as the primary driver of AI innovation, this is only half the equation. Advances in AI algorithms in the mid-1980s broke the spell of the AI winter of the 1970s. The work of deep learning pioneers like Geoffrey Hinton and Yann LeCun solved some of the critical shortcomings that plagued earlier algorithms. In many ways, algorithms like Hinton’s backpropagation opened the floodgates for future algorithmic innovations, albeit these improvements happened at a slower, academic pace. DeepMind’s AlphaGo program, for example, combined deep learning with reinforcement learning to enable a computer that beat the world’s highest-ranked Go player in 2017 – a full 20 years later.

Historically, AI has been defined by the ability of a computer to pass the Turing test, which meant the public wasn’t going to be happy with AI until they had a walking, talking robot. Anything less was considered a failure. We’re still far away from creating this kind of general AI, but we’re already solving some types of advanced problems with machine learning, a subset of AI proper. Rather than focus on general intelligence, machine learning algorithms work by improving their ability to perform specific tasks using data. Problems that used to be the exclusive domain of humans – computer vision, speech recognition, autonomous movement – are being solved today by machine learning algorithms.

In fact, machine learning has become such a huge area of focus and, for all practical purposes, the term machine learning has become synonymous with AI. Ultimately this is a good thing. The more consumers and companies start associating the term AI with real-world applications of machine learning like self-driving cars, the more they realize that AI is a real thing. It’s here to stay, and it holds the promise of reshaping the technology landscape over the next several years.

Enterprises should take advantage by aligning their cloud and technology stacks with providers who are leaders in AI. The gap between the AI haves and have-nots will continue to widen, so picking the right technology providers is critical. For example, a non-AI powered CRM system might allow your sales team to find prospective customers based on the last time they were contacted, helping sales reps search for potentially fruitful leads. But an AI-powered CRM system, in contrast, could proactively feed leads to sales reps in real time using algorithms designed to maximize the likelihood of a sale, based on breaking information about the customer, their company, and the sales rep herself. Choosing the right CRM vendor in this case could have a direct and significant impact on revenue.

This year will, if it hasn’t already, bring the realization that if we don’t develop strong in-house machine learning capabilities now, we’ll end up on the wrong side of the future of technology. However, rather than hire teams of AI innovators like the first wave of AI tech giants have done, today’s technology companies must build their AI capabilities using out-of-the-box machine learning tools from AI-focused platform providers like Microsoft and Google.

The increasing demand for AI-driven technology, combined with the dearth of machine learning talent in the labor pool, will force the democratization of data science. Indeed, Google and Microsoft are betting the farm on this trend, which is why they’re making such huge investments in machine learning education and easy-to-use AI tools.

Jake Bennett is CTO of Seattle-based agency POP.

Let’s block ads! (Why?)

Big Data – VentureBeat

Microsoft Teams and its Impact with Microsoft Dynamics 365

[unable to retrieve full-text content]

Microsoft Teams is a chat-based workspace in the Office 365 suite. Microsoft introduced Microsoft Teams on November 2016, and was designed as a competitor to Slack. It is a platform that combines workspace chat, meetings, notes and attachments. By being able to integrate with the company’s Office 365 subscription office productivity suite including Skype, it features extensions for integrating other technologies. Microsoft Team’s promising features raises a lot of questions heading into the future as everyone is moving to the cloud with Microsoft Dynamics 365. In this blog, we will explore some of the main features Microsoft Teams has to offer and how it ties with Microsoft Dynamics 365.

Microsoft Teams has a goal of bringing everything together – people, conversations and content. It promotes easy collaboration such that a working team can achieve more. The chat space named Conversations is grouped into Teams, which can be further modularised into Channels. This is where persistent and threaded chats are used to keep Team members engaged. Team-wide and private discussions are available within these constraints.

image thumb Microsoft Teams and its Impact with Microsoft Dynamics 365

Conversations is not limited to messaging. Common resources shared within a Team are posted in the Conversations or pinned in Tabs for easy access. This includes shared files, calendars, and other content which encourages collaborative editing. Each Team and Channel can be customized accordingly when adding Tab. Shared and integrated components are publicly available to Team members. This ranges from anything in the Office suite, and other things such as website URLs, PowerBI dashboards, Zendesk and SharePoint.

On launch, Microsoft Teams shipped with over 70 Connectors and 85 Bots which can participate in conversations. For example, within Conversations, you can create polls using the Polly Bot by tagging Polly and adhering to the simple question-answer format. T-Bot is also available in Chat for assistance with anything related with Microsoft Teams. Establishing the Dynamics 365 Connector with Microsoft Teams will be further explained in a later blog post.

Currently, functionality between the Connector is still quite limited. It works similarly like the Office 365 Connector for Groups in which new or existing Office 365 Groups with Dynamics 365 are connected such that the Group is notified when new Activities are posted. This happens similarly with Teams where the connection is made on a Channel within a Team, and Team members get notified for any updates of activities from specific CRM records. In the image below, I established a Dynamics 365 connection in a Channel within a Team. Then in my Dynamics 365 environment, I tested the connection by making an Activity record for the Account record “A. Datum Corporation”. The Meeting details are highlighted below, and Teams handily includes a summary of the monitored CRM record such as Annual Revenue and Number of Employees.

image thumb 1 Microsoft Teams and its Impact with Microsoft Dynamics 365

Dynamics 365 can still be used in Microsoft Teams as it is IFD by adding it in the Channel Tabs as a website. By simply entering in the Dynamics 365 URL, you can use Dynamics 365 within Microsoft Teams as normal. The user would still need to authenticate themselves by entering their username and password (you’d expect this to be done automatically especially since Microsoft Teams already uses Office 365 accounts).

image thumb 2 Microsoft Teams and its Impact with Microsoft Dynamics 365

One thing to note though is that using Dynamics 365 within Teams is not intuitive to use and user friendly. For one, when navigating in and out of the Dynamics 365 Channel Tab would reset wherever you were previously in Dynamics 365 and it would take you back to the landing page. Most Dynamics 365 users also have multiple Dynamics 365 records open in multiple browser tabs or windows which Microsoft Teams does not do as it is not a committed browser. Another interesting thing to note is that a user has to enable Microsoft Dynamics 365 within your pop-up blocker. Notably, I initially could not use Advanced Find within Microsoft Teams because of the following pop-up error:

image thumb 3 Microsoft Teams and its Impact with Microsoft Dynamics 365

It will be interesting to know how further development of Microsoft Teams will affect its integration with Microsoft Dynamics 365. Some things to consider include how Teams and its integration with Outlook items such as Meetings can transfer to Dynamics 365 and its congruent Dynamics 365 Activity records. Adding to these are Skype’s integration with video meetings, and implications of CRM for Outlook. This would of course depend on the demand of users who use Microsoft Teams as their centralized software tool.

Microsoft Teams is a promising tool to tie in multiple integrated technologies including Microsoft Dynamics 365. The use of Dynamics 365 in Microsoft Teams still need refining and tweaking. There is a lot of setting up to do within each Team and Channels, the technologies involved, and most importantly, setting up the actual process of people using the tool to collaborate and share information within a single centralized platform. It’s too early at this stage to know how Microsoft Teams will be accepted by users in conjunction with Microsoft Dynamics 365, but it is exciting to see how it would go about in accomplishing its goal of bringing everything together.

 Microsoft Teams and its Impact with Microsoft Dynamics 365
Magnetism Solutions Dynamics CRM Blog

How To Present With Impact In The Digital Age

Dan McCaffrey has an ambitious goal: solving the world’s looming food shortage.

As vice president of data and analytics at The Climate Corporation (Climate), which is a subsidiary of Monsanto, McCaffrey leads a team of data scientists and engineers who are building an information platform that collects massive amounts of agricultural data and applies machine-learning techniques to discover new patterns. These analyses are then used to help farmers optimize their planting.

“By 2050, the world is going to have too many people at the current rate of growth. And with shrinking amounts of farmland, we must find more efficient ways to feed them. So science is needed to help solve these things,” McCaffrey explains. “That’s what excites me.”

“The deeper we can go into providing recommendations on farming practices, the more value we can offer the farmer,” McCaffrey adds.

But to deliver that insight, Climate needs data—and lots of it. That means using remote sensing and other techniques to map every field in the United States and then combining that information with climate data, soil observations, and weather data. Climate’s analysts can then produce a massive data store that they can query for insights.

sap Q217 digital double feature3 images2 How To Present With Impact In The Digital Age

Meanwhile, precision tractors stream data into Climate’s digital agriculture platform, which farmers can then access from iPads through easy data flow and visualizations. They gain insights that help them optimize their seeding rates, soil health, and fertility applications. The overall goal is to increase crop yields, which in turn boosts a farmer’s margins.

Climate is at the forefront of a push toward deriving valuable business insight from Big Data that isn’t just big, but vast. Companies of all types—from agriculture through transportation and financial services to retail—are tapping into massive repositories of data known as data lakes. They hope to discover correlations that they can exploit to expand product offerings, enhance efficiency, drive profitability, and discover new business models they never knew existed.

The internet democratized access to data and information for billions of people around the world. Ironically, however, access to data within businesses has traditionally been limited to a chosen few—until now. Today’s advances in memory, storage, and data tools make it possible for companies both large and small to cost effectively gather and retain a huge amount of data, both structured (such as data in fields in a spreadsheet or database) and unstructured (such as e-mails or social media posts). They can then allow anyone in the business to access this massive data lake and rapidly gather insights.

It’s not that companies couldn’t do this before; they just couldn’t do it cost effectively and without a lengthy development effort by the IT department. With today’s massive data stores, line-of-business executives can generate queries themselves and quickly churn out results—and they are increasingly doing so in real time. Data lakes have democratized both the access to data and its role in business strategy.

Indeed, data lakes move data from being a tactical tool for implementing a business strategy to being a foundation for developing that strategy through a scientific-style model of experimental thinking, queries, and correlations. In the past, companies’ curiosity was limited by the expense of storing data for the long term. Now companies can keep data for as long as it’s needed. And that means companies can continue to ask important questions as they arise, enabling them to future-proof their strategies.

sap Q217 digital double feature3 images3 copy 1024x572 How To Present With Impact In The Digital Age

Prescriptive Farming

Climate’s McCaffrey has many questions to answer on behalf of farmers. Climate provides several types of analytics to farmers including descriptive services, which are metrics about the farm and its operations, and predictive services related to weather and soil fertility. But eventually the company hopes to provide prescriptive services, helping farmers address all the many decisions they make each year to achieve the best outcome at the end of the season. Data lakes will provide the answers that enable Climate to follow through on its strategy.

Behind the scenes at Climate is a deep-science data lake that provides insights, such as predicting the fertility of a plot of land by combining many data sets to create accurate models. These models allow Climate to give farmers customized recommendations based on how their farm is performing.

“Machine learning really starts to work when you have the breadth of data sets from tillage to soil to weather, planting, harvest, and pesticide spray,” McCaffrey says. “The more data sets we can bring in, the better machine learning works.”

The deep-science infrastructure already has terabytes of data but is poised for significant growth as it handles a flood of measurements from field-based sensors.

“That’s really scaling up now, and that’s what’s also giving us an advantage in our ability to really personalize our advice to farmers at a deeper level because of the information we’re getting from sensor data,” McCaffrey says. “As we roll that out, our scale is going to increase by several magnitudes.”

Also on the horizon is more real-time data analytics. Currently, Climate receives real-time data from its application that streams data from the tractor’s cab, but most of its analytics applications are run nightly or even seasonally.

In August 2016, Climate expanded its platform to third-party developers so other innovators can also contribute data, such as drone-captured data or imagery, to the deep-science lake.

“That helps us in a lot of ways, in that we can get more data to help the grower,” McCaffrey says. “It’s the machine learning that allows us to find the insights in all of the data. Machine learning allows us to take mathematical shortcuts as long as you’ve got enough data and enough breadth of data.”

Predictive Maintenance

Growth is essential for U.S. railroads, which reinvest a significant portion of their revenues in maintenance and improvements to their track systems, locomotives, rail cars, terminals, and technology. With an eye on growing its business while also keeping its costs down, CSX, a transportation company based in Jacksonville, Florida, is adopting a strategy to make its freight trains more reliable.

In the past, CSX maintained its fleet of locomotives through regularly scheduled maintenance activities, which prevent failures in most locomotives as they transport freight from shipper to receiver. To achieve even higher reliability, CSX is tapping into a data lake to power predictive analytics applications that will improve maintenance activities and prevent more failures from occurring.

sap Q217 digital double feature3 images4 How To Present With Impact In The Digital Age

Beyond improving customer satisfaction and raising revenue, CSX’s new strategy also has major cost implications. Trains are expensive assets, and it’s critical for railroads to drive up utilization, limit unplanned downtime, and prevent catastrophic failures to keep the costs of those assets down.

That’s why CSX is putting all the data related to the performance and maintenance of its locomotives into a massive data store.

“We are then applying predictive analytics—or, more specifically, machine-learning algorithms—on top of that information that we are collecting to look for failure signatures that can be used to predict failures and prescribe maintenance activities,” says Michael Hendrix, technical director for analytics at CSX. “We’re really looking to better manage our fleet and the maintenance activities that go into that so we can run a more efficient network and utilize our assets more effectively.”

“In the past we would have to buy a special storage device to store large quantities of data, and we’d have to determine cost benefits to see if it was worth it,” says Donna Crutchfield, assistant vice president of information architecture and strategy at CSX. “So we were either letting the data die naturally, or we were only storing the data that was determined to be the most important at the time. But today, with the new technologies like data lakes, we’re able to store and utilize more of this data.”

CSX can now combine many different data types, such as sensor data from across the rail network and other systems that measure movement of its cars, and it can look for correlations across information that wasn’t previously analyzed together.

One of the larger data sets that CSX is capturing comprises the findings of its “wheel health detectors” across the network. These devices capture different signals about the bearings in the wheels, as well as the health of the wheels in terms of impact, sound, and heat.

“That volume of data is pretty significant, and what we would typically do is just look for signals that told us whether the wheel was bad and if we needed to set the car aside for repair. We would only keep the raw data for 10 days because of the volume and then purge everything but the alerts,” Hendrix says.

With its data lake, CSX can keep the wheel data for as long as it likes. “Now we’re starting to capture that data on a daily basis so we can start applying more machine-learning algorithms and predictive models across a larger history,” Hendrix says. “By having the full data set, we can better look for trends and patterns that will tell us if something is going to fail.”

sap Q217 digital double feature3 images5 How To Present With Impact In The Digital Age

Another key ingredient in CSX’s data set is locomotive oil. By analyzing oil samples, CSX is developing better predictions of locomotive failure. “We’ve been able to determine when a locomotive would fail and predict it far enough in advance so we could send it down for maintenance and prevent it from failing while in use,” Crutchfield says.

“Between the locomotives, the tracks, and the freight cars, we will be looking at various ways to predict those failures and prevent them so we can improve our asset allocation. Then we won’t need as many assets,” she explains. “It’s like an airport. If a plane has a failure and it’s due to connect at another airport, all the passengers have to be reassigned. A failure affects the system like dominoes. It’s a similar case with a railroad. Any failure along the road affects our operations. Fewer failures mean more asset utilization. The more optimized the network is, the better we can service the customer.”

Detecting Fraud Through Correlations

Traditionally, business strategy has been a very conscious practice, presumed to emanate mainly from the minds of experienced executives, daring entrepreneurs, or high-priced consultants. But data lakes take strategy out of that rarefied realm and put it in the environment where just about everything in business seems to be going these days: math—specifically, the correlations that emerge from applying a mathematical algorithm to huge masses of data.

The Financial Industry Regulatory Authority (FINRA), a nonprofit group that regulates broker behavior in the United States, used to rely on the experience of its employees to come up with strategies for combating fraud and insider trading. It still does that, but now FINRA has added a data lake to find patterns that a human might never see.

Overall, FINRA processes over five petabytes of transaction data from multiple sources every day. By switching from traditional database and storage technology to a data lake, FINRA was able to set up a self-service process that allows analysts to query data themselves without involving the IT department; search times dropped from several hours to 90 seconds.

While traditional databases were good at defining relationships with data, such as tracking all the transactions from a particular customer, the new data lake configurations help users identify relationships that they didn’t know existed.

Leveraging its data lake, FINRA creates an environment for curiosity, empowering its data experts to search for suspicious patterns of fraud, marketing manipulation, and compliance. As a result, FINRA was able to hand out 373 fines totaling US$ 134.4 million in 2016, a new record for the agency, according to Law360.

Data Lakes Don’t End Complexity for IT

Though data lakes make access to data and analysis easier for the business, they don’t necessarily make the CIO’s life a bed of roses. Implementations can be complex, and companies rarely want to walk away from investments they’ve already made in data analysis technologies, such as data warehouses.

“There have been so many millions of dollars going to data warehousing over the last two decades. The idea that you’re just going to move it all into a data lake isn’t going to happen,” says Mike Ferguson, managing director of Intelligent Business Strategies, a UK analyst firm. “It’s just not compelling enough of a business case.” But Ferguson does see data lake efficiencies freeing up the capacity of data warehouses to enable more query, reporting, and analysis.

sap Q217 digital double feature3 images6 How To Present With Impact In The Digital AgeData lakes also don’t free companies from the need to clean up and manage data as part of the process required to gain these useful insights. “The data comes in very raw, and it needs to be treated,” says James Curtis, senior analyst for data platforms and analytics at 451 Research. “It has to be prepped and cleaned and ready.”

Companies must have strong data governance processes, as well. Customers are increasingly concerned about privacy, and rules for data usage and compliance have become stricter in some areas of the globe, such as the European Union.

Companies must create data usage policies, then, that clearly define who can access, distribute, change, delete, or otherwise manipulate all that data. Companies must also make sure that the data they collect comes from a legitimate source.

Many companies are responding by hiring chief data officers (CDOs) to ensure that as more employees gain access to data, they use it effectively and responsibly. Indeed, research company Gartner predicts that 90% of large companies will have a CDO by 2019.

Data lakes can be configured in a variety of ways: centralized or distributed, with storage on premise or in the cloud or both. Some companies have more than one data lake implementation.

“A lot of my clients try their best to go centralized for obvious reasons. It’s much simpler to manage and to gather your data in one place,” says Ferguson. “But they’re often plagued somewhere down the line with much more added complexity and realize that in many cases the data lake has to be distributed to manage data across multiple data stores.”

Meanwhile, the massive capacities of data lakes mean that data that once flowed through a manageable spigot is now blasting at companies through a fire hose.

“We’re now dealing with data coming out at extreme velocity or in very large volumes,” Ferguson says. “The idea that people can manually keep pace with the number of data sources that are coming into the enterprise—it’s just not realistic any more. We have to find ways to take complexity away, and that tends to mean that we should automate. The expectation is that the information management software, like an information catalog for example, can help a company accelerate the onboarding of data and automatically classify it, profile it, organize it, and make it easy to find.”

Beyond the technical issues, IT and the business must also make important decisions about how data lakes will be managed and who will own the data, among other things (see How to Avoid Drowning in the Lake).

sap Q217 digital double feature3 images7 1024x572 How To Present With Impact In The Digital Age

How to Avoid Drowning in the Lake

The benefits of data lakes can be squandered if you don’t manage the implementation and data ownership carefully.

Deploying and managing a massive data store is a big challenge. Here’s how to address some of the most common issues that companies face:

Determine the ROI. Developing a data lake is not a trivial undertaking. You need a good business case, and you need a measurable ROI. Most importantly, you need initial questions that can be answered by the data, which will prove its value.

Find data owners. As devices with sensors proliferate across the organization, the issue of data ownership becomes more important.

Have a plan for data retention. Companies used to have to cull data because it was too expensive to store. Now companies can become data hoarders. How long do you store it? Do you keep it forever?

Manage descriptive data. Software that allows you to tag all the data in one or multiple data lakes and keep it up-to-date is not mature yet. We still need tools to bring the metadata together to support self-service and to automate metadata to speed up the preparation, integration, and analysis of data.

Develop data curation skills. There is a huge skills gap for data repository development. But many people will jump at the chance to learn these new skills if companies are willing to pay for training and certification.

Be agile enough to take advantage of the findings. It used to be that you put in a request to the IT department for data and had to wait six months for an answer. Now, you get the answer immediately. Companies must be agile to take advantage of the insights.

Secure the data. Besides the perennial issues of hacking and breaches, a lot of data lakes software is open source and less secure than typical enterprise-class software.

Measure the quality of data. Different users can work with varying levels of quality in their data. For example, data scientists working with a huge number of data points might not need completely accurate data, because they can use machine learning to cluster data or discard outlying data as needed. However, a financial analyst might need the data to be completely correct.

Avoid creating new silos. Data lakes should work with existing data architectures, such as data warehouses and data marts.

From Data Queries to New Business Models

The ability of data lakes to uncover previously hidden data correlations can massively impact any part of the business. For example, in the past, a large soft drink maker used to stock its vending machines based on local bottlers’ and delivery people’s experience and gut instincts. Today, using vast amounts of data collected from sensors in the vending machines, the company can essentially treat each machine like a retail store, optimizing the drink selection by time of day, location, and other factors. Doing this kind of predictive analysis was possible before data lakes came along, but it wasn’t practical or economical at the individual machine level because the amount of data required for accurate predictions was simply too large.

The next step is for companies to use the insights gathered from their massive data stores not just to become more efficient and profitable in their existing lines of business but also to actually change their business models.

For example, product companies could shield themselves from the harsh light of comparison shopping by offering the use of their products as a service, with sensors on those products sending the company a constant stream of data about when they need to be repaired or replaced. Customers are spared the hassle of dealing with worn-out products, and companies are protected from competition as long as customers receive the features, price, and the level of service they expect. Further, companies can continuously gather and analyze data about customers’ usage patterns and equipment performance to find ways to lower costs and develop new services.

Data for All

Given the tremendous amount of hype that has surrounded Big Data for years now, it’s tempting to dismiss data lakes as a small step forward in an already familiar technology realm. But it’s not the technology that matters as much as what it enables organizations to do. By making data available to anyone who needs it, for as long as they need it, data lakes are a powerful lever for innovation and disruption across industries.

“Companies that do not actively invest in data lakes will truly be left behind,” says Anita Raj, principal growth hacker at DataRPM, which sells predictive maintenance applications to manufacturers that want to take advantage of these massive data stores. “So it’s just the option of disrupt or be disrupted.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

About the Authors:

Timo Elliott is Vice President, Global Innovation Evangelist, at SAP.

John Schitka is Senior Director, Solution Marketing, Big Data Analytics, at SAP.

Michael Eacrett is Vice President, Product Management, Big Data, Enterprise Information Management, and SAP Vora, at SAP.

Carolyn Marsan is a freelance writer who focuses on business and technology topics.


Let’s block ads! (Why?)

Digitalist Magazine

The Artificial Intelligence Impact: Friend or Foe to Marketing?

The Artificial Intelligence Impact  Friend or Foe to Marketing 351x200 The Artificial Intelligence Impact: Friend or Foe to Marketing?

The era of Artificial Intelligence (AI) is upon us. More and more, computer systems will become able to perform tasks that previously required human intelligence, such as decision making, visual perception, and speech and language recognition.

Marketing, like most other fields, will feel AI’s impact in several areas, including database marketing techniques, search queries and search engine optimization (SEO), personalization, predictive customer service, sales forecasting, customer segmentation, pricing, and many others.

Act-On recently asked several marketing experts and analysts for their assessments of the pros and cons of using Artificial Intelligence in marketing and what they foresee for the future.

Question: Will Artificial Intelligence (AI) be marketing’s friend or foe? How do you see the two co-existing as AI becomes more mainstream?

Jim Ward (Act-On customer)


As AI learns and develops, I can foresee buying behaviors and automated nurture- or real time- programs tied together as an example. Literally with any data that impacts behavior, AI will be able to quickly make assessments and guide a process. Think about the weather in a target region and how that might impact a buyer, or a decision process which, connected with marketing automation, will likely enhance results.

David Raab

Industry Analyst
Raab Associates

If AI takes over our jobs and relentlessly seeks to eliminate all vestiges of human involvement, Terminator-style, I guess that would qualify as “foe”.

But it’s much more likely that AI will augment human capabilities, in particular by making individual- or micro-segment-level decisions with much more specificity than humans can do. In other words, AI can easily monitor individual behaviors and find appropriate responses in millions of different situations, whereas human marketers can only deal with segments and create rules or workflows that deal with dozens or hundreds of situations.

Similarly, AI can closely watch results to find new patterns as they emerge, something that humans can only do sporadically and with less attention to detail. For example, it’s wholly conceivable that AI could write individually tailored catalog content for tens of thousands of items, taking the tone and pitching the features that will most appeal to every person based on their psychological profile. 

What AI won’t be able to do for a long time is to come up with radically new ideas, or to judge which random possibilities are actually promising enough to be worth testing. So we’ve already seen AI take over some more routine marketing tasks, such as certain kinds of reporting and analysis, and even climb a bit up the value chain to making suggestions based on those reports. That will be more widespread. 

And, AI will take over much of the actual message selection, replacing human-built rules with AI-based recommendations: something that’s also already in place. One thing I do expect is that AIs will broaden their scope, so instead of a lot of separate AIs, each handling a particular task (chatbot here, recommendation engine there, etc.), we’ll see a single AI or several tightly connected AIs working together.

Michael Krigsman

Industry Analyst
CXO Talk

AI is a tool; it’s neither the next great hope for marketers nor something to fear and avoid. If we remember that AI is really advanced pattern matching based on lots of compute power crunching through large data sets, the whole conundrum becomes easier to manage.

AI tools can find correlations between customer digital activity and likely subsequent behavior. For example, AI will help us personalize and tailor messages specifically for individual customers. If a woman on your site is looking at shoes, historical data patterns may suggest that sending an offer late in the evening will be more effective than during the workday. Or maybe the patterns will show that men buying black shoes in wide sizes on the weekend are six times more likely to buy blue socks within three days. AI can help us find these patterns in ways never before possible.

Despite the obvious benefits, there are also real concerns, especially around privacy. With great power comes the ability to become annoying — and even creepy. Marketers must temper their enthusiasm for personalization with the caution not to be overly intrusive. Remember the story of a teen whose father found out she was pregnant because her shopping patterns at Target triggered advertisements for pregnancy items? That’s beyond creepy and it’s all based on data and analysis.

The lessons for marketers are clear: Use AI to personalize messages your customers will welcome and want. Think about campaigns through the eyes of your customer; if it’s helpful and useful to customers, they’ll respond positively. If the personalization becomes too personal or frequent, you’ll just alienate valuable customers. As with many things, finding the right balance is critical.

Douglas Karr

DK New Media

As with any tool in marketing, AI can be either [friend or foe]. Email, for instance, gets amazing results when utilized effectively. When it’s not, you’re a spammer who is doing more harm than good. AI is no different; it’s a tool that will have tremendous results when applied to situations where the marketing strategy is on a solid foundation. 

Bots, for example, may hold great conversations for a company that has too many site visitors to respond to each independently. They can both help or route the conversation based on the interaction learned. A/B and multivariate testing with AI will help marketers automatically improve their site copy or even personalize it for maximum input. AI can streamline and reduce the workload on the marketer, enabling them to build strategies instead of handling mundane tasks or efforts they lack the resources for.

Daniel Newman

Principal Analyst, Futurum Research
CEO, Broadsuite Media Group

Artificial Intelligence will be the best thing to ever happen to marketers. With all of the data being created every second, there is almost no way for marketers to access and process all of the useful data in real time. This is where AI will be able to augment the role of the marketer. By rapidly processing and synthesizing an unprecedented amount of data from across the web and from our structured data systems, marketers will be able to get closer to not just real time, but also to right time marketing that puts the right information in front of the right consumer in as close to their “Zero Moment” as possible.

Christine Crandall


Our love/hate relationship with AI has its roots in science fiction movies like Her, Transcendence, and Star Wars. In these stories, AI thinks for itself, creative problem solves, and mostly tries to keep humans from getting into more trouble. Fiction is great, as it fuels dreams of innovation. The reality of AI is quite different. Software programs, created and managed by humans, perform predefined micro-tasks following pre-set decision trees designed to automate routine, repeatable tasks.

Augmenting marketers’ skills, akin to being their wingman, AI will have more impact on the discipline than any martech product to date. The ability to quickly conduct data analysis to spot and evaluate trends and then develop risk/return-based actions to a host of routine activities and decisions such as ad networks, keywords, CTAs, personas, journeys, budget and ROI reports, competitive intelligence, etc., will enable marketing to focus on higher value-add activities better suited for humans such as strategy, cross-functional collaboration, new business/markets, customer alignment, etc. 

How each organization, however, leverages AI depends on its own market, business model, and core competencies. It won’t be “one size fits all” but rather an ecosystem of interlocking frameworks that will require marketers and CIOs to closely partner in order to reap the potential benefits. Marketing will need to prepare for AI and carry forward the lessons learned from predictive analytics and today’s patchwork of martech solutions — automation can only offer recommendations that are as good as the underlying data sources. In the end, the human is still the pilot.

Blake Morgan

Customer Experience Futurist, Author, and Keynote Speaker
Columnist at Forbes

Artificial intelligence makes marketers better listeners. Machine learning can help determine what a customer needs. That said, being a good listener is an important piece of being a good communicator. It’s marketing’s job to learn how to communicate to prospects and customers. Marketing will rely increasingly on AI in the future to determine what customers need, what they want, and what kinds of experiences they are looking for.

Morgan’s book More Is More: How the Best Companies Go Farther and Work Harder to Create Knock-Your-Socks-Off Customer Experiences can be found here.

John Koetsier

Journalist, Analyst, Futurist, and Dreamer

AI is most definitely going to be a marketer’s friend. Marketers spend far too much time right now gathering, merging, cleansing, and normalizing data, and AI is going to make that simple. (In fact, it already is.) In addition, the customer journey is far too complex — one company mapped 500 points on their customers’ journeys — to really understand, never mind react to in real-time with appropriate messages, resources, or services. AI will also help here, delivering on the promise of technology by enabling one-to-one communication with prospects and customers at scale.

Ankush Gupta

COO & Editor-in-Chief
MarTech Advisor

Marketing is driven by consumer behavior and needs. And, consumers today expect marketers to be clued into their expectations, actions, and needs in real time. Of course, they want this without marketers being the least bit intrusive or creepy. And, marketers on their part want to seem like they knew all of this somehow magically, making the entire delightful experience as seamless as possible.

So, I think yes, AI can and will be marketing’s friend, because there is no other way for marketers to provide personalized delightful experiences at scale. Whether it’s with customer experience or business decisions, AI is going to propel marketing into the future, which is all about data and intelligence.

Not only do I believe AI and marketing will co-exist, AI will be the driver for marketing to keep pace with customer dynamics and evolution. Whether you’re talking B2C or B2B, buying behavior is changing dramatically and at great speed. There is no way to keep pace with it manually. But, AI can do the heavy lifting at scale. Whether it’s tracking the millions of customer-generated data points and variables, running analytics on seemingly disjointed pieces of data and turning it into intel, answering (millions of) customer queries intelligently in real time, or recommending micro-segmented campaign strategy based on predictive models, AI can be the strongest ally for marketing yet. 

The only catch is getting the balance between machine-generated intelligence and human instinct and discretion right. Building in cultural nuances, for example, is a critical area for technology and marketing to get right before entrusting customer experience to AI. 

Dom Nicastro

Staff Reporter

Poor customer experience is an epidemic. It’s going to bring down the largest companies in the next few years. The only ones left standing will be the ones that provide exceptional experiences.

I know this personally because I went through a terrible experience where I was bounced around several different channels in an ordeal that took weeks to resolve. No matter how great the product and technology is, no matter how many great offers and discounts I get, I associate this brand with that experience. And I tell people about it.

How does this relate to AI? I do think machine learning engines could have been used in my experience to at least learn more about me — which channels do I go to in order to resolve problems, what’s my tone and sentiment toward the brand? “Dom, we understand you’ve interacted with us several times recently. Let us help you now.” 

Still, though, AI will not eliminate the need for the marketer to do the foundational work for knowing its customers. In other words, marketers will mostly always be the ones creating content, delivering content, and managing content.

AI can be a great friend. It won’t matter, though, if brands aren’t committed to exceptional experiences ― in marketing and beyond.

Barb Mosher Zinck

CEO, Content/Product Marketer, Marketing Technology Analyst
BMZ Content Strategies
Digital Tech Diary

The discussions around AI and marketing are heating up, and there’s good reason for this. Marketing is under great pressure to deliver more personalized targeted experiences ― and to do that at scale is just not possible using manual techniques. AI is the answer to personalization at scale, but it needs to be done well, and how it’s working needs to be transparent to the marketer.

The biggest concern is a technology vendor will wrap artificial intelligence in a black box and expect marketing to just “know it works.” That’s a bad idea. Marketing needs to see the data that drives AI decisions; they need to understand how AI is implemented. Maybe they don’t understand the details of the algorithms, but they have to understand the strategy and approach. And, depending on the purpose of the AI, marketing may need to influence how it works by configuring some key aspects.

I don’t think we’ll ever be in the position of looking to technology to do everything for marketing automatically, but there are certainly areas where AI can take the load off the marketing team. Marketing can then focus on aspects that technology can’t do or that we don’t want it to do on its own, like creative, like strategizing messaging, designing campaigns, and key customer interactions.

Ann Handley

Head of Content

The editor in me wants to edit your question: It’s not future tense (“Will AI be…?”); it’s present tense (“Is AI…?)

Artificial intelligence is already here. And it’s already our friend.

It might seem that AI will render some marketing jobs obsolete. We could argue that machines that do the gathering and interpreting and optimizing better than people will forever alter the jobs of those marketers who are focused on interpreting data, doing testing, and optimizing campaigns.

But we could also argue that those same marketing technologists and data scientists are also perfectly situated to in turn manage and leverage those AI systems and technologies.

I come out of the creative side of marketing (like most marketers have). So, should we artsy-fartsy creative types be freaking out at the rise of AI?

Nope. Because writers, designers, videographers, and storytellers are more valuable than ever. A world steeped in AI means we have the data and processes in place to get to know our customers better. We can use the brilliant gold that AI beautifully spins for us to craft more effective marketing — to create brighter, more creative, and more winning programs. We can also zig while the machines zag. Because sometimes breakthrough marketing is doing exactly what your customers don’t expect.

At least, that’s how I see things.

To quote Noam Chomsky: “Thinking is a human feature. Will AI someday really think? That’s like asking if submarines swim. If you call it swimming then robots will think, yes.”

Kerry Cunningham

Senior Research Director
Sirius Decisions

AI will be a great friend to organizations that market. AI will be a great friend to marketing leaders and the marketers who learn to incorporate AI into their skillset. Without a doubt, artificial intelligence is going to help organizations and the leaders that implement AI effectively become more effective and more competitive. From automating some prospect communications altogether, to automating various aspects of prospect nurture, prioritizing and sourcing prospects, and even improving pipeline and revenue forecasting, there are a variety of ways AI will improve marketing effectiveness. 

As with all processes that automate what would otherwise be human actions and decisions, there may be displacement. At the same time, many of the actions and decisions AI can improve and automate are actions and decisions that don’t even occur today. For example, marketing organizations have automated, programmatic nurture programs, but these can also be enhanced with AI to react in real time to prospect behavior and market conditions. Such an enhancement should improve nurture performance (possibly leading to growth and new opportunity), but doesn’t involve displacing existing people and processes.

AI can also be inserted into very human-centric processes, such as teleprospecting, to optimize the intensity, timing, and even content of teleprospector calls, helping revive a challenged B2B function.

Back to you:

What are your thoughts on AI in modern marketing? Is your firm taking advantage of any of these technology advancements? Tell us your thoughts, experiences, and predictions in the comments!

Let’s block ads! (Why?)

Act-On Blog

Measure and Magnify Your Impact with Usage Metrics for Dashboard and Report Authors

Power BI makes it easy to deliver valuable insights across your organization. We’ve heard many stories from customers who’ve made a tremendous impact by publishing content that became critical to every day decision making. But sometimes it has been hard to precisely quantify the impact and understand what data is being used, by whom, and for what purpose.

Today, I am excited to announce the availability of usage metrics in the Power BI service for dashboard and report authors. This feature gives you one-click access to key usage metrics for your dashboards and reports, allowing you to pinpoint how your end users are interacting with your content. We hope these metrics can help you justify your investments and prioritize your efforts on the most impactful, widely used content.

Getting started

It’s easy to start taking advantage of usage metrics. Simply navigate to any dashboard or report that you have edit permissions to. From there, select the new “Usage metrics” option at the top right.

339441d1 78a0 49e2 bdca 861d428efb5f Measure and Magnify Your Impact with Usage Metrics for Dashboard and Report Authors

Note: a Power BI Pro license is required to access the usage metrics reports

You’ll be presented with a pre-built report which shows the usage metrics for that dashboard or report over the last 90 days. You will see, for example, the number of views and viewers on your content, and how those numbers stack up compared to other dashboard/reports in your organization. You’ll be able to slice based on how your end users received access, whether they were accessing via the web or mobile app, etc. As your dashboards and reports evolve, so too will the usage metrics report, which updates every day with new data.

66fe64b1 e9e2 4d1f 9ffa 6479b3214750 Measure and Magnify Your Impact with Usage Metrics for Dashboard and Report Authors

Usage metrics will be a powerful ally as you work to deploy and maintain Power BI dashboards and reports. Wondering which pages on your report are most useful, and which ones you should phase out? Slice by report page to find out. Wondering if you should build a mobile layout for your dashboard? The usage metrics report will inform you how many users are accessing your content via the mobile apps vs. via web browser.

Tip: usage metrics are presented as a pre-built report, so you can pin any of the report visuals to a dashboard, to monitor them more easily or share usage with others.

To get even more out of usage metrics, you can use the “save as” feature to create a copy of the pre-built usage metrics.

90d9145d e735 46cf 9600 ed867def67ab Measure and Magnify Your Impact with Usage Metrics for Dashboard and Report Authors

Once you’ve created a copy, you’ll get full access to the underlying dataset, allowing you to fully customize the usage metrics report to your specific needs. You can even use Power BI Desktop to build custom usage metrics reports using the live connection to Power BI service feature.

Better yet, the underlying dataset includes the usage details for all dashboards or reports in the workspace. This opens up yet another world of possibilities. You could, for example, create a report which compares all dashboards in your workspace based on usage. Or, you could create a usage metrics dashboard for your Power BI app by aggregating usage across all the content distributed within that app.

Next steps

Content creator usage metrics are now broadly available, so go forth and explore! We’re excited to see how you’ll magnify your impact with this feature.

  • For further information, be sure to check out the usage metrics documentation
  • Have comments? Want to share a cool use case for the feature? We’d love to hear it! Please leave comments below, or in the community forums
  • Have ideas for where we should take the feature? Create new ideas or vote on existing ones in UserVoice

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Individual Excellence vs. Organizational Impact: Know the Difference!

image thumb 4 Individual Excellence vs. Organizational Impact: Know the Difference!

Guess Which One Grows Your Career (or Company) More? (Hint: It’s the One on the Right)
(But Individual Excellence is a Prerequisite for Org-Wide Impact)

  -Marshall Mathers

image thumb 10 Individual Excellence vs. Organizational Impact: Know the Difference!Last week’s post on learning DAX before learning M mostly met with positive reviews, but also drew some fire.  A few staunch M supporters showed up and voiced their disagreement – including the one and only Ken Puls.  Now I know from experience not to mess with Ken…  OK OK, I confess…  messing with Ken is a barrel of monkeys actually.  Put it on your bucket list.  That said, I have immense respect for his skills and perspective, and only enjoy messing with him out of friendship.  He’s an amazing human being and learns more things in a year than I could ever squeeze into my leaky head over a lifetime.

But I still firmly believe in what I said.  I’m not here to offer apologies – only clarification and justification.  There very much IS an olive branch in all of this, but again, that comes merely from clarification.

First clarification:  I love love LOVE Power Query and M!  They are a godsend!  I never said what some people thought I was saying, which was “meh, you can ignore/neglect that part of the platform.”  Nope, you absolutely benefit tremendously from both.

The minor tension from last week raised a MASSIVELY important point, one that transcends any technical debate and puts things in their proper perspective.  So I’m grateful for the opportunity that the misunderstanding provides us.  Let’s begin with…

image thumb 11 Individual Excellence vs. Organizational Impact: Know the Difference!

At first this sounds like an incredibly abstract question.  I mean, how can you put a dollar figure on massive gains in personal efficiency?  Sounds impossible, right?

But I’ve got a card up my sleeve:  how much is your salary?  It’s not terribly outlandish to say that the best you could EVER do, in terms of speeding up the tasks in your workday, is to completely make yourself redundant.

And the market has already put a dollar value on THAT, right?  It’s called your salary.  No, it’s not a perfect number, not at all.  Many of you reading this are criminally underpaid in some very real sense, for instance.  But this is what the market is saying today, and that’s a very real quantity.  Furthermore it’s not like it’s really possible to automate 100% of your duties using ANY of the tools available today (thankfully!), so it’s somewhat “gracious” to set the max at 100% of salary.

The folks applying for our Principal Consultant role lately have ranged in current salary from a definitely-criminal $ 40k on the low end to a damn-near-executive $ 160k on the high end.

So let’s continue with the “gracious” theme and go with the high end:  $ 160k per year is a serviceable maximum ROI for “making my current job run faster.”

You might be thinking, at this point, that I’m still not being Gracious Enough.  It’s possible, after all, for a single hyper-efficient individual to suddenly replace MULTIPLE other individuals right?  Setting aside the distasteful notion of those lost jobs for a moment, I still think my $ 160k figure isn’t that bad, given that it’s 100% of an entire individual on the high end of the range.  But fine, if you want to multiply it by 3 and make it $ 480k, I don’t think that necessarily undermines any of the points I want to make.  I’m in the business of adding more zeroes to the productivity multiplier, not linear multiplications.

que01 thumb Individual Excellence vs. Organizational Impact: Know the Difference!At first, nothing.  Both DAX* and M, in the early going, are BOTH very much “speed up my current workflow” kind of things.  And that’s perfectly natural – what you’re currently doing is ALWAYS the best place to start, the best place to learn.

(* Remember, when I say “DAX,” I use that as shorthand for “DAX and Modeling,” where “Modeling” is best described as “figuring out how many tables you should have, what they should look like, and how they are related).

And that “improve what I’m currently doing” lens is why M/Power Query steals the show in the earliest demos – it’s easier to see how it’s going to change your life, because it automates/accelerates a larger percentage of what Excel Pros have traditionally done.

Hold this thought for a moment while I introduce something else…

image thumb 8 Individual Excellence vs. Organizational Impact: Know the Difference!This is a real thing, it belongs to one of our clients, and we helped them build it.  To call it a “workbook” is a bit of an insult of course, because it’s a modern marvel – an industrial-strength DAX model with a suite of complementary scorecards as a frontend.  But all built in Excel.  (Power Pivot, specifically).

And this model has provably returned about $ 25M a year to the bottom line for this client.  As in, profit.  Not revenue.  Pure sweet earnings.  This workbook is visible on their quarterly earnings reports to Wall Street.

And this wonder of the modern world is well into its fourth year of service now, bringing its lifetime “winnings” into the $ 100 Million range.  No lie.  This happened, and continues to happen.  This “workbook” is a tireless machine that makes it rain money.

Let’s do some math:  $ 25M per year vs. $ 160k per year is… 150x.

In other words, the ROI of this project went FAR beyond any amount of “accelerating what we already did.”  It was, instead, a MASSIVE dose of “doing something we’ve NEVER done before.”

image thumb 12 Individual Excellence vs. Organizational Impact: Know the Difference!This may sound like a cheap verbal trick, but I sincerely think it is a weighty truth that everyone should internalize.  The workbook above, which now in some sense runs the show at this large client, had no predecessor whatsoever (its “forerunners” were a scattered collection of hundreds of distinct reports, each of which was just burying readers in borderline-raw data).  For an even bigger example, consider that Instagram started as a hybrid of Foursquare and Mafia Wars before deciding to go “all-in” on their most popular feature, photo sharing.  The blank canvas has no ceiling, if you permit me to mash-up some metaphors, and both of these success stories are rooted in a combination of analytics and courage.

What we’ve been doing traditionally, in both the traditional Excel and traditional BI worlds, is nothing to brag about.  Most of our reporting and analysis output has been, traditionally-speaking, designed by the path of least resistance – as opposed to defined by careful and creative thinking about what truly matters.  The president of one of our clients/partners’ told us last week, “people tend to measure what they can easily count, as opposed to what they SHOULD measure,” and I just about leapt out of my chair screaming “YES!  PRECISELY!”

If you want to learn more about this topic, start with Ten Things Data Can Do For You and We Have a “Crush” on Verblike Reports.  For now, it’s time to move on to Leverage.

image thumb 13 Individual Excellence vs. Organizational Impact: Know the Difference!

You wanna know why Data is so “Hot” these days?  It’s because of Leverage.  Data is hot precisely because proper application of data can impact the behavior and productivity of MANY people simultaneously.  You can’t typically save or make millions of incremental dollars as an individual, but it’s “easy” to do if you can magnify benefits across dozens of other people – or hundreds, or even thousands (as is the case with the $ 100M workbook).

In fact, it’s worth considering that the $ 100M Workbook actually offers only modest benefit!  On a per-person basis, on a single day, you wouldn’t even notice the difference.  But multiply that modest, say, 3% benefit across tens of thousands of people and 365 days…  and you get $ 25M per year.  When you have the power of Leverage, you don’t even have to find something “big,” like the Instagram “pivot” from one mission to another, to get something BIG.

We are all, everyone reading this, INCREDIBY FORTUNATE to be working in data, because of its somewhat-unique capacity for leverage.  So many jobs, whether white- or blue-collar, are essentially cogs in the machine, and the top-end benefits they provide are limited to the “just you” size.  But WE have hit the jackpot.  WE have a job that is “unfairly” capable of leverage.  It just fell into our laps.  But then, the mind-numbing dosages of VLOOKUP (in the traditional Excel world) and endless documentation and miscommunication of requirements (in the traditional BI world) deflected us off into a relatively un-ambitious mindset.

Data has ALWAYS had the advantage of Leverage, but the traditional methodologies and tools brought tremendous friction and inertia to the table.  They wore us down – in terms of time, money, and psychic energy.  They “chokepointed” our potential.  They enforced a terribly-linear culture of thinking.  In short, the traditional tools took the potential 100x or even 1,000x leverage possibilities of Data and tamped them down to about 10x – still good!  But so much less than what we COULD do.

Well guess what?  No more chokepoint.  Whatever you want to call it – Power BI, Power Pivot, Modern Excel – the next-generation toolset from Microsoft gives us those extra zeroes of potential.

Note the quotation marks in that heading, because the next section is more conciliatory, but there IS something very important to bring home here.

If you MADE me choose one or the other, I’d definitely choose DAX, because I think it offers us the virtually-unlimited twin powers of  WWNDB and Leverage.  In fact, I don’t think that, I know that – I (and my companies) have been blowing people’s doors off with this new toolset since 2010.  We didn’t even get Power Query until what, 2014?  Fully half the lifetime of this revolution pre-dates M.  Even the $ 100M Workbook predates M!  Heck, until Power Update came along, you couldn’t even schedule refreshes of models that relied on M, which almost by definition “funneled” M usage down the “just for me” path – and to this day, Microsoft still hasn’t finally released a server that natively runs M.

I just don’t think it’s nearly as easy to explore/exploit WWNDB or Leverage via the M path.  Not impossible, because there are plenty of exceptions that prove the rule.  And to be clear, I think most of the exceptions will be in the WWNDB category, not the Leverage category.

And that was kinda my whole point in last week’s article – Power Query dramatically captures the attention of new converts to Modern Excel precisely because of how well it fits and improves What We’ve Already Been Doing, as Individuals.  This is a Good Thing!  No caveats needed.  I just don’t want anyone to become so distracted with it that we miss the Big Wins of WWNDB and Leverage.

image thumb 14 Individual Excellence vs. Organizational Impact: Know the Difference!

This is What We Can Do With “Just” DAX and Modeling

The picture above illustrates how a single individual (you, or a member of your team) can achieve wins MUCH bigger than just them.  And it’s my experience-powered belief that you cannot get a Win of that size without leveraging DAX and Modeling.

But what if you THEN take that single individual’s newfound powers of WWNDB and Leverage, provided by DAX and Modeling, and now make THAT person more efficient?  “Holy Additional Multiplier, Batman!”

image thumb 15 Individual Excellence vs. Organizational Impact: Know the Difference!

If Our DAX Modeler Superhero “Levels Up” with the Efficiency Gains of Power Query / M…  Look Out!

Yeah, if you take THAT person, and make THEM more efficient, WOW, you can do EVEN MORE of the amazing, transformational, WWNB-and-Leverage style work.

Which we can all agree…  is a Very Good Thing.  One of my favorite personal sayings is “the length of a rectangle is not more ‘responsible’ for the area of the rectangle than the width.”  Double either one, and you double the area.  But that’s essentially my point in a nutshell – you can 100x with DAX and modeling, AND you can double with M.  If you had to choose one, choose 100x.  But we don’t have to choose.  Adding Power Query and M to your org-wide-impact powers, even if it’s “just” 2x or 3x, delivers JUST AS MUCH, or more, incremental Big Win as the original 100x.

We can have our flagons full of mead and drink them too, as Lothar once said.

Let’s block ads! (Why?)