Tag Archives: Change

Q&A: Does Your Company Understand Change Management?

Doing it Wrong 300x225 Q&A: Does Your Company Understand Change Management?

Change Management is an often foreign concept, and yet, everybody thinks they understand it. Successful Change Management practices can save companies tons of time, money, resources, and perhaps most importantly it prevents talent from seeking new employment. Realizing how powerful this practice can be, we talked with one of PowerObjects training and Change Management consultants, Sara Jo Justice, to provide you with some must know Change Management information. If you’re seeking these services from PowerObjects, check out more here.

First things first, in three sentences or less, what is Change Management?

Change Management is a process designed to increase the probability of success, acknowledge and address employees resistance to change, build communication practices that support the organizations goals, and support and equip transitions within all levels of a company.

How does implementing Change Management best practices help an organization?

Change Management effectiveness is driven by employees adopting and using the system to drive the organization’s KPIs and goals. Without effective Change Management, organizations will not have a way to plan for and measure the results that they are looking to achieve.

ROI on Change Management is difficult to definitively label, but it’s consistently shown to be effective. What is the number one metric (quantitative or qualitative) that an organization should look to for proof of positive ROI?

True ROI in Change Management is understanding the Ultimate Utilization of the project implementation. Part of the process is having an organization “define” what success looks like to their organization at the beginning of a project so that they can truly measure through adoption, feedback, project performance, and readiness assessments if they have met their project objectives.

How do Change Management consultants differ from a business analyst? Don’t business analysts already look at organizational impacts of a project?

While both roles are critical to the success of a project, the Change Management consultant will really be focused on the significance of the change from current state to future stage. Based upon this change, communication, training, and resistance strategies are developed.

What’s the most common Change Management mistake you see businesses make?

Starting the Change Management and communication discussions too late in the game. By the time they realize there is an adoption problem, a lot of the “damage” may have already been done. People like to hear what is going on early and often.

What’s the most common misconception of Change Management?

We don’t need Change Management, because everyone is excited for this change.” Even when an organization is looking at an improvement to their current status quo, that doesn’t mean there won’t be bumps in the road. Every individual has a different attitude toward change because resistance doesn’t occur in a vacuum. Employee personal lives, their career plans, the degree to which the change affects them, and their history with change … their perspective can vary greatly!

Last, but not least: why do you love being a Change Management consultant for PowerObjects?

There is a great amount of personal satisfaction that comes from not simply tossing someone a manual and telling them what they need to do, but rather, helping an individual see how their day-to-day activities impact their organization’s ability to see success. Helping everyone see that they are an important part of the whole is better than just a “lightbulb” moment; it is tantamount to a “giant spotlight” moment.

Have questions about training or Change Management services through PowerObjects? Our expert trainers, including Sara Jo, are on standby. Contact us!

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

New from Microsoft Labs: Change Tracking Solution for Dynamics 365 released!

If you’ve ever needed to determine which system administrator made a particular problematic change to a solution, until now the process was time-consuming – restoring a backup from the date the problem occurred and querying the database.  In Microsoft Dynamics 365, an out of the box feature to capture the changes on records by the Users or System is achieved by enabling the “Audit” for an entity but there is no out of the box feature to track the changes done by System Administrators/System Customizers or anyone having access to make a change in the application.

No more!  Microsoft Labs”has released a “Change Tracking” solution to find such information easier; check it out! The solution for Dynamics 365 (version 8.2) Online or Dynamics CRM 2016 (version 8.1), Online with documentation, is available via administrators as well as user guides are available for download via:


A downloadable solution for use by Dynamics 365 on-premise system administrators of Dynamics 365 (version 8.2) or Dynamics CRM 2016 (version 8.1), with documentation, is available via:


Notes from AppSource:

Change Tracking Solution
Microsoft Labs

A feature that provides the ability to track the changes on D365 made by System admins/customers
The Change Tracking solution helps in tracking down the details of changes on who updated an entity, JavaScript, assemblies and processes along with the time of update. This solution is built on Dynamics 365 and as well works on Dynamics CRM 2016 (Online/ On premises)


Greg Nichols
Senior Premier Field Engineer, Dynamics 365
Microsoft Corporation

Let’s block ads! (Why?)

Dynamics CRM in the Field

Change your Power BI report to point to an external SSAS model

A common questions I get is to change the connection string to from my report to SSAS after I move my Power BI desktop file into SSAS. It turns out this actually pretty simple as there is an API that allows you to copy and then bind the report to ANY dataset. Let’s walk through this.

I start with a very simple Power BI desktop file that contains a single table that I imported from SQL:

 Change your Power BI report to point to an external SSAS model

I then upload it to Power BI, I added it to it’s own app workspace.

Then I created a report that points to my local SSAS and uploaded it to the same workspace, I do this to make sure I have a dataset in Power BI that points to SSAS. If you already have a dataset present you can skip this step. Of course you have to set up a data gateway in case of using a local SSAS.

So now I have 2 datasets in my workspace (I use the same workspace but you can also have them live in different ones):

 Change your Power BI report to point to an external SSAS model

SalesIt is the dataset that points to my SSAS instance and the SalesPBI is the embedded data model. I also have the 2 reports:

 Change your Power BI report to point to an external SSAS model

Now here comes the magic. I am using a PowerShell report created by my colleague Sirui that allows you to copy and bind the new report to ANY dataset in Power BI. You can find the script here. The key thing here is to make sure the 2 schema’s are the same, it needs to have the same columns, measures, tables etc otherwise the report will throw errors. In my example I didn’t actually use a imported model but used a SQL Server 2016 RTM model with the same schema and that also works.

Ok now for the PowerShell script, it does two things:

1 It creates a copy of the report

2 It binds the new report to a dataset provided in the script

The script uses to the Power BI Clone API to clone the report and rebind it..

First we need to configure the the script, I created a new “app” on https://dev.powerbi.com/apps as described in the script to get a new client Id and set a name for my new report called “SalesNowIt”. Next I got all the ID’s needed for the script to run like report and groupId’s. The script has step by step instructions.

Now after configuring I just run the PowerShell script (no additional changes necessary). And now see a new report showing up:

 Change your Power BI report to point to an external SSAS model

And now when I run the report that previously pointed to my embedded model it still works:

 Change your Power BI report to point to an external SSAS model

But when running profiler I see queries going to my local SSAS instead of the embedded model:

 Change your Power BI report to point to an external SSAS model

So that’s it, pretty straightforward and very straightforward.  Of course you can extend this PowerShell script yourself to do whatever you want, for example, loop through all the reports in a workspace and rebind them all.

Let’s block ads! (Why?)

Kasper On BI

Time for a Change – Groundhog Day Edition

Back in 2013 I announced I’d be joining BlueGranite’s team. Well, it’s like Groundhog Day because I’m joining BlueGranite again. Let me explain…

For 3 years I worked as a solution architect for BlueGranite, a data-oriented consulting firm focused on BI & analytics. In the fall of 2016 I made a change to an in-house BI role at SentryOne. And although this past year has been great in many ways, I missed some things about my prior role, company, and coworkers. So, I’m headed back to BlueGranite. I’m looking forward to working on interesting customer projects with the wicked-smart people at BlueGranite. Consulting is a good fit for me because it pushes me to stay current on technology & industry changes, and I really need to be learning something new all the time to be happy work-wise.

SentryOne is an awesome place – these people care deeply about doing good work. I’m happy I spent a year there. Even though it didn’t end up being a perfect fit, it helped me identify what I value most career-wise. And, I still get to accompany the SentryOne team at PASS Summit (how cool is that?!?) to deliver a session at their bootcamp on Tuesday, Oct. 31st. During the bootcamp I’ll discuss my telemetry project which involved numerous Azure services.

Aspects of the data lake portion of that implementation will be discussed at my pre-conference workshop at SQL Saturday Charlotte coming up on Oct. 13th. (Tickets are still available. Shameless plug, I know, I know.) If you’re near Charlotte and haven’t registered for the SQL Saturday training event on Oct. 14th, you can find more info here: http://www.sqlsaturday.com/683/eventhome.aspx.

My husband says this is a little like a woman who remarries her ex-husband. (Yeah, he’s a little out there sometimes, heh heh.) I’m not sure that’s quite the right analogy, but I certainly am excited to rejoin the BlueGranite team.

Let’s block ads! (Why?)

Blog – SQL Chick

Don’t Miss our Latest eBook: Strategies for Change Data Capture

Change Data Capture, or CDC, is the process that ensures that changes made in one dataset are automatically transferred to the other dataset. CDC  is most often used with databases that hold important transactional data to make sure that data organizations are working with up-to-date information. It plays an important role in various areas of data management like enterprise data warehouse, business intelligence and data quality.

In Syncsort’s lasted eBook, Strategies for Change Data Capture, we review the advantages and disadvantages of different CDC methods and show you how to keep mainframe-sourced data current in Hadoop.

blog banner eBook Strategies for CDC Don’t Miss our Latest eBook: Strategies for Change Data Capture

Download the eBook to see which change data capture strategies work best for your enterprise.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

How to Change Icons of Custom Entities in Dynamics 365

Once you’ve created your custom entities in Microsoft Dynamics 365, you’ll probably want to give them custom icons to make them more easily recognisable, and to give the entities a more finished feel.

Thankfully, this is very easy to do in Microsoft Dynamics 365. Simply acquire the image that you want to use as the icon for the entity, then create two copies of the icon, of sizes:

  • 32 x 32 px
  • 16 x 16 px

The reason for this is that Microsoft Dynamics 365 uses the two differently sized icons in different places. For example, the 32 x 32 px image is used in the site map, while the 16 x 16 px image is used in lookup fields.

In order to use an image as an icon, the image must first be uploaded as a web resource formatted as a .png, .gif, or .jpg. The image also must be smaller than 10 kB.

When you’ve uploaded your desired image, navigate to you custom entity in the solution explorer in Dynamics 365, and select the update icons button at the top of the window. This will open a wizard which will allow you to select your image.

image thumb How to Change Icons of Custom Entities in Dynamics 365

Some Considerations

When selecting an image, you may want to consider using a vector image, as these scale very nicely, and remain sharp no matter the resolution. Also, in interests of keeping things consistent, consider either styling the icons to remain consistent with the out-of-the-box Dynamics 365 icons (white on a transparent background, material design), or making all of the icons for your custom entities consistent in some way.

Let’s block ads! (Why?)

Magnetism Solutions Dynamics CRM Blog

How to Run Workflow on Business Process Flow Stage Change in Dynamics 365

In Microsoft Dynamics 365 when a stage changes in a Business Process, you may want to run a workflow to automate the fetching of information, an invoicing process, or any other number of things. This is incredibly easy to do with Microsoft Dynamics 365.

In the new business process flow designer, simply select Add > Add Workflow, then select the stage within which you want to run the workflow.

image thumb How to Run Workflow on Business Process Flow Stage Change in Dynamics 365

image thumb 1 How to Run Workflow on Business Process Flow Stage Change in Dynamics 365This will add the workflow item to the stage, from which you are able to select the trigger that you would like the workflow to run on (Stage Entry or Exit), and then also the workflow itself that you want to run. Note that there is a caveat for the workflow selected, that it must be an active, on demand workflow for the same entity as the selected stage. The selected workflow will now run on the selected trigger.

image thumb 2 How to Run Workflow on Business Process Flow Stage Change in Dynamics 365

Let’s block ads! (Why?)

Magnetism Solutions Dynamics CRM Blog

How to Run Workflow on Business Process Flow Stage Change in Dynamics 365

In Microsoft Dynamics 365 when a stage changes in a Business Process, you may want to run a workflow to automate the fetching of information, an invoicing process, or any other number of things. This is incredibly easy to do with Microsoft Dynamics 365.

In the new business process flow designer, simply select Add > Add Workflow, then select the stage within which you want to run the workflow.

image thumb How to Run Workflow on Business Process Flow Stage Change in Dynamics 365

image thumb 1 How to Run Workflow on Business Process Flow Stage Change in Dynamics 365This will add the workflow item to the stage, from which you are able to select the trigger that you would like the workflow to run on (Stage Entry or Exit), and then also the workflow itself that you want to run. Note that there is a caveat for the workflow selected, that it must be an active, on demand workflow for the same entity as the selected stage. The selected workflow will now run on the selected trigger.

image thumb 2 How to Run Workflow on Business Process Flow Stage Change in Dynamics 365

Let’s block ads! (Why?)

Magnetism Solutions Dynamics CRM Blog

The Best Healthcare System In The World Is About To Change

Dan McCaffrey has an ambitious goal: solving the world’s looming food shortage.

As vice president of data and analytics at The Climate Corporation (Climate), which is a subsidiary of Monsanto, McCaffrey leads a team of data scientists and engineers who are building an information platform that collects massive amounts of agricultural data and applies machine-learning techniques to discover new patterns. These analyses are then used to help farmers optimize their planting.

“By 2050, the world is going to have too many people at the current rate of growth. And with shrinking amounts of farmland, we must find more efficient ways to feed them. So science is needed to help solve these things,” McCaffrey explains. “That’s what excites me.”

“The deeper we can go into providing recommendations on farming practices, the more value we can offer the farmer,” McCaffrey adds.

But to deliver that insight, Climate needs data—and lots of it. That means using remote sensing and other techniques to map every field in the United States and then combining that information with climate data, soil observations, and weather data. Climate’s analysts can then produce a massive data store that they can query for insights.

sap Q217 digital double feature3 images2 The Best Healthcare System In The World Is About To Change

Meanwhile, precision tractors stream data into Climate’s digital agriculture platform, which farmers can then access from iPads through easy data flow and visualizations. They gain insights that help them optimize their seeding rates, soil health, and fertility applications. The overall goal is to increase crop yields, which in turn boosts a farmer’s margins.

Climate is at the forefront of a push toward deriving valuable business insight from Big Data that isn’t just big, but vast. Companies of all types—from agriculture through transportation and financial services to retail—are tapping into massive repositories of data known as data lakes. They hope to discover correlations that they can exploit to expand product offerings, enhance efficiency, drive profitability, and discover new business models they never knew existed.

The internet democratized access to data and information for billions of people around the world. Ironically, however, access to data within businesses has traditionally been limited to a chosen few—until now. Today’s advances in memory, storage, and data tools make it possible for companies both large and small to cost effectively gather and retain a huge amount of data, both structured (such as data in fields in a spreadsheet or database) and unstructured (such as e-mails or social media posts). They can then allow anyone in the business to access this massive data lake and rapidly gather insights.

It’s not that companies couldn’t do this before; they just couldn’t do it cost effectively and without a lengthy development effort by the IT department. With today’s massive data stores, line-of-business executives can generate queries themselves and quickly churn out results—and they are increasingly doing so in real time. Data lakes have democratized both the access to data and its role in business strategy.

Indeed, data lakes move data from being a tactical tool for implementing a business strategy to being a foundation for developing that strategy through a scientific-style model of experimental thinking, queries, and correlations. In the past, companies’ curiosity was limited by the expense of storing data for the long term. Now companies can keep data for as long as it’s needed. And that means companies can continue to ask important questions as they arise, enabling them to future-proof their strategies.

sap Q217 digital double feature3 images3 copy 1024x572 The Best Healthcare System In The World Is About To Change

Prescriptive Farming

Climate’s McCaffrey has many questions to answer on behalf of farmers. Climate provides several types of analytics to farmers including descriptive services, which are metrics about the farm and its operations, and predictive services related to weather and soil fertility. But eventually the company hopes to provide prescriptive services, helping farmers address all the many decisions they make each year to achieve the best outcome at the end of the season. Data lakes will provide the answers that enable Climate to follow through on its strategy.

Behind the scenes at Climate is a deep-science data lake that provides insights, such as predicting the fertility of a plot of land by combining many data sets to create accurate models. These models allow Climate to give farmers customized recommendations based on how their farm is performing.

“Machine learning really starts to work when you have the breadth of data sets from tillage to soil to weather, planting, harvest, and pesticide spray,” McCaffrey says. “The more data sets we can bring in, the better machine learning works.”

The deep-science infrastructure already has terabytes of data but is poised for significant growth as it handles a flood of measurements from field-based sensors.

“That’s really scaling up now, and that’s what’s also giving us an advantage in our ability to really personalize our advice to farmers at a deeper level because of the information we’re getting from sensor data,” McCaffrey says. “As we roll that out, our scale is going to increase by several magnitudes.”

Also on the horizon is more real-time data analytics. Currently, Climate receives real-time data from its application that streams data from the tractor’s cab, but most of its analytics applications are run nightly or even seasonally.

In August 2016, Climate expanded its platform to third-party developers so other innovators can also contribute data, such as drone-captured data or imagery, to the deep-science lake.

“That helps us in a lot of ways, in that we can get more data to help the grower,” McCaffrey says. “It’s the machine learning that allows us to find the insights in all of the data. Machine learning allows us to take mathematical shortcuts as long as you’ve got enough data and enough breadth of data.”

Predictive Maintenance

Growth is essential for U.S. railroads, which reinvest a significant portion of their revenues in maintenance and improvements to their track systems, locomotives, rail cars, terminals, and technology. With an eye on growing its business while also keeping its costs down, CSX, a transportation company based in Jacksonville, Florida, is adopting a strategy to make its freight trains more reliable.

In the past, CSX maintained its fleet of locomotives through regularly scheduled maintenance activities, which prevent failures in most locomotives as they transport freight from shipper to receiver. To achieve even higher reliability, CSX is tapping into a data lake to power predictive analytics applications that will improve maintenance activities and prevent more failures from occurring.

sap Q217 digital double feature3 images4 The Best Healthcare System In The World Is About To Change

Beyond improving customer satisfaction and raising revenue, CSX’s new strategy also has major cost implications. Trains are expensive assets, and it’s critical for railroads to drive up utilization, limit unplanned downtime, and prevent catastrophic failures to keep the costs of those assets down.

That’s why CSX is putting all the data related to the performance and maintenance of its locomotives into a massive data store.

“We are then applying predictive analytics—or, more specifically, machine-learning algorithms—on top of that information that we are collecting to look for failure signatures that can be used to predict failures and prescribe maintenance activities,” says Michael Hendrix, technical director for analytics at CSX. “We’re really looking to better manage our fleet and the maintenance activities that go into that so we can run a more efficient network and utilize our assets more effectively.”

“In the past we would have to buy a special storage device to store large quantities of data, and we’d have to determine cost benefits to see if it was worth it,” says Donna Crutchfield, assistant vice president of information architecture and strategy at CSX. “So we were either letting the data die naturally, or we were only storing the data that was determined to be the most important at the time. But today, with the new technologies like data lakes, we’re able to store and utilize more of this data.”

CSX can now combine many different data types, such as sensor data from across the rail network and other systems that measure movement of its cars, and it can look for correlations across information that wasn’t previously analyzed together.

One of the larger data sets that CSX is capturing comprises the findings of its “wheel health detectors” across the network. These devices capture different signals about the bearings in the wheels, as well as the health of the wheels in terms of impact, sound, and heat.

“That volume of data is pretty significant, and what we would typically do is just look for signals that told us whether the wheel was bad and if we needed to set the car aside for repair. We would only keep the raw data for 10 days because of the volume and then purge everything but the alerts,” Hendrix says.

With its data lake, CSX can keep the wheel data for as long as it likes. “Now we’re starting to capture that data on a daily basis so we can start applying more machine-learning algorithms and predictive models across a larger history,” Hendrix says. “By having the full data set, we can better look for trends and patterns that will tell us if something is going to fail.”

sap Q217 digital double feature3 images5 The Best Healthcare System In The World Is About To Change

Another key ingredient in CSX’s data set is locomotive oil. By analyzing oil samples, CSX is developing better predictions of locomotive failure. “We’ve been able to determine when a locomotive would fail and predict it far enough in advance so we could send it down for maintenance and prevent it from failing while in use,” Crutchfield says.

“Between the locomotives, the tracks, and the freight cars, we will be looking at various ways to predict those failures and prevent them so we can improve our asset allocation. Then we won’t need as many assets,” she explains. “It’s like an airport. If a plane has a failure and it’s due to connect at another airport, all the passengers have to be reassigned. A failure affects the system like dominoes. It’s a similar case with a railroad. Any failure along the road affects our operations. Fewer failures mean more asset utilization. The more optimized the network is, the better we can service the customer.”

Detecting Fraud Through Correlations

Traditionally, business strategy has been a very conscious practice, presumed to emanate mainly from the minds of experienced executives, daring entrepreneurs, or high-priced consultants. But data lakes take strategy out of that rarefied realm and put it in the environment where just about everything in business seems to be going these days: math—specifically, the correlations that emerge from applying a mathematical algorithm to huge masses of data.

The Financial Industry Regulatory Authority (FINRA), a nonprofit group that regulates broker behavior in the United States, used to rely on the experience of its employees to come up with strategies for combating fraud and insider trading. It still does that, but now FINRA has added a data lake to find patterns that a human might never see.

Overall, FINRA processes over five petabytes of transaction data from multiple sources every day. By switching from traditional database and storage technology to a data lake, FINRA was able to set up a self-service process that allows analysts to query data themselves without involving the IT department; search times dropped from several hours to 90 seconds.

While traditional databases were good at defining relationships with data, such as tracking all the transactions from a particular customer, the new data lake configurations help users identify relationships that they didn’t know existed.

Leveraging its data lake, FINRA creates an environment for curiosity, empowering its data experts to search for suspicious patterns of fraud, marketing manipulation, and compliance. As a result, FINRA was able to hand out 373 fines totaling US$ 134.4 million in 2016, a new record for the agency, according to Law360.

Data Lakes Don’t End Complexity for IT

Though data lakes make access to data and analysis easier for the business, they don’t necessarily make the CIO’s life a bed of roses. Implementations can be complex, and companies rarely want to walk away from investments they’ve already made in data analysis technologies, such as data warehouses.

“There have been so many millions of dollars going to data warehousing over the last two decades. The idea that you’re just going to move it all into a data lake isn’t going to happen,” says Mike Ferguson, managing director of Intelligent Business Strategies, a UK analyst firm. “It’s just not compelling enough of a business case.” But Ferguson does see data lake efficiencies freeing up the capacity of data warehouses to enable more query, reporting, and analysis.

sap Q217 digital double feature3 images6 The Best Healthcare System In The World Is About To ChangeData lakes also don’t free companies from the need to clean up and manage data as part of the process required to gain these useful insights. “The data comes in very raw, and it needs to be treated,” says James Curtis, senior analyst for data platforms and analytics at 451 Research. “It has to be prepped and cleaned and ready.”

Companies must have strong data governance processes, as well. Customers are increasingly concerned about privacy, and rules for data usage and compliance have become stricter in some areas of the globe, such as the European Union.

Companies must create data usage policies, then, that clearly define who can access, distribute, change, delete, or otherwise manipulate all that data. Companies must also make sure that the data they collect comes from a legitimate source.

Many companies are responding by hiring chief data officers (CDOs) to ensure that as more employees gain access to data, they use it effectively and responsibly. Indeed, research company Gartner predicts that 90% of large companies will have a CDO by 2019.

Data lakes can be configured in a variety of ways: centralized or distributed, with storage on premise or in the cloud or both. Some companies have more than one data lake implementation.

“A lot of my clients try their best to go centralized for obvious reasons. It’s much simpler to manage and to gather your data in one place,” says Ferguson. “But they’re often plagued somewhere down the line with much more added complexity and realize that in many cases the data lake has to be distributed to manage data across multiple data stores.”

Meanwhile, the massive capacities of data lakes mean that data that once flowed through a manageable spigot is now blasting at companies through a fire hose.

“We’re now dealing with data coming out at extreme velocity or in very large volumes,” Ferguson says. “The idea that people can manually keep pace with the number of data sources that are coming into the enterprise—it’s just not realistic any more. We have to find ways to take complexity away, and that tends to mean that we should automate. The expectation is that the information management software, like an information catalog for example, can help a company accelerate the onboarding of data and automatically classify it, profile it, organize it, and make it easy to find.”

Beyond the technical issues, IT and the business must also make important decisions about how data lakes will be managed and who will own the data, among other things (see How to Avoid Drowning in the Lake).

sap Q217 digital double feature3 images7 1024x572 The Best Healthcare System In The World Is About To Change

How to Avoid Drowning in the Lake

The benefits of data lakes can be squandered if you don’t manage the implementation and data ownership carefully.

Deploying and managing a massive data store is a big challenge. Here’s how to address some of the most common issues that companies face:

Determine the ROI. Developing a data lake is not a trivial undertaking. You need a good business case, and you need a measurable ROI. Most importantly, you need initial questions that can be answered by the data, which will prove its value.

Find data owners. As devices with sensors proliferate across the organization, the issue of data ownership becomes more important.

Have a plan for data retention. Companies used to have to cull data because it was too expensive to store. Now companies can become data hoarders. How long do you store it? Do you keep it forever?

Manage descriptive data. Software that allows you to tag all the data in one or multiple data lakes and keep it up-to-date is not mature yet. We still need tools to bring the metadata together to support self-service and to automate metadata to speed up the preparation, integration, and analysis of data.

Develop data curation skills. There is a huge skills gap for data repository development. But many people will jump at the chance to learn these new skills if companies are willing to pay for training and certification.

Be agile enough to take advantage of the findings. It used to be that you put in a request to the IT department for data and had to wait six months for an answer. Now, you get the answer immediately. Companies must be agile to take advantage of the insights.

Secure the data. Besides the perennial issues of hacking and breaches, a lot of data lakes software is open source and less secure than typical enterprise-class software.

Measure the quality of data. Different users can work with varying levels of quality in their data. For example, data scientists working with a huge number of data points might not need completely accurate data, because they can use machine learning to cluster data or discard outlying data as needed. However, a financial analyst might need the data to be completely correct.

Avoid creating new silos. Data lakes should work with existing data architectures, such as data warehouses and data marts.

From Data Queries to New Business Models

The ability of data lakes to uncover previously hidden data correlations can massively impact any part of the business. For example, in the past, a large soft drink maker used to stock its vending machines based on local bottlers’ and delivery people’s experience and gut instincts. Today, using vast amounts of data collected from sensors in the vending machines, the company can essentially treat each machine like a retail store, optimizing the drink selection by time of day, location, and other factors. Doing this kind of predictive analysis was possible before data lakes came along, but it wasn’t practical or economical at the individual machine level because the amount of data required for accurate predictions was simply too large.

The next step is for companies to use the insights gathered from their massive data stores not just to become more efficient and profitable in their existing lines of business but also to actually change their business models.

For example, product companies could shield themselves from the harsh light of comparison shopping by offering the use of their products as a service, with sensors on those products sending the company a constant stream of data about when they need to be repaired or replaced. Customers are spared the hassle of dealing with worn-out products, and companies are protected from competition as long as customers receive the features, price, and the level of service they expect. Further, companies can continuously gather and analyze data about customers’ usage patterns and equipment performance to find ways to lower costs and develop new services.

Data for All

Given the tremendous amount of hype that has surrounded Big Data for years now, it’s tempting to dismiss data lakes as a small step forward in an already familiar technology realm. But it’s not the technology that matters as much as what it enables organizations to do. By making data available to anyone who needs it, for as long as they need it, data lakes are a powerful lever for innovation and disruption across industries.

“Companies that do not actively invest in data lakes will truly be left behind,” says Anita Raj, principal growth hacker at DataRPM, which sells predictive maintenance applications to manufacturers that want to take advantage of these massive data stores. “So it’s just the option of disrupt or be disrupted.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

About the Authors:

Timo Elliott is Vice President, Global Innovation Evangelist, at SAP.

John Schitka is Senior Director, Solution Marketing, Big Data Analytics, at SAP.

Michael Eacrett is Vice President, Product Management, Big Data, Enterprise Information Management, and SAP Vora, at SAP.

Carolyn Marsan is a freelance writer who focuses on business and technology topics.


Let’s block ads! (Why?)

Digitalist Magazine

Stop Trying to Win Against Robo-Advisors: Change the Game by Providing Value they Can’t with Today’s “Predictive” CRM

CRM Blog Stop Trying to Win Against Robo Advisors: Change the Game by Providing Value they Can’t with Today’s “Predictive” CRM

Morgan Stanley recently announced it is putting a machine learning-based system in an effort to make its financial advisors (the human ones) more effective. And this means what? That Morgan Stanley is ahead of the pack in understanding that robo-advisors are diluting the entire market, and simply having a “relationship” is not enough to retain clients and keep them loyal. Human advisors today need the ability to offer something more—something robo-advisors cannot: a crystal ball.

But here’s the problem: Human advisors, as far as we know, don’t have one, either.

What Morgan Stanley is doing, though, is s helping its advisors become more proactive by adding predictive technology—which provides more value to their existing clients. The theory they are promoting is that having a human advisor with an “algorithmic assistant” would be preferable to basic software that lumps the clients together using extremely limited information, and allocating assets wholesale within each category, based on how they are profiled.

We think Morgan Stanley has got this right, but do the rest of us necessarily need to have highly customized, complex technology (which is not cheap, I might add) to get us closer to that crystal ball? Or, could your firm achieve this goal with—hmmm—say, your CRM system?

Traditional CRM: It did what it needed to do before, but it’s no longer fighting the fight

If you have worked with CRM in a financial services capacity, you already know what it can do. The typical role for CRM has been that of a control tool—primarily taking care of managing relationships, asset aggregation, and reporting. At AKA, we have been implementing these Microsoft Dynamics CRM systems for many years, and we could easily argue that, when it comes to the integration of data from transfer agents, the systems that manage portfolios, trade settlement systems, and other such programs, we are the go-to experts.

Here’s the problem with traditional CRM systems:  They cannot provide enough value for the financial advisors when it comes to predictive relationship management. The reason for this is that traditional CRM systems are housed on premises, which limits them to using internal data such as roll-up information and account information. These systems have not been able to tap into any external sources, and they also lacked the capability to provide predictive analytics and machine learning. They basically were just not built that way.

Super-charged CRM: Machine learning is the new crystal ball

But hold on: If you’re thinking you’ve lost this war with the robo-advisors, you haven’t. Right now, advisors have the perfect opportunity to show clients that they are not just portfolio managers. They can help their clients reach their goals and realize their dreams. But to do this, they must get in front of the information so they can start providing such an unbelievable client experience that their client begins to think their advisor does indeed possess the ability to see into the future. Today’s CRM can aid them in doing just that. In fact, what CRM can offer now is pretty amazing.

Taking advantage Cloud capabilities, the CRM functionality in Microsoft Dynamics 365 can instantly integrate with other systems along with their information sources. Microsoft also features Relationship Insights, which basically super-charges CRM, turning it into a predictive tool that adds value through a proactive approach. Relationship Insights offers Microsoft’s capacity for managing external data as well as data that you have typically integrated within your CRM system. That takes CRM—and client relationships—to a new, exciting level by leveraging AI and machine learning along with the Cloud.

With the parameters and triggers you establish in the system, CRM reaches out to advisors well in advance—allowing them to make smarter, more predictive decisions that will benefit their clients. Beyond just monitoring their clients’ portfolios, advisors can now ensure that action is being taken on trends in the earliest stages. In addition, by monitoring social media channels, the advisors can provide a more personalized type of guidance. For example, if an advisor begins noticing that a client is posting lots of photos of sailboats on Instagram, he (the advisor) can then reach out to discuss how the client can work the purchase of a sailboat into their financial plan.

With this new layer of technology, a reactive CRM changes to a predictive tool, thereby allowing for greatly improved outreach to your clients and a considerably higher relationship score. The very minute you give a financial advisor the ability to make better decisions in the limited time that they have, you have added more worth to their client relationships. You’ve allowed them to focus more on those clients who are more valuable, increasing the chances for retention.

I’d like to see a robo-advisor do that.

Change the game!

To summarize, the CRM of today is equipped with predictive technology. Programs like Relationship Insights give your advisors the ability to build upon those precious, human relationships they have with their clients, allowing them to focus on helping them achieve their goals and live the dream. And that, my friends, is how you get a client for life.

So, stop playing the robo-advisor game–and learn how to change it. Check out our recorded webcast, Competing with Robo Advisors: How to Carry a Bigger Book of Business While Providing Clients with a High-touch Experience.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365