• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Monthly Archives: October 2018

Announcing General Availability of Custom and Certified Connectors for Power BI

October 31, 2018   Self-Service BI

We are happy to announce that Custom and Certified Connectors in Power BI are now generally available!

Custom and Certified Connectors

After delivering the Power BI Custom Connectors SDK, our incredible community has developed hundreds of connectors. Some of these were developed by vendors, some by consultants, and some by third parties who were passionate enough about Power BI to write connectors for specific data sources. It’s been an incredible success, and we’re happy to announce it is no longer a preview feature!

Custom Connectors for Power BI are modules that extend Power BI’s capabilities to connect to data sources not covered by what’s shipped with the product. We’ve seen these used to connect to databases, web services, and even specific private data sets (such as census data).

As this feature is no longer in preview, connectors are now fully supported in Power BI Desktop, the On Premise Data Gateway (Personal and Enterprise), and Power BI Service (via the Gateway). We support both refresh of data through import as well as direct query for those connectors that support it. We also have now have a dedicated Power Query section on Microsoft Docs, with all the information you might need to develop a Custom Connector, alongside our existing M Language Reference.

Alongside this, we are announcing our ‘Certified Connector’ program. A certified connector is:

  • Built by the developer
  • Evaluated by Microsoft
  • Distributed by Microsoft
  • Maintained by the developer
  • Supported by the developer

This program will allow owners of data sources to deliver connectors seamlessly via Get Data in Power BI, making it easy for you to take advantage of the capabilities of Power BI and the breadth of our audience.

Connector support is provided by the connector owner, with assistance to the connector owner from Microsoft. If you’re a vendor interested in certifying your connector, reach out to us via this form. More information about requirements can be found here.

We encourage the community to use custom connectors to enable bringing all their data into Power BI, not just from sources we already support. Many vendors have reached out to work with us already, and more information about some of the connectors we’ve already certified are below.

What does this mean?

Custom Connectors are now Generally Available and every Power BI user is encouraged to leverage them – whether that is building new connectors or leveraging connectors built by others. This includes Certified connectors that are distributed seamlessly, as well as using other Custom Connectors built by 3rd parties.

We have new security settings under ‘Security’ in ‘Options’. If you’d like to know more about the security levels, please read the documentation page here. You’ll note that there are two security levels currently—one for ‘custom connectors’ and one for ‘certified connectors’. Because we go through all certified connectors to make sure they’re safe, we’re willing to distribute them at the higher level of security.

However, to ensure your security, modules not certified by Microsoft will not load without changing security settings.

We are aware that there are lots of connectors don’t fall into the certified path, as mentioned above. Sometimes, however, you may trust a vendor to provide you a connector that (for whatever reason) isn’t easily certifiable. It could be a data back-end they don’t own, it might be a specific data set, or it might be a private database technology not generally available on the market. There are lots of reasons, and we will do our best to support this use-case.

Certified Connectors in October Release

As mentioned above, we’ve worked with a number of vendors already on the Certified Connector program. Below is a blurb about each of them. Some of these connectors (Exasol, JethroData, and TeamDesk) were introduced under the feature switch a few months ago—check them out! The others are brand new with the October release of Power BI. All statements below, aside from learn more links, are taken from the ISV.

6b5af8c5 55bd 42af 9568 1879cb8c7769 Announcing General Availability of Custom and Certified Connectors for Power BI

Denodo

The Denodo Platform provides all the benefits of data virtualization including the ability to provide real-time access to integrated data across an organization’s diverse data sources, without replicating any data. The Denodo Platform offers the broadest access to structured and unstructured data residing in enterprise, big data, and cloud sources in both batch and real time, exceeding the performance needs of data-intensive organizations.

To learn more about Denodo, please visit https://www.denodo.com/en

Dremio

Dremio is a data-as-a-service platform that empowers users to discover, curate, accelerate, and share any data at any time, regardless of location, volume, or structure.

To learn more about Dremio, please visit https://dremio.com

Exasol

EXASOL is a high-performance, in-memory, MPP database specifically designed for analytics. From business-critical data applications to advanced analytics, EXASOL helps you analyze large volumes of data in real-time, helping you to accelerate your BI and reporting, and to turn data into value. With EXASOL, you can benefit from real-time analytics in order to improve operational efficiencies, drive business growth, improve customer acquisition rates and deliver excellent customer service.

To learn more about Exasol, please visit https://www.exasol.com

Kyligence

Kyligence provides a leading intelligent OLAP platform powered by Apache Kylin to enable interactive big data analytics with high concurrency from on-premises to the cloud. Deployed on Azure HDInsight, Kyligence helps you matching the dynamic computing and analytics requirements, reducing the operation cost and accelerating business analytics on the cloud. Kyligence is now available on Azure Marketplace.

Kyligence is Power BI partner and Azure ISV partner. To learn more about Kyligence, please visit http://kyligence.io/

Jethro

Jethro is designed specifically for the unique needs of interactive enterprise Business Intelligence (BI) on Big Data. It is a transparent middle tier that requires no change to the BI apps or the underlying data. Jethro employs full indexing, auto cubes and a sophisticated optimizer to address the full range of BI queries. As it is self-driving, Jethro requires no maintenance or tuning, which eliminates costly manual administration.

To learn more about Jethro, please visit https://jethro.io

Paxata

Paxata is a visually-dynamic, intuitive solution that enables business analysts to rapidly ingest, profile, and curate multiple raw datasets into consumable information in a self-service manner, greatly accelerating development of actionable business insights.  In addition to empowering business analysts and SMEs, Paxata also provides a rich set of workload automation and embeddable data preparation capabilities to operationalize and deliver data preparation as a service within other applications.

To learn more about Paxata, please visit https://www.paxata.com

TeamDesk

TeamDesk is an online database system that is developed to facilitate working with data, organize and store information that you use in your routine work.

Using TeamDesk anyone can build his/her own business management solution for any business process or choose a predefined database and your team-members will be happy to use it.

To learn more about TeamDesk, please visit https://www.teamdesk.net

Summary

Custom and Certified connectors for Power BI are now generally available! We encourage the community to use them and experiment with them, and we would love for vendors to reach out to us about certifying their connector. Information on the certification process can be found here, and you can nominate your connector for certification here.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Read More

TIBCO Scribe Conference 2018: Better Together

October 31, 2018   TIBCO Spotfire
 TIBCO Scribe Conference 2018: Better Together

























Integration

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Happy Halloween!

October 31, 2018   Humor

The scariest things this Halloween are Fox News and Donald Trump. Even though they are trying to scare everyone by claiming that the US is being invaded by immigrants. You know, immigrants, like the ancestors of almost everyone in the US. Indeed, the immigrants already invaded, long ago.


© Keef Knight



Also published on Medium.

Related

 If you liked this, you might also like these related posts:
  1. Many Happy Returns?
  2. Happy Fourth of July
  3. Happy War on Christmas, part 2
  4. Happy Holidays!
  5. Happy New Year!

Let’s block ads! (Why?)

Political Irony

Read More

Experience The Cloud With PowerObjects – D365UG Summit 2018 [VIDEO]

October 31, 2018   Microsoft Dynamics CRM
Web Stills 800x600 300x225 Experience The Cloud With PowerObjects – D365UG Summit 2018 [VIDEO]

A few weeks ago, PowerObjects enjoyed the distinct honor of being a Premier Sponsor at Dynamics Communities’ Dynamics 365 User Group Summit in Phoenix. We had a great time chatting with existing clients, meeting prospective clients, and sharing knowledge and insights with colleagues, partners, and users from all over the globe. The annual Summit always makes for an exhausting week, but it is also incredibly rewarding, and this year was no exception. In addition to sponsoring the event, running the PowerObjects booth, talking to literally thousands of people, sharing our own journey to the cloud, and handing out tons of swag, the 21 PowerObjects team members hosted several workshops, demonstrated for crowds large and small the power and potential of Microsoft Business Applications, and showcased tons of the creative ingenuity for which we’re known. Please enjoy this video recap of our week in Phoenix!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Best Practices for Increasing Data Availability

October 31, 2018   Big Data
Best Practices for Increasing Data Availability Best Practices for Increasing Data Availability
Christopher Tozzi avatar 1476151897 54x54 Best Practices for Increasing Data Availability

Christopher Tozzi

October 31, 2018

Understanding the importance of data availability is easy enough. Actually achieving high data availability, however, is harder.

If you’re wondering how you can improve data availability, keep reading. This post discusses some data availability best practices.

Embrace redundancy

Perhaps the most basic step you can take to improve data availability is to ensure that your data is redundant—or, in other words, that you have multiple sources of the data available. That way, a failure in one of the disks, servers or databases that hosts your data will not lead to a disruption in availability.

The challenge with redundancy is striking the right balance between redundancy and cost-efficiency. In a world where budgets are a thing, the number of copies of a database or data server that you can have running at the same time is limited. However, by studying data such as how often a given server or database fails and assessing how important different data workloads are, you can make an informed decision about how much redundancy to implement for each data source.

Assessing the Financial Impact of Downtime Best Practices for Increasing Data Availability

Automate failover

Data redundancy is great, but data redundancy combined with automated failover is even greater.

That is because automated failover (as the term implies) means that when a component of your infrastructure fails, a backup component automatically replaces it. By eliminating the need to wait for a human engineer to detect a failure and switch to a backup system, Automated failover minimizes or completely avoids disruption to data availability.

Many monitoring and management tools for virtual servers and databases make it possible to configure automated failover. And if yours doesn’t, some simple scripting should suffice for ensuring that backup systems come online automatically when a primary system fails.

Avoid single points of failure

Another simple step that you can take to improve data availability is to avoid single points of failure—meaning infrastructure components or applications that would cause your data to become unavailable if they stop working correctly.

The concept here is similar to the point about redundancy made above, but there are differences between redundancy and eliminating single points of failure. You can have a redundant storage infrastructure composed of multiple servers and disks but still be at risk of having them become unavailable if, for example, the network router on which they depend crashes. In that case, your router would be your single point of failure. A well-architected infrastructure would avoid that risk by ensuring that not all data passes through a single router.

Embrace software-defined infrastructure (where possible)

Generally speaking, software-defined infrastructure and storage help to maximize data availability. That is because when the infrastructure and file systems that store your data are defined in software and not directly integrated with the hardware that hosts them, they are easy to move around and to scale.

Keep in mind that software-defined environments can be as simple as a virtual server and virtual disks, which give you the benefits of storage that is abstracted from the underlying hardware. There are fancier ways to do software-defined infrastructure, too, such as by using a file system like GlusterFS or Ceph. But you need not get that complicated to take advantage of software-defined infrastructure and storage in order to improve data availability.

Obviously, you can’t migrate every type of data workload to a software-defined environment. And some software-defined environments are more sophisticated.

Establish and enforce RTO

RTO, or Recovery Time Objective, refers to the amount of time that your business can continue to operate in the event of a disruption to data availability.

Depending on your industry, the amount of data you collect and other factors, you might be able to continue functioning for days without your data, or you may not last more than an hour before suffering critical damage to the business. If your business is a chain of coffee shops, recovering data instantaneously may not be as crucial as it would be if you are a bank and depend on digital data for nearly all of your operations.

If you haven’t yet figured out what your business’s RTO is, now is the time to do it. You don’t want to wait for a data availability disruption to occur before discovering how long your company can continue to function (or not) without its data.

Remember, too, that calculating RTO is only half the battle. In order to make RTO useful, you also need to ensure that you can recover from a disaster within the timeframe specified by your RTO. Here again, you do not want to wait until a disaster happens to test whether your disaster recovery plan is actually capable of meeting RTO requirements. Instead, run periodic drills to see how long it takes to recover data from backup databases and to switch to new data server instances to ensure that you are prepared to meet RTO requirements when the time comes.

Download our White Paper on Assessing the Financial Impact of Downtime!

Let’s block ads! (Why?)

Syncsort Blog

Read More

Design, Plan, Execute & Support – A Supply Chain Evaluation Guide

October 31, 2018   NetSuite
gettyimages 875110490 Design, Plan, Execute & Support – A Supply Chain Evaluation Guide

Posted by Gavin Davidson, Product Marketing Manager for ERP

As hurricane season approaches, it’s a timely reminder that no matter how efficient your supply chain is, we’re all at the mercy of the next natural disaster. Building redundancy into your supply chain is critical, but even then you need to make sure that your supply routes are spread not only across multiple vendors, but also multiple geographies and ideally multiple shipping routes.

When NetSuite went about re-thinking its supply chain software, I went through the exercise of mapping out all of the processes and ended up grouping them into four categories: Design, Plan, Execute & Support. These categories really helped focus the customers I spoke with as we held our process mapping sessions, but they should also be useful when evaluating the effectiveness of your supply chain redundancy.

Design. Everything benefits from having a solid foundation to build on, and your supply chain is no different. True redundancy will inevitably result in multiple variations of your product designs, especially if you’re relying on your contract manufacturers to source local components. In NetSuite, the Advanced BOM feature allows for a specific version of a BOM to be applied to any combination of locations that it’s sourced from – but uniquely, it also allows a single BOM to be associated to multiple end SKUs. This really simplifies things when your dealing with multiple brands of a single item. Making sure that everyone is working from the same version of the BOM at all times reduces errors, ensures optimal cost of quality and should also minimize returns. Having your Engineering Change Order process embedded in your ERP system – unifying design, engineering and execution – should be a critical part of your company’s strategy.

Plan. Building on a well designed product and supply chain, it’s time to plan for the inevitable – disaster! Whether natural or otherwise, Murphy’s Law states its going to happen, so you must be ready for it when it does. One of the keys to effective supply chain management is a control tower, essentially a single place where you can go to see everything that’s happening, evaluate your response and execute upon it. NetSuite has launched a supply chain snapshot feature that’s a central part of any control tower that shows all of your inventory and related transactions with the ability to filter by subsidiary, location etc. Users can easily now look at your global or regional supply situation and quickly source alternates when required, but the key here is to practice your response. Using a sandbox account to plan out some simple and worst-case scenarios and discussing the best ways to resolve them and how those should be communicated internally and externally will make it easier to execute an alternate plan when it’s required.

Execute. Communication is a key aspect in all areas of business, but especially so in a complex supply chain. Keeping open lines of communication and making sure that the message being delivered is concise and specific is critical. Consider establishing a communication strategy for suppliers, customers and employees. In today's social media age, people have more information at their fingertips than they can possibly consume – so it’s your responsibility to guide them to the information that’s most important and to make sure that everyone knows what you expect of them. It is YOUR supply chain after all. Not all your suppliers will be fully digitized, but they can still take advantage of using NetSuite’s portals to make sure they are always aware of your current plan and how you expect them to execute against it.

Support. Support is an interesting category. At the end of the day we ended up putting everything that supported but didn’t directly impact the supply chain into this category. Beyond after-sales support, customer service and financials, one of the biggest areas of concern was fully understanding the cost of sourcing products from other geographies. NetSuite’s Inbound Shipment Management and Landed Cost functionality ensures that you can consider freight, duty and brokerage costs – and the impact on profitability – as part of your decision making.

With NetSuite’s 18.2 release, your supply chain managers have unprecedented access to all the information they need to make the decisions that ultimately affect your business’ ability to execute on your plan. Whether that involves design and engineering changes, full documentation control, real-time tracking of current and future inventory balances, inbound shipment management or accurate accounting of landed costs, they will find all of the necessary information at the tip of their fingers allowing them to execute on their plans and fulfill your company’s goals.

But to really make fundamental change to your business, you’d be wise to consider following our four step methodology to supply chain success: Design, Plan, Execute & Support.

Watch Sneak Peek to catch a preview of what’s new across several different areas of the latest release.

Posted on Tue, October 30, 2018
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Read More

Supplier Risk Management In Financial Services: Overcoming The Dangers

October 31, 2018   BI News and Info
282562 GettyImages 172197217 Supplier Risk Management In Financial Services: Overcoming The Dangers

Supplier Risk Management In Financial Services: Overcoming The Dangers

Ruud Willemsen

The rise of third-party outsourcing has helped the financial services industry innovate and boost efficiency and earnings. Yet fallout from risk-related incidents has shot up as well, while research shows that risk strategies remain inadequate. For example, a cross-industry Deloitte survey of executives responsible for third-party risk management (TPRM) in their companies found that:

  • 87% suffered a disruptive third-party incident in the past 2–3 years
  • 28% faced a major operational disruption due to these incidents
  • 94% expressed low confidence in current tools and recognized the need to implement new and leading technologies

And as regulatory scrutiny, criminal prosecutions, and financial penalties targeting the financial services sector continue to grow, effective risk mitigation becomes an increasingly urgent priority.

Challenges to the three lines of defense

The three lines of defense (3LD) framework introduced in the 1990s has helped financial institutions reduce risk through a system of checks and balances that distributes responsibility across multiple internal and external groups. But challenges often prevent companies from achieving 3LD goals, as shown below:

  • First line: procurement and line of business (LoB) units. These teams must work together to segment vendors across their entire supply chain by risk type and domain, then analyze large volumes of internal and external data to identify and assess supplier-related threats – cybersecurity breaches, bribery and corrupt practices, money laundering, insolvency, data mishandling, regulatory noncompliance – and take actions to manage or remove them. Yet because this data resides in disparate systems controlled by different departments, teams lack the visibility they need to catch potential problems. Duplicate accountability structures can also create confusion about who’s managing what, so risk indicators remain unseen.
  • Second line: governance, risk, and compliance (GRC) teams. This line sets global/regional TPRM policies that help the organization meet legal requirements and gain a comprehensive view of risk issues. It also alerts lines of business (LoBs) to emerging risk trends and offers guidance on how to meet regulations. However, questions frequently arise as LoBs try to implement requirements for their specific business environment, resulting in inefficient one-off interactions that don’t scale. Operational silos and the lack of a central repository to capture and coordinate this information can cause communication problems that increase the chance of compliance violations.
  • Third line: auditing and board of directors. Here senior management represents organizational stakeholders relative to risk issues and maintains oversight, while independent auditors review first- and second-line activities/results to assure the board that exposures have been dealt with. But tracing audit trails through disconnected solutions, spreadsheets, and manual processes often delays progress and impedes accuracy, driving up costs and making institutions more vulnerable to mistakes and sanctions.

Shifting the equation: Best practices to drive success

Recognizing that the best defense is a good offense, financial services leaders are implementing best practices and innovative supplier management solutions to resolve their 3LD challenges. These tactics can work for any business, including yours:

  • Strengthen teamwork by aligning people, processes, and technology. Develop an intentional plan for scaling the risk management process that includes the supplier end of the workflow. Seek solutions that automate your risk assessment capabilities through a unified data set and advanced risk modeling, with alerts and controls that let you quickly halt the use of unsafe suppliers and suggest better alternatives. You also benefit from a 360-degree view of risk across all supplier relationships, integrated risk insights to help stakeholders engage suppliers more effectively, and maximized protection for your audit group, executive management, and board.
  • Enhance collaboration and simplify self-service. Focused collaboration can help you nurture long-term relationships with strategic suppliers and build detailed knowledge about how they operate. The best technology solutions make this easier by providing a centralized system for supplier data that all stakeholders can easily tap into, with a single repository to encode GRC best practices into specific LoB workflows. Self-service tools give suppliers one place to update their information with visibility to all, reducing fatigue from repeated assessment requests.
  • Get real-time intelligence intelligently. Constant changes in global regulations and supplier risk factors make it imperative to manage data efficiently across myriad sources. Solutions with artificial intelligence and machine learning can dramatically compress the time needed for data mining and aggregation, while advanced analytics ensure fast, accurate supplier assessments. Automated monitoring of large third-party data ecosystems informs you about new legislation and tells procurement, LoB, and GRC teams when action is required.

By adopting best practices like these – and the technology solutions that make them possible – you can minimize your risk, protect your brand, and boost your business value to customers and the world.

For additional information about effective technology solutions, please attend our webinar in the EMEA region.


Sharelines

The Digitalist Magazine is your online destination for everything you need to know to lead your enterprise’s digital transformation.

Read the Digitalist Magazine and get the latest insights about the digital economy that you can capitalize on today.

About Ruud Willemsen

As a Solution Consultant within the SAP Procurement team , Ruud Willemsen discusses SAP’s purchasing solutions in various ways with customers. Ruud advises companies on how procurement process automation can bring more efficiency and professionalism.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Digitalist Flash Briefing: Blockchain: Farm To Consumer Is A Journey Of Trust

October 31, 2018   SAP
flash feature image Digitalist Flash Briefing: Blockchain: Farm To Consumer Is A Journey Of Trust

Digitalist Flash Briefing: Blockchain: Farm To Consumer Is A Journey Of Trust

Bonnie D. Graham

Today’s briefing looks at the impact of blockchain on the global food value chain.


https://www.digitalistmag.com/files/2018/10/sapdigitalist101718.mp3

  • Amazon Echo or Dot: Enable the “Digitalist” flash briefing skill, and ask Alexa to “play my flash briefings” on every business day.
  • Alexa on a mobile device:

    • Download the Amazon Alexa app: Select Skills, and search “Digitalist”. Then, select Digitalist, and click on the Enable button.
    • Download the Amazon app: Click on the microphone icon and say “Play my flash briefing.”

Find and listen to previous Flash Briefings on Digitalistmag.com.

Read more on today’s topic


Sharelines

The Digitalist Magazine is your online destination for everything you need to know to lead your enterprise’s digital transformation.

Read the Digitalist Magazine and get the latest insights about the digital economy that you can capitalize on today.

About Bonnie D. Graham

Bonnie D. Graham is the creator, producer, and host/moderator of Game-Changers Radio series presented by SAP, bringing technology and business strategy discussions to a global audience. Listen to the series flagship, Coffee Break with Game-Changers.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Embed Power BI reports in SharePoint online using Apps

October 31, 2018   Self-Service BI

Last week I was trying to embed a Power BI report into a SharePoint online page. I wanted users in my organization with a free license to be able to see the reports.  Sounds easy? I opened the help documentation and went ahead. Unfortunately it was not as easy as it seems as the documentation was missing one crucial point (this will be updated soon).

Steps to get it to work

In the end I got it to work. These were the steps I used:

  1. The author of the report is has Power BI Pro (user 1).
  2. The report consumer (User 2) has a PBI free license
  3. I created a report a in workspace 1 as user 1. The workspace is hosted on premium so User 2 can view the content.
  4. To give access to the report with a large set of users we use apps. Apps support the giving access to groups.
  5. So I created an app containing the report.
  6. Next share I it with myself and the target audience. I used “Install app automatically” to make sure the app is installed and the user has access.
     Embed Power BI reports in SharePoint online using Apps
    Remember: To have access to the report you need to have the app installed.
  7. Now I open the app myself (as user 1) and go to the report
  8. After opening the report I can get the embed report URL
     Embed Power BI reports in SharePoint online using Apps
    This is key to making it work! Use the report installed by the app and not the one  from the workspace!
  9. Now that we have the URL we can create a new team site in SharePoint
  10. I now add the report URL I copied in step 7 to the PBI web part.
     Embed Power BI reports in SharePoint online using Apps
  11. After publishing the SharePoint page I can see the page successfully, including embedded Power BI report:
     Embed Power BI reports in SharePoint online using Apps
  12. Now I add user 2 as member to the SharePoint page
  13. I now open a in private browser and log in to SharePoint as user 2
  14. I now go the team site as user 2. Remember: User 2 needs to have a Power BI license. You can bulk assign free licenses to your users if you don’t want them to get a pop up at this point.
  15. When user 2 opens the page he can see the report embedded in SharePoint.
  16. Done!

Recap

So in short the following items are needed:

  • Create an app for the report that you want to share
  • Share the app to the users\groups who need to see the report, when the app is installed on their Power BI they have access.
    • You can install the app as part of the app configuration so they don’t have to install it themselves.
  • Copy the embed report URL from the app, not the workspace (this is key!)
  • Give access to the SharePoint page to the same group as where you embed your Power BI report.

After the introduction of WS v2

Now this looks pretty complicated but with the new workspaces that are in preview today, and described here, this is going to be much easier. AFter this has gone live you can just replace all the steps above with the following:

  • Create a report in a workspace
  • Add the group you want to see the report as “Viewer” to the workspace
  • Now just embed the page from the workspace in SharePoint
  • Done  Embed Power BI reports in SharePoint online using Apps No more messing about with apps to control security.

Let’s block ads! (Why?)

Kasper On BI

Read More

Conversational experiences: Building relationships one conversation at a time

October 30, 2018   CRM News and Info

NOTE: While I have my head down writing a white paper, I get the distinct honor of introducing you (again) to a dear friend and thought leader, Mitch Lieberman (who you may have met before on these pages last March). Mitch, now the founder and principal strategist of ConversationalX, has been doing this long enough and exceptionally well enough for you to read this carefully and think about it because it’s likely to involve the future of your business. It’s not just CRM anymore – it’s conversations.

Right Mitch?


Conversations represent the past and future of business relationships; designed and built one conversation at a time. There is often a missed opportunity here. A conversation between two people is as old as humanity itself, and conversations have always been at the core of all relationships. Many often forget this simple point. While a conversation creates a shared experience, its interpretation is unique to each party. Thus, mutually beneficial, positive conversations facilitate the building and strengthening of relationships. This perspective is frequently underappreciated. In order to build better relationships, we need to have better conversations. Conversational Systems will not only support customer engagement, they will take a leading role in the future of building business relationships.

A Conversational System is an intelligent platform combined with brand philosophy and business strategy within a communications framework. The core objective is to facilitate comfortable and familiar discussions between two parties that provide mutually beneficial value, establish trust, and embrace building a co-designed relationship.

Conversational Systems are the result of recombinant innovation focused on progress and outcomes. In other words, this is not something new. What is new is looking at modern communications through a different lens and rebuilding the core ideas and concepts for ‘the now’. Focusing energy on conversational flow, design, and understanding will allow directed outcomes to progress through a more natural path. Not to be ignored, every conversation leaves its participants feeling something – an emotional or psychological reaction. The individual’s perception based on the outcome of each interaction is Conversational Experience.

Conversational Experience Excellence is the job-to-be-done of a Conversational System

intelligentplatform1ml Conversational experiences: Building relationships one conversation at a time

Figure 1: Conversational System Logical Diagram

When focused on customer/company relationships, Conversational System design needs to be closely aligned with the discipline of CRM: Sales, Service and Marketing. A strong focus should be placed on outcomes, not on records or process management. There also needs to be some guidelines, which I call the “20 Principles That Will Guide the Practice of Conversational Experience”. Here are the first three. (If you are interested in seeing the complete list, feel free to send me a note. I am easy to find: a twitter (@mjayliebs) DM with your email. It works.

Embedded within each idea is the simple question: How does this make the Experience better? These 20 principles serve as a guide to vendors, consultants and the enterprise. They will also serve as an evaluation guide for capability and maturity.

(3 of the 20) Principles That Will Guide the Practice of Conversational Experience:

  1. Conversations are multifaceted. Getting to the right conversation is about accuracy; having that conversation is about precision. Thus, determining which conversation to have and how to best have it are distinct exercises. While equally important, both are complex. Therefore, practitioners must be clear about which problem they are trying to solve.
  2. Conversations reduce friction. They are familiar and easy to describe. Active dialog allows for clear communication and the ability to course correct. Conversations can take place in-person or over video, voice, web chat, and SMS/messaging. Conversations are synchronous, asynchronous, and may come in bursts. Technological definitions may conflict with human nature.
  3. Conversations have a mission. They create optimal experiences and prove value to both participants of the conversation. This involves supporting and enhancing communication between two people, a person and a system/brand, or two systems. The objective is to have the best, informed, value-based and outcome-driven conversation possible. This is Conversational Experience (cX).

How We Arrived Here

We all have biases. My bias comes from spending the past 20+ years in the practice of CRM and Customer Experience (CX). I have come to the conclusion that CX is often too broad a concept when examining human-to-human or digital engagement; rather, we need to focus on individual conversations. What I mean is that CX includes all interactions from the purchasing experience to implementation (or unboxing) to use. Each experience can be better understood, analyzed, and enhanced if it is segmented into individual points of engagement. Is it possible that we have lost sight of the trees within the forest? Interactions, engagement, and personalization all impact an overall experience. But, do we really understand how and why? What if we narrowed our view and concentrated on the experience associated within each conversation?

We need to consider that each party has their own interest, desired outcome, and individual perception of every experience. A Conversational Experience is a subset of CX by definition. Each conversation can be measured, analyzed and scored, serving as an input into the larger Customer Experience. More importantly, every conversation can be informed, add value and help each side reach their desired outcome.

The approach to defining Conversational Systems and the output, Conversational Experience, is heavily influenced by the fine-tuning and progression of CRM definitions over time. Building on ideas from the past allows for marked progress towards more meaningful and valuable business relationships. Modern thinking and evolving concepts need to support and accelerate discussions surrounding larger topics such as digital transformation and customer experience management. This is not redefining CRM. What is being described is a framework focused on action, based on data, information, insight and knowledge.

The Build-Up

In 2009 Paul Greenberg defined Social CRM -

“Social CRM is a philosophy and a business strategy, supported by a technology platform, business rules, workflow, processes and social characteristics, designed to engage the customer in a collaborative conversation in order to provide mutually beneficial value in a trusted and transparent business environment. It’s the company’s programmatic response to the customer’s control of the conversation”

Then in 2015 Paul Greenberg stated –

“Customer Relationship Management is a technology and system that sustains sales, marketing and customer service activities. It is designed to capture and interpret customer data, both structured and unstructured, and to sustain the management of the business side of customer related operations. CRM technology automates processes and workflows and helps organize and interpret data to support a company in engaging its customers more effectively.”

While we have not come full circle, it is time to add philosophy and business strategy back into the CRM equation, reducing the heavy focus on technology from the 2015 definition, embracing recombinant innovation, and adopting Conversational Systems. This will allow us to alter the focus of our attention where it needs to be, on the conversation between a company and a customer. This is not visionary. This is about being practical and meeting the needs of the customer wherever they are, whenever they elect to engage, and over any channel in which they choose to connect.

Why This and Why Now?

An informal sampling and reading of vendor websites suggests that CX is part of the marketing messaging for greater than 75 percent of software vendors who design, build and/or deliver technology to their customers in support of company customer communications. In each case, the stated business goal of the technology is to facilitate communications, reduce friction, and provide sales/marketing service excellence. The complication is that experiences can neither be dictated nor given. An experience is the customer’s perception of an interaction. The only way to create a shared experience is through a conversation. This is called Conversational Experience. This is different from CX. A conversation is the most natural and comfortable method of communication for people. While each and every conversation needs to be natural and well informed, communication needs to be precise.

In seeking to understand how customers choose to engage with organizations, many look to map the set of steps and touches from first contact through the entirety of the company/customer relationship; the Customer Journey. Where a customer is located along their journey is often misunderstood by the organization. This is more prevalent during the pre-buy phase, the decision cycle, but can happen at any point along the journey. More often than not, a seller hopes that the buyer is ready to make a purchase, only to learn that they are at the research phase of their journey. Meaningful conversations identify misalignment and narrow the gap between organizational process and the Customer Journey. Conversational Systems facilitate and enhance an organization’s ability to bridge the gap more effectively, leading to positive experiences and valuable outcomes.

 Conversational experiences: Building relationships one conversation at a time

Figure 2: Journey versus Process

Conversations Reduce Friction

Conversations are comfortable, familiar and easy to describe. They can take place between two or more parties face-to-face, over voice, or via text-based chat and SMS/messaging. Conversations can be synchronous, asynchronous and may come in bursts. When real people as opposed to Bots are involved, the technological definitions for synchronous and/or asynchronous communication may conflict with human nature. Conversational Intelligence is the practice of designing and supporting well-informed conversations between the company and its customer or an employee and the organization. Intelligence can be active by recommendation, passive by suggestion, or artificial by taking over the conversation. The business objective of a conversation is to increase a customer’s brand loyalty, help solve a problem or provide information. This is the next generation of customer engagement. Conversational Intelligence for employees is about augmentation of knowledge, increased productivity or reducing workload.

Conversational systems need to support optimal business outcomes that lead to positive interactions and experiences for all involved. At a practical level, this is Conversational CRM. Conversational CRM pushes past silos of information and data objects. Conversations ebb and flow, relationships twist and turn, intelligence identifies gaps, and precision narrows and closes that gap – this is Conversational Experience. Within CRM, innovation is happening around the edges, not within its core. Where CRM is traditionally focused on records and data, it is becoming increasingly more about information and insight. While people need information, systems like data. The edge, action-based systems and innovation that use CRM data, is where the action is. The reason is straightforward, where there are many similarities across industries and geographies, as well as between consumer focus and business to business; the differences, no matter how subtle are what allows differentiation.

A Few Closing Thoughts

Many companies have capitalized on the transition from Social to Message based communications, and are moving from broken or choppy communications to Conversational – we are only just at the beginning. Here are a few areas that are ready to breakout, along with a couple that could use some attention and possible rethinking.

Social Customer Service has nearly completed the transition to Conversational Service. Messaging is a core part of the mix and needs to be one part of a larger communications / conversational strategy. Stay tuned, I have my favorites here, the world is going to change!

The future of Digital Assistants or Intelligent Employees is bright. Keep your eye on productivity and AI – where A is “Augmentation” with a strong focus on helping people do their jobs better. I am cautiously optimistic about pushing Assistants beyond what they are ready for… People still buy from people, but Intelligent systems have proven they are here to stay.

The future of CRM is less about managing a relationship and more about focusing on the inputs that define the relationship. Conversations serve as input. Each conversation fosters engagement, and each conversation determines the path of the relationship. CRM is the single best use case for Conversational Systems.

Voice is a valuable type of input and best for an Intelligent Assistant; however, multi-modal is probably better with an Intelligent Advisor. Clarity in marketing messaging is really important here; Voice is not the only mode for conversations, but the importance of Voice as an interface is huge.

CRM is the hub of ALL customer information. There are two market forces trying to alter this course, a risk and an opportunity: unlocking data from organizational silos, and in reverse, making sure data within CRM is accessible to other systems. The idea of system of record, ownership and management of Data is now, and will continue to be a required area of focus for companies of any size.


I hope that’s food for thought and for, well, conversations with your peers. In the meantime, I’m back to work. A few announcements coming up in the next post. See you then.

Let’s block ads! (Why?)

ZDNet | crm RSS

Read More
« Older posts
  • Recent Posts

    • PUNNIES
    • Cashierless tech could detect shoplifting, but bias concerns abound
    • Misunderstood Loyalty
    • Pearl with a girl earring
    • Dynamics 365 Monthly Update-January 2021
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited