Monthly Archives: July 2018
Expert Interview (Part 2): Dr. Sourav Dey on Augmented Intelligence and Model Explainability

At the recent DataWorks Summit in San Jose, Paige Roberts, Senior Product Marketing Manager at Syncsort, had a moment to speak with Dr. Sourav Dey, Managing Director at Manifold. In the first part of our three-part interview Roberts spoke to Dr. Dey about his presentation which focused on applying machine learning to real world requirements. Dr. Dey gave two examples of matching business needs to what the available data could predict.
Here in part two, Dr. Dey discusses augmented intelligence, the power of machine learning and human experts working together to outperform either one alone. In particular, AI as triage is a powerful application of this principle, and model explainability is the key to making it more useful.
Roberts: One of the big themes I’m seeing here, what the keynote talked about this morning, is that the best chess machine can beat the best human chess player, but both can be beaten by a mediocre chess player with a really good chess program working together. One of the things you talked about was that kind of cooperation between people and machines speeding up triage, and how that works.
Dey: Yeah, so this is what many people call augmented intelligence. I would say almost 50% or more of the projects that we do at Manifold fall into the business pattern that I call “AI as Triage”. The predictions that AI is doing helps to triage a lot of information that a single human can’t process. Then, the AI presents it in a way that a human can make a decision on it. That’s a theme that I’ve seen over and over again. Both of the examples I gave before fit that, for instance.
In the baby registry example, our client was collecting all of these signals that no single human can understand, all the web clicks, mobile clicks, marketing data, etc. The AI is triaging that and distilling it down so that a marketing person or the product person can make decisions on it.
In the oil and gas company example, it’s the same. The machines are generating fine-tick data from 54 sensors from thousands of locations across the country, no person (or even team of people) can look at that all the time.
Nobody can make sense of that.
Yeah, but the AI can crush it down, and present it to humans in an actionable way. That can really speed up that triage process. So that’s the goal there.
I was impressed by one example you mentioned. You have these decision trees making a decision that something would fail, and that was kind of useful. But the person still had to figure out from scratch why it would fail, and how to repair it. Whereas, if the AI explained … how was that done?
The TreeSHAP algorithm, yeah. It explains how a decision tree came to a particular decision. It’s relatively recent that people are doing some good research into this. Essentially, there is the model that’s making the prediction. Then, you can make another model of that model that explains the original model. It tells you why it made that prediction.
That WHY can be key.
There have been a few competing techniques out there. All of them had some issues, but this group at the University of Washington, inspired by game theory from economics, they made a consistent explanation. It’s called the Shapley Metric. What’s nice, is that they developed a fast version of it that can be used with tree-based models called TreeSHAP. It’s fantastic. We use it all the time now for explanations of why the model is making a particular individual prediction. For instance, today, you predicted .91 probability of failure. Why? You could also use it at the aggregate level, for something like: On the whole, thousands of machines over five years, what was importance of this feature in making the prediction?
And then the person going to repair that equipment knows WHY it was predicted to fail, and therefore has a pretty good idea of what they have to fix.
Well at least they have a much better idea. The maintenance engineers have a web app that they can then dig deeper into looking at the historical time series. In addition, they can VPN into the physical machine. All in all, the explainable model allows them to do triage much faster, and, in turn, do the repair more quickly.
Model explainability is incredibly useful for a lot of things. I know Syncsort has been doing a lot of work around GDPR, and I talked to a data scientist in Germany, Katharine Jarmul about this. For example, if a person wants a loan, and you’ve got a machine learning model that says no, you can’t have that loan, you have to be able to explain why.
Totally, yeah. There are laws about that for important civil rights reasons.
For what you’re doing, the reasons are less legal and more practical. If I’m going to use this prediction in order to take an action, such as a repair, it helps a lot if I know how the prediction was reached.
I can give another example of that. We did work for a digital therapeutics company. They make an app along with wearables that helps people get their diabetes under control. We were making predictions of whether or not, in 24 weeks, is the patient’s blood sugar going to go below a certain level. There’s a human in the loop, a human coach that you get as a part of this program. They didn’t know what to do with the raw prediction probability. When we put in an explainable algorithm, that let them know why that number was high or low, they could have much better phone calls with the patients.
Because they knew WHY the blood sugar was likely to dip.
They could say things like, hey, I see that you’re not doing this food planning very much, or you haven’t logged into the app in a while. You used to log in seven times a week. What’s going on? They have the knowledge ready to have a high bandwidth interaction with the patient.
So, I think there’s a lot there.
The more I learn about model explainability, the more I see where it’s hugely useful.
There are a lot of folks doing cool things with deep learning. It’s far harder to explain, but there’s work being done on that. Hopefully, in the next few years, there will be better techniques to explain those more complex models as well.
Tune in for the final part of this interview where Roberts and Dey speak about the effect of data quality as well as Entity Resolution in conjunction with machine learning.
Check out our white paper on Why Data Quality Is Essential for AI and Machine Learning Success.
Power BI Developer community July update
This blog post covers the latest updates for Power BI Developers community. Don’t forget to check out the June blog post, if you haven’t done so already.
Here is the list of July updates for Embedded Analytics…
Embed capabilities
Programmatic control on visual header
Developer tools
New Power BI developer center
Power BI Embedded in Azure
Power BI Embedded available in China Cloud
Embed capabilities
Programmatic control on visual header
Visual headers contain the action bar of a visual, displaying the user all the actions he can do, such as entering ‘Focus mode’, drilling down or exporting data. In some use-cases in your application, or at least for certain visuals, you might want to limit user access to those actions.
This capability could be achieved using Power BI Desktop, which allows you to hide or edit the visual header in reading view for visuals. Now we are releasing an API that allows you to hide the visual header programmatically.
For example, when embedding specific visuals, you can decide to hide or show the header according to the visual being displayed to the user. You can also decide that for the same report, a specific group of users will have access to header actions, while other will only see the visual without header.
Hiding or showing the visual header can be configured at any given moment in the user’s session- either upon loading the report, or during the session, through ‘Update settings’ API. All the definitions made in Power BI Desktop will apply automatically for the visuals shown in your application, whether you choose to show or hide the header. We will add more support for configuring visual header functionality using API in the future. Learn more on how to hide the visual header using API.
Developer tools
New Power BI Developer Center
We’re thrilled introduce the Power BI Developer Center—a one-stop-shop for developers looking to customize or extend their Power BI solutions, create custom visuals, or embed Power BI in applications. You’ll find easy access to our comprehensive set of APIs and SDK libraries, and pointers for various developer scenarios, such as embedding visual analytics, creating custom visuals, and automating BI solutions. You’ll also find links to all the content and resources for developers, so you can get all the info you need to get your questions answered.
Power BI Embedded in Azure
Power BI Embedded available in China Cloud
Power BI Embedded is now available in China Azure cloud. Learn more on how to configure Power BI Embedded in China cloud.
That’s our update for this month. If you are puzzled or run into few issues, be sure to check our resources that can help you get by:
Accessibility: Adding high-contrast mode support
Windows High-Contrast setting makes text and apps easier to see by using more distinct colors. Read more about high-contrast support in Power BI.
Now you can add high-contrast support to your visual as well.
Check this commit to see learn how high contrast was implemented in sample bar chart.
Find more information on adding high-contrast here. The new API is released with v1.13.1 API .
New source code of our visuals now available on github
We have recently released the source code of the following visuals on github:
Check out the source code at: https://github.com/Microsoft/PowerBI-visuals-HeatStreams.
Check out the source code at: https://github.com/Microsoft/PowerBI-visuals-CardBrowser
Check out the source code at: https://github.com/Microsoft/PowerBI-visuals-TextFilter
Check out the source code at: https://github.com/Microsoft/PowerBI-visuals-TimelineStoryteller.
You can freely access the code on github, modify it, enhance it, and create a new custom visual of your own!
Looking forward to your contribution to the custom visual community!
As always, feel free to use all the communication channels at your disposal to connect with our team, share your thoughts and ask questions:
That’s all for this post. We hope you found it useful. Please continue sending us your feedback, it’s very important for us. Have an amazing feature in mind? please share it or vote in our Power BI Embedded Ideas forum, or our Custom Visuals Ideas forum.
International Operations: Out of Sight, Out of Mind?

Posted by David Hope, Client Success Specialist, EMEA
When a company first sets up shop in a new country, it will often rely on third-party advisors from a local accountancy firm to get things up and running: filing the documents required for legal set-up, keeping a basic set of accounts for the new office, providing some basic reporting.
There’s nothing intrinsically wrong with that approach – in fact, it’s often the best way to navigate an unfamiliar market, with the guidance of people comfortable with its specific tax and reporting requirements as well as business culture.
But problems can quickly arise when a company sticks with that approach for too long, so that an office that has grown to represent a substantial chunk of its business is still being overseen by external helpers, typically using their own software.
This can lead to an alarming lack of visibility and oversight at a corporate level with the international office both out of sight and out of mind. And regardless of how trustworthy the company’s local advisors may be, that’s at best unhelpful and at worst distinctly dangerous.
From the finance director’s perspective, it’s going to be extremely difficult to roll up data into consolidated financial reports. Routine accounting tasks, such as period close and audits, are likely to take longer and consume more resources. And the reports that the corporate finance team provides to company leaders aren’t likely to offer a truly complete view of operations if it’s a struggle to extract the right data out of overseas offices.
In fact, without accurate information on local budgets, costs, forecasts, supplies, labour and projects, it’s nigh-on impossible to know whether the opening of an international office has been a success or failure. Nor has corporate leadership much chance of making sound decisions about the types of opportunities that employees in that region might be profitably chasing, in order to feed the organisation’s wider high-growth strategy.
A better approach might be to still hire those third-party advisors but get them up and running on the company’s global, cloud-based ERP system from Day One. And if the company doesn’t have a global, cloud-based ERP system, it may need to think carefully about whether it’s truly ready for international expansion. Then, as the company grows, newly recruited internal hires can take over tasks previously performed by the local firm.
In this way, all information is handled from the start in the same consistent, shared format. It can be accessed from anywhere in the world, giving managers the benefit of real-time insight into company-wide performance. Reporting and analysis can be provided on a global basis, with reports and dashboards presenting up-to-the-minute views of how well the company is responding to challenges and opportunities, both at home and abroad.
The global ERP system still needs to handle the specific local requirements of the international office through local customisations: currency, language, tax regimes, legal frameworks. But at the same time, it can bring that international office ‘into the fold’, making it an integrated part of the wider business, rather than a far-flung outpost that’s poorly understood back at headquarters.
International expansion is difficult. Even at companies that handle it well, intentions and instructions can get lost in translation. But when employees in different locations work from the same system, using the same information, the likelihood of misunderstandings is reduced, cultural and language barriers can be lowered and everyone can be working on the same page.
Learn how Deliveroo expanded from 4 to 12 countries.
by NetSuite filed under
Does Amazon need an Alexa smartphone to continue dominating the voice-first economy?

Video: Amazon Prime Day 2018: A look at the numbers
Right on the heels of Prime Day last week came Amazon’s second quarter earnings report, which the financial analyst types lauded as a blowout… even if they came up a bit light on expected revenues. And those numbers didn’t even include the estimated $ 4 billion in Prime Day sales.
Read also: Big Bezos: Amazon Alexa controls my entire freakin’ house
Speaking in terms of billions, the official earnings announcement included about a billion bullet-pointed highlights, with a number of them focused on Alexa, including:
- The Alexa Skills store now offers more than 45,000 skills created by third-party developers
- Alexa being available on even more products via the Alexa Voice Service, including Windows 10 PCs from Acer, HP, and Lenovo; and select vehicles from BMW, Ford, and Toyota.
- New ways to navigate and control video content with Alexa, including integrations with TiVo, Dish, Netflix, and DirecTV.
- The Alexa Fund invested in new companies, including Tact.ai, and kicked off the second round of the Alexa Accelerator powered by Techstars, a program empowering entrepreneurs focused on innovating voice technology.
- Amazon adding new Alexa capabilities, including calendaring features, such as the ability to move meetings via voice; new information on current events like the Royal Wedding, World Cup, and NBA Playoffs.
Believe me, these are just a few of the Alexa/Echo highlights mentioned in the release. And maybe these points were there to back up the quote from CEO Jeff Bezos included in the release:
“We want customers to be able to use Alexa wherever they are”.
In fact, the only quotes in the release attributed to Bezos focus solely on Alexa. And the Alexa “wherever they are” message resonated with many people and led to speculation that maybe Amazon is going back to the drawing board to take another stab at selling their own smartphones. Then, that speculation was heightened after Jen Salke, head of Amazon Studios, recently told a reporter she was given a prototype phone that had an improved user interface for viewing Prime Video.
Later, on an Amazon spokesperson walked back that statement from Salke, saying she misspoke and meant to say it was a prototype user interface on an existing cell phone model. But the phone speculation has already left the barn, and it leads to an interesting question. Even though Amazon is still the pacesetter in the voice-first device space, does it need to have its own smartphones to keep its lofty position and hold off the likes of Google, Apple, and others?
Read also: Alexa smartphone: Amazon’s next strike in the mobile IoT war?
It’s been pretty amazing to see the smart speaker market go from “what is it” to “how many should I get” in just over three years. And Alexa has become the poster child (or better yet poster voice) for the early days of the voice-first movement. But it has done so without having a smartphone that is sold with the Alexa app natively included on it. And while it’s selling more Echo devices than ever before, with plenty of green pasture ahead as just a fraction of the addressable market already owns at least one smart speaker, there are literally billions of smartphone owners out there — with Google and Apple having already planted their assistants on them. This poses a more competitive next phase in the road to sustained dominance as voice-first maturity and “Alexa wherever they are” being realized (especially, with Google combining its established position on mobile phones with its growing competitiveness in the smart speaker race).
So, getting back to the question at hand, given the goal Bezos has for Alexa, does Amazon need to have its own smartphone to keep Alexa ahead of the competition? My guess, and that’s all it is, is that it doesn’t. Now, that’s not to say that it wouldn’t help Amazon tremendously on its way to having “an Alexa in every pot.” If it could pull it off, it would be game-changing… or would that be more of a game-keeper in the current context. But I think it would be extremely difficult at this juncture to pull that one off. Smartphones are a saturated market, and I think it would take more than Alexa to get people to move away from the phones they already own and use. And it may be just as hard to get people to download the Alexa app onto phones it doesn’t come natively with, but it would be a less expensive pathway to “wherever.”
Maybe Amazon continuing to look to other kinds of devices that aren’t in saturated categories is the way to go. I still think a device that turns your car into an Alexa device maybe worth exploring… or acquiring. I’m digging my new Roav Viva device that Alexa-enables my car and forces you to put the Alexa app on your iPhone if it wasn’t already there. And most other devices that are Alexa-enabled do the same.
Read also: Does Amazon need a smartphone to win the smart home battle?
I don’t think there has to be one answer of either an “Alexa phone” directly coming from Amazon or more devices that need you to install the Alexa app on your current phone. Either could work, depending on the circumstances and expectations. But I do think that, in order for Alexa to be where Bezos wants her to be, it will become harder to achieve the dominance Amazon had in the early days, as the voice-first market moves into a different phase of the maturity cycle. But, with all that said, would anybody be that surprised if it were able to pull it off?
PREVIOUS AND RELATED COVERAGE:
“Alexa, fart,” plus 15 other useful Echo tricks and tips
What? If you had a multi-billion dollar, state-of-the-art, cloud-based artificial intelligence, wouldn’t you want to see if you could get it to fart? It’s who we are… priorities, people! Priorities.
Alexa is coming to your hotel room
Forgot your toothbrush? Just use Alexa for Hospitality.
7 Alexa commands you’re not using but should – CNET
With Alexa and the the Echo, Amazon has built a robust service that can deliver seemingly endless information and control your home
Amazon Alexa: Cheat sheet – TechRepublic
mazon Alexa is the leading digital assistant on the market.
Serious News Story

Here’s the most serious news story you’ll read all week. Mia Khalifa (a former porn star and person I’d never heard of until today) is going to require surgery on her left boob after she was struck by a puck in a hockey game.
The American-Lebanese babe told the Daily Star: “I was sitting behind the glass during a game, and it came shooting over the glass and it caught me so off guard and I had no idea it was coming.
“I grabbed my chest and I didn’t want to let go, because I felt like if I did let go blood was going to be everywhere.
“I got to take it home, it was the single greatest souvenir any hockey fan can get: a game-used puck that comes at you and hits you.
“They’re really heavy, it’s pure rubber, they go at about 80 mph. My left breast is slightly deflated now and I will be getting it fixed next year.”
You’re welcome for this truly noteworthy piece of news.
Fitting the CRM and ERP Puzzle Pieces Together in Dynamics 365

Microsoft Dynamics 365 is now a family of applications that has changed rapidly. So how do we structure these applications to meet specific business needs and work with each other? There are many approaches to integrated solutions and creating a game plan requires strategic thinking and preparation.
The Customer Relationships Management (CRM) application has been “flattened out” into multiple solutions that can be licensed and purchased individually – each bearing a new name under the Dynamics 365 flag. To be fair, many continue to call the application CRM although the application by that name no longer exists. Many also call it D365, but that too, is a misnomer as D365 refers to all 10 applications under this banner.
To enable these applications, Microsoft has built these applications on a common data platform using a toolset designed to take advantage of Microsoft’s Common Data Services (CDS). Like pieces of a puzzle, this allows them to integrate cleanly with Dynamics 365 for Finance and Operations and integrate to pre-defined integration points. Dynamics 365 for Customer Engagement, Sales, Field Services (FS), Project Service Automation (PSA), and Marketing are technically one back-end application with multiple modular “apps” built on top and all fit seamlessly with Finance and Operations.
So how does CRM fit into an integrated ERP application like Dynamics 365 for Finance and Operations (FO)? To answer this question, we must first answer what the application is natively and where it belongs integrated into purpose-built solutions.
CRM is natively:
- The core of CRM is still about relationships with people and companies. If a business needs to manage any information about contacts or organizations, then CRM has a place within integrated ERP. Broadly speaking, this can include relationships to products, contracts, service tickets, sales cycles, social media insights, and human behavior. Creating a centralized record for a contact or an account which includes all information about how that entity has interacted with your business gives profound insight. With the ability to create custom tables/entities in CRM the application can map to virtually any information a company wants to relate to people.
- CRM is also about managing people in your business and their activities. The configurability of the application is designed so that business processes, business rules, and workflow automation can optimize employees’ correspondence with people and companies. Tasks, appointments, phone calls, service tickets, emails, and custom activities can all be managed within the application by employees or groups of employees.
- CRM is integrated into the Microsoft stack. The application natively integrates with:
- SharePoint and One Drive for document management.
- Exchange for email and activity tracking.
- Excel online for spreadsheet management.
- Office 365 for group and administrative management.
- Power BI for infographic business intelligence reporting and dashboard management.
- CRM is a solution with a highly productized marketplace. There are hundreds of proprietary solutions that have been built on top of the application to accommodate niche business needs. For example, at PowerObjects we have dozens of PowerPack add-ons that enhance the productivity and marketing automation capabilities of CRM.
CRM should be integrated in these situations:
- CRM is not a transactional database integrated with financials and accounting systems. General ledger and sub-ledger accounting are best utilized within Dynamics 365 for Finance and Operations. CRM has invoicing, contract, and quoting capabilities, but functionality ends where the dollars turn into cash in your bank account. CRM monitors sales cycles and pipeline management, but it is not designed to be an accounts receivable, payable, or collections system.
- CRM is not a warehouse and distribution management system. If you are shipping and receiving supplies, dealing with inventory metrics, or facilitating transportation from warehouse to brick and mortar store then use the D365 for Finance and Operations Trade and Logistics capabilities, Warehouse Management, and Transportation modules and integrate with CRM for customer insights. CRM is designed to show what products and services customers have purchased or might purchase, not how the product was shipped, manufactured, or inventoried.
- CRM isn’t a payroll or benefits tool, although the structure for resource management is available. D365 for Field Service and Project Service Automation deal with how resources are assigned to services and tasks, and can be used to tally up hours and estimates for employee optimization. Nevertheless, you need to integrate D365 FS or PSA with D365 FO’s Payroll module to push out paychecks and fringe benefits payments from your bank account.
- D365 PSA is not a project forecasting or procurement system. D365 PSA can certainly create project contracts, assign resources, and plan task and resource assignments. However, the robustness of D365 FO’s Project Management and Accounting module is multi-talented. It allows you to:
- Assign project resources with its integrations to the Human Resources module and search employee skillsets and certifications.
- Procure project assets, item requirements, and expenses using its integration with the Procurement and Travel & Expense modules.
- Capture resource time entry and integrate it within the Project Invoicing process and Payroll module.
- Forecast project completion, planning, and budgeting.
In summary, combining the power and capabilities of the Dynamics 365 family of applications can take an organization to the next level of technology, efficiency, and productivity. But you must lay out the proper strategy and roadmap to take full advantage of these applications, assess their functionalities, and leverage the strengths of each application to work in harmony.
You can learn more about Microsoft Dynamics 365 on our website here. Looking for more posts about Dynamics 365? Be sure to subscribe to our blog!
Happy Dynamics 365’ing!
4 Trends Driving Growth in Cloud Data Lakes
Here at Sisense, we provide analytics for a wide variety of use cases and data sources. This gives us a unique vantage point for identifying trends in the database market, based on customer requests and usage patterns. The most notable trend we currently see is the massive growth of cloud-based data lakes like Amazon S3, Snowflake, and Google BigQuery.
Over the past business quarter alone, customer requests for connection to cloud data lakes doubled – some specific sources even tripling in growth. Compare that to only 20% growth of the dominant cloud analytical database, Amazon Redshift.
A data lake is simply a place for accumulating a ton of data, usually in a semi or unstructured form. The important differentiators of modern cloud data lakes include:
- Infrastructure and cost separation for storage and compute
- Storage of any data type
- Infrastructure abstraction
- Limitless scalability
What’s driving this growth in these technologies? It comes down to Big Data, ease of use, flexible pricing, and an increased supply of quality services.
The Big Growth Drivers
1. Big Data is real, and it’s democratized.
The cloud is creating lots of data, and the cloud is needed to analyze it. Looking back at the past decade, only the world’s largest companies had enough data to warrant a lake. These large businesses could also afford the high personnel and infrastructure costs that were required to set them up on technologies like Hadoop. Many service providers have launched to help manage these complex deployments, but the investment and manpower required has always been daunting.
Today, smaller businesses are also generating petabytes of information, often from web traffic or user data generated in the cloud. User data from a cloud product sold by a mid-size business will often surpass tens of millions of records created a day, scoped only by the granularity of tracked events. These companies understand the value of that data and need affordable ways of capturing it, creating demand for easier tools with scalable pricing models.
2. Cloud makes management easy.
“Data lake” has always been synonymous with Hadoop. While Hadoop is open source, a full deployment can be a multi-million dollar project after the required infrastructure, developers, consultants, and time spent on set-up and maintenance.
In the past few years, new cloud options have come online that offload the maintenance effort to the infrastructure provider, making it much more affordable for smaller companies. Amazon S3 has been leveraged for a large variety of storage use cases but is increasingly being used as an analytical data lake because of its ease of management and new SQL interfaces. You can store anything in S3, and AWS will take care of auto-scaling, encryption and many other utilities.
3. Pricing that makes sense for agile businesses.
Cloud data lakes have pricing models that allow businesses to get started for cheap. Many offer per query pricing that offsets the need for a large upfront investment. Amazon Athena and Spectrum, which are used for querying S3 data, both cost $ 5 per TB scanned, offering businesses the ability to get started for cheap.
Also from a pricing perspective, “separating storage and compute” fits more agile methodologies. With data storage very cheap and most cost coming from querying, it makes more sense for businesses to first build the pipes for collecting all potentially useful data, then later make smart decisions about what’s actually required based on ad-hoc needs.
Compare that style of operating to being forced to choose whether to collect certain data upfront. An analyst can query on previously collected data in hours or minutes. Adjusting the data collection flow, on the other hand, can be a long process involving multiple solutions and engineering resources.
The downside of this model is that users will need to think about cost control when ad-hoc querying transitions into more persistent tasks, as with all serverless computing.
4. More accessibility and choice.
Maybe most importantly, over the past couple years new ways of querying data lakes like S3 have emerged, using SQL tools that interface nicely with analytics applications. This includes:
- AWS offerings like Athena and Spectrum
- Open source tools like Apache Drill, Presto, and Hive
- Integrated solutions like Snowflake and BigQuery, with native compute layers
With the entrance of so many new standard SQL solutions, users can optimize for what’s most important to them between ease of management, performance, scalability, and price. Businesses will need to consider those tradeoffs when they chose their solutions.
While the entire cloud data market will grow, cloud data lakes will take a larger portion of market share from relational databases in analytical use cases.
Beyond that, as needs become more diverse and advanced, we should expect more specialization in organizations’ data pipelines, as teams push different products to maximize across all critical requirements. Certain data flows will demand relational or tuple-based databases for high-speed retrieval of complete records while others will require columnar engines for bulk analytical queries. This underscores the value of starting in the cloud with a trusted infrastructure provider that makes a broad suite of interoperable tools available.
3 Features to Look for in the Best CRMs for Nonprofits

But managing relationships with members and other constituents is complex. Nuances abound. You require systems and software that consider these distinctions.
THE NONPROFIT DIFFERENCE
When it comes to constituent relationship management (CRM), choose a
Relationships are prioritized. The customer is the biggest difference between your mission-driven organization and the profit-obsessed company down the road. Your diverse base of members and influencers fuel your work. Your CRM must create campaigns that are personalized, contextual, and content-rich to ensure you are speaking the language of your people.
Information is maximized. Data tells you about the people you serve and those you’d like to reach. Use the information at your fingertips to create stakeholder personas and more smartly promote your mission to your customer. Empower your teams with intelligent analytics to predict member needs—and meet them with customized attention that builds loyalty while growing revenue.
Processes are streamlined. Test the CRM before you purchase it. Try a demo, or ask for a free trial run. Ensure the software integrates seamlessly with your existing financial management systems, allowing you to track all member correspondence, marketing, and events in one place.
Vendor considerations
When you start shopping for a CRM, you will quickly realize that the vendor space is crowded. Use the above list to weed out the B2B suppliers, and vet the remaining pool with these insights:
- Look for a vendor that is committed to the nonprofit space and will be around in 10 years and longer. There are copious start-up software suppliers. Small- and medium-sized nonprofits will do well to align with a longstanding vendor that has a proven and reputable track record.
- Many nonprofits don’t care about being in the IT game; rather, they want to spend their time and energy on the things that are near and dear to their mission and core expertise. If this is you, choose a solution built and run completely in the Cloud – the vendor will take care of solution management and support.
- Be clear about the need you wish to solve with a CRM, whether it’s tracking member behavior or increasing retention. You don’t need a CRM platform with extra bells and whistles that you won’t use. You do, however, need a solution capable of integrating with and optimizing the existing accounting and marketing systems within your agency.
As you set off on your nonprofit CRM search, consider
Digitalist Magazine’s Top Picks [July 30, 2018]
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognizing you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
You can adjust all of your cookie settings by navigating the tabs on the left hand side.