• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Building

AI Weekly: Tech, power, and building the Biden administration

November 14, 2020   Big Data
 AI Weekly: Tech, power, and building the Biden administration

Best practices for a successful AI Center of Excellence

A guide for both CoEs and business units

Download Guide

After the defeat of Donald Trump, there was little time between Joe Biden and Kamala Harris’ celebratory speeches and the start of conversations about transition team members and key administration appointments.

Some of the first names to emerge include people with tech backgrounds like former Google CEO Eric Schmidt who may be tapped to lead a tech industry panel in the White House. Since leaving Google, Schmidt extended his services to the Pentagon especially for machine learning. He also acted as head of the Defense Innovation Board at the Pentagon, and the National Security Commission on AI, a group advising Congress that more federal spending is needed to compete with China. NSCAI commissioners have so far recommended things like the creation of a government-run AI university and increasing public-private partnership in the semiconductor industry.

Hearing names like Schmidt and others raised questions about how close the administration will get with Big Tech at a time when tech companies are gaining reputations as the next Big Tobacco. Unlike when Biden first entered the White House in 2009, a number of sources today say Big Tech’s concentration of power is an accelerant of inequality.

A Department of Justice antitrust lawsuit against Google and a congressional committee investigation both found that Big Tech companies enjoy an edge based on compute, machine learning, access to large amounts of personal data, and wealth. The congressional report also concludes that Big Tech poses a threat to competitive free market economy but also democracy.

A paper covered by VentureBeat this week found that a compute divide is driving inequality in AI research, concentrating power, and giving an advantage to universities and Big Tech companies in the age of deep learning. The Biden campaign platform committed to increases in federal research and development spending in areas like AI and 5G up to $ 300 billion, spending that could help address that inequality as well as projects identified by groups like the NSCAI.

The Obama-Biden administration developed a reputation for bringing new concepts into the White House like appointment of a chief technology officer and chief data scientist, support for open access to data, and championing public service by people with tech skills, but that all seems like a long time ago.

Speaking to changing attitudes since then, Tim Wu, who testified as part of a congressional antitrust investigation into Congress, told the Financial Times “There has been a shift since the Obama administration, even among the people working in that administration, in the way they think about power in the tech world.”

Despite those changes, work to build civic tech that improves lives remains undone, said Nicole Wong, who served as deputy White House CTO. She entered the role shortly after the Edward Snowden leaks went public in 2014 and was responsible for privacy, internet, and innovation policy. She was also part of legal teams at Google and Twitter. Wong is now serving on a Biden review team for the National Security Council, according to Reuters.

In a speech delivered about a year ago at Aspen Tech Policy Hub in San Francisco, she said the government has outdated and inefficient tech, and that there’s a pipeline problem for people with tech skills who want to apply their talent to public service. Wong said she still believes that the government can make technology that improves human lives and that it’s important that it do so. Modernizing outdated government tech isn’t moonshot technology, she said, but public trust is at its lowest rate since the 1970s, a trend that started before Trump came into office. That pipeline issue is important because the decline in public trust is due in part to a failure to deliver for the people.

“That’s why the non-glamorous work of modernizing a 70-year-old system matters just as much or more as perfecting a self-driving car, or putting a person on Mars,” she said. “If we can order a gluten-free chocolate cake on our mobile phone while sitting in our living room and have it delivered in an hour then we should be able to help a single mother get food stamps without having to take a day off work and fill out paperwork and stand in line at a limited hours government office. We should be able to get our benefits to our veterans who fought for our country and the world that makes this tech possible.”

Some believe Biden plans to take on Big Tech companies like Facebook. Gene Kimmelman, who testified in favor of antitrust reform last year, will be part of the Department of Justice review team, for example. Others have concluded that initial appointments signal the opposite.

If you’re interested in seeing particulars about some of the tech connections, Protocol made an interactive graph that shows connections between acquaintances, family, and current and former employers. Who the Biden administration chooses may reflect its priorities, the diverse coalition that delivered the Biden ticket to the office, and may inspire people to the kind of public service that Wong talked about in order to solve moonshot problems and improve people’s lives. They can also reflect the shift in attitudes about Big Tech and power that Wu mentioned. The involvement of people like chief of staff Ron Klain seems to indicate they will at least believe in science.

In the days and weeks ahead, we will learn more about what the Biden cabinet and heads of federal agencies will look like. Building the Biden administration will have to take a lot of factors into account, from short term problems like a a global pandemic and urgent need for U.S. economic recovery, but also longer term issues like the decline in public trust in government, concentration of power by Big Tech, the continuing decline of democracy in our time, and the increase of surveillance and autocratic rule at a time of accelerating deployments of AI in business and government.

For AI coverage, send news tips to Khari Johnson, Kyle Wiggers, and Seth Colaner — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI Channel.

Thanks for reading,

Khari Johnson

Senior AI Staff Writer


Best practices for a successful AI Center of Excellence:

A guide for both CoEs and business units Access here


Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Building Bridges — Best Practices for BI Teams Working with Data Teams

September 17, 2020   Sisense

Implementing analytics at your company is a multi-team job. In Building Bridges, we focus on helping end-users, app builders, and data experts select and roll out analytics platforms easily and efficiently.

Selecting and implementing a new BI and analytics platform is a big decision and can be a vital part of an organization’s digital transformation. Rolling out a new platform involves everyone who’ll implement, maintain, and most heavily use such a platform.

Advanced analytics and BI democratize access to data, empowering more business users to develop insights, with less reliance on data professionals who have previously been gatekeepers of this information. As business teams become more involved with data in their day-to-day work, it’s natural that they should play a role in choosing the right platform and determining how it will benefit their organization.

Making these decisions — which platform to choose and how to put it into operation—  requires buy-in from both the analytics and BI team (the probable end-users/frontline users) and the data team (who will prepare the data, build the models, and connect datasets).

Importantly, to make the best decision for your organization, each team must understand, acknowledge, and address the needs and concerns of the other.

packages CTA banners BI and Analytics21 Building Bridges — Best Practices for BI Teams Working with Data Teams

Working together: Understanding priorities

Mutual understanding can only come about via dialogue between teams, so that they can understand the priorities and needs that their BI and analytics platform should meet. It’s important that the analytics and BI team clearly indicate their needs and that the data team understand what the BI platform will be used for and how they can build the right data model(s) to suit the analytics and BI team’s requirements.

To help achieve this, let’s look at some considerations that data teams and analytics & BI teams should discuss in the vital conversation about selecting and implementing a new platform, so that they both can get the most out of the process.

The five big questions

First, when considering a BI and analytics platform for your organization, there are five big questions that everyone should ask, irrespective of their function. A simple yes or no answer in each case, will help you determine quite simply some fundamental requirements that a platform should fulfil. These five big questions are:

The conversation

Armed with the answers to the five big questions, you’re then in a position to drill down into some details about what a BI and analytics platform should do for you. This is where the really interesting conversation starts, as youestablish where your requirements, questions, and concerns both converge and diverge. We envisage that the conversation could go something like this:

building bridges convo graphic2 387x455 1 205x770 Building Bridges — Best Practices for BI Teams Working with Data Teams

Usability

BI & Analytics Team: Can we find a BI and analytics platform that’s really user-friendly? Can you involve us in choosing something that offers self-service analytics, so we don’t have to hassle you all the time to help us crunch complex data? We’d really like to be part of the selection process.

Data Team: Sure. Let’s work together to find a platform that can meets your needs and ours. From our perspective, we need to be confident that a platform is robust and that it can handle on-premise and cloud data from any source.

While the data team is concerned with storing, connecting, and preparing data for analysis, the BI and analytics team is concerned with examining the data and creating relationships and comparisons between datasets, in order to surface insights and visualize the data. The former are data experts. The latter is more focused on the business needs and usability is a priority for them.

“When we were using a different BI platform, I wouldn’t let frontline business users touch it,” says Jennah Crotts, data analytics manager at Jukin Media. “They’d click one thing, everything would break, and I’d have to rebuild it from scratch. Now, with Sisense, if somebody is up to speed on their data and has gone through some basic training, I can copy a dashboard and give them ownership.”

Governance

BI & Analytics Team: It’s really important that we make our data as accessible as possible to as many users as we can, without unnecessary impediments. Can we make this happen?

Data Team: Certainly, but let’s not forget governance too. It’s also important for us to be able to control access to our data and ensure that proper policies are in place

Data management, including security, is a priority for the data team. Accessibility and the democratization of data are key focuses for the BI and analytics team. To maximize the utility of an analytics platform and the value of data to an organization, data for decision-making should be as accessible and as easy to understand for as many end-users as possible.

Trust and accuracy

BI & Analytics Team: For data, and ultimately insights, to be as accessible as possible, we need analysis and visualizations to be as simple as possible to find and use. Can we ensure that this is possible?

Data Team: We want to make this happen for you, while at the same time ensuring that data isn’t inadvertently mishandled and that problems don’t arise when setting up new KPIs or dashboards. We need to protect the data, the methods of accessing, and the ways it can be analyzed and visualized, while making it as accessible as possible

Both teams are concerned about the quality of the data they’re handling. The BI and analytics team, and end users, want to be sure that they’re receiving and analyzing the most accurate data as possible. However, it’s the data team’s job to manage and prepare the data, and optimize its accuracy, by ensuring it’s “clean”, without errors, duplications and without it being available to unauthorized users. The ability to set access programmatically, by group, team, department, or individual is an important feature that Data teams will look for in a BI platform.

“The benefit to using a BI system,” says Jennah, is that: “Even if you have all of your data mapped beautifully, is: Can you make sense of it? Can you notice trends? Can you identify what’s going on? Having the right BI tool and visualizations can allow you to do a lot of things.”

Scalability, agility, and capacity

BI & Analytics Team: It’s really important that we get the most value we can from a BI and analytics platform. So it shouldn’t just be suitable for now. It must be able to scale with the growth of our business. And scaling up shouldn’t be costly. Positive ROI is essential.

Data Team: Agreed. The platform we choose must be scalable, and agile, to respond to the growth of our business, the increase in the volume of data we’re handling, and changes in our market that require us to pivot. It needs to have the capacity and capability to grow with us.

In the interests of the business, the BI and analytics team seeks to maximize ROI, which is why ensuring that a BI and analytics platform is future-proofed so that it can scale up with the growth of the organization and the data it generates. The data team is solely focused on the capability of the platform to scale up and its capacity to handle increased volumes of data.

“To test the agility of the platform, I was able to connect the technical person at Sisense with our guy in engineering,” Jennah says. “This was helpful because if you try to reword what your tech guy is saying, it just won’t make sense. So when the right technical people can speak directly to each other, it helps the flow of the setup enormously.”

Teamwork 770x250 770x250 Building Bridges — Best Practices for BI Teams Working with Data Teams

Cost-effectiveness

BI & Analytics Team: On the subject of ROI, a BI and analytics platform has to be cost-effective. It needs to save users time and resources while improving performance, and decision-making. We need to be sure we can achieve these aims, so we should conduct a proof of concept before making a final decision.

Data Team: Indeed. We appreciate that we can’t implement a complex platform just for complexity’s sake. It has to serve our business, our needs, and our objectives. And it has to address the particular challenges that any business has. A POC is essential.

When it comes to cost-effectiveness, the teams align. Both need to ensure that their choice of BI and analytics platform can deliver what it promises, what each of them requires and the maximum benefits to their organization, with no hidden or unexpected costs.

Versatility

BI & Analytics team: Please keep people — the users — in mind. We really need to put end-users first vs focusing mainly on IT needs. Any platform that we choose must be able to solve a range of different data-related issues, each of which affects different people in different functions within the business.

Data team: We actually think that our objectives meet here. Our platform of choice must be able to handle the widest variety of data from the widest variety of sources, precisely to maximize accessibility and usability. More and more data is unstructured (text, video, audio, graphical data and the like) and comes from non-technical sources. It requires technology like NLP to deliver usable findings for all. We want to reduce the time it takes to clean, integrate, and maintain data.

So, for instance, the C-Level wants to see high-level performance reports across departments. Mid-level executives want to make better decisions in their line of business.  All data, and all types of data, must be accessible to all, subject to governance and policies. This necessitates that a BI and analytics platform be as versatile as possible.

“We’re starting to get more and more positive feedback from our C-level executives,” Jennah reports. “They’re saying things like, ‘This is a lot cleaner looking.’ ‘You were able to make those changes a lot quicker than I’m used to.’ ‘You’re able to get this to me faster.’ and stuff like that.”

Speed

BI & Analytics team: We’re missing opportunities when we can’t get insights quickly. We need data and visualizations in real time so we can make decisions and pivot fast.

Data team: We get it. Speed and ease of use are critical, but so is thoroughness. We must be sure that we choose a platform that handles and manages all our data properly so you can be confident that you’re getting the best and most accurate insights.

In a world where data is generated by the second, quickly getting your BI and analytics up and running could give you a competitive advantage. More importantly, making data-driven real-time decisions can mean the difference between success and failure, so from a business perspective, speed and ease of use are imperative. They should not be impeded by the need for accurate data management and preparation, which advanced BI and analytics should be capable of achieving equally quickly.

“The head of our content acquisition team, a major data consumer, was unsure about our switch. She was worried about transition costs and maintaining access to her particular data requirements.” Jennah says. “But when I showed her the first Sisense dashboard on a remote screen-share and was able to edit the dashboard to her specifications in real-time, she lost her mind over the speed and ease of use of the new platform.”

Implementation

BI & Analytics Team: How are we going to implement a new platform? We want results quickly so we can make decisions fast and get changes under way.

Data Team: Let’s not rush this. We’ll do our due diligence with a POC and ensure the right platform meets our needs, then consider how best to implement a new platform within the organization. If we get it right the first time, it’ll be easier, quicker, and more cost-effective in the long term. Tell us what you need and that will dictate timing.

Implementation can happen incrementally by department, company-wide in one sweep, limited to certain levels within an organization, and it’s important to decide whether to include embedded analytics and white-labeling for new revenue opportunities. All of these considerations will influence how quickly a new platform can be implemented.

“There’s been solid social interaction with the right kind of people who understand what they’re doing. Our sales guy did his homework and came back and hit every single request with full beautiful explanations,” says Jennah. “That allowed me to go to leadership and say they’re definitely getting a POC because they hit all the boxes. Because when the VP of engineering gets on the phone and starts asking the questions, they need to be able to answer them.”

Cloud, on-premises, or hybrid?

BI & Analytics Team: We need to have the fastest, most flexible, and most scalable access to data of any kind and in any form, to benefit as much as possible from the information we generate and handle. The Cloud has the capacity and flexibility that legacy on-premises solutions lack. Plus, cloud solutions include support, maintenance, and upgrades, which is cost-effective.

Data Team: Sure. We just need to consider whether we put all of our BI and analytics on the cloud, run it on-premises, or have a hybrid of the two? And how do we handle legacy systems on-premises? Is it more cost-effective and efficient to migrate them to the cloud, or maintain them on-premises alongside cloud analytics for newer data?

This is a key consideration when seeking to maximize the capabilities of BI and analytics and future-proofing them. The future of data and analytics is in the cloud. Its capacity is almost unlimited. You only pay for the capacity and processing power you need. There’s no need to invest in your own IT servers or additional data center facilities or hire a team to manage and maintain the application. The cloud services vendor will do all of that for you. And implementation is immediate, with no lead time need for ordering, installation, or deployment.

“I was really interested in a cloud-based interface.” Jennah reports. “I didn’t want a situation I had with another platform where I was constantly downloading and uploading data to communicate with internal business users. Other platforms that are so reliant on Internet speed won’t cut it in our current situation. It’s faster for me to build reports from scratch in Sisense than it would be to edit them in another platform.”

The first step to data-driven success

Choosing and implementing the right BI and analytics platform is a major project that can be hugely valuable for your organization. It will enable you to maximize the value you get from your data and will optimize the insights you can take from it. Plus, done right, it will empower many more of your colleagues to engage with data, and make vital decisions. Asking these questions and having this conversation, can be the first, essential step in establishing a successful, data-driven future that can supercharge your organization.

packages CTA banners Cloud Data Teams Building Bridges — Best Practices for BI Teams Working with Data Teams

Adam Murray began his career in corporate communications and PR in London and New York before moving to Tel Aviv. He’s spent the last ten years working with tech companies like Amdocs, Gilat Satellite Systems, and Allot Communications. He holds a Ph.D. in English Literature. When he’s not spending time with his wife and son, he’s preoccupied with his beloved football team, Tottenham Hotspur.

Let’s block ads! (Why?)

Blog – Sisense

Read More

Can a Microsoft Dynamics 365 User Save Money by Building Their Own Reports?

September 15, 2020   CRM News and Info
crmnav Can a Microsoft Dynamics 365 User Save Money by Building Their Own Reports?

Companies evaluating Microsoft Dynamics 365 Sales will often ask us, “Can Microsoft Dynamics 365 users build their own reports?”

The answer comes down to a couple of factors:

  • What kind of reports do you need?
  • What kind of relevant experience does your team have?

By building their own reports, a business could potentially save thousands of dollars. However, there is a lot to consider.

First things first: What kind of reports do I need?

This is often the main source of confusion. What one person considers a report may just be a list for another. When asking your CRM partner for a report, there may some misunderstanding as to what constitutes a report.

Here are three common substitutions to consider instead of a full-blown report:

  1. CRM Views: Views in Dynamics 365 Sales are essentially a filtered list based on certain criteria put in place by the user. For many clients, this is enough on its own. Moreover, once a user learns how to create their own CRM views, it helps them become autonomous. This helps speed up the analysis process for this user or department.
  2. Dashboards: Dashboards display critical data that you need to review on a regular basis from your CRM (customer relationship management) tool. Potentially, much of the data you would need in a report can be viewed from a dashboard, which you can create yourself within Microsoft Dynamics 365 Sales. Key statistics can be displayed as graphics or lists depending on user preferences and requirements. Like Views, this is something that users can create themselves with a quick training, some understanding of the fields within the forms, and a vision of what they would like to review.
  3. Excel templates: As a final option we would suggest looking into Excel templates in the CRM. This allows you to build your spreadsheets as you would like them to be presented and refresh the data as you need it. The most common application for this tool is to share information with your clients. What differentiates this from a report is the limitations in filtering and intelligence (e.g.: display this information if “x” exists). For some applications it makes more sense to create a report rather than an Excel template.

By using these tools to their full potential, you can probably limit the number of reports you need and save the extra costs associated with building them. The first step is then to ask yourself if you really need the information to be consumed as a report or if one of these tools can meet your needs.

In some cases, reports are necessary as there are limitations to what you can do in a view. If this is the case for you, you likely want to know if you can build these reports yourself and what the cost will be.

Next step: Can my team create their own reports in-house?

The short answer is it depends. Creating your own reports depends on the relative competency levels available internally. Ask yourself if anyone on your team has the SQL/Fetch XML experience required to build reports.

For those who want to create their own reports but don’t have the experience, it might be worth considering Microsoft Power BI instead. This business intelligence solution is more dynamic and simpler to use once the data set is created and in place.

You could improve your skills by taking classes and consulting a partner. But realistically, the cost of these could come out to about the same as having your reports developed by a Microsoft Dynamics partner.

Bottom line: Can I build my own reports or not?

The answer is not the same for everyone. Building your own reports may sound appealing, but unless you have the competencies required internally, you may have to look at alternatives, like Power BI, CRM views or dashboards, and Excel templates. It basically boils down to what kind of report you need and how much experience your team has.

Our suggestion would be to have an honest and open discussion with your Microsoft Dynamics partner to ask them what building a specific report entails. Don’t be afraid to ask questions to see if there is a better way of doing things. This way the partner will get a clearer idea of what you need, and the effort required to produce it.

Here at JOVACO Solutions, we have decades of experience working with organizations to help them get the information they need. Our consultants are available to figure out the best way for you to take advantage of the wealth of data stored in your system. Contact us today to tell us more about your challenges and objectives.

By JOVACO Solutions, Microsoft Dynamics 365 specialist in Quebec

About JOVACO Solutions

JOVACO Solutions is a leading ERP and CRM solution provider operating in Quebec for over 35 years. As a specialist of Microsoft Dynamics business management solutions, we offer a wide range of products and services to meet all the needs of professional services firms and project-based organizations. We also offer specialized project management tools and timesheet add-ons fully integrated to Microsoft Dynamics solutions. Visit our website or contact us for more information.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Monitoring the Power Platform: Custom Connectors – Building an Application Insights Connector

June 28, 2020   Microsoft Dynamics CRM

Summary

 

Connectors are used throughout Power Platform Pillars such as Microsoft Power Automate and Microsoft Power Apps. They are also used in Azure services such as Azure Logic Apps. Connectors are a wrapper around first and third party APIs to provide a way for services to talk to each other. They represent the glue between services, allowing users to setup Connections to connect various accounts together. These connectors encompass a wide range of SaaS providers including Dynamics 365, Office 365, Dropbox, Salesforce and more.

This article will demonstrate how to build a custom connector for use with Power Automate and Power Apps Canvas Apps. This custom connector will attempt to build a connection to Azure Application Insights to assist Makers with sending messages during run time. We will discuss building and deploying an Azure Function and how to construct a Custom Connector. Finally we will discuss testing and supplying run time data from Power Automate.

Overview of Azure Function

 

Azure Functions provide a great way to build micro services including ones to help surface run time data from Power Automate Flow or Model Driven Application Plug-in tracing. Azure Functions can be written using .NET Core and included is native integration with Azure Application Insights. Alternatively, we can import the Azure Application Insights SDK to provide a streamlined approach to delivering messages. This article will focus on using the HTTP entry point and Azure Application Insights SDK to deliver messages to Application Insights.

Overview of Custom Connectors

 

Custom Connectors allow developers to supply custom actions and triggers that can be used by Microsoft Power Automate and Microsoft Power Apps. These connectors provide a reusable no or low code approach to integrating with an Application Programmable Interface otherwise known as an API. The complexity of interacting and implementing a connection and call to the API is hidden from makers, allowing focus on finding a solution to whatever business objective is at hand.

The custom connector can be thought of as a solution to a no cliffs approach to empowering makers. If a connector doesn’t exist for your particular need, for instance an in house API, a custom connector can be used to bridge the gap. No longer are we having to build and send HTTP requests or manage connection flows as the connector will fill the gap.

For additional information, including considerations for Solution Aware Custom Connectors allowing for migration between environments, refer to the article Monitoring the Power Platform: Connectors, Connections and Data Loss Prevention Policies.

Overview of the Open API Specification

 

Open API is a specification built on the desire and need to standardize how we describe API endpoints. Built from Swagger, the Open API dictates how an API should be used. Everything from security to required fields are detailed allowing integrators to focus on developing and not chasing down API specs. One great feature is the ability to design a specification first without relying on code being written. This allows Makers to define what they are looking for with Custom Connectors.

This specification is used by Power Platform Custom Connectors to build out the various triggers and actions that will be made available. This will be covered in more detail in the Building a Custom Connector section.

For more information regarding Open API please refer to this reference from Swagger.

Building the Azure Function

 

The section below will document the steps I took to create the Azure Application Insights Azure Function. There are several ways to build, frameworks to use, additional requirements to adhere to, etc that are not represented here. That said these steps should allow developers to create a proof of concept that can be used to learn and build from.

Creating the Visual Studio Project and Gathering Dependencies

 

To build the Azure Function, I started with Visual Studio 2019 and created a new Azure Function project.

AUTHOR NOTE: CLICK EACH IMAGE TO ENLARGE FOR DETAIL

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

I chose a name and left everything else as the default values. From there I chose the HTTP trigger and .NET Core Framework v2 version for my Azure Function.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Once loaded I add the latest NuGet package for Azure Application Insights.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

The goal with this Azure Function project is to avoid any additional dependencies and simply deliver messages to Azure Application Insights and return an object that could help me track messages. An image of the code I used is below. For the full Azure Function code, please refer to the Samples folder within the MonitoringPowerPlatform GitHub repo that includes all samples from the Monitoring the Power Platform series. For this sample, a direct link can be found here.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

In the sample, I’ve embedded the custom connector definition described below and image within the solution file.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Testing and Deploying to Azure

 

For testing I typically use Postman to build a collection of test requests. Its a free application that is an industry standard for testing APIs. Also as noted, its in the documentation for crafting a specification for a Power Platform Custom Connector.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Once you’ve tested and are ready to to deploy, right click on the Azure Function project and choose Publish. For my example I published using the Zip Deploy mechanism.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Building the Custom Connector

 

The Custom Connector can be built by hand or using an Open API definition. For my connector, I defined and tested my Azure Function and deployed using Visual Studio. Once running, I was able to use Azure API Management to assist with defining the specification for use with custom connectors. The documentation points to using Postman as a primary tool for the specification, however I wanted to mention other techniques to achieve the same goal. To follow a step by step guide using Azure API Management, refer to the article Create an OpenAPI definition for a serverless API using Azure API Management.

Icon, Description, Host and Base URL

 

The first section of the wizard will expect the endpoint from your Azure Function App. The specific operations will be defined later but for now, insert the Host and “/api” if part of your Azure Function URL. In my Azure Function this looked like “<functionapp>.azurewebsites.net“.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Next, the icon background color and image will need to be updated. I’ve noticed that the image will look skewed when used as an action but as a connection listed in a Power Automate Flow or Canvas App it looked ok.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Blue?…No thanks!

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Security

 

The custom connector will need security defined which will be how we establish our connection, similar to other connectors. Many options exist, including OAuth 2.0 and Basic authentication, but for Azure Functions an API Key works well. Name the parameter “code” and set the location to “Query” as shown below.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

The code will need to be standardized across functions or you may have to create multiple connections. A quick way to do this is to create a key for the host.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

NOTE: Whatever this string is will be what is needed for the Custom Connector we will define below so I would suggest generating this yourself.

The Definition

 

The custom connector definition is where we begin to realize how our connector will be used within Power Automate flows. Each action will need to be defined within the “paths” section of the specification. Its important to point out here why the choice to use Azure Function helps here. By defining individual functions we can create actions that align directly with Azure Application Insights tables or to domain or workload specific actions. The example I’m using only shows that direct alignment but depending on the need an action can be invoked which from Azure Function can send multiple messages to various tables or even log stores (e.g. Azure Log Analytics).

Each action requires an operationId and response. Optional parameters include summary, description and parameters. Parameters define the fields in our custom connector action. Consider the following image, showing the Track Event action:

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Now compare that to the definition of the specification:

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Property Considerations

 

One item I ran into fairly early on was how to distinguish and work with json objects within other objects. Consider the scenario below:

{
    "correlationid":"testcorrelation",
    "name": "test app insight",
    "properties":
        {
            "user":"John",
            "userTwo":"Jane"
        }
}

My plan for this object was to take the properties shown above and add this to the customDimensions field within the customEvents table in Azure Application Insights. The issue I encountered was the data type of the field, thinking that I could use a serialized object as a string data type.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

However the custom connector encoded this string resulting in a mismatch from what I was expecting in my Azure Function to what was actually delivered.

The expectation now was to reference the Open API or Swagger documentation to utilize the object or array data type to assist.

Testing the Custom Connector

 

The custom connector wizard includes a window for testing each operation. This is useful to see how the request properties are sent to the Azure Function from the custom connector and how the response looks. To begin, start by creating and testing a connection.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Next, choose an operation and fill out the properties defined in the Open API specification. When ready, click the Test operation button. A nice additional feature here is that is generates a cURL command that can be used locally.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

The request and response will be shown from the operation. Here is where we can continually refine the custom connector specification to provide the correct data types to the Azure Function.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

The sample for this custom connector is located here.

Using the Custom Connector

 

Once the Custom Connector has been created it can be used within Power Automate Flows or Power Apps Canvas Apps. I would assume this also applies to Azure Logic Apps. Below is an example using Track Event. In this example I’m including the correlationId that I passed from my originating request as well as the name property. I’m also using workflow, trigger and action objects detailed in the article Monitoring the Power Platform: Power Automate – Run Time Part 1: Triggers, Workflows and Actions.

 Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

To wrap, here is a gif showing an example run of the Application Insights Tester Power Automate Flow. The full sample can be downloaded here.

TestingFromPostman Monitoring the Power Platform: Custom Connectors   Building an Application Insights Connector

Next Steps

 

In this article we have discussed how to build and deploy an Azure Function to help deliver messages to Azure Application Insights. We then created a custom connector to allow Makers the ability to interact with our Azure Function like any other connector.

Continuing down this path we can use this approach for extending other logging APIs such as Azure Log Analytics. We can even extend the Common Data Service connector if needed. Be mindful however, of the nuances between the Open API specification and what Custom Connector requires.

In previous articles, we discussed how to evaluate workflows, triggers and run functions to help deliver insights. We have also discussed how to implement exception handling within Power Automate flows. Using the connector above, we can now send specific results from our scoped actions to Azure Application Insights allowing for proactive monitoring and action.

If you are interested in learning more about specialized guidance and training for monitoring or other areas of the Power Platform, which includes a monitoring workshop, please contact your Technical Account Manager or Microsoft representative for further details.

Your feedback is extremely valuable so please leave a comment below and I’ll be happy to help where I can! Also, if you find any inconsistencies, omissions or have suggestions, please go here to submit a new issue.

Index

 

Monitoring the Power Platform: Introduction and Index

Let’s block ads! (Why?)

Dynamics 365 Customer Engagement in the Field

Read More

Best Practices to Get Paid Faster – Building a Successful Recurring/Subscription Business​

June 17, 2020   Microsoft Dynamics CRM

As Simon Sinek says – “Start with Why”?

Volumes play an important role in a subscription business. This revenue is billed more frequently, in fact monthly- for 1 customer you are generating 12 invoices a year! The invoices are typically smaller in size than one-off sales. If you are reselling products (which is true for most Microsoft CSP Partners), the margins are tight. If customers don’t pay or delay payments, there is an impact on cash-flow.

How to Collect Cash Faster?

Step 0: Sell to who will pay: In order to make sure there is no risk of non-payments you need to establish credit limits, do background checks and ensure that you won’t be left with the proverbial “bag in hand”- this is typically part of your Sales Process.

Step 1: Invoice Accurately. An accurate invoice is correct on all counts.

  • Who has the check or credit card? It could be the finance department, the procurement department, and the IT manager or relevant stakeholder who would need to approve your invoice.
  • What are exactly are they getting billed for? Are all the services and subscriptions correct, prices, changes in quantities, and refunds clearly shown in the invoice so that there isn’t any confusion?
  • When do they need to pay and most importantly receive the invoice? Is the invoice being sent to the customer on the correct date and with the correct due dates? The invoice needs delivery needs to predictable. We see so many cases where customers have cash flow issues but are not able to generate an accurate invoice on the correct date based on the billing frequency.

Step 2: Invoice: Delivery, Persistent, Actionable, Convenient: It’s imperative your invoice is delivered to the right stakeholders in a medium that is acceptable to them (email, mailed physical signed copies). Is the invoice persistent? Can the invoice be easily accessed by the intended audience? A self-service portal that shows their orders and invoices would allow them to access your invoice when they need it. Is the invoice actionable: Does it clearly state how to pay you and how convenient is it to pay you?

“The customer experience is the sum total of all customer moments. The invoice is a defining customer moment. “

What Systems are Involved in Generating an Invoice and Collecting Cash?

The problem with the above situation is that while you’ve got all the pieces together:

  • There are multiple logins/identities
  • Your brand is diluted and there is no consistency is not there because of the different looks and feels.
  • Data is the new oil – and it’s messy to have oil all over the place
  • Access to these systems is limited and to a few people. The lack of access to this data creates problems like selling to customers you don’t want to, inability to create an accurate invoice.

What is Required for a Successful Billing Automation Solution to Generate Accurate Invoices and Collect Cash?

  • Customer data in one system
  • Single Identity
  • Single Portal for everything: what they’ve bought from you, when they bought is from you, agreements, invoices.

Cash Collection is a business function:

  • Credit Limit should be associated with Aging Data and affect new Sales
  • The customer Payment information should be in the customer system
  • Automate Collections using Payment Gateways
  • Don’t reward bad behavior – Credit Holds so that provisioning is stopped

Work 365 offers a Billing and Subscription application that helps cloud companies archive the necessary requirements and processes to grow your revenue.

I am a Dynamics 365 enthusiast. I enjoy building systems and working with cross-functional teams to solve problems and build processes from lead generation to cash collection. Work 365 is a global developer of the Billing Automation and subscription application for Dynamics. Helping companies to streamline business processes and scale their recurring revenue.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Best Practices to Get Paid Faster – Building a Successful Recurring/Subscription Business​

April 29, 2020   Microsoft Dynamics CRM

As Simon Sinek says – “Start with Why”?

Volumes play an important role in a subscription business. This revenue is billed more frequently, in fact monthly- for 1 customer you are generating 12 invoices a year! The invoices are typically smaller in size than one-off sales. If you are reselling products (which is true for most Microsoft CSP Partners), the margins are tight. If customers don’t pay or delay payments, there is an impact on cash-flow.

Related post – The Need for Collecting Cash Fast

How to Collect Cash Faster?

Step 0: Sell to who will pay: In order to make sure there is no risk of non-payments you need to establish credit limits, do background checks and ensure that you won’t be left with the proverbial “bag in hand”- this is typically part of your Sales Process.

Step 1: Invoice Accurately. An accurate invoice is correct on all counts.

  • Who has the check or credit card? It could be the finance department, the procurement department, and the IT manager or relevant stakeholder who would need to approve your invoice.
  • What are exactly are they getting billed for? Are all the services and subscriptions correct, prices, changes in quantities and refunds clearly shown in the invoice so that there isn’t any confusion?
  • When do they need to pay and most importantly receive the invoice? Is the invoice being sent to the customer on the correct date and with the correct due dates? The invoice needs delivery needs to predictable. We see so many cases where customers have cash flow issues but are not able to generate an accurate invoice on the correct date based on the billing frequency.

Step 2: Invoice: Delivery, Persistent, Actionable, Convenient: It’s imperative your invoice is delivered to the right stakeholders in a medium that is acceptable to them (email, mailed physical signed copies). Is the invoice persistent? Can the invoice be easily accessed by the intended audience? A self-service portal that shows their orders and invoices would allow them to access your invoice when they need it. Is the invoice actionable: Does it clearly state how to pay you and how convenient is it to pay you?

“The customer experience is the sum total of all customer moments. The invoice is a defining customer moment. “

What Systems are Involved in Generating an Invoice and Collecting Cash?

xNot so Ideal Landscape 625x368.png.pagespeed.ic.cLpOKfrhgJ Best Practices to Get Paid Faster – Building a Successful Recurring/Subscription Business​

The problem with the above situation is that while you’ve got all the pieces together:

  • There are multiple logins/identities
  • Your brand is diluted and there is no consistency is not there because of the different looks and feels.
  • Data is the new oil – and it’s messy to have oil all over the place
  • Access to these systems is limited and to a few people. The lack of access to this data creates problems like selling to customers you don’t want to, inability to create an accurate invoice.

What is Required for a Successful Billing Automation Solution to Generate Accurate Invoices and Collect Cash?

  • Customer data in one system
  • Single Identity
  • Single Portal for everything: what they’ve bought from you, when they bought is from you, agreements, invoices

xBilling Automation 625x296.png.pagespeed.ic.BtXdYrH47B Best Practices to Get Paid Faster – Building a Successful Recurring/Subscription Business​

Cash Collection is a business function:

  • Credit Limit should be associated with Aging Data and affect new Sales
  • The customer Payment information should be in the customer system
  • Automate Collections using Payment Gateways
  • Don’t reward bad behavior – Credit Holds so that provisioning is stopped

Work 365 offers a Billing and Subscription application that helps cloud companies archive the necessary requirements and processes to grow your revenue. Learn more here.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

COVID-19 & Building a Strategy to Go Remote: 3 Tips for CIOs

April 11, 2020   CRM News and Info

Need to Go Remote? Follow These 3 Tips 

With the current COVID-19 pandemic, businesses nationwide are faced with the same challenge: transitioning their workforce from office-based to home-based.

For some organizations, having employees who work remotely is nothing new or unusual. But for many businesses, this is unfamiliar territory. If you’re the CIO of a business that’s suddenly having to transition dozens (or hundreds, or even thousands) of employees to remote work, what should you do?

First off, take a deep breath. Remember that there are countless examples of successful, multinational companies who have had employees telecommuting for many years. In the end, you may decide that having some of your employees working from home is a good thing — and you may even keep it up once this situation passes.

That said, we know that developing and executing a strategy to go remote can be complicated. To help you start the process, here are three essential tips.

Tip 1: Make the Switch to the Cloud

 

Is your business still using an Exchange server? Are you still dependent on physical in-house file servers? If so, it’s time to transition to the cloud.

If you’re using a network share, your employees have to be physically on premises in order to access important files and software systems. The only way around this is with some kind of VPN solution — but getting hundreds of employees set up with a VPN takes a considerable amount of time and money, and can also create security concerns.

Rather than opting for a complicated workaround, the better option is to simply switch to the cloud with Microsoft Office 365. With a cloud setup, employees can access everything they need from home. There’s no need to be on premises and logged into the company network. You can also rest assured that employees will be accessing files and software in a completely secure environment, minimizing security risks.

Tip 2: Use Microsoft Azure for Legacy Client/Server Issues

 

In our experience at AKA, we’ve found that while many companies have upgraded the majority of their software to SaaS (Software-as-a-Service) and other modern solutions, it’s still common for businesses to rely on one or two legacy client/server applications. These apps are based around physical installs, often on desktop computers, which means that employees won’t be able to access them when working remotely.

Fortunately, there’s a relatively simple solution to this problem. With Microsoft Azure, it’s possible to set up a remote desktop that gives employees access to legacy client/server software from their home office. Thanks to the power and efficiency of Azure, developing this sort of solution can often be accomplished in just a few hours and for a minimal investment.

Tip 3: Enable Collaboration with Teams

 

Once you’re up and running in the cloud and have ensured that your employees can access what they need from home, there’s one other major consideration to account for: the ability to collaborate. Employees working from home will sometimes struggle with a lack of face-to-face interaction–an essential element of effective collaboration.

With Microsoft Teams, employees can keep in touch, work together on group projects, and monitor one another’s progress. With Teams running on an employee’s mobile phone and home computer, they’ll receive push notifications for upcoming meetings and ongoing chat conversations. Built-in video capability allows team members to interact more naturally during remote meetings. Sharing documents and files is easy, and Teams keeps all of your communication in one easy-to-access place.

The Time is Now: Modernize Your Organization’s Culture

 

Over the past few years, successful businesses have been transitioning to the cloud and setting their employees up to work remotely. And with tools on special trial offer like Microsoft Azure and Teams, it’s never been easier.

While the current situation brings with it certain challenges, it also presents businesses with an opportunity to modernize and update their company culture. AKA can help your business get started with Dynamics 365 CRM, Microsoft Azure, Microsoft Teams, and other essential solutions for working remotely via the cloud.

Want to learn more about transitioning your workforce to remote work? Read The Sudden Need to Work Remotely: How This Company Transitioned 500 Users in Just One Weekend with a Cost-Effective Solution on Microsoft Azure. Ready to take the next step? Contact the experts at AKA to discuss your needs.


ABOUT AKA ENTERPRISE SOLUTIONS
AKA specializes in making it easier to do business, simplifying processes and reducing risks. With agility, expertise, and original industry solutions, we embrace projects other technology firms avoid—regardless of their complexity. As a true strategic partner, we help organizations slay the dragons that are keeping them from innovating their way to greatness. Call us at 212-502-3900!


Article by: Greg Inks | 212-502-3900

With two decades specializing in Microsoft and Azure platforms, Greg leads AKA’s Cloud practice. He is a Cloud evangelist offering deep expertise in Cloud architectures and adoption strategy. Greg has developed subject-matter expertise and wide-ranging business acumen by working with some of the largest, most successful technology providers and client companies on the planet.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Building Better: Ethics and Product Development

March 24, 2020   Sisense

As software’s role in business proliferates, the ethical implications of decisions made by developers only become more profound. In Big Questions, we outline the biggest concerns facing development teams and look at how companies are grappling with them.

In our increasingly digital world, software has a fundamental role to play in almost everything we do. It’s the backbone of how our global workforce collaborates, it underpins much of our industrial operations, and even ensures the smooth-running of our cities.

A failure of these critical software systems, whether intentional or accidental, can result in damaging, even fatal, consequences. Unfortunately, these failures are all too common. It seems that every week we hear of another privacy infringement, security breach, or technology misuse in the news. With all this in mind, it’s never been more important for software developers to be aware of their ethical responsibilities.

packages CTA banners Product Teams Building Better: Ethics and Product Development

Building a more secure future

The information means that a company’s data and infrastructure are its greatest assets, and weaknesses. According to Mandiant in their 2020 “M-Trends Report,” 22% of attacks on companies were for IP theft or corporate espionage, while a whopping 29% of attacks were for direct financial gain. Interestingly, only 4% of attacks covered by the report were to weaken the target’s systems for another hit later. This is worth knowing because 31% of customers covered in the report who suffered one attack saw another within the following 12 months!

The Internet of Things only makes the rise of attacks on companies more likely and more challenging to deal with as it continues to grow; more than 20 billion new devices are forecast to connect to the internet this year alone. Malware creators are ready and waiting to infiltrate the software underpinning these devices. This mounting threat landscape is something that developers cannot afford to ignore.

“The greatest ethical challenge facing software developers today is how to secure information in to reduce the risk of intellectual property loss to external threats (like hackers) in shared environments without affecting ease of use for the end user,” said Thomas Holt, a professor in the school of criminal justice at Michigan State University, whose research focuses on computer hacking and malware. 

“If encryption or security becomes too cumbersome, employees will be less likely to use them or create workarounds, so there must be thought given as to how to protect vulnerable information in a distributed and networked environment.”

A Code of Ethics developed by the Association for Computing Machinery (ACM) states that computing professionals should “design and implement systems that are robustly and usably secure.” They should do this by integrating mitigation techniques and policies, such as monitoring, patching, and vulnerability reporting. It’s also important that developers take steps to ensure parties affected by data breaches are notified in a timely and clear manner.

The inspiration to consider the repercussions of a product team’s actions can come from any source:

“As a practicing security professional in tech, I’m grateful to see that many modern engineers were influenced by Isaac Asimov, inventor of the Three Laws of Robotics,” says Ty Sbano, Sisense Chief Security and Trust Officer. “Rule number one is ‘A robot may not injure a human being or, through inaction, allow a human being to come to harm.’ Even though the term ‘injure’ may be severe, if you take a step back, the mandate for product teams remains the same when it comes to acting ethically: In the case of privacy and ethics, or any other impact your creation could have on users and the world, you really need to think these things through.” 

“Where machines learn like humans and from humans, unconscious bias is as much a threat as with humans.”

Lothar Determann

User acceptance testing and other best practices can help developers avoid implementing security precautions that are too confusing, are situationally inappropriate, or otherwise inhibit legitimate use. ACM is clear that there should be no compromise here: in cases where misuse or harm are predictable or unavoidable, it says the best option may be to not implement the system.

Give your customers (and everyone else) privacy

Software developers are often asked to create solutions that enable the collection, monitoring, and exchange of personal information. Our increasingly global workforce collaborates via a host of digital technologies spanning multiple countries and territorial borders. Employers can use many of these technologies to monitor employees, but at what point is it considered an invasion of privacy? How far do software developers integrate this ability to monitor into their solutions?

“Respect privacy” is featured high up in ACM’s Code of Ethics, which says that software developers should only use personal information “for legitimate ends, and without violating the rights of individuals and groups.” This means taking precautions to prevent the re-identification of anonymized data or unauthorized data collection, ensuring the accuracy of data, understanding the provenance of the data, and protecting it from unauthorized access and accidental disclosure. Personal information gathered for a specific purpose should not be used for other purposes without the person’s consent.

Fight algorithmic bias to deliver better products

Algorithmic bias describes systematic and repeatable errors in a computer system that create unfair outcomes. This danger looms greater than ever today thanks to machine learning techniques that are fundamentally changing the way software is made. 

“Instead of coding instructions for machines from A-Z, engineers provide massive data sets to programs with high-level instructions to figure out solutions on their own,” said Lothar Determann, partner at multinational law firm Baker McKenzie, and professor of law at the Free University of Berlin. “Where machines learn like humans and from humans, unconscious bias is as much a threat as with humans.”

ACM’s Code of Ethics states to “be fair and take action not to discriminate.” It says that technologies and practices should be as inclusive and accessible as possible and software developers should take action to avoid creating systems or technologies that disenfranchise or oppress people. Failure to design for inclusiveness and accessibility may constitute unfair discrimination. Determann believes that success requires developers to code prohibitions into algorithms. 

“We must not rely on machines learning on their own what is and isn’t prohibited,” he said. “We need to do our best to avoid replicating unconscious human biases by training machines with insufficient data (e.g. outdated history books), supervising teams (e.g. lacking diversity), and procedures (e.g. failing to flag for machines where data sets are known to be incomplete). We have to develop countermeasures to reduce the risk of replicating human unconscious bias in AI. Diverse teams and constant validation and questioning should be part of the solution.”

Think before you build

Software developers are the first, and last, lines of defense against the misuse of technology. Our current era of software culture may have been dominated by Facebook founder Mark Zuckerberg’s now-famous motto “move fast and break things,” but that’s not what users want from their software anymore. However, inside software companies, there can still be pressure on developers to get software to market quickly, making it tempting to skip rigorous testing.

“Software developers are often infatuated with the technology itself — what it can do and how we can apply it,” said Michael S. Kirkpatrick, associate professor in the department of computer science at James Madison University in Virginia, and education coordinator for ACM’s Committee on Professional Ethics.

Kirkpatrick argues that it’s the responsibility of developers to ask more questions about how their technology might be used. “All too often, developers are given a task to develop a piece of code, without information on the context in which it might be used. This needs to change. Developers need to be more proactive in finding out how the code is going to be used and anticipating how it might be misused. They should also question what policies and procedures can be put in place to prevent misuse.”

Kirkpatrick believes that multidisciplinary development teams are crucial to success here. “Software developers tend to have a very narrow focus on technology, so it’s really important that they work with ethicists, anthropologists, sociologists, and other people who bring a different perspective on the context of how a piece of technology might be used, and how misuse can be prevented.” 

Developing a better world  

It’s clear that a lot of responsibility rests in the hands of software developers. The ACM’s Code of Ethics is a useful framework for helping software engineers to live up to their ethical obligations, but it doesn’t provide a solution for solving ethical problems; rather it serves as the foundation of ethical decision-making. 

Ultimately, it is up to software developers to act responsibly, taking the time to consider the wider impacts of their work, which should consistently support the public good.

“Technology amplifies power,” concludes Kirkpatrick. “Governments, large corporations, and other actors can use it to extend their influence in all aspects of our lives. It’s no exaggeration to say that the unquestioning adoption of technology poses risks to fundamental human freedoms and civic rights. So it’s really important for software developers to listen to different voices, to ask more questions and to think more proactively about how their software can be used and, more importantly, how it might be misused.”

packages CTA banners Product Teams Building Better: Ethics and Product Development

Lindsay James is a journalist and writer with over 20 years’ experience creating compelling copy for some of the world’s biggest brands including Microsoft, Dassault Systemes, Exasol, and BAA. Her work has appeared in The Record (a magazine for the Microsoft partner community), Compass, and IT Pro.

Let’s block ads! (Why?)

Blog – Sisense

Read More

Alphabet’s Verily is building COVID-19 triaging tool as Trump declares national emergency (Updated)

March 14, 2020   Big Data

Alphabet’s Verily is building a triaging tool to help people find COVID-19 testing sites in the U.S. The company says this tool will initially target the San Francisco Bay Area, though Verily hopes to expand coverage to more regions in the future. The move is part of a public-private partnership to dispense COVID-19 testing to “millions of Americans” in the weeks ahead at places like Target, Walgreens, CVS, and Walmart parking lots, according to U.S. Vice President Mike Pence. Testing to confirm COVID-19 cases has been a critical part of response plans in other countries around the world.

U.S. President Trump announced the news today (although he erroneously said Google was building the tool) while declaring a national emergency in a White House press conference. Trump, Pence, and the rest of the administration implied the tool would serve people nationwide. Positive COVID-19 cases have now been found in all 50 states, and on Wednesday the World Health Organization declared a global pandemic.

The federal government said it would point people to a website where they could fill out a screening questionnaire, stating their symptoms and risk factors, and that if necessary they would then be told the location of a drive-through testing option. Automated machines will be used to return results in 24 to 36 hours.

President Trump claimed that about 1,700 engineers are building the website, but that also appears to be incorrect, according to The Verge. Verily did confirm, however, that some Google engineers have volunteered to be part of this effort. VentureBeat has reached out to Google for more information about the triage tool.

Dr. Deborah Burke said nasal swab samples can be delivered to doctor’s offices and hospitals and then picked up by companies like Quest Diagnostics.

 Alphabet’s Verily is building COVID 19 triaging tool as Trump declares national emergency (Updated)

“The important piece in this all is they’ve gone from a machine that may have a lower throughput to the potential to have automated extraction,” she said. “It’s really key for the laboratory people; it’s an automated extraction of the RNA that then runs in an automated way on the machine with no one touching it. And the result comes out of the other end.” She said that going from sample to machine to results removes the manual procedures that were slowing down testing and delaying results.

As an emergency executive action announced by Trump today, the Department of Education will waive interest on student loans held by federal government agencies, and he instructed the Secretary of Energy to buy crude oil reserves.

At midnight tonight, the United States will suspend travel from Europe, and U.S. citizens traveling into the country will be asked to take part in a voluntary 14-day quarantine.

Tech giants like Apple, Amazon, Facebook, Google, and Microsoft took part in a teleconference with White House CTO Michael Kratsios to discuss how artificial intelligence and tech can help combat this disease. A White House statement said the discussion touched on issues like the creation of new tools. Public health officials and authorities from China to Singapore and beyond have used AI as part of solutions to detect and fight COVID-19 since the novel coronavirus emerged in December 2019.

Earlier this year, Google’s DeepMind also released structure predictions of proteins associated with the virus that causes COVID-19 with the latest version of the AlphaFold system.

“These structure predictions have not yet been experimentally verified, but the hope is that by accelerating their release they may contribute to the scientific community’s understanding of how the virus functions and experimental work in developing future treatments,” Alphabet and Google CEO Sundar Pichai said in a blog post last week.

Upon questioning by reporters at the press conference, Trump refused to take responsibility for the heretofore slow U.S. response to the pandemic. He also evaded questions about whether he needs to be tested for COVID-19, despite the fact that he was in close proximity days ago with a person who has tested positive — Fabio Wajngarten, press secretary to Brazil’s President Jair Bolsonaro. Eventually, after multiple reporters pressed him on the issue, Trump said he would get tested — but not, he said, because of his contact with the two men. Miami Mayor Francis Suarez was also in contact the Bolsonaro and Wajngarten at Trump’s Mar-a-Lago resort and today tested positive for coronavirus.

Throughout the press conference, President Trump, Vice President Pence, and a roster of scientists and executives shook hands and touched the same mic.

Updated 4:40 p.m. Pacific. The original version of this story relied on statements by President Donald Trump and White House officials, but it was updated to reflect a Google communications statement that Alphabet’s Verily, not Google, is in the early stages of creating a tool to help triage individuals in the San Francisco Bay Area. 

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Building the Salesforce Franchise

March 6, 2020   CRM News and Info

Franchising is a tried-and-true business model with numerous permutations that benefit all participating parties.

For the vendor it’s a great model for expanding without all of the downside risk associated with building a business, like finding all the financing and hiring people. The franchisor generally sells licenses to willing partners who agree to uphold the standards of the brand. The franchisees get a running business model, expertise and financing without the hassle and anguish of being a pure startup.

Generally, the franchisor establishes a new tier in the business serving the franchisees as its customers rather than going direct to primary customers. Customers become the concern of the franchisees with some supervision from the franchisor.

The franchisors make money selling licenses and supplies to their networks of partners. The best example might be McDonald’s. In addition to selling supplies, branding and licenses, it also does a good business in real estate.

The AppExchange Innovation

Recently, as I see it, newer models of the franchise model have sprouted. For instance, I’d classify Über as a newfangled franchisor. Its partners don’t work for the company, at least as far as the company is concerned, and Über makes most of its money taking a cut of every ride it arranges through its technology platform. Über is at least a technology partner serving franchisees.

The tech industry has had its share of companies trying to emulate the franchise model, and most of them have had good but limited runs, often failing. The tech model supported reseller networks, and for a long time the software needed to support that model just wasn’t there.

Add to this the boom-and-bust nature of many tech firms as their innovations commoditize, and their inability to bring new innovations to market fast enough, and you see what happens.

Salesforce has bucked the trend and is consolidating a position through its AppExchange to be the leading franchisor of business software. It has taken 20 years to get here though.

The company pioneered Software as a Service, which matured into cloud computing, and it has revealed its inner workings in increments to enable its partners to gain access to its core development technologies.

At first this meant giving developers tools for creating new applications that leveraged their creators’ domain expertise. Quickly, it also came to mean providing the ability to link those third-party apps to the main Salesforce system so that everything ran as a single entity.

The greatest innovation was the AppExchange. It put Salesforce and its partners on an equal footing when selling solutions while Salesforce provided the core platform, enabling it to offload innovation at the application level to its partners.

That was a risky proposition given the history, but the size and robustness of the partner community ensured a steady supply of application-level innovation and customer-centricity.

Protecting the Brand

Salesforce updated its AppExchange Partner Program this week, publishing a new summary document and program inclusions. From this it’s pretty clear, if it wasn’t already, that Salesforce increasingly sees itself as a platform enabling thousands of developers to sell its underlying technology in support of their domain-specific apps.

Of course, it also sees itself as a tools provider for enterprises wanting to develop their own unique apps.

What’s most intriguing to me is that the revised program mimics the best programs you might see in fast food, where a franchisee might be evaluated on raw income and profitability, but also on things like staff knowledge and friendliness, cleanliness of the establishment, and more.

It’s a way for the franchisor to maintain some control over how the brand is presented to the customer, because unhappy customers will bring down a brand and all of its secondary businesses, like licensing and real estate management.

In the Salesforce program, partners are evaluated quarterly in three areas with six metrics:

  • Customer Success, including attrition rate and average app rating (from the AppExchange);
  • Innovation, including technology adoption and Trailhead badges; and
  • Engagement, including ACV growth and total revenue.

The whole rating is worth a maximum of 1,000 points and partners who participate in Pledge 1 Percent, the philanthropy part of Salesforce, can earn 25 bonus points.

Importantly, the ratings help to assign partners to discount tiers that determine their profitability. The better-performing partners and those best able to conform to the parameters should make the most money.

Salesforce is adding a lot of skin to this game by giving partners access to experts and testing technologies, as well as frequent briefings on how the technology and program evolve. All told, it seems that Salesforce is doing everything it can to emulate the best franchise programs and to do it for the long run.

This is very different from the way we see almost any other enterprise software company run. For many, selling Infrastructure as a Service is about as far as it goes. However, selling infrastructure, while good and important, doesn’t do much to share the burden of becoming successful. It’s more a B2B strategy than a B2C one.

So far — it’s only been 20 years — Salesforce has put a lot into appealing to both the B2B and B2C markets as well as the SMB and enterprise spaces. That’s a lot of balls to keep in the air. If you want to understand why this company is so successful, you just need to look at how it reaches out to so many different constituents.
end enn Building the Salesforce Franchise

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.


Denis%20Pombriant Building the Salesforce Franchise
Denis Pombriant is a well-known CRM industry analyst, strategist, writer and speaker. His new book, You Can’t Buy Customer Loyalty, But You Can Earn It, is now available on Amazon. His 2015 book, Solve for the Customer, is also available there.
Email Denis.

Let’s block ads! (Why?)

CRM Buyer

Read More
« Older posts
  • Recent Posts

    • solve for variable in iterator limit
    • THE UNIVERSE: A WONDROUS PLACE
    • 2020 ERP/CRM Software Blog Award Winners
    • Top 10 CRM Software Blog Posts in 2020
    • Database trends: Why you need a ledger database
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited