Monthly Archives: August 2018
Mighty AI Provides the ‘Ground Truth’ for Self-Driving Cars
Posted by Christopher Lam, Senior Director Customer Success
To fulfill the promise of self-driving cars, the systems that run them must be able to tell the difference between a car and a dumpster, a dotted yellow line from a straight line and thousands of other objects. That’s where the rubber meets the road for Seattle-based Mighty AI.
The 4-year-old software and services company has a global community of more than 400,000 citizen annotators that provide the labor-intensive data work automakers need, via a mobile game that has them identifying pedestrians, cars, tractor trailers, construction, signage, bicyclists and many other data points.
That “ground truth” data is gathered, reviewed and validated by the Mighty AI team. It’s then made available to autonomous vehicle developers to help train self-driving vehicles to think and act like a safe human driver. Mighty AI works with more than a dozen companies developing autonomous vehicle vision systems.
Recently, Mighty AI announced a collaboration with Mcity, a public-private consortium at the University of Michigan backed by nearly 60 companies including Ford, GM, Honda, Toyota, Intel, LG and State Farm. Mighty AI’s training datasets will be used in autonomous vehicle machine learning models developed at Mcity, which includes an R&D center and real-world testing facilities.
“We’ve seen a great deal of traction in delivering the huge amounts of annotated data needed to train machine learning models,” said Teresa A. Kotwis, Mighty AI CFO. “The level of data quality, accuracy and precision is enormously high, which works very well with the tooling we have.”
Readiness for Global Growth
With customers in North America as well as Europe and Asia, Mighty AI selected NetSuite OneWorld over Sage Intacct to prepare for international expansion and the launch of global subsidiaries. NetSuite Professional Services led the implementation, with pricing well within their budget.
Time was of the essence. After signing the contract in October 2017, the company had roughly 75 days to go live to make its planned January 1 cutover date from QuickBooks to NetSuite OneWorld. That’s a “super fast” timeline, Kotwis said, especially as implementation would proceed through the Thanksgiving and late December holidays.
Led by a three-person team from the NetSuite Global Delivery Center, the implementation was the most efficient she’s experienced, said Kotwis who has past experience with deployments of Sage MAS 90, Sage Intacct, Microsoft Dynamics and other ERP platforms.
“I have converted more accounting systems than I can count in my 30-plus years of experience, and this was by far the easiest and most efficient implementation thanks to the dedication of the NetSuite Professional Services team,” Kotwis said.
Expert-Led Implementation
The implementation was made easier by a structured process with clear milestones, weekly meetings, online data exchange and accountability for all stakeholders. The NetSuite team was “super flexible” and readily accommodated time zone differences between Manila and Seattle, according to Kotwis.
“Our NetSuite Professional Services implementation process was very smooth and super fast,” she said. “Working with the Global Delivery Center team was just excellent — it was seamless from end to end, as if they were side by side with us in our office. Location had no impact on the ability to communicate, get things done and understand where we were in the process.”
With OneWorld in place, Mighty AI has new visibility into key metrics, and can report on budget vs. actuals across the business. Once it’s ready to open global subsidiaries, Mighty AI is equipped with capabilities for multi-currency transactions and global financial consolidation to help fuel its continued growth. And as it builds out its software-as-a-service offerings, NetSuite Revenue Recognition will play a valuable role.
“The reporting and trend analysis in NetSuite is definitely giving us better information and lets us ask better questions,” Kotwis said. “As we move from a tech-enabled services company to more of a software platform, we’ll be able to put billing and rev rec in NetSuite to use.”
Learn more about NetSuite Support Services.
by NetSuite filed under
Must Chatbots Be Voice-Enabled To Improve The Customer Experience?

Digital transformation has forged new pathways for customer relations, allowing organizations to reach their target audiences using emerging technologies that provide instant, easy, and relevant customer experiences. These dramatic advancements in technology have changed customer attitudes toward service times, delivery, and accessibility, especially in the realm of e-commerce. To adapt to customer expectations, companies like Amazon offer Echo, a device that connects to Alexa Voice, enabling customers to access information and complete tasks without having to press a button.
Speed and efficiency is valuable to customers, and this provides the potential for organizations to capitalize on conversational artificial intelligence in retail. Voice-activated chatbots are much more than a fading tech trend, and with companies like Facebook offering businesses text-based and voice-activated communication through Messenger, conversational AI will significantly impact the way customers choose to interact with retailers.
Create a personalized service
Chatbots simulate an in-store experience for customers buying online, ensuring that customer service is cohesive and consistent across the brand. This kind of automation creates a tailored purchasing experience on mobile and web devices, allowing customers to speak to chatbots as they would sales associates. Chatbots can be used to locate and sell products, answer product-related questions, direct customers to helpful web pages, and upsell products with the added benefit of immediacy and accuracy.
Chatbots are the new user interface
Mobile apps are less relevant. unless they’re for messaging. With mobile phone messaging apps estimated to increase to 2.5 billion users worldwide in 2021, retailers have a wider reach if they can integrate chatbots into customer service. Conversational AI already exists within our smartphones with Siri, Bixby, and Google Assistant. Improvements to natural language processing (NLP) and understanding make voice the new UI. It can respond to more complex questions and complete simple request commands three times faster than texting. That means that retailers need to prepare for the shift in demand, especially in providing seamless customer experiences in e-commerce, even as they ask ”Is it good enough yet?” “Should I focus on other priorities until it’s better?”
The Intelligent Enterprise and conversational AI
Conversational AI will make managing operations more efficient and comprehensive by providing real-time data, accessing information on stock and ordering, volumes, sales figures, and HR management through voice-activated commands. Retailers can build their own chatbots dedicated to specific customer-directed aims and customize NLP models dependent on their performance needs. With growing demand for voice-enabled technologies, conversational AI isn’t an option, but an increasing necessity, in the retail industry.
Of course, there will be discussions about whether voice-enabled commerce is good enough yet, but regardless of its current accuracy, retailers should start planning and preparing for it now.
To learn more about SAP Leonardo Conversational AI Foundation.
User Experiences That Convert – The Do's and Don'ts
In technology parlance, the emotional result of a person’s interaction with a website or digital app is called “user experience” or “UX” — and the success of a business depends on it. Users who have easy, positive experiences with websites and apps likely will be drawn back to the business. On the contrary, websites and apps with poor navigation and slow loading times likely will turn off consumers.
UX is essential in e-commerce because conversion rates often are aligned with a positive or negative experience. As technology continues to improve and as customer preferences evolve, businesses must adapt UX to stay competitive.
To optimize UX and enhance customer perceptions of your brand, following are five UX Do’s and Don’ts to focus on.
1. Do invest in a unique visual experience. Don’t choose cookie-cutter solutions.
Today, most customers can spot stock imagery and templated websites from a mile away. These features not only make a business look lazy, but also restrict brand expression. Be sure to choose custom visual designs whenever possible.
A website or mobile app should evoke the identity of the brand. Whether a business offers home decor products or tax preparation services, users should be able to identify the business brand from its respective platforms. A relevant and consistent brand fosters legitimacy and trust. You want users to take your business seriously. Of course, this is much easier said than done.
Below is a homepage example from
Restoration Hardware. The simple, neutral background allows brand imagery to speak for itself, and the uncluttered navigation supports an easy, straightforward shopping experience. Additionally, the page’s text copy is short and direct, to give visitors clear options and guide behavior. Lastly, the main image not only displays business products, but also conveys the atmosphere that Restoration Hardware promises its customers. Regardless of stylistic preference, Restoration Hardware offers a distinct visual experience on its website.
Businesses can use custom visual content to stand out from the competition. A simple start is to use real pictures of teams/staff, offices/locations and products/services. Moreover, details like updated banners to reflect appropriate seasons, upcoming holidays or events relevant to the market/industry also help to add unique visuals. Unique visual content is a great way to enhance UX.
2. Do make messages personal. Don’t be generic.
People want to feel that their needs matter and their voices are heard. Individualized experiences/messaging can be tough with a diverse audience/customer base; however, businesses can gain advantages by using conversational language, addressing people by name wherever possible, and helping visitors/customers find content they might like.
It is important to research customer habits and desires. What words are used in the search for products or services relevant to your business? Talk to your customers in the language they use. Read reviews to better understand customer needs. Also, find out what your competitors are doing.
With the latter research, businesses can install appropriate technology to address specific customer desires, even at scale. For example, a simple upgrade for personal messaging could be automating business newsletters/email outreach to populate each recipient’s name in the greeting. Instead of, “Hello,” the customer would be greeted with, “Hi Ted.” Little details can make a huge difference.
More advanced features include personalized product recommendations, individualized search tools and saved checkout information. If a business offers UX that makes customers feel like VIPs, sales likely will grow.
3. Do get social. Don’t keep to yourself.
Social media is a mandatory extension for a business’ website. It contributes to the UX businesses want to create, but not all social media platforms are right for business. Be careful to choose the right social media platform to leverage your strengths and connect with your target audience.
For example, a clothing brand for teenagers would be wise to publish content on Instagram, which is image-focused and has a young user base, rather than on LinkedIn, which is text-heavy and oriented toward professionals. Pinterest might be a great choice for an event-planning company, while news blogs would favor Facebook and Twitter for sharing articles and stories.
Content should follow the voice of the brand. A children’s toy business should sound different than a high-end luxury sports car company. Be sure to talk to your audience in the language and style they understand and respond to. Below is a great example of brand voice from sunglasses retailer
Foster Grant, which uses messaging that is personable and that its customers can relate to.
Once a business has identified the right social media platform(s), it can share original content that reflects the brand. Some social media best practices are to engage with your audience (always reply to user comments) and to be consistent in content and posting. If your business is active and friendly on social media, you likely will build customer rapport and gain customer trust, which are essential parts of a compelling UX.
4. Do balance form and function. Don’t put all your eggs in one basket.
Not all beautiful websites work well, nor are all functional websites visually appealing. The UX sweet spot is where form and function complement each other, providing a fun and easy experience.
Businesses should audit appropriate website(s) and app(s) for strengths and weaknesses. For example, if a business has blazing-fast page speed and a robust payment gateway but a low conversion rate, it is possible that branding and content need improvement. No matter how good website infrastructure is, if visitors are not compelled to buy or act, they never will reach the checkout page or respond to your call to action.
Furthermore, if visual content is strong but site performance is lacking, businesses should invest in technology that improves page speed, checkout convenience and customer service. Any difficulty finding or buying products/services can ruin an otherwise engaging experience for current and potential customers.
To boost conversions, a website’s UX should feel and perform like it is from 2018. This requires keeping up with the latest technology and design trends, such as mobile-first optimization and simple, iconic aesthetics.
5. Do offer an omnichannel experience. Don’t restrict usability.
Omnichannel means providing the same excellent UX whether customers shop in-store, online or digitally via mobile device, tablet or laptop, or even live chat or by phone. Giving people options makes shopping more convenient and increases a business’ chances for conversion.
To start, businesses should integrate software that makes it possible to offer and manage an omnichannel experience. This often requires a website back end and an ERP (enterprise resource planning) that syncs inventory and processes orders in real time, whether someone buys a product online or in-store.
From the customer perspective, people want flexibility in how they shop, pay, and receive or pick-up purchases/services. Restricting customer options risks a loss in sales, and expanding options can help a business to stay competitive.
At the end of the day, UX is about creating an “environment” on your website or app that visitors/users will enjoy and want to return to. Remember, UX is never final — it is meant to be a work in progress, keeping up with the latest and greatest in innovation and technology.
Businesses should analyze performance and take corrective action to tweak and improve UX every few months. Ask questions on how customers are behaving, and that identify what is and is not working. Review Google Analytics, check conversion funnels, look at drop rates from email lists, and use heat mapping tools. Then, take this information and funnel it through the UX Do’s and Don’ts discussed here.
If you improve UX based on data and user feedback, you will see a corresponding improvement in the rise of conversion rates and customer satisfaction.
Clearing Up Some Confusion about GUIDS in Dynamics 365

Globally Unique Identifiers (GUIDs) are the primary key of all entities in CRM and are sequentially generated. There is often confusion about how GUIDs work, how unique they are, and how they are generated in CRM. This confusion can often lead to unnecessary or bad requirements when planning integrations or migrations for Dynamics 365. Let’s discuss some of the different kinds of GUIDs, and how their nature will affect requirement gathering.
What is a GUID?
A GUID is a range of values, much like a number. However, instead of using a base-10 number to represent each value in the range, as with an integer, it uses a “hexadecimal digit with the range of 0-9 and a-f in the format of xxxxxxxx–xxxx–xxxx–xxxx–xxxxxxxxxxxx.”
Perhaps the most important feature of the GUID is the extend of the range. The number of possible GUIDs is astronomically vast. The total number of possible combinations is 2^128. to show how big a number this is, here it is without scientific notation:
340,282,366,920,938,000,000,000,000,000,000,000,000
(This is roughly the number of every gain of sand on earth, times 4.5 billion, then times again by one billion.)
Types of GUIDs
One main issue of confusion regarding GUIDs is the misunderstanding that there are different types, that derive their value in different ways. While there are formally five versions of GUIDs (and functionally many ways of getting these five versions), all GUIDs can be summed up in two categories: Sequential and Random.
One main point to take away here: GUIDs are designed to be unique, not random.
In Microsoft SQL Server (the database driving D365), there are two ways to generate GUIDs. The first we’ll discuss, NEWSEQUENTIALID, is sequentially (not randomly) generated. It is important to note that the range at which the GUID starts is randomly selected when windows starts, and each subsequent GUID created will be a value sequentially higher than the previous. When windows restarts, the next sequential “GUID can start from a lower range, but is still globally unique.” Also, different machines will be working with different ranges, however these will also still be globally unique.
Using sequential GUIDs as primary keys for DB tables allows for optimal indexing and performance, both for creating and querying records. As such, D365 follows best practice.
The other type of method to create GUID in MSSQL is NEWID. This is a randomly generated, but still unique. However, these don’t index as well and have more of a performance impact when used as primary keys on DB tables (poor practice, but not how CRM works). Often when speaking to infrastructure architects and the likes, make sure they know that GUIDs are sequentially generated, and will perform similarly to an incremental integer as a primary key.
Another point to take away: CRM generates GUIDs sequentially, not randomly.
Chances of Duplicates/Collision
Another concern that comes from a misunderstanding of GUID generation is a fear that a duplicate might occur.
This might be a following example where this fear might exhibit:
There is a project in which multiple child Dynamics 365 systems needed to be integrated into one main D365 parent. There were concerns that, even though sequentially generated, these different CRM systems could share the same GUID ranges and could cause collision if we tried to directly map the primary keys from the child to the parent. This concern, while understandable, is negligible.
Another point to take away: You’ll never see a GUID duplicated.
The likelihood of getting duplicates are very unlikely. Here are some stats to show how completely negligible these chances are.
In this project, if we had 1000 child D365 orgs that were each creating one billion records per second each for a year (that’s right, 1,000,000,000,000 per second between all Orgs, for a total of 31,536,000,000,000,000,000 records), the likelihood that there would be a duplicate collision in the master org integration would be about this:
0.000000000000000009%
So… zero percent.
According to National Geographic, a person worrying about duplicates would be more than 15 trillion times more likely to be struck by lightning than for a duplicate to occur in the above scenario in that same year.
Another point to take away: Always map the primary key between D365 and other systems.
It is best practice to maintain the GUID from any record when integrating or migrating between any environment. This will help ensure data integrity, and there is no need for concern regarding GUID collision.
Dynamics 365 GUIDs in Summary
- GUIDs will always be unique
- GUIDs are generated sequentially in D365, and are performant as a primary key
- There is no need to worry about duplicates
- Always keep the GUID of a record when moving between orgs or other systems
For more Dynamics 365 tips and tricks – be sure to subscribe to our blog!
Happy Dynamics 365’ing!
A Closer Look at DevOps – Breaking down organizational silos for greater efficiency and reduced time to market

The trend of DevOps has really picked up speed. Today, we’ll answer the question “what is DevOps?” and take a look at how businesses are using DevOps methodology stay ahead of the competition.
What is DevOps?
DevOps is a combination of the terms development and operations. DevOps is an extension of the agile development philosophy to promote faster and more efficient development and deployment of software.
Many businesses are adopting this approach because it helps them stay competitive by bringing new product features (or bug fixes) to market much faster. DevOps methodology advocates for automation and monitoring across steps in the process of software production – from integration, QA testing, deployment to production and infrastructure management. The result is shorter development cycles, more dependable software releases and increased deployment frequency.
What is DevOps? IBM Distinguished Engineer and Solutions Architect Sanjeev Sharma delivers a basic, comprehensive overview of DevOps methodology
Agile, DevOps & Continuous Delivery
Agile software development follows a philosophy combining the ideas of collaboration, adaptive planning and continual improvement to rapidly respond to feedback and evolving requirements. Seeing success using the Agile approach, organizations wanted to release the software faster and more frequently, giving birth to continuous delivery and the DevOps culture.
DevOps and continuous delivery are often used interchangeably because of their shared goal to speed up software deployment, but there is a subtle difference between the two. Continuous delivery is focused on automating software delivery processes. DevOps takes this one step further to also break down organizational silos for greater collaboration between the many functional areas that have a hand in this process.
DevOps and the Mainframe
Many business-critical applications rely on mainframes. If you want to efficiently deploy and maintain those applications, you need to make mainframes part of your DevOps workflow.
In our recent interview with Trevor Eddolls, he asserted that DevOps offers the greatest opportunity for organizations to get the most of our their mainframe investment. “For organizations that haven’t looked at DevOps and Agile computing, or are unaware of the fact that RESTful protocols work with IMS and CICS, this kind of modernization will bring them the greatest advantages in terms of growth and improved service.”
Organizations that do DevOps most effectively understand that, while technologies like Docker are one important component, an DevOps-optimized workflow involves all parts of an organization’s infrastructure.
Related: DevOpsify Your Mainframe: Continuous Improvement on Big Iron
Breaking down the mainframe silo
If you have a mainframe, there’s a good chance that your mainframe is one of the biggest silos inside your organization. That’s because, by default, mainframe data is very disconnected from the rest of your infrastructure. It exists in formats that are difficult to convert and use with modern analytics tools. It takes a long time to offload and is expensive to store for long periods
Fortunately, things don’t have to be this way. You can also de-silo your mainframe data by bridging the gap between your mainframe and the rest of your infrastructure.
See how applying DevOps principles can help you unlock the value of mainframe data – read Mainframe Challenge: Unlocking the Value of Legacy Data
How DevOps has changed mainframe careers
The DevOps revolution is meaningfully changing the job descriptions for the mainframe experts they hire. The idea that constant collaboration should be a central feature of IT workflows, and that team members should be prepared to coordinate with one another as much as possible, is now deeply embedded in the way most businesses organize their approach to software delivery.
Mainframe programmers are now required to branch out from roles strictly programming roles to also participate in administrative tasks. It also means that programmers and system administrators who specialize in mainframes need to collaborate more closely with the rest of the IT organization.
This greater collaboration requires a working understanding of other systems – as well as a preparedness to integrate information and resources quickly between mainframes and other types of environments. (In other words, you can’t pitch yourself as solely a mainframe engineer anymore.)
DevOps… and Big Data?
You’ll notice that the description of DevOps and continuous delivery didn’t mention data. And it’s true that, by most conventional definitions, it is not closely linked to Big Data. But new ideas have recently emerged to bring them together.
Integrating Big Data into DevOps
If the goal of DevOps is to make software production and delivery more efficient, then including data specialists within the continuous delivery process can be a big win for organizations working to embrace DevOps. By integrating Big Data, organizations can achieve more effective software update planning, lower error rates, closer alignment between development and production environments and more accurate feedback from production.
Applying these principles to data management
DevOps focuses on software production, and at first glance might not seem to offer much to people working with data. But data specialists have much to learn from the movement.
But upon closer inspection, we notice that the data management process is similar to software production in that both involve multiple teams. A group to set up data storage, another to run the database and a third to work on analytics, plus a security team to keep the data safe and enforce compliance policies.
Traditionally, these different teams have not always collaborated closely. The folks who set up MySQL databases usually don’t know much about using Hadoop, for example. By embracing the core ideas of DevOps, however, organizations can achieve DataOps to make these different teams collaborate more effectively.
Check out these related articles:
Has your organization adopted DevOps methodologies that integrates your mainframe? Have you applied these principles to your data management strategy? See how Syncsort Integrate products can help you break down the silos.
Terminology Check – What are Data Flows?
It’s another terminology post! Earlier this week I was having a delightful lunch with Angela Henry, Kevin Feasel, Javier Guillen, and Jason Thomas. We were chatting about various new things. Partway thru our conversation Jason stops me because he thought I was talking about Power BI Dataflows when I was really talking about Azure Data Factory Data Flows. It was kind of a funny moment actually but it did illustrate that we have some overlapping terminology coming into our world.
So, with that inspiration, let’s have a chat about some of the new data flow capabilities in the Microsoft world, shall we?
Azure Data Factory Data Flow
The new Azure Data Factory (ADF) Data Flow capability is analogous to those from SSIS: a data flow allows you to build data transformation logic using a graphical interface. A really interesting aspect about ADF Data Flows is that they use Azure Databricks as the runtime engine underneath — however, you don’t actually have to know Spark or Databricks in order to be able to use ADF Data Flows. The goal is for it to be a low code/no code way to transform data at scale.
Follow Mark Kromer and the ADF team on Twitter to stay up to date on the rollout of the preview.
More info on ADF Data Flow can be found here: https://aka.ms/adfdataflowdocs.
Power BI Dataflows
Power BI Dataflows (yes, this one is branded as one word) are a new type of object in a Power BI Workspace which will allow you to load data into a Common Data Model. Data is loaded via a web-based version of Power Query, which is why this capability is referred to as self-service data prep. The resulting data is stored in Azure Data Lake Storage Gen 2. Once in the Common Data Model in the data lake, it can be reused among various Power BI datasets — allowing the data load, transformations, and cleansing to be done once rather than by numerous PBIX files. This capability was known for a little while during the private preview as Power BI Datapools or as ‘Common Data Service for Analytics’ (CDS-A) — but the final name looks like it’s going to be Power BI Dataflows.
It’s still early so there’s not a lot of info available online yet. James Serra wrote up a nice summary and has a few links on his blog. Also, here’s a diagram that Chris and I included in the recently updated whitepaper Planning a Power BI Enterprise Deployment which shows our initial understanding of the Power BI Dataflows capability:
Note that Pro users can use Power BI Dataflows without requiring Premium. However, my hunch is that this capabililty will be most appealing for data at scale – i.e., the features that Premium offers with respect to Power BI Dataflows will be pretty compelling, which is why Premium is depicted in the diagram above.
SSIS Data Flow
Data flows have long been a key part of SQL Server Integration Services (SSIS) for data transformations, just like the new capability being added to ADF discussed above. As of Azure Data Factory V2, we can also host and execute SSIS packages in Azure from ADF V2.
Microsoft Flow
Just for completeness I’ll cover one more product which is similarly named. Flow is an Office 365 service for workflow automation between services. It can be used in conjunction with PowerApps and Power BI for different types of workflow automation. Flow lets you do things like approval requests, sending an e-mail alert, or creating a task in a project management system.
Now you know there are multiple types data flows being launched into the world of Microsoft BI (in addition to the good old SSIS data flows we’ve had forever). Now you can cleverly watch out for which one is being bantered about in your techie conversations.
You Might Also Like…
5 techniques to take your data analysis to another level
Implementing a business intelligence suite in your organization is about more than simply collecting additional data—it’s about converting this data into actionable insights. The amount of data an organization can collect today from a variety of sources offers it the ability to see under the hood, understand which processes are working, and help teams prepare for future trends. However, without properly analyzing and comprehending the data you collect, all you have is figures and numbers with no context.
More importantly, there isn’t one right way to analyze data. Depending on your needs and the type of data you collect, the right data analysis methods will shift. This also makes it necessary to understand each type of data, and which methodology can deliver the best results. Even so, there are some common techniques that come included in most data analytics software because they’re effective. These five data analysis methods can help you create more valuable and actionable insights.
Quantitative and Qualitative Data—What’s the Difference?
The first step in choosing the right data analysis technique for your data set begins with understanding what type of data it is—quantitative or qualitative. As the name implies, quantitative data deals with quantities and hard numbers. This data includes sales numbers, marketing data such as click-through rates, payroll data, revenues, and other data that can be counted and measured objectively.
Qualitative data is slightly harder to pin down as it pertains to aspects of an organization that are more interpretive and subjective. This includes information taken from customer surveys, interviews with employees, and generally refers to qualities over quantities. As such, the analysis methods used are less structured than quantitative techniques.
Measuring Quantitative Data
Quantitative analysis methods rely on the ability to accurately count and interpret data based on hard facts. Our first three methods for upping your analysis game will focus on quantitative data:
1. Regression Analysis
Regression studies are excellent tools when you need to make predictions and forecast future trends. Regressions measure the relationship between a dependent variable (what you want to measure) and an independent variable (the data you use to predict the dependent variable). While you can only have one dependent variable, you can have a nearly limitless number of independent ones. Regressions also help you uncover areas in your operations that can be optimized by highlighting trends and relationships between factors.
2. Hypothesis Testing
Also known as “T Testing”, this analysis method lets you compare the data you have against hypotheses and assumptions you’ve made about your operations. It also helps you forecast how decisions you could make will affect your organization. T Testing lets you compare two variables to find a correlation and base decisions on the findings. For instance, you may assume that more hours of work are equivalent to higher productivity. Before implementing longer work hours, it’s important to ensure there’s a real connection to avoid an unpopular policy.
3. Monte Carlo Simulation
As one of the most popular ways to calculate the effect of unpredictable variables on a specific factor, Monte Carlo simulations use probability modeling to help predict risk and uncertainty. To test a hypothesis or scenario, a Monte Carlo simulation will use random numbers and data to stage a variety of possible outcomes to any situation based on any results. This is an incredibly useful tool across a variety of fields including project management, finance, engineering, logistics, and more. By testing a variety of possibilities, you can understand how random variables could affect your plans and projects.
Measuring Qualitative Data
Unlike quantitative data, qualitative information requires moving away from pure statistics and toward more subjective approaches. Yet, you can still extract useful data by employing different data analysis techniques depending on your demands. Our final two techniques focus on qualitative data:
4. Content Analysis
This method helps to understand the overall themes that emerge in qualitative data. Using techniques like color coding specific themes and ideas helps parse textual data to find the most common threads. Content analyses can work well when dealing with data such as user feedback, interview data, open-ended surveys, and more. This can help identify the most important areas to focus on for improvement.
5. Narrative Analysis
This kind of analysis focuses on the way stories and ideas are communicated throughout a company and can help you better understand the organizational culture. This might include interpreting how employees feel about their jobs, how customers perceive an organization, and how operational processes are viewed. It can be useful when contemplating changes to corporate culture or planning new marketing strategies.
There is no gold standard for statistical analysis or right way to do it. The method you choose should always reflect the data you’ve collected, and the type of insights you want to extract. Matching the right data and analysis helps uncover better insights to optimize your organization.
Dreamforce Considerations
The Salesforce people whom I speak with are all heads down and breathing hard in the big push to Dreamforce. In other words, things are normal for this part of the cycle.
Unfortunately, I can’t say things are normal for this time of year because Dreamforce and Oracle OpenWorld are, like Easter, movable. We’ve even taken on the language of Easter with phrases like “Dreamforce is early this year,” or “OpenWorld is pushed back.”
Speaking of OpenWorld, it’s not till October, so we can focus right now just on the insanity of Dreamforce preparations and leave the same story for Oracle till later.
On the Way to Plug-and-Play
The big questions for each show are, “What will the vendor emphasize this time?” and “Will that be a tone set for the industry for the year ahead?” Sometimes the announcements set a tone but not always.
Salesforce’s announcements of its moves into social and analytics were both major stakes in the ground for the whole software community. In contrast, Oracle’s autonomous database announcement was very important but not in the same way, because there are no longer very many database vendors.
So let’s focus on Salesforce. I have a reasonable expectation that it will have a lot to say about integration, industry-specific CRM, security and blockchain. However, a note of caution: Prognostications like this aim to extend paradigms, but vendor announcements, at least the great ones, tend toward discontinuous market disruption.
Salesforce earlier this year bought MuleSoft, an integration provider, and I think integration will be an important theme. We’re getting to the point where Salesforce has claimed most of the relevant CRM Magic Quadrants, and before they plateau they need to stake a new claim.
Integration is logical because Salesforce needs it to keep its ever-growing bundle of acquisitions working together, but also to continue fulfilling the promise of an evolving software utility.
As markets age, products commoditize, and the ultimate commoditization is integration to the point of utility formation in fact or in practice. I think we’re far down that path in practice, and the coming years will show more and easier integration to the point that some apps will become plug-and-play across vendor lines. This is already well under way within Salesforce’s AppExchange, for example.
Broader Footprints, Deeper Discussions
Many of the same arguments apply to industry-focused CRM. Many organizations in finance and healthcare, to name a couple, still build their own customer-facing apps simply because the available products are so generic that they feel they gain nothing by going with a CRM brand.
However, tuning a CRM suite to support the unique needs for banking and finance or healthcare could open new markets. Certainly, industry CRM is already available from Salesforce, and other vendors also have offerings. So I’m looking for more of a commitment from Salesforce to industry solutions through broadening footprints.
Finally, you can’t get enough security these days, and I look for Salesforce to offer something significant here. Understand that Salesforce is already a very secure place for data. Add that it already runs on the Oracle database and uses Oracle Xadata devices, and consider what’s going on in the world, and there’s a case for a security discussion.
Responsible Social, Chaotic Crypto
Now, I am going out on a limb, but I’d also suggest that after the year we’ve had with scandals in social media, it is important to say that just because CRM has social media techniques embedded in it, social is not being used in the same way in business that it has been under the worst conditions.
Facebook and others have been taken to task for enabling disturbing amounts of behavior modification on their sites, but so far no one has made similar accusations against CRM vendors. The need for responsible social media techniques is apparent, though, and I think it would be smart to point that out.
Finally, there’s blockchain. Marc Benioff tells a story of wandering into a blockchain presentation at Davos last winter. He didn’t know much about it at the time, but it got the gears turning in his mind. After the dreadful year cryptocurrencies have had — blockchain is an essential part of ensuring their security — perhaps we can separate blockchain, a promising technology for business, from cryptocurrency.
Crypto seems like a boondoggle to me after the year we’ve had so far, and we might be ready to find some real-world applications for blockchain if we separate the two. If there are some applications (and there are), trust Salesforce to come up with some ideas at Dreamforce.
We might as well also add that the Trailhead training system and community will have a prominent place, and there will be many announcements around usability and new training offerings. It’s vital for businesses that roll their own software on the Salesforce platform and the increasing community of developers who base their livelihoods on it.
Last Words
Of course, there will be more than this. Every major product line has a keynote that will afford plenty of opportunity to present new product features, and no doubt there will be customer panels on most of it.
Dreamforce is too big to take in whole, and this has been true for a long time. If this is your first Dreamforce, accept that and focus on attending the presentations that directly impact your business.
Drink water, rest your feet, make new friends. Do things you hadn’t thought of before, like participating in some Trailhead learning programs. All that will ensure you bring some useful ideas back to work.
The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.