• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Insights

Cisco is bringing individual and team insights to Webex video calls

March 31, 2021   Big Data

Want more company growth? Build a better company culture.

Learn how building an employee-first culture determines resilience, growth, and innovation.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


Starting this summer, Cisco’s Webex will begin to serve up insights for video calls to a select group of users for individuals, teams, and organizations. Examples include engagement insights, like how often you had your video on or showed up on time and the people or teams within an organization that you speak with most often.

The goal, Cisco VP Jeetu Patel told VentureBeat in a phone interview, is to make video calls better for people living in the hybrid world between in-person meetings in the office and virtual meetings at home. The tricky part, he said, is considering what information is good for an individual to know while not giving people the impression that Webex is, for example, flagging employees who are routinely late to meetings to managers.

“Let’s say you did 12 meetings today, and in six of those meetings with four people or less, you actually spoke for 90% of the time. That would be a really bad thing to give your boss, but a really good thing for you to have so you can say, ‘Oh, I should probably do a better job listening,’” he said. “The privacy on that front is not at the organizational level. It’s at the individual level. So when we provide insights like that to an individual, the individual owns the data, not the organization, because we don’t believe that without your explicit permission, you’d want to have your boss see that.”

Webex has introduced a series of new features in recent months, some powered by artificial intelligence, to change how people share information in video calls. Toward this end, Patel said, “We’ve probably invested about a billion dollars or so in the past two years in AI.”

Above: Individual insights

Gesture recognition means that people in video calls can now raise their hand or give a thumbs up or thumbs down to ask to speak or register feedback. Another AI-powered feature on the way will crop the faces of people who attend in-person meetings for the person who’s working from home or remotely.

“Even though there are three people sitting in a conference room, we’ll actually break the stream into three separate boxes and show it to you, and our hardware will actually do that,” he said.

Patel has overseen the acquisition of three companies since joining Cisco last summer, after serving as chief product officer at Box. Last month, Cisco closed its acquisition of IMImobile for $ 730 million in part to beef up its AI capabilities. Last summer, Cisco announced plans to acquire BabbleLabs, an AI startup focused on filtering audio so that the sound of someone doing dishes nearby, a lawnmower, or loud background noise can be reduced or eliminated. And earlier this year, Cisco acquired Slido, a startup that makes engagement features for video calls like word clouds or upvoting questions. Such features can allow a meeting to take the structure of a town hall, with transparency around the top questions for employees within an organization since everyone can see the questions that are being posted.

“Engagement should not be measured based on having a judgment on someone saying, ‘I’m judging that you look sad, and therefore I’m going to do certain things … at that point in time in my mind, you could cross a boundary where there’s more bad that can come out of that than good,” he said.

In 2019, Cisco acquired Voicea to power speech-to-text transcription of meetings. Closed captioning and live translation are also available in Webex calls.

Deciding where to draw the line on which AI-powered features or insights to introduce in video calls can be a challenge with nuance. Earlier this year, Microsoft Research did a study with AffectiveSpotlight on AI for recognizing confusion, engagement, or head nods in meetings. If taken in the aggregate, picking up cues from the audience could be really helpful, particularly for large organizations. But if affective AI for video calls led to critique of how often a person smiles or shows certain forms of expression, it could be considered invasive, or counterproductive, or biased to certain groups of people.

Video analysis of expression today can have major shortcomings. A group of journalists in Germany recently demonstrated that placing a bookshelf in the background or putting on glasses can change affective AI evaluations of a person in a video.

It shouldn’t matter whether a person is an extrovert or prefers not to talk in group settings as long as they fulfill their job duties. And some people talk a lot but have nothing much say, while others talk less often but deliver sharp insights or sage advice. It just depends on the team, role, and scenario.

“I’d rather you give explicit permission than something you pick up because one, it’s bad if you misread [certain stats]. And two, there’s a fine line between ‘This is super productive’ and ‘We can’t do this because it violates my privacy or it’s just outright creepy,’” Patel said.

Cisco plans to roll out Webex People Insights globally over the span of the next year starting with select users in the U.S. this summer, announcing the news today as part of Cisco Live. In other Cisco Live news, on Tuesday Cisco announced plans to combine networking, security, and IT infrastructure offerings and work with the Duo authentication platform it acquired in 2018.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Analyzing Data from Multiple Sources: The Key to More Powerful Insights

March 13, 2021   Sisense

Simple datasets just won’t cut it anymore. To get truly powerful insights, you need to pull in data from multiple sources. The more complex and diverse your datasets, the more surprising and potent the insights they’ll produce. Additional data sources increase your chances to inform actions, fueling top-line and bottom-line growth.

How does it work in real life? Read on to find out how Measuremen optimizes workspace utilization, Skullcandy minimizes product returns, and Air Canada improves airline safety.

cloud flexibility hosting options blog cta banner 770x250 1 Analyzing Data from Multiple Sources: The Key to More Powerful Insights

Measuremen: Optimizing facilities use with data from numerous sources

When Measuremen CEO Vincent le Noble began the company in 2005, he wanted to help his clients make the best use of their workspaces. He leaned into his facilities-management experience, taking careful note of how organizations used desks, chairs, meeting rooms, and amenities. At the start, Measuremen observed and recorded utilization data and began to log the kinds of activities each space enabled (such as collaboration, individual work, and high-concentration work). 

Since those early days, Measuremen has broadened its data sources. A mobile app is used to collect granular information about how each component or meeting space is used. The app not only documents utilization data, but allows users to add subjective inputs such as personal preferences. According to Vincent, it allows Measuremen to ask customers questions such as: what made users come to the office, how they foresee the future for their departments, whether they will grow, whether they will shrink, and what kind of activities they will be doing.

In the last two years, Measuremen has added location-based sensors, which record data on a more permanent and real-time basis. The result of the various inputs — self-reporting, app-based logs, static sensors, and other data sources — allows Measuremen to assess much more than unused desk spaces or underutilized meeting rooms. It can give businesses a better understanding of how employees experience the workplace and let companies tailor their resources to better suit employee needs. As a result, companies can proactively address more challenging problems like employee productivity, retention, and work-life quality. And Measuremen is hardly finished.

“We’ve been deepening our analysis for the last four years now with Sisense,” said Vincent. “All the data streams combined … give us the insights and decision-making power to help users to improve workplaces and work life for employees. And that journey is still going on.” 

Skullcandy: Listening to the market

The datasets you collect, the way you combine them, and the insights that can come from them depend on your industry, the realities of your business, and your imagination. Personal audio brand Skullcandy had a huge dataset and a huge challenge: using analytics to explore returns and reviews data to inform future product decisions.

Machine learning and predictive modeling allowed the company to use complex historical warranty claim and cost information, previous and new product attributes, and forecasting data to create a predictive data model for future warranty costs. The information not only helps Skullcandy with resource allocation for future warranty fulfillment, but can also drive design improvements. 

Skullcandy’s methodology included delving into sentiment analysis gleaned from online reviews and other customer feedback, which gave rise to exciting revelations. For example, when customer comments focus on a particular defect, Skullcandy can pinpoint the problem, examine the future warranty claims effects, and engage its engineers to make design modifications to help head off these returns. Skullcandy is also exploring ways to use disparate data streams to inform decision-making that improves customer relationships, customer education, and e-commerce ecosystems.

Air Canada: Taking data to new heights

The value of large, varied data sources is becoming obvious in air travel, too. Air Canada uses Sisense to collect and translate a wide variety of safety, quality, environmental, and security data. Safety Analytics & Innovation Manager Shaul Shalev said, “We collect hundreds of gigs of data … but unless you have a clear method of slicing and dicing that data and presenting it to users, it’s not really useful. With a tool like Sisense, it changes the game altogether.” 

The ability to collect data and make it useful allows Air Canada to identify important insights and extract actionable intel so frontline employees can respond to it in real time. In a more forward-looking posture, AI can employ the data to predict component failure, so Air Canada can replace parts before they cease functioning.

Seek out varied datasets to transform your business

The power of vast, varied data sources isn’t constrained to any segment of the economy. Here are a few more stories of companies finding success with large, complex datasets:

As you can see, leveraging diverse datasets to generate game-changing insights spans industries. The takeaway is that you should cast the widest net you can and leverage whatever data you can get your hands on. The benefits you’ll reap from bringing that data together can help drive revolutionary change at your business and further evolve you in a tumultuous business world.

Adam Luba is an Analytics Engineer at Sisense who boasts almost five years in the data and analytics space. He’s passionate about empowering data-driven business decisions and loves working with data across its full life cycle.

Let’s block ads! (Why?)

Blog – Sisense

Read More

DNV Illuminates the Utilities Industry with Insights and Data

February 20, 2021   Sisense

At DNV, our mission is to safeguard life, property, and the environment. We do this throughout a variety of industries, and as the head of the electric grid ecosystem and product center at DNV, my focus is on helping optimize risk assessment, remediation, and overall reliability of critical assets at our electric utility customers. In order to do this, my team uses data to identify problem areas and potential issues for our customers (ideally before they happen).

Like many organizations, our utility company clients are sitting on copious amounts of data from a variety of sources, including mobile applications and streaming data captured by grid-edge technologies. However, without an advanced analytics solution that could surface relevant trends and information, this data was an underutilized asset.

That led us to build Cascade Insight, an enterprise solution that infuses data and insights into a comprehensive platform that utility companies use to optimize their operations, reduce risk, and improve the reliability of their electrical equipment. This process necessitated us getting a handle on the data we were already collecting, making sense of the constant stream of new information our clients were pulling in, and presenting it all in an easy-to-understand interface. 

Unique challenges and opportunities in the utilities industry

As opposed to typical for-profit enterprises, utility companies were created to serve the public good. One of their main focus areas is driving operational efficiencies to maintain affordable rates and deliver better service for customers.

Utilities employ skilled professionals as knowledge workers, but creating a simple, visual way to analyze their data is a hard skillset to find in abundance. DNV’s effort to create an analytics consortium was driven by the opportunity to bring knowledge workers from different utilities together to collaborate on challenges with their peers. Much in the same way that researchers are able to collaborate on projects across time and space, we wanted a consortium of utility providers to be able to pool efforts on major problems affecting the industry beyond their individual companies.

Capturing and transforming data to transform the utilities industry

The key to making this collaboration possible would be having an analytic application that could pull in data from across a number of sources. Each utility company generates a tremendous amount of data from a variety of touch points, making the first challenge one of volume and ingestion: The destination system has to be able to handle data from disparate sources and in a range of formats, especially from mobile and field sensors/equipment. Transforming this data into something usable becomes more complicated when you realize that utilities often have different data naming conventions in addition to capturing different pieces of data.

This presented the first challenge for our product team in building Cascade Insight: What is the data that is most important to capture? Being IT professionals and software vendors, my team initially focused a lot on the technology. However, defining the data requirements was important for understanding what data you need to measure to provide analytical insights. And my team at DNV struggled with that in spite of having all this data and domain knowledge. Fortunately, the team at Sisense was able to come out and help us with this process.

the data journey infographic blog cta banner 770x250 1 DNV Illuminates the Utilities Industry with Insights and Data

The right data, the right platform, the right partner

The experts at Sisense live and breathe data analytics — and the Sisense platform is designed from the ground up to connect with disparate datasets and infuse insight from those sources anywhere users need them. Working with the Sisense team, we were able to nail down the data requirements that were needed to create the high-level insights, such as the overall health of a transformer, that would improve the lives of our utility company end users. Armed with this understanding, we were able to use Sisense to develop the Cascade Insight platform in just six months, a rapid timeline for building an enterprise product from scratch!

As a seasoned IT professional, I knew that our platform had to consume clean data if it was going to deliver accurate insights. The old adage “garbage in, garbage out” rings very true when it comes to data analysis, and we had to do a number of things to verify that the data entering the Cascade Insight platform was clean. It started with implementing data validation rules, including basics like only accepting numbers for numerical inputs, etc. Data quality starts when you first capture information; keeping bad data from getting in your calculations in the first place is key. 

We also had to ensure that the information captured across our utilities was consistent. For example, when considering data captured on/by a transformer, a couple of the obvious data points are make and model. However, a utility may have another 40-plus items about that transformer that it’s capturing data on that may either be documented with a different naming convention or not even captured by other utilities. So, when building out Cascade Insight, we chose 16 key dashboards that our clients would benefit from and ensured that the data captured from each of the clients was clean and consistent and could be ingested into our analytics application.

Driving immediate impact with analytics

Cascade Insight had an immediate positive impact for our utility company clients. Professionals are now able to quickly identify equipment and processes that are at risk or are performing suboptimally, drilling down to specific components on a map that are having problems. From there, they can proactively provide maintenance to those components and help keep the lights on for millions of customers. But this is just the starting point for Cascade Insight.

By creating this consortium of utility providers that collaborates across utilities, we hope that this will encourage ongoing communication among the utilities. By collaborating on best practices, and as new challenges are discovered, this coalition can begin defining future dashboards and coming up with standards for the associated data that must be gathered to drive meaningful insights for them and users like them. This collaborative work style, uniting a number of utilities, is a practice that can be replicated across other industries, in particular health care. Bringing together really talented professionals and providing them data in an accessible application instead of jumbled numbers in a spreadsheet helps drive positive, high-value results.

Whatever your industry and your company, you’ve got some data you can use to drive revenue. The key is to understand that data, learn to capture and prepare it properly, then present it to your users in a way that is meaningful to them. The right BI platform can go a long way toward making that process possible, and the right partner (like Sisense) can give you the support and expertise you need to make your dream a reality. To hear more about DNV’s analytics journey, check out our conversation here! 

Ron Howard is head of electric grid ecosystem & product center for DNV. He has over 35 years of experience in the utilities industry with a focus on IT and software products. Having worked for both software vendors and utilities, he has devised and delivered many successful software products and has spoken at numerous industry events and conferences.

Tags: embedded analytics | enterprise BI

Let’s block ads! (Why?)

Blog – Sisense

Read More

From Data to Decisions with Actionable Insights

January 30, 2021   Sisense

There’s no industry that couldn’t be improved by actionable insights, and over time, every industry will be. For example, one fintech problem requiring actionable insights today is credit risk assessment. A well-structured fintech algorithm could easily use machine learning (ML) to advise a human agent to approve or reject a business loan application. 

That advice is a quintessential actionable insight: The algorithm models the applicant’s probability of success by training on a mountain of historical data and market factors. The resulting insight is the algorithm’s classification of the applicant as an acceptable or unacceptable risk. The action, in this case, is the human agent’s decision to grant or deny the loan. Businesses of all kinds today are dependent on such actionable insights, ideally infused into workflows, driving better business outcomes without users having to leave their primary tasks to look for answers in the data.

building apps that last blog cta banner 770x250 1 From Data to Decisions with Actionable Insights

Actionable insights: The core of business intelligence

In every industry, actionable insights arising from analytics and AI are no longer a luxury, but a necessity for achieving competitiveness. E-commerce platforms require high-level fraud detection systems to advise their human reps on courses of action. Actionable insights have a special connotation in the field of data analytics: An algorithm evaluates a vast data store and effectively advises a human agent on the best action to take. Implicit among assumptions about machine learning is that algorithms will identify a pattern or outcome in time-series data, for example, which human beings could not do in practical time. There are countless opportunities for turning data into actionable insights in industries like:  

  • Finance: Optimal portfolio trading
  • Healthcare: Diagnosis, intervention, administration
  • Advertising: Market analytics and placement
  • Pharma: Protein folding and drug synthesis
  • Retail: Demographics and style trending
  • Education: Assessment and teaching methods

The race to put actionable insights into the hands of users is on. When one player in a field turns data into actionable insights, all competitors must follow suit to keep pace. Innovative use of AI and ML methods in all fields will only accelerate this arms race, putting more powerful actionable insights in the hands of workers of all kinds.

Actionable insights example: Broadridge

Undoubtedly money is the great motivator, and AI solutions must improve profitability. Therefore, let’s see how ML models in asset management and optimal portfolio trading produce actionable insights. Broadridge Broadridge Asset Management Solutions is a $ 4.5 billion revenue assets management company with clients generating unique data stores in diverse containers. An analytics challenge here is the integration of multiple data streams to feed an analytics platform and subsequently generate actionable insights that are valuable across a portfolio spectrum. 

On the fundamental level, preparing data for analysis is often called wrangling. To business intelligence execs, the top-level view is the integration of everything from containers to ML technologies and visualizations. Analytics must cope with both structured and unstructured data to achieve optimal results. As we’ll see, very few ML and AI analytics platforms today embody the data science and engineering sophistication to accomplish such a feat with an outcome of actionable insights whose profit justify their cost.

A particularly profitable outcome achieved by Broadridge is the use by its portfolio managers of ML analytics to forecast the best portfolio positions. Broadridge accountants are also leveraging analytics to generate actionable insights that include forecasting accounts receivable and payable outcomes leading to sharper awareness of overall assets. 

Infusing actionable insights into workflows — the future of business

Turning data into actionable insights is where BI and data analytics platforms like Sisense shine. Modeling complex and multidimensional data parameters in innovative ways and presenting those insights within workflows will change the future of every industry. For example, these systems will be able to evaluate millions of model configurations and dynamically make corrections to them based on live event streaming data in order to continuously update business-facing users with the best possible insights. The result will be more businesses where all or most decisions are made on underlying data, not gut or guessing, leading to increased conversions, lower churn, and high revenues for those who embrace actionable insights. Those who don’t won’t have much of a future to speak of.

white label reports dashboards blog cta banner 770x250 1 1 From Data to Decisions with Actionable Insights

Eitan Sofer is a seasoned Sisenser, having spent the last 13 years building and shaping our core analytics product, focusing on user experience and platform engineering. Today, he runs the Embedded Analytics product line which powers thousands of customers and businesses, making them insights-driven. Eitan is also an avid music fan and surfer.

Tags: actionable analytics | Machine Learning

Let’s block ads! (Why?)

Blog – Sisense

Read More

Visualize 5 Cool Insights on Holiday Tree Trends Over Time

December 31, 2020   TIBCO Spotfire
TIBCOSpotfire ChristmasTree scaled e1608573759606 696x365 Visualize 5 Cool Insights on Holiday Tree Trends Over Time

Reading Time: 3 minutes

Did you know Thomas Edison’s assistants proposed putting electric lights on Christmas trees? There’s a long and rich history surrounding holiday trees, in America and around the world. According to the History Channel, symbolic traditions involving evergreen trees in winter began in ancient Egypt and Rome and continue to take on new meaning today. 

New Holiday Traditions: Annual Analytics

Here at TIBCO, we’ve started our own holiday tradition involving the classic festive trees: using our data visualization software to understand trends in holiday tree data. Last year, we shared our analysis and “treemap” visualization (quite literally a treemap of trees) via TIBCO Spotfire®. This year, we dived even deeper into the data, using the new Spotfire Mods functionality to design custom apps for greater interactivity. Here’s what we found:

  • Top Tree Producing States: All 50 states contribute to the holiday tree industry, but our analysis shows the greatest production occurs in Oregon, North Carolina, Michigan, and Pennsylvania. Also interesting is that while Oregon and North Carolina are top producers overall, states like Ohio and Michigan definitely over-index for total tree producing counties as a percentage of their total land area. 
 Visualize 5 Cool Insights on Holiday Tree Trends Over Time
Immersive, interactive exploration of a bubble “tree-map” visualization Mod alongside county-level Spotfire geoanalytics  [*source: USDA census data]    
  • Artificial vs. Real Tree Sales: As you can see below, artificial tree sales have been on the rise over the last decade, with 162 percent growth between 2004 and 2018. Artificial trees are taking over. Actually, 81 percent of the trees on display, whether in storefronts, businesses, or homes, in 2019 were of the artificial variety. But what does that mean for the global economy when China produces 80 percent of artificial trees worldwide and given that artificial trees cannot be recycled like real trees?
 Visualize 5 Cool Insights on Holiday Tree Trends Over Time
Tree sales volume over time in this area chart visualization Mod in Spotfire [*source: National Christmas Tree Association] 
  • Rising Average Price of Real Trees: According to an article in the Hustle, “During the recession in 2008, ailing farmers planted too few trees. As a result, prices have been much higher since 2016.” The article also cites the National Christmas Tree Association as stating that the average retail price for a real tree in 2019 was $ 75. Obviously, this is a huge market, but one that continues to shift with economic and social changes—which makes us wonder just how different our analysis next year will look.
  • Consumer Demand Lower in 2019: In the area chart visualization above, we see that sales for natural trees still account for a larger share of the market. However, the artificial tree category set new high marks for sales in each year progressively from 2016 to 2018. Why could this be? One hypothesis might be that as Baby Boomers retire as “empty-nesters” and downsize their homes, they are buying fewer trees, but let us know your thoughts on this surprising find. 
  • The More the Merrier? Multiple Trees: According to a survey by the American Christmas Tree Association, the number of households in the United States that display more than one Christmas tree has grown by 10 percent from 2014 to 2019. In 2019, approximately 16 percent of American households display multiple trees. But will this trend continue or, as with the overall tree sales, will the number of trees per household decrease in the coming years?

We’ve started our own holiday tradition involving the classic festive trees: using our data visualization software to understand trends in holiday tree data. Click To Tweet

A New Tradition: Immersive Yourself in Custom Analytics Applications 

But this is just one festive story you could tell around data trends. What about shopping trends this year, will there be an increase in small business online sales? What will be the top gifted items in 2020? 
You tell us! Join our tradition, and read our whitepaper to learn how the immersive qualities of Hyperconverged Analytics will create new value for your business. For a closer look at all of “What’s New in Spotfire®” including visualization Mods, watch our 20-minute intro webinar on demand. 

Previous article20 for 2020: Looking Back on a Year of Blogging

Shannon Peifer is a Marketing Content Specialist at TIBCO Software in Denver, CO. She graduated from the University of Texas at Austin in 2018 with a double major in marketing and English honors, and loves writing engaging content related to technology. Shannon grew up overseas, and loves to explore new places. When she’s not writing, you can find her swimming laps at the pool, gulping down iced lattes at local coffee shops, or scouring the shelves at the bookstore.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Sisense Q4 2020: Analytics for Every User With AI-Powered Insights

December 19, 2020   Sisense

Sisense News is your home for corporate announcements, new Sisense features, product innovation, and everything we roll out to empower our users to get the most out of their data.

Every company is becoming a data company; there’s no getting around it. Savvy organizations know they don’t need to fear data and analytics — they see better insights as the pathway to a brighter future.

Yet a recent Gartner survey shows that 50% of organizations lack sufficient data literacy skills to achieve business value. With our Q4 release, Sisense is bridging the skillset gap to help organizations unlock business potential faster with AI-powered explanations, an enhanced live data experience, and a robust new reporting service.

Smarter insights with AI-powered data explanations

Looking at a chart, it can be difficult to uncover actionable insights. Data doesn’t yell, “Your sales increased by 10% last month because an influx of men aged 18-24 in Seattle bought more beard-trimming kits without any promotion.”

Often, to find those types of insights, you slice, dice, and filter. However, this requires a level of familiarity with the data itself. Most of us don’t have the time to dig deep into the data to get rich intelligence that we can use in our daily and strategic decisions. 

Now you don’t have to! Sisense fuses your business expertise with deep insights through the power of AI-powered Explanations. Sisense Explanations provides easy, deep data exploration for every user, across the entire data journey. 

To start, Sisense does the heavy lifting by highlighting anomalies and points of interest in your data for further exploration. Or a user can simply click on any point to discover the driving force behind the data. Sisense leverages all of the dimensions in your data and runs combinations to determine the exact impact that each variable has on a data point.

As another example, if your sales went up by 10%, Sisense might explain that the increase was attributable to both a specific product category and a certain age group of customer with a visual display of the breakdown. Within seconds, any user can understand the data without the need for specific technical expertise and get the deep intelligence to uplevel every decision.

Fast insights: Enhanced live data experience

With the global economy, your business doesn’t stop at 5 p.m., so why should you limit your data-driven decisions to those hours? These kinds of round-the-clock decisions require the most up-to-date information, which can only be surfaced with the aid of real-time data. Our goal at Sisense is to continually make sure your most critical real-time decisions can be made with ease.

For every query, Sisense translates live widget information into SQL data. This quarter, we released a new translator service to reduce the time and cost for these queries on your Snowflake live connection, with other live connections in beta. Reduce data query time by up to 70% and give your live widgets up to a 15% performance boost! 

Code-savvy users can crack open the optimized SQL code that drives live visualizations to easily understand and validate the logic. It’s also simple to track, monitor, and optimize activity on your live data sources to better understand the context of your live queries.

Deeper understanding with next-level reporting

As you bring more robust intelligence to everyone at your company, we continue to make it easy to effectively communicate repeatable and ad hoc reports with a new reporting engine. Now, reporting is better than ever with the ability to generate PDFs and images at scale to share reports with everyone in the organization.

Create robust reports that feature your customizations such as Sisense BloX to ensure every insight is visible and can be understood. The technology is packed in a new microservice and has dedicated queues that can be managed to scale so your reports are always delivered on time at the scale you need.

Toward a data-powered future

Think of all the data your company is sitting on. Now think of all the hoops you need to jump through to make sense of any of it. There are countless decisions you and your colleagues make every day that could be enhanced with the right insights, drawn from your complex array of datasets. Bridging the skills gaps to allow your organization to successfully leverage its datasets is challenging, but with the enhanced functionality of the latest Sisense release, you’re one step closer to becoming a truly data-driven organization.

Mindi Grissom is Director of Product Marketing at Sisense. She has over 5 years of experience in the technology industry, helping thousands of organizations transform their business with data and analytics.

Let’s block ads! (Why?)

Blog – Sisense

Read More

New LinkedIn Sales Insights Coming in 2021

December 19, 2020   Microsoft Dynamics CRM
 New LinkedIn Sales Insights Coming in 2021

With so much going on in the news these days (hello COVID vaccines!), you may have missed LinkedIn’s recent announcement on a pretty major upgrade to its portfolio. Come this February, the online platform is launching LinkedIn Sales Insights (LSI) to help users better identify opportunities and gain a more comprehensive understanding of the market.

In her LinkedIn Sales Blog post earlier this month, LinkedIn Vice President of Product Management Lindsey Edwards said LSI is poised to deliver “clear visibility into the size and fast-growing nature of specific departments, functions, and accounts” and give users enhanced capabilities in accurately planning their sales strategies.

Data is the foundation of every great sales process. And LinkedIn, quite frankly, is loaded with a vast amount of mostly up-to-date data points on hundreds-of-millions of global members that, in my opinion, has been underleveraged up until now.

DATA IS THE FOUNDATION TO EVERY SUCCESSFUL SALES PROCESS

By leveraging LinkedIn Sales Insights, Sales Ops leaders, high-performance sales teams and Lead Sales Dogs benefit from clean, reliable data and will better be able to:

  • Segment accounts more effectively
  • Leverage the data for smarter sales planning
  • Identify whitespace within a target market
  • Gain better understanding on where to focus relationship-building efforts
  • Feed CRM with real-time data from LinkedIn

The LinkedIn Sales Insights platform offers truly enhanced insight into the size and growth of targeted departments, functions, and accounts, giving users a fast track to a more streamlined sales strategy and, ultimately, a boost in revenue growth.

With the changing roles in this new business and employment paradigm, it is more important than ever to have access to the real-time, accurate data that supports your sales initiatives. I applaud LinkedIn for stepping up to the plate and taking a good hard swing at better sales books and a stronger data game.

While its true functionality is yet to be seen, I look forward to LinkedIn’s LSI launch in just a few weeks and the profound impact it may have on good, clean data and enhanced sales processes.

Click here to watch a short LSI video on LinkedIn.

By Christopher Smith, Founder & CEO of Empellor CRM.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Harnessing Streaming Data: Insights at the Speed of Life

October 28, 2020   Sisense

We live in a world of data: There’s more of it than ever before, in a ceaselessly expanding array of forms and locations. Dealing with Data is your window into the ways data teams are tackling the challenges of this new world to help their companies and their customers thrive.

Streaming data analytics is expected to grow into a $ 38.6 billion market by 2025. The world is moving faster than ever, and companies processing large amounts of rapidly changing or growing data need to evolve to keep up — especially with the growth of Internet of Things (IoT) devices all around us. Real-time insights provide businesses with unique advantages that are rapidly becoming must-haves for staying competitive in a variety of markets. Let’s look at a few ways that different industries take advantage of streaming data.

How industries can benefit from streaming data

  • Sales, marketing, ad tech: Making faster marketing decisions, optimizing ad spend
  • Security, fraud detection: Reducing the time needed to detect and respond to malicious threats
  • Manufacturing, supply chain management: Increasing inventory accuracy, avoiding production line downtime, monitoring current IoT and machine-to-machine data
  • Energy, utilities: Analyzing IoT data to alert and address equipment issues, get ahead of potential issues, help reduce maintenance costs
  • Finance, fintech: Tracking customer behavior, analyzing account activities, responding to fraud and customer needs immediately and proactively while they’re engaged. 
  • Automotive: Monitoring connected, autonomous cars in real time to optimize routes to avoid traffic and for diagnosis of mechanical issues

As real-time analytics and machine learning stream processing are growing rapidly, they introduce a new set of technological and conceptual challenges. In this piece, we’ll dig into those challenges and how Upsolver and Sisense are helping tackle them.

Technological challenges to handling streaming data

Stateful transformations

One of the main challenges when dealing with streaming data comes from performing stateful transformations for individual events. Unlike a batch processing job that runs within an isolated batch with clear start and end times, a stream processing job runs continuously on each event separately. Operations like API calls, joins, and aggregations that used to run every few minutes/hours now need to run many times per second. Dealing with this challenge requires caching the relevant context on the processing instances (state management) using techniques like sliding time windows.

Optimizing object storage

Another goal that teams dealing with streaming data may have is managing and optimizing a file system on object storage. Streaming data tends to be very high-volume and therefore expensive to store. However, cloud object storage, like Amazon S3, is a very cost-effective solution (starting at $ 23 per month per terabyte for hot storage at time of writing) compared to traditional databases and Kafka (which creates three replicas by default on local storage — that’s a lot of data being stored!).

The challenge with object storage is the complexity of optimizing its file system by combining the right file format, compression, and size. For example, small files are a performance anti-pattern for object storage (50X impact), but using streaming data forces us to create such files. 

Cleaning up dirty data

Every data professional knows that ensuring data quality is vital to producing usable query results. Streaming data can be extra challenging in this regard, as it tends to be “dirty,” with new fields that are added without warning and frequent mistakes in the data collection process. In order to bridge the gap to analytics-ready data, developers have to be able to address data quality issues quickly.

The best architecture for that is called “event sourcing.” Implementing this requires a repository of all raw events, a schema on read, and an execution engine that transforms raw events into tables. Every time analytics data needs to be adjusted, the developer will run a processing job from the raw data repository (time travel/replay/reprocessing).

Broader considerations

These are just a few of the technical considerations that teams need to grapple with when attempting to use real-time data. They also have to orchestrate huge volumes of jobs to handle rapidly changing data, work to ensure data consistency with exactly-once processing, 

and deal with concurrent requests from a variety of users all trying to get insights from the same data at the same time.

Digital Transformation Whitepaper CTA Banner 770x250 1 Harnessing Streaming Data: Insights at the Speed of Life

How Upsolver solves stream processing 

Sisense has partnered with Upsolver to help solve these technical challenges. Upsolver has spent countless hours to eliminate the engineering complexity so that companies can solve these types of data challenges in real time.

upsolver1 Harnessing Streaming Data: Insights at the Speed of Life

Challenges like job orchestration, exactly-once data consistency, and file system management are heavy engineering challenges. Solving them is a complex challenge, resulting in a data engineering bottleneck that slows the analytics process. 

Upsolver encapsulates the streaming engineering complexity by empowering every technical user (data engineers, DBAs, analysts, scientists, developers) to ingest, discover, and prepare streaming data for analytics. These experts can define transformations from streams to tables and govern the processing progress using a visual, SQL-based interface. Engineering complexity is abstracted from the user via an execution engine that turns multiple data sources (stream and batch) into tables in various databases. 

Let’s dig into how it works!

Step 1: Connect to data sources — cloud storage, data streams, databases

Once inside your Upsolver user interface (UI), users simply click on “Add a new data source” and choose one of the built-in connectors for cloud storage, databases, or streams. The UI allows users to parse their source data in formats including JSON, CSV, Avro, Parquet and Protobuf.

upsolver2 Harnessing Streaming Data: Insights at the Speed of Life

A sample from the parsed data is displayed before ingestion starts:

upsolver3 Harnessing Streaming Data: Insights at the Speed of Life

Schema on read and statistics per field are automatically detected and presented to the user: 

upsolver4 Harnessing Streaming Data: Insights at the Speed of Life

Step 2: Define stateful transformations 

Now that each data source has been set up, it’s time to define an output to be Upsolver’s entity for processing jobs. Each output creates one table in a target sync and populates it continuously with data based on the transformations the user defined. 

upsolver5 Harnessing Streaming Data: Insights at the Speed of Life

Transformations can be defined via the UI, SQL, or both (bidirectional sync between UI and SQL).

(Note: The SQL statement isn’t for querying data like in databases. In Upsolver, SQL is used to define continuous processing jobs so the SQL statement is executed once for every source event.)

Upsolver provides over 200 built-in functions and ANSI SQL support out of the box to simplify stream processing. These native features hide implementation complexities from users so their time isn’t wasted on customized coding. 

Upsolver also provides stateful transformations that allow the user to join the currently processed event with other data sources, run aggregation (with and without time windows), and deduplicate similar events. 

Adding transformations using the UI:

upsolver6 Harnessing Streaming Data: Insights at the Speed of Life

Editing transformations with SQL:

The output below calculates the order total by aggregating net total and sales tax.

upsolver7 Harnessing Streaming Data: Insights at the Speed of Life

IF_DUPLICATE function for filtering similar events:

upsolver8 Harnessing Streaming Data: Insights at the Speed of Life

Defining keys for tables in Amazon Redshift: 

This feature is necessary to support use cases like change data capture log streaming and enforcing GDPR/CCPA compliance in Amazon Redshift.

upsoler9 Harnessing Streaming Data: Insights at the Speed of Life

Step 3: Execute jobs query optimization

Now that your output is ready, click on RUN to execute your job.

First, fill in the login information for Redshift:

upsolver10 Harnessing Streaming Data: Insights at the Speed of Life

Second, choose the Upsolver cluster for running the output and the time frame to run at.

Upsolver clusters run on Amazon EC2 spot instances and scale out automatically based on compute utilization. 

upsolver11 Harnessing Streaming Data: Insights at the Speed of Life

Note that during this entire process, the user didn’t need to define anything except data transformations: The processing job is automatically orchestrated, and exactly-once data consistency is guaranteed by the engine. The impact of implementing these best practices is faster queries that will power Redshift and dashboards in Sisense.

Step 4: Query

Log in to your Sisense environment with at least data designer privileges. We’ll be creating a live connection to the Redshift cluster that was set up in Step 3 and a simple dashboard. The first step is to navigate to the “Data” tab and create a new live model. You can give it whatever name you like.

sisense1 Harnessing Streaming Data: Insights at the Speed of Life

Now, let’s connect to Redshift. Click the “+ Data” button in the upper right hand corner of the screen. Select Redshift from the list of available connections. Next, you’ll need to enter your Redshift location and credentials. The location is the URL/endpoint for the cluster. The credentials used should have read access to the data you are using. Make sure to check “Use SSL” and “Trust Server Certificate” (most clusters require these options by default).

When ready, click “Next” and select the database you’ll be connecting to. Then click “Next” again.

Here you can see all of the tables you have access to, separated by schema. Navigate to the schema/tables you want to import, and select a few to start with. You can always add more later. When ready, click “Done” to add these tables to your model.

sisense2 Harnessing Streaming Data: Insights at the Speed of Life

Once you have tables to work with, it’s time to connect them. In this case, the tables link on “orderId.” Simply drag one table onto the other and select the linking field (the key) on both sides. When you see the line appear between the tables, you can click “Done.” Finally, click “Publish” in the upper right hand corner, and you’re ready to create a dashboard!

sisense3 Harnessing Streaming Data: Insights at the Speed of Life

Now you need to navigate over to the “Analytics” tab. This is where we’ll create a new dashboard to view and explore the data. In there, you’ll see a button to create a new dashboard.  Click it and select the live model you just published. You can give another name to the dashboard or it will default to the same name as the model.  

sisense4 Harnessing Streaming Data: Insights at the Speed of Life

Now, it’s time to build the dashboard and explore your data. In the following animation, we create a few different visualizations based on fields from both tables. Notice that we are selecting fields to use as both dimensions and measures from both tables in the model. Sisense is automatically joining the tables for us, and Upsolver keeps the data in both tables synced with the stream.

sisense5 Harnessing Streaming Data: Insights at the Speed of Life

Feel free to explore your data now. You can left-click on elements of the dashboard to place a filter and right-click to drill down. You can also interact with filters on the right hand filter panel. Again, you can filter on fields in both tables, and Sisense will determine the right queries for you.

sisense6 770x388 Harnessing Streaming Data: Insights at the Speed of Life

Getting your streaming data to work for you

Streaming data analytics is important for businesses to make critical decisions in real time. To get there, it’s necessary to solve several engineering complexities that are introduced with streaming and aren’t addressed with the batch technology stack. In this article, we showed how Upsolver, AWS, and Sisense can be used together to deliver an end-to-end solution for streaming data analytics that is quick to set up, easy to operate without coding, and scales elastically using cloud computing/storage. 

Click here to find out more about Upsolver. Visit this page to become a Sisense Platform Partner and here to learn  more about our partnership with AWS.

Ori Rafael has a passion for taking technology and making it useful for people and organizations. Before founding Upsolver, Ori performed a variety of technology management roles at IDF’s elite technology intelligence unit, followed by corporate roles. Ori has a B.A. in Computer Science and an MBA.

Mei Long, PM of Upsolver has held senior positions in many high-profile technology startups. Before Upsolver, she played an instrumental role on the teams that contributed to the Apache Hadoop, Spark, Zeppelin, Kafka, and Kubernetes project. Mei has a B.S. in Computer  Engineering.

Tags: streaming analytics

Let’s block ads! (Why?)

Blog – Sisense

Read More

Artificial Intelligence, Real Enhancements, Better Insights

August 6, 2020   Sisense

Sisense News is your home for corporate announcements, new Sisense features, product innovation, and everything we roll out to empower our users to get the most out of their data.

Ashley Kramer is our new chief product and marketing officer. She began her career as a software engineering manager at NASA, worked at Oracle, Amazon, and elsewhere before moving to Alteryx, serving as senior vice president of product. She recently spoke with SearchBusinessAnalytics about her vision for the Sisense platform and what it’s like being one of the few women helping shape software development at a major business intelligence and analytics vendor.

SearchBusinessAnalytics: What led you to the decision to leave Alteryx and move to Sisense to head the development of its analytics platform?

Ashley Kramer: What was really interesting to me about Sisense was the approach they took from the beginning, which was to give agility to customers. That means being able to do analytics the way that they need — the Periscope acquisition means it can be code first or drag-and-drop — and being able to allow people to leverage analytics wherever they need, whether that’s embedded deeply within a product, whether that’s in an internal portal, or whether that’s a dashboard that you deploy.

In addition, I have a lot of cloud experience, and I understand what works and what doesn’t, and Sisense took a really unique approach here a year ago to say, “Let’s rewrite our products to be more of a Linux microservices-based architecture,” so that as companies scale in the cloud and as they leverage different complex data scenarios, we can grow with them. We can meet them where they are today, and we can grow with them as they go into the future, and that was truly unique to me and super exciting.

SBA: What is the dynamic between you as chief product officer and chief technology officer Guy Boyangu?

AK: He deeply understands the architecture, both where it was and where it’s come, and what we collaborate on is around the AI space. We have something called the Knowledge Graph that gathers all kinds of intelligence so we can give our customers smartness out of the box when we deliver our analytics, so what Guy and I collaborate on is on the under-the-hood side of where Sisense goes next.

I’m the product strategist and visionary working with the entire team, and Guy is able to take that and say, “OK, technology-wise, this can be our strategy.” It’s a really good partnership that we have together.

SBA: As you come to Sisense, what is your vision for where the analytics platform will be in two or three years?

AK: There are a lot of places we can go, and we need to focus those efforts so we can make sure we’re delivering what customers need. But I can really break this down into further investments in three different areas.

First, how will we continue to scale across clouds? If you think about [many] of the other vendors in this space, they’re mostly part of a cloud ecosystem — Looker and Google, Tableau and Salesforce, Power BI always being part of Microsoft — and we’re a cross-cloud platform, we can work with data analytics cross-cloud, so we want to continue innovating there and scaling to make sure we work seamlessly with all of the different services that all of the different clouds provide. The second is more intelligence. How do we continue to do deeper smartness within the platform, so automated recommendations and automated decisions, alerting, those types of things. And of course we have natural language querying so you type a sentence and get your results. There are many more things we can do to make analytics easier for everybody involved and get more people involved in analytics.

And that brings me to the last one, which is if you look at statistics on the different usage and adoption of analytics it tends to be pretty low in general within organizations, and I truly believe that’s for one reason — most BI platforms assume people will unnaturally leave their everyday workflows and processes and look at a dashboard and then come back. With our API-driven platform and approach, we can bring analytics to the salesperson that spends their entire day in whatever sales platform or CRM (customer relationship management) platform they use, and for someone like me that’s always on the go, send it to my cellphone. By continuing to innovate in that area, I think we can start to see adoption go up and really start seeing organizations make data-driven decisions, and then of course as an end result have better business outcomes.

si whitepaper banner data deduplication 092720191 Artificial Intelligence, Real Enhancements, Better Insights

SBA: What are some trends you see in BI and analytics that Sisense is responding to and are helping drive Sisense’s roadmap?

AK: The first trend is we’re finally starting to see people migrate their data to the cloud. It’s been something people have been talking about, but we’re really seeing it come to life. The second is adoption — different statistics say that only about a third of an organization are actually using the analytics to make decisions. And the next one is a huge trend around AI and augmented analytics, so how do we not just allow people to make a decision about what’s happening right now, build a dashboard and see sales are down in the west, but how do we actually start to help them predict what to do next and prescribe how to take the action. A lot of the things we’re doing with our Knowledge Graph and a lot of the features we’ve released and have planned as part of our roadmap will start getting people out of just descriptive and taking them to those next two steps of predictive and prescriptive analytics.

SBA: How did your time at Oracle, Tableau, Alteryx, and other organizations prepare you for your new role at Sisense?

AK: My early career at Oracle and NASA was valuable because I was a programmer, so I understand technology pretty deeply, and I understand how sometimes a product team can dream up something but it might actually be a hard engineering challenge. That was helpful for me to understand how to be a better product leader. My time at Amazon helped me understand the crazy importance of the cloud. They were the early innovators in that area. Tableau was really interesting because Tableau started coming into the market with cloud analytics right at the cusp of when it was happening, so I got to learn a lot about the early market and then see how it evolved. And of course when you’re one of the earliest to arrive you don’t get everything right, and that’s okay. You learn lessons and figure out how to do it better next time. And at Alteryx it was trying to understand what the more data-science part of the world needs.

As a product leader it’s super important to understand what is possible for engineering and what is not, understanding the cloud is important because we’re cross-cloud here at Sisense, and then understanding where people want to go next with analytics by actually scaling it across the organization and taking that next step to be predictive. So, this really was the perfect role.

SBA: As chief product officer at Sisense you are now one of the few women helping shape the product development of a leading BI vendor. Did you face barriers on your path to Sisense?

AK: I was always interested in technology growing up — even in the early days, the Oregon Trail days. Fast-forward to high school and I was in the very first computer science class taught by my physical education teacher, which is crazy when you think about it. It doesn’t make any sense, but at least my school was thinking about offering computer science. I was a senior and the only woman in a class of 10 people. I was very fortunate to have people to encourage me to continue on this path. Move on to college, and in my graduating class there were three to five women in the class. Since then, I think what the world has done a better job of is have things like Girls Who Code, different programs to push the importance of STEM and make girls and women be a part of that. The world has changed and evolved, but it wasn’t really there when I was coming to my professional stage of my career.

SBA: What have you found following school in your professional life?

AK: As far as the glass ceiling, I feel like I’ve had a lot of great mentors along the way, some of them were women and some of them were men, and I think sometimes we undervalue the importance of what we can do for each other. Women can get together and I can go to every women leadership CPO dinner that exists, but we need men there too supporting us. We need them to be part of the conversation. Something that’s super attractive to me about Sisense is that not only is 50% of our core operational team reporting to the CEO women, but also [CEO] Amir Orad’s wife is a CEO [of LimitScreen Inc.]. He definitely understands the value of women in these leadership roles. Everyone has an equal voice, and I think that’s really powerful.

It’s been interesting. I’ve seen a trend with my generation and there are other women CPOs — there aren’t many, but even at my last company there was a woman chief strategy officer and we were really big supporters of each other. There was a historic thing in the past that there could only be one woman in the room, but that didn’t exist at any of the companies I’ve been. I feel fortunate to not have to have lived in that world. The more women in the room at leadership levels, the better, so I don’t feel like I have to speak up or be the alpha woman because that doesn’t exist anymore.

Editor’s note: This Q&A has been edited for clarity and conciseness.

packages CTA banners Cloud Data Teams Artificial Intelligence, Real Enhancements, Better Insights

Let’s block ads! (Why?)

Blog – Sisense

Read More

D365 Customer Insights for Asset Mgmt: More Opportunities

July 31, 2020   Microsoft Dynamics CRM

Microsoft Dynamics 365 Customer Insights for Asset Management: Better Data Insights Lead to More (and Better) Opportunities

For asset management firms, succeeding with only basic understanding of clients, advisors and consultants is a thing of the past. To be a successful asset manager, you need to know more than just basic demographics to initiate knowledgeable, proactive decision making when it comes to opportunities. But, how exactly do you do that? Thanks to Microsoft Dynamics 365 Customer Insights, you can take and combine copious amounts of available data and put it to work so you don’t miss out on opportunities or clients.

Gather, Analyze and Gain Important Data Insights with Dynamics 365 Customer Insights

If you aren’t familiar with Dynamics 365 Customer Insights, we recommend reading our blog post from January 2020 that introduces Customer Insights and six reasons companies should leverage the most important features of this powerful tool.

Microsoft Dynamics 365 Customer Insights helps to you to gather and import all of your data—from any source—into one location, analyzes that data with Artificial Intelligence (AI) and delivers insights far beyond what is capable by humans (even experienced humans with tribal knowledge). Companies can then take these valuable insights and apply them to sales, marketing, activities and service initiates to personalize engagement.

But, how exactly do asset management firms leverage Customer Insights?

Asset Management Firms Identify Big Tickets with Dynamics 365 Customer Insights

Thanks to Dynamics 365 Customer Insights, gone are the days of limited information about big tickets and wholesalers relying solely on colleagues with extensive tribal knowledge.

To ensure all opportunities and vulnerabilities are being addressed, intermediaries need to target certain wholesalers. Big tickets are an important focus of asset management firms, and Customer Insights provides the tools needed to determine:

  1. The big ticket’s source—even down to the advisor.
  2. If the big ticket is a SELL or a BUY
  3. If that specific trade has been influenced by the wholesaler

Determining these factors is done in various ways including relationship evaluations, examining recent activities with advisors in a given office, and more. Once asset managers have this information, they can choose the next best action. In the case of a BUY, you would show gratitude for their business and go on to investigate potential complimentary strategies with other clients. For a SELL, however, you would indicate a change in strategy or risk of losing wallet share.

This is where Dynamics 365 Customer Insights really shines by merging all of your firm’s data across every source—even your current CRM system—by linking the data sources, and by applying public info to enhance your data. Customer Insights does the work so you can drill down into data, gain the insights you need and make an informed decision on your next course of action. Advisors and offices can view the span of rich data, including transactions over time, wallet share and product types. This powerful tool can even pull in new data–such as life or public events—based on financial accounts or holdings that could affect a given account or holding.

You can even keep your current CRM system and information reporting/display tools with Dynamics 365 Customer Insights. Talk about a win!

Don’t Miss Opportunities with Customer Insights

Thanks to Dynamics 365 Customer Insights, you don’t need to guess about what to do with your data; you can easily identify who you need to speak with to prevent missed opportunities or lost clients. And, losing someone with tribal knowledge is no longer a necessary fear.

Contact AKA today to learn more about their Customer Insights for Asset Management firms. Ready to see this powerful tool in action? Get started with a Proof of Concept engagement


ABOUT AKA ENTERPRISE SOLUTIONS
AKA specializes in making it easier to do business, simplifying processes and reducing risks. With agility, expertise, and original industry solutions, we embrace projects other technology firms avoid—regardless of their complexity. As a true strategic partner, we help organizations slay the dragons that are keeping them from innovating their way to greatness. Call us at 212-502-3900!


Article by: Tom Berger | 212-502-3900

With 20+ years of field experience, Tom Berger is Vice President of Financial Services for AKA Enterprise Solutions.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More
« Older posts
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited