Tag Archives: Intelligence

New eBook: Mainframe & Machine Learning for IT Service Intelligence (ITSI)

Our modern computing environments rely on many hardware components and several software layers to work together in unison. The failure of one element in this complex system could impact hundreds, thousands, or even millions of users.

Syncsort’s latesteBook reviews how an ITSI solution with machine learning capabilities can provide a comprehensive view of your organization’s service delivery, allowing you to effectively set SLAs, identify potential problems, and plan for changes in the IT environment.

blog banner eBook MF Machine Learning ITSI New eBook: Mainframe & Machine Learning for IT Service Intelligence (ITSI)

Download the eBook now: Mainframe and Machine Learning for IT Service Intelligence

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Expert Interview (Part 1): Splunk’s Andi Mann on IT Service Intelligence, ITOA and AIOps

For over 30 years across five continents, Andi Mann (@AndiMann) has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant. He currently serves as Splunk’s Chief Technology Advocate. He is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, and communicator. In the first of this two-part interview, he shares his thoughts on IT Service Intelligence (ITSI) and its role in IT Operational Analytics (ITOA) and Artificial Intelligence Operations (AIOps).

What is ITSI and how does it fit with ITOA and/or AIOps?

According to Gartner, IT Operational Analytics (ITOA) is a market for solutions that bring advanced analytical techniques to IT operations management use cases and data. ITOA solutions collect, store, analyze, and visualize IT operations data from other applications and IT operations management (ITOM) tools, enabling IT Ops teams to perform faster root cause analysis, triage, and problem resolution.

As it has become more sophisticated, Gartner has redefined ITOA as “AIOps,” initially calling it Algorithmic IT Ops, now morphing into “Artificial Intelligence Ops,” reflecting the increasing use of machine learning, predictive analytics, and artificial intelligence in these solutions.

Splunk IT Service Intelligence (ITSI) is a next-generation monitoring and analytics solution in the ITOA/AIOps space, built on top of Splunk Enterprise or Splunk Cloud. ITSI uses machine learning and event analytics to simplify operations, prioritize problem resolution, and align IT with the business.

blog banner eBook Ironstream case studies 1 Expert Interview (Part 1): Splunk’s Andi Mann on IT Service Intelligence, ITOA and AIOps

Using metrics and performance indicators that are aligned with strategic goals and objectives, ITSI goes beyond reactive and ad hoc troubleshooting to proactively organize and correlate relevant metrics and events according to the business service they support. With ITSI, IT Ops can better understand and even predict KPI trends, to identify and triage systemic issues, and to speed up investigations and diagnosis.

This allows maturing IT organizations to quickly yet deeply understand the impact that service degradation has not only on the components in their service stack, but also on service levels and business capabilities – think more “web store” than “web server.”

We are seeing a lot of investment by organizations in leveraging the value of Big Data, what do you see as the major drivers for this?

I see three main drivers for this new focus on Big Data.

Firstly, the increasing volume of data is creating a maintenance nightmare, but also an analytics dream. This new data – from online applications, mobile devices, cloud systems, social services, partner integrations, connected devices, and more – is full of insights, but cannot be managed with traditional tools. Big Data is often the only way to understand a modern business service at scale.

Secondly, speed and agility are emerging as market differentiators. Slow, old-school techniques like data warehousing, Extract-Transfer-Load (ETL) operations, batch data processing, and scheduled reporting are not fast enough. New-style Big Data tools, by contrast, ingest data in real time, use machine learning and predictive analytics to generate meaning, instantly display sophisticated and customizable visualizations, and produce actionable insights from Big Data as it is produced.

Thirdly, there is an increasing focus on data-driven decisions to drive innovation. From junior IT admins to senior business execs, innovation requires all stakeholders make accurate decisions in real time. Big Data allows everyone to try new ideas, determine what works and what doesn’t, and then iterate quickly to course-correct from failures or double-down on successes, quickly adjusting to new information and meeting the changing demands of the market.

In Part 2, Andi Mann discusses the reasons mainframe and distributed IT are sharing data, and the use cases where organizations are building more effective digital capabilities with mainframe back ends.

Download Syncsort’s latest eBook, Ironstream in the Real-World for ITOA, ITSI, SIEM, to explore real world use cases where new technologies can provide answers to the questions challenging organizations.

Try Ironstream fbanner Expert Interview (Part 1): Splunk’s Andi Mann on IT Service Intelligence, ITOA and AIOps

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

How to Become an Intelligence Influencer

rsz blog intelligence influencer How to Become an Intelligence Influencer

Throughout our three-part series, we’ve covered how data analytics is an ally to your marketing cause and how demystifying customer insights begin with a humanized approach. In a nutshell, organizations struggling with new technology isn’t a new problem, and not all hope is lost. There are quick ways to get your foot through the door using data intelligence and influencing process-driven innovation.

Context, context, context

When was the last time you’ve seen a design failure when taken out of context? (See here and here for examples). It’s easy to laugh it off when the contextual error is purely aesthetics, but if a misinterpretation of data results in your sales teams engaging prospects with an ineffective sales pitch and the wrong product, coupled with an irrelevant marketing program, the impact can be far worse.

Forrester listed three prerequisites that drive predictive marketing success in a 2016 white paper. The first is to refresh capabilities to boost insights-driven engagement, and second is to ensure sales and marketing alignment through lead processes. Last is to gather useful data to inform predictive algorithms.

“…apply the same lenses as you would an external customer to really understand how data will be accessed by them and more importantly, in what context.”

What these points summarize is the state of needing different pieces in the bigger picture to get the customer view right. With shiny new toys like data automation and predictive analytics in place, we need first to ensure that the people using these data know what to do with it, how it is used—in short, to improve on business intelligence, starting from within the organization. It is also akin to applying an internal marketing funnel to bring perspective to a process.

There are many ways for businesses to improve their intelligence, and they probably have some areas of the organization that has developed such skills and processes. The challenge is to combine a very broad view of context within a specific time frame—let’s say a market segment and the products and campaigns targeted toward it―with a multitude of sharply focused views of context, often in real time (such as the interactions and experience provided to a customer).

How can your business reconcile those broad and focused views to define the best decisions and actions? Your stakeholders, be it sales, marketing, or customer experience, needs to apply the same lenses as you would an external customer, to comprehend how data is accessed and more importantly, in what context. Then you need to combine all of their intelligence and related capabilities so you can connect their intelligence.

People first, always

We have already identified sales and marketing as the most active users of data within an organization, but this is based on the functional needs of their roles. Similar to how marketers use personas to help us find our best customers look-alikes, a lens can be applied to help us understand what a salesperson or a marketer needs then apply that to a defined lead generation workflow and map that to success indicators to identify what works and what doesn’t.

As Aberdeen Group pointed out, “strongly aligned marketing and sales teams are 53% more likely to ensure relevant value propositions aligned to buyers’ business challenges”. The idea of a strongly aligned team refers to having a set of shared objectives, priorities, strategic and tactical operational processes, goals, and resources.

Understanding what a salesperson versus a marketer needs, then putting them together into a shared program not only accelerates the alignment process between these two functions but also helps the analytics team build dashboards efficiently. We can also apply a particular set of goals tied to these different roles, and further customized the data output according to the actual user needs at any given time.

For example, TIBCO Spotfire provides interactive visualizations derived from previously disparate data for marketers to review the effectiveness of digital engagement pathways via web content consumption trends, while sales may use another visualization with the same data sources to plot predictive trends of peaks and troughs of customer interests during a given period. Our customer is never single-dimensional, so our data outputs shouldn’t be either.

Linear or cyclical?

Once you have initiated an alignment between internal stakeholders and begin to design shared objectives within functions and cross-functionally, a process of continuous realignment and refinement needs to be set in place. How do you decide then if the process should be linear or cyclical?

One needs to only look at the shared priority of your stakeholders—the customer. While not all organizations have successfully applied measurements of Customer Lifetime Value (CLV), it is essential to a business focusing on becoming customer-centric. Why is CLV important? It tells us many things that can help maximize sales and marketing efforts, such as:

  • The potential worth or contribution your customer can bring
  • How to move your most desired (i.e. more valuable) customers through each stage of the buyer’s journey
  • Which prospects to focus on with your limited resources

This is where your data-driven intelligence comes into play. Using dashboards set up with combined data sources, mapped to your operational perspectives acquired with your “People First” approach, you are now ready to layer this information against a broader framework of calculating your customer lifetime value.

The business can then learn to identify threats or opportunities throughout a customer’s mapped lifetime while refining its predictions of outcomes of the context, and finally, make recommendations on the best action for the business to take. It is also important to note that when you have billions of people, systems, and devices interacting simultaneously, data is arriving in real time from hundreds and thousands of resources—it has a shelf life; its value diminishing over time, so you need to be able to decide and act instantly too.

TIBCO understands this and provides event-driven solutions that help businesses augment traditional business intelligence processes to capture, aggregate, and analyze data at-rest and in-motion, and help businesses spark insight for contextual awareness that leads to preemptive, actionable insights. This real-time intelligence becomes part of your competitive strategy, providing a continuous feedback loop that continually enriches your knowledge, capabilities, and responses.

And because a customer’s behavior isn’t always linear, your business intelligence shouldn’t be either.

This article is the final piece of a three-part series to look at visual, predictive, and streaming analytics technologies can apply to modern marketing.

Let’s block ads! (Why?)

The TIBCO Blog

How to Calculate Total Cost of Ownership for Business Intelligence

Imagine you’re comparing gym memberships to figure out which one offers the best value. Sure, you could simply look at the monthly fee and go for the cheapest, but that wouldn’t tell you everything you need to know about the total cost of ownership.

For starters, you’d want to know what the cost includes. Does it offer all the machines and classes? Do you have to rent/buy extra equipment? Then there are the less obvious considerations. Do you need to pay a trainer to get true value? What’s the price of travel? Is there enough capacity to cope with the crowds, even during peak hours?

Loosely speaking, the approach to buying a new gym membership should be, for the majority of savvy businesses, the same approach they use for price comparisons when weighing up different tech solutions for their business – especially with a solution as powerful and intricate as Business Intelligence.

Business Intelligence Pricing – There’s a Catch

There are many things to consider when pricing out the total cost of ownership of BI. To really get a feel for the cost of implementing a BI solution, start by making sure that the platform in question does everything you need and has enough capacity for all of your data – or if not, how much you’ll need to spend on additional technical infrastructure, tools, or the necessary consulting / IT expertise manpower to tailor a solution version that does work for you.

Try to estimate how much you’ll need to commit in terms of internal budget and resources, whether you’ll need to pay to take on new staff, and the opportunity costs of taking existing personnel off revenue-generating projects to ensure smooth deployment and daily use.

Then, once you’ve tallied up all the hidden costs of rolling out and operating a workable solution, choose the option that offers the best value for the price tag.

Sounds sensible, right? Well, yes – in 99% of cases, this formula works just fine.

But BI is different. To work out the real cost of using your BI platform, you have to take a final, vital step: calculate the value that a BI solution gives you – it’s cost of new analytics.

770x250 TOC 2 770x250 How to Calculate Total Cost of Ownership for Business Intelligence

Considering the Cost of New Analytics

Let’s look at the gym membership example again. Imagine that you spot in the small print that one of the gyms is only open on weekends, whereas the other one is open every day.

Until this point, you’d thought Gym A offered the better deal. You’d calculated the total cost of ownership at $ 820 per year, while Gym B worked out at $ 1200 per year.

But if you can only visit Gym A a maximum of twice a week, even if you take every available opportunity to go, you’re still paying a significant amount of money per session. The gym is only open 104 days of the year, so the absolute minimum you pay per workout will be:

$ 820 / 104 = $ 7.8

Gym B, on the other hand, might be more expensive, but it’s open seven days a week. In fact, it’s only closed on two days out of the whole year. If you took advantage of this and went there on every possible day, the minimum you’d pay per workout would be:

$ 1200 / 363 = $ 3.3

Suddenly, Gym B looks like a much better option, right?

This is precisely how you need to approach your value assessment of a BI platform, too.

That’s because BI platforms vary wildly in the time it takes you to submit a new data query, generate results and present them in a format that makes sense – for example, an easy-to-process dashboard showing progress on your KPIs.

On first look, it might seem that the annual total cost of ownership of one product is much higher than another. Once you factor in the turnaround time for a data analysis project, though, and divide your number by the maximum amount of data projects you can process in a year, this could quickly start to look very different indeed.

That’s because BI tools aren’t best measured by total cost of ownership per annum, but by the cost of running each individual analysis.

How to Calculate the Cost of New Analytics

In short, it’s putting a concrete number on the actual value you and your team are going to be getting from a BI solution.

Since we have already established that upfront costs is just one aspect of a bigger equation, businesses are now using a newer, more clever and accurate way of measuring the total cost of ownership of a BI solution by incorporating the full value potential of BI – how much will you and your team benefit from BI – that’s by calculating the cost of new analytics.

Ask yourself: What is the cost of a new analytics report for my team? This is precisely how you need to approach your value assessment of a BI platform because the cost of new analytics essentially calculates how quickly your team can churn out (and benefit from) new analytics and reports, which actually measures how much value for how much investment you are getting from your BI tool.

A Formula for Calculating BI’s Total Cost of Ownership

By incorporating the notion of speed, you will quantify how agile a BI tool is, which depends on quickness on operations.

Get our guide on calculating the total cost of ownership of a BI tool to see an exact formula on how you can quantify the cost of new analytics and take all costs – from technical infrastructure to manpower- into account before you buy a business intelligence solution.

770x250 TOC 2 770x250 How to Calculate Total Cost of Ownership for Business Intelligence

Let’s block ads! (Why?)

Blog – Sisense

Expert Interview (Part 2) with Databricks’ Damji: Spark + Hadoop = Artificial Intelligence = Science Fiction Becoming Reality

At this year’s Strata + Hadoop World, Syncsort’s Paige Roberts caught up with Jules Damji (@2twitme), the Spark Community Evangelist for Databricks and had a long conversation. In this second post of our four-part interview, they discuss the trend of the Spark and Hadoop technologies and communities merging over time, and how that’s creating a science fiction novel kind of world, where artificial intelligence is becoming commonplace.

blog damji quote today theres not Expert Interview (Part 2) with Databricks’ Damji: Spark + Hadoop = Artificial Intelligence = Science Fiction Becoming Reality

Paige Roberts: One thing I’ve noticed over the last few years is that to a certain extent; the Spark and Hadoop communities seem to be merging. We just had a Hadoop focused conference, and yet half the sessions were about Spark. Why do you think that is?

Jules Damji: Apache Spark is such an integral part of Big Data because it allows people to deal with and process large scale data in a very quick manner. It allows people to run different workloads on a single, unified engine. That’s one of the main attractions.

If you look at the history of Big Data, you had all these different systems and you had to stitch them together to do your end-to-end job pipeline. It was difficult. You had to learn five different systems.

Another reason people are rallying around Apache Spark is that it works very well with the Hadoop ecosystem. You can store your data in HDFS or S3 or whatever. The API works well with the storage level. It works well with the applications. Apache Spark talks to BI tools, to Sqoop, to all these third-party data ingestion tools. And it can be deployed in different environments as well. You can have it running on YARN, on its own cluster, or on Mesos.

These dimensions of Apache Spark’s flexibility make it an integral part of Hadoop or Big Data in general. Today there’s not a single conversation that’s happening in the world where Big Data and Apache Spark are not mentioned in the same sentence.

blog banner BBDtL ExpertsSay Expert Interview (Part 2) with Databricks’ Damji: Spark + Hadoop = Artificial Intelligence = Science Fiction Becoming Reality

Roberts: Right. I see that, too.

Damji: We are in the Big Data era. We have seen data coming in fast and we need this real-time end-to-end solution. If I get data, I should be able to make a decision fast. And I should be able to consult either my machine learning model in split second time or I should be able to interact with my stored data. One of the things that Apache Spark provides through Structured Streaming is the ability to write a continuous application.

Today, you heard Reynold Xin speak about the ability to write fault-tolerant applications that give you the ability to interact with streaming data and query it as if you were querying your old, stationary data. It gives you the ability to do ad hoc analysis on the fly. Before, it took you a long time to do this after you finished getting the data. Now, you can do it instantly. That’s one thing.

The other thing I see is that Artificial Intelligence (AI) has come to its fore, and Spark is going to play a big role in the democratizing aspects of Big Data and AI.

blog damji quote AI has come Expert Interview (Part 2) with Databricks’ Damji: Spark + Hadoop = Artificial Intelligence = Science Fiction Becoming Reality

Yeah, you’re seeing artificial intelligence now. On things like self-driving cars and such.

Yes, exactly, self-driving cars, image and voice recognition, recommendation engines, and so much more. At the center of that is the ability to do advanced analytics quickly. The ability to employ popular framework like TensorFlow with Apache Spark, to be able to do machine learning using Apache Spark’s library at scale, to be able build deep neural networks, and do computational analysis quickly. That enters us into this new era of Artificial Intelligence. We now have some of these AI systems, which used to be science fiction. Now, they are taking realistic form.

blog damji quote AI systems Expert Interview (Part 2) with Databricks’ Damji: Spark + Hadoop = Artificial Intelligence = Science Fiction Becoming Reality

The science fiction novels that I read as a kid are now old hat. Yeah, we did that last year.

You will see more and more Apache Spark playing an integral role in this Big Data and Artificial Intelligence era, what I call the Zeitgeist of Big Data. At the core is the ability to process a lot of data fast, ability to manage large clusters seamlessly, ability to transform data at immense speeds, ability to process myriad kinds of data, such as text, video, unstructured, and structured data. It can all be done through the same processing engine such as Spark.

Streaming, batch, …

We’re streaming, we’re doing batch. Before, all these different systems had different formats of data, and different engines.

Yeah, different engines, and different APIs…

Right. But now you have a unified API. You have workloads that run on the same engines so that makes things a little easier. It’s the stepping stone to this powerful digital revolution. No previous industrial industrial revolutions had so many fast technology trends and innovations than this digital revolution. Just in less than few years, I mean, look at what we are going through with Apache Spark.

blog Spark 2.0 Expert Interview (Part 2) with Databricks’ Damji: Spark + Hadoop = Artificial Intelligence = Science Fiction Becoming Reality

Yeah, it’s amazing!

And 10 years from now you might have something else which might be different from Spark, but the next five, we will see Apache Spark growing. We’ll see more and more intelligent application built on top of machine learning techniques that Apache Spark facilitates and catalyzes. And we’ll see huge performance improvements.

Like Project Tungsten?

Tungsten is the second generation of how you can have 10X to 40X the performance. The need is there. The need is not new. That you have data coming in at enormous velocity is new. So, you need capacity to process it instantly. In order to do that, you need very performant distributed systems. And I think you and I are both living in the heart of this data Zeitgeist.

This revolution has been a lot like being in the center of a tornado. Everything around you changes so quickly. So, how do you like the Strata conference?

Oh, this has been wonderful. Like you said, this is a Big Data Hadoop conference and to see how many Apache Spark talks were there was amazing. A testament that Spark is an integral part of Big Data.

It’s certainly is, yes.

It is. Spark Summit is growing fast, too. Big Data and Apache Spark, that’s become a very symbiotic relationship. It’s very complimentary. You can’t really talk about Big Data and not talk about Apache Spark.

Be sure to read the first part one of this conversation, on the importance of the Apache Spark community, and don’t miss the next part of the conversation! We’ll talk about Security, and the big move of Big Data processing to the Cloud.

For more talk about the future of Big Data, including more on Spark and Hadoop, read our eBook, Bringing Big Data to Life: What the Experts Say.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Excellent Customer Service Requires Emotional Intelligence

Staff engagement is a key component in a telco’s ability to positively impact customers, based on new data from
InMoment.

customer service Excellent Customer Service Requires Emotional Intelligence

The company recently conducted a survey of 11,000 North American customers of Internet, mobile and TV services and found that telecommunications companies’ customers just plain hate them.

Telcos are turning to self-service, automation and artificial intelligence applications like chatbots, but these tools are appropriate only in some scenarios, InMoment suggested.

In many parts of the telco customer journey, positive human interactions are essential and contribute to a good user experience, the company maintained.

Telcos should emphasize emotional IQ, or EQ, in hiring and training personnel with high emotional intelligence for all client-facing roles, the study recommends.

This holds true especially for physical locations where elevated support issues likely will be handled.

Ideal Support Staff Traits

“A huge indicator of whether support staff have what it takes for successful customer interactions is whether they really listen to and understand the issue,” said Andrew Park, director of CX strategy at InMoment.

Support staff “must be creative problem solvers, looking at each opportunity as an interesting puzzle to solve rather than a problem to get through,” he told CRM Buyer. They “should also be comfortable making smart, timely, yet empathetic decisions.”

Telcos should pay greater attention to emotional trends and related areas of weakness, the study suggests.

“We recommend [telcos] hire for connection, train for skills,” Park said. “It’s much easier to train personnel on facts than [instill] emotional and personality traits that may not come naturally to everyone.”

Telco Customers’ Disgust

Customer satisfaction fell at the one-year mark, no matter which service was used, as did the likelihood of a customer recommending the service provider to others, according to the survey.

Satisfaction over support staff knowledge, ability, efficiency, friendliness and helpfulness fell significantly at the one-year anniversary mark, and fluctuated throughout the term of a customer’s contract with the provider.

For many lines of service, the metrics showed no recovery of satisfaction.

Bill payment frustrations also peaked at the one-year anniversary mark. Both the ease of understanding a bill and the ease of paying it posed problems.

Customers who had switched providers in the previous year rated key areas of the user experience lower than those who had not switched, but they still were more likely to recommend their second provider.

Most major telcos have proactive outreach programs in the first year of their relationship with customers, but the outreach diminishes over time, the study found.

Many promotional offers expire at the one-year mark and a customer’s bill after that may differ from expectations. Customers were less willing to tolerate frustrations after the promotional period.

How Telcos Can Resolve the Problem

Telcos can invest in new technologies, such as remote diagnostic tools with video capabilities, the study suggests. That said, customer resolution strategies should include the option of human intervention with automated self-service solutions.

“Automation works well for simple and known issues,” InMoment’s Park pointed out. “Humans always perform better in more complex and emotionally charged situations.”

Sophisticated automation technology, such as AI chatbots, can understand what customers are saying in real time and trigger relevant follow-up questions to better understand what the next best action might be, Park observed.

“For instance, if an AI chatbot detects a customer’s experiencing escalating frustration based on their word choice and sentiment, it can ask whether the customer would like to be immediately routed to a human for more personalized care,” Park said.

Service reps who work well with people “are hard to find and command a higher wage than traditional service reps, who are often hired because they’re cheap,” observed Michael Jude, a research manager at Stratecast/Frost & Sullivan.

“Rather than treating customer service as an unavoidable overhead, operators should view it as the point of it all,” he told CRM Buyer. “A good customer service rep can upsell and reduce churn simply by being facilitative.”
end enn Excellent Customer Service Requires Emotional Intelligence


Richard%20Adhikari Excellent Customer Service Requires Emotional IntelligenceRichard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology.
Email Richard.

Let’s block ads! (Why?)

CRM Buyer

How Collective Intelligence Can Empower Workers And Develop Leaders

277357 l srgb s gl 300x200 How Collective Intelligence Can Empower Workers And Develop Leaders“Innovation distinguishes between a leader and a follower.” – Steve Jobs

As a part of the last wave of Millennials joining the workforce, I have been inspired by Jobs’ definition of innovation. For years, Millennials like me have been told that we need to be faster, better, and smarter than our peers. With this thought in mind and the endless possibilities of the Internet, it’s easy to see that the digital economy is here, and it is defining my generation.

Lately we’ve all read articles proclaiming that “the digital economy and the economy are becoming one in the same. The lines are being blurred.” While this may be true, Millennials do not see this distinction. To us, it’s just the economy. Everything we do happens in the abstract digital economy – we shop digitally, get our news digitally, communicate digitally, and we take pictures digitally. In fact, the things that we don’t do digitally are few and far between.

Millennial disruption: How to get our attention in the digital economy

In this fast-moving, highly technical era, innovation and technology are ubiquitous, forcing companies to deliver immediate value to consumers. This principle is ingrained in us – it’s stark reality. One day, a brand is a world leader, promising incredible change. Then just a few weeks later, it disappears. Millennials view leaders of the emerging (digital) economy as scrappy, agile, and comfortable making decisions that disrupt the norm, and that may or may not pan out.

What does it take to earn the attention of Millennials? Here are three things you should consider:

1. Millennials appreciate innovations that reinvent product delivery and service to make life better and simpler.

Uber, Vimeo, ASOS, and Apple are some of the most successful disruptors in the current digital economy. Why? They took an already mature market and used technology to make valuable connections with their Millennial customers. These companies did not invent a new product – they reinvented the way business is done within the economy. They knew what their consumers wanted before they realized it.

Millennials thrive on these companies. In fact, we seek them out and expect them to create rapid, digital changes to our daily lives. We want to use the products they developed. We adapt quickly to the changes powered by their new ideas or technologies. With that being said, it’s not astonishing that Millennials feel the need to connect regularly and digitally.

2. It’s not technology that captures us – it’s the simplicity that technology enables.

Recently, McKinsey & Company revealed that “CEOs expect 15%–50% of their companies’ future earnings to come from disruptive technology.” Considering this statistic, it may come as a surprise to these executives that buzzwords – including cloud, diversity, innovation, the Internet of Things, and future of work – does not resonate with us. Sure, we were raised on these terms, but it’s such a part of our culture that we do not think about it. We expect companies to deeply embed this technology now.

What we really crave is technology-enabled simplicity in every aspect of our lives. If something is too complicated to navigate, most of us stop using the product. And why not? It does not add value if we cannot use it immediately.

Many experts claim that this is unique to Millennials, but it truly isn’t. It might just be more obvious and prevalent with us. Some might translate our never-ending desire for simplicity into laziness. Yet striving to make daily activities simpler with the use of technology has been seen throughout history. Millennials just happen to be the first generation to be completely reliant on technology, simplicity, and digitally powered “personal” connections.

3. Millennials keep an eye on where and how the next technology revolution will begin.

Within the next few years Millennials will be the largest generation in the workforce. As a result, the onslaught of coverage on the evolution of technology will most likely be phased out. While the history of technology is significant for our predecessors, this not an overly important story for Millennials because we have not seen the technology evolution ourselves. For us, the digital revolution is a fact of life.

Companies like SAP, Amazon, and Apple did not invent the wheel. Rather, they were able to create a new digital future. For a company to be successful, senior leaders must demonstrate a talent for R&D genius as well as fortune-telling. They need to develop easy-to-use, brilliantly designed products, market them effectively to the masses, and maintain their product elite. It’s not easy, but the companies that upend an entire industry are successfully balancing these tasks.

Disruption can happen anywhere and at any time. Get ready!

Across every industry, big players are threatened — not only by well-known competitors, but by small teams sitting in a garage drafting new ideas that could turn the market upside down. In reality, anyone, anywhere, at any time can cause disruption and bring an idea to life.

Take my employer SAP, for example. With the creation of SAP S/4HANA, we are disrupting the tech market as we help our customers engage in digital transformation. By removing data warehousing and enabling real-time operations, companies are reimagining their future. Organizations such as La Trobe University, the NFL, and Adidas have made it easy to understand and conceptualize the effects using data in real time. But only time will tell whether Millennials will ever realize how much disruption was needed to get where we are today.

Find out how SAP Services & Support you can minimize the impact of disruption and maximize the success of your business. Read SAP S/4HANA customer success stories, visit the SAP Services HUB, or visit the customer testimonial page on SAP.com.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

TIBCO Energy Forum to Showcase Augmented Intelligence for Energy Sector Optimization

hef TIBCO Energy Forum to Showcase Augmented Intelligence for Energy Sector Optimization

Have you heard?! Our 15th annual TIBCO Energy Forum will take place September 6-7 at The Westin Houston, Memorial City. The conference gathers over 650 leading energy industry experts to discuss why digital transformation using TIBCO’s Connected Intelligence solutions drives value across the energy sector and how to navigate market changes. TIBCO customers and partners will also share best practices and use cases for visual, predictive, and streaming analytics in oil and gas, smart grids, and utilities.

“High crude supply and lower margins are driving innovative energy companies to improve profitability by employing TIBCO’s advanced Connected Intelligence analytics solutions. From upstream exploration and production surveillance, to midstream supply, downstream refining, smart power grids, and utilities, our solutions interconnect data and augment intelligence to surface smart analytics and optimize business operations,” said Thomas Been, chief marketing officer, TIBCO. “We’re pleased to announce this year’s TIBCO Energy Forum and look forward to dynamic conversations about the role of Connected Intelligence in achieving stronger ROI and operational efficiency.”

TIBCO Energy Forum attendees will gain unique access to energy sector use cases for analytics solutions during in-depth educational sessions and hear first-hand from leading industry experts in targeted breakout sessions. Conference attendees can expect to:

  • Discover how oil and gas companies are using analytics to optimize drilling, completions, and production;
  • Find out how peers are using TIBCO Spotfire, TIBCO StreamBase, and TIBCO OpenSpirit to drive business value in the energy sector;
  • Learn about applications in midstream supply, trading, downstream operations, smart grids, and utilities, where analysis of historical and real-time data feeds is used to optimize business operations; and
  • See the latest TIBCO product updates, configurations, and community activities.

Speakers include TIBCO executives—Chief Executive Officer Murray Rode, Chief Technology Officer & Executive Vice President Matt Quinn, and Chief Analytics Officer Michael O’Connell. TIBCO customers, including Anadarko, NRG Energy, LINN Energy, and ConocoPhillips, will also present at the conference. Members of TIBCO’s extensive partner network will participate as well, including Bahwan Cybertek, Ruths.ai, Quintus, Blue River Analytics, Mathworks, and many more.

To register for this event and learn more about the agenda, visit: https://energyforum.tibco.com/. Registration closes September 1, 2017 at 11:59 PM PT.

Follow #EnergyForum17 on Twitter for live updates from the show.

Let’s block ads! (Why?)

The TIBCO Blog

What Artificial Intelligence and Machine Learning can do – and what not

I have written on Artificial Intelligence (AI) before.  Back then I focused on the technology side of it: what is part of an AI system and what isn’t.  But there is another question which might be even more important.  What are we DOING with AI?

Part of my job is to help investors with their due diligence.  I discuss companies with them in which they might want to invest. Here is a quick observation:  By now, every company pitch is full with stuff about how they are using AI to solve a given business problem.

Part of me loves this since some of those companies are on something and should get the chance.  But I also have a built-in “bullshit-meter”.  So, another part of me wants to cringe every time I listen to a founder making stuff up about how AI will help him.  I listened to many founders who do not know a lot about AI, but they sense that they can get millions of dollars of funding.  Just by adding those fluffy keywords to their pitch.  The bad news is that it sooner or later actually works.  Who am I to blame them?

I have seen situations where AI or at least machine learning (ML) has an incredible impact.  But I also have seen situations where this is not the case.  What was the difference?

In most of the cases where organizations fail with AI or ML, they used those techniques in the wrong context.  ML models are not very helpful if you have only one big decision you need to make.  Analytics still can help you in such cases by giving you easier access to the data you need to make this decision.  Or by presenting this data in a consumable fashion.  But at the end of the day, those single big decisions are often very strategic.  Building a machine learning model or an AI to help you making this decision is not worth doing it.  And often they also do not yield better results than just making the decision on your own.

Here is where ML and AI can help. Machine Learning and Artificial Intelligence deliver most value whenever you need to make lots of similar decisions quickly. Good examples for this are:

  • Defining the price of a product in markets with rapidly changing demands,
  • Making offers for cross-selling in an E-Commerce platform,
  • Approving a credit or not,
  • Detecting customers with a high risk for churn,
  • Stopping fraudulent transactions,
  • …among others.

You can see that a human being who would have access to all relevant data could make those decisions in a matter of seconds or minutes.  Only that they can’t without AI or ML, since they would need to make this type of decision millions of times, every day.  Like sifting through your customer base of 50 million clients every day to identify those with a high churn risk.  Impossible for any human being.  But no problem at all for an ML model.

So, the biggest value of artificial intelligence and machine learning is not to support us with those big strategic decisions.  Machine learning delivers most value when we operationalize models and automate millions of decisions.

The image below shows this spectrum of decisions and the times humans need to make those.  The blue boxes are situations where analytics can help, but it is not providing its full value. The orange boxes are situations where AI and ML show real value. And the interesting observation is: the more decisions you can automate, the higher this value will be (upper right end of this spectrum).

 What Artificial Intelligence and Machine Learning can do – and what not

One of the shortest descriptions of this phenomenon comes from Andrew Ng, who is a well-known researcher in the field of AI.  Andrew described what AI can do as follows:

“If a typical person can do a mental task with less than one second of thought, we can probably automate it using AI either now or in the near future.”

I agree with him on this characterization. And I like that he puts the emphasis on automation and operationalization of those models – because this is where the biggest value is. The only thing I disagree with is the time unit he chose. It is safe to go already with a minute instead of a second.

Let’s block ads! (Why?)

RapidMiner

SugarCRM Launches Hint, 1st in Relationship Intelligence Line

SugarCRM on Tuesday launched Hint, the first offering in its new line of relationship intelligence products.

sugar crm SugarCRM Launches Hint, 1st in Relationship Intelligence Line

Hint users can enter a few contact details about a person with whom they want to connect, and Hint automatically will search social sources on the Web for personal and company information.

That data, and other personal and corporate details from a company’s internal files, will be collated and served up to the user in a side-panel view.

“Hint is the building block for future relationship intelligence product offerings, which will go deeper into AI,” said SugarCRM spokesperson Andrew Staples.

Artificial intelligence requires large data sets, and “as Hint evolves, it will allow us to get the data needed and then layer predictive analytics and machine learning on top of that data,” Staples told CRM Buyer.

What Hint Users Get

For sales teams, Hint’s advantage over other CRM vendors’ products is that it lets users select specific data and quickly pull it into the CRM account profile in the context of a particular opportunity they’re working on, Nucleus Research noted.

Further, users can quickly create a new customer profile, and information from the Web can be refreshed automatically, keeping records up to date.

Hint is priced at US$ 15 a month per user. It is compatible with Sugar 7.8 or later.

Hint is SugarCRM’s first purely cloud-based Software as a Service offering, and it could be marketed as a standalone cloud service in the future, Nucleus suggested.

Taking On the Competition

SugarCRM is listed at $ 70 a user a month, so the price with Hint goes up to $ 85 a user a month. Microsoft Dynamics 365 with LinkedIn Sales Navigator costs $ 135 per user per month, Nucleus pointed out.

That said, Microsoft “does provide some built-in coaching, relationship health graphs, and in-mail capabilities that SugarCRM Hint doesn’t,” Rebecca Wettemann, VP of research at Nucleus Research, told CRM Buyer.

Competition for Hint comes from companies like Hoovers, InsideView, DiscoverOrg and Salesforce, remarked Cindy Zhou, a principal analyst at Constellation Research.

“The advantage for SugarCRM is that Hint’s a native product for their CRM solution, which facilitates a faster and automatic data append process,” Zhou told CRM Buyer. “The other solutions require integration with CRM to append the data, and require costly subscription fees.”

Possible Issues

It is likely that SugarCRM customers will understand the depth and breadth of companies and contacts that Hint can append, said Zhou.

They may need to turn to third-party data companies for specialized contacts, she noted. “InsideView and DiscoverOrg are strong in IT contacts, for example.”

Where Relationship Intelligence Is Going

“Relationship intelligence” is SugarCRM’s new name for the Sugar Intelligence Service, which the company unveiled at SugarCON 2016 last June.

At that conference, SugarCRM demonstrated Candace, an AI-powered intelligent agent. Candace is still in development and will be unveiled later, Staples said.

SugarCRM plans to partner with AI offerings from Amazon, Google and other companies rather than build its own AI products. It already integrates with IBM Watson.

“Our customers, and the industry at large, aren’t asking us to build AI,” Staples explained — “they’re asking us to give them the tools to build better business relationships.”

Sugar plans “to innovate on Hint and relationship intelligence quickly,” he added, indicating “there’s a good chance” the company will talk much more about Hint at SugarCON 2017, to be held in late September.

SugarCRM’s vision for relationship intelligence is that it will “guide and assist users in interactions with customers, helping them to plan meetings, build deeper connections, recommend best actions, and respond to late-breaking developments as relationships evolve,” Staples remarked, “at any time on any device.”
end enn SugarCRM Launches Hint, 1st in Relationship Intelligence Line


Richard%20Adhikari SugarCRM Launches Hint, 1st in Relationship Intelligence LineRichard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology.
Email Richard.

Let’s block ads! (Why?)

CRM Buyer