Category Archives: Predictive Analytics

Top Ten Digitalist Magazine Posts Of The Week [July 24, 2017]

300w x 200h 300x200 Top Ten Digitalist Magazine Posts Of The Week [July 24, 2017]

Like the classic parable of the blind man and the elephant, it seems everyone has a unique take on digital transformation. Some equate digital transformation with emerging technologies, placing their bets on as the Internet of Things, machine learning, and artificial intelligence. Others see it as a way to increase efficiencies and change business processes to accelerate product to market. Some others think of it is a means of strategic differentiation, innovating new business models for serving and engaging their customers. Despite the range of viewpoints, many businesses are still challenged with pragmatically evolving digital in ways that are meaningful, industry-disruptive, and market-leading.

According to a recent study of more than 3,000 senior executives across 17 countries and regions, only a paltry three percent of businesses worldwide have successfully completed enterprise-wide digital transformation initiatives, even though 84% of C-level executives ranks such efforts as “critically important” to the fundamental sustenance of their business.

The most comprehensive global study of its kind, the SAP Center for Business Insight report “SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart,” in collaboration with Oxford Economics, identified the challenges, opportunities, value, and key technologies driving digital transformation. The findings specifically analyzed the performance of “digital leaders” – those who are connecting people, things, and businesses more intelligently, more effectively, and creating punctuated change faster than their less advanced rivals.

After analyzing the data, it was eye-opening to see that only three percent of companies (top 100) are successfully realizing their full potential through digital transformation. However, even more remarkable was that these leaders have four fundamental traits in common, regardless of their region of operation, their size, their organizational structure, or their industry.

We distilled these traits in the hope that others in the early stages of transformation or that are still struggling to find their bearings can embrace these principles in order to succeed. Ultimately I see these leaders as true ambidextrous organizations, managing evolutionary and revolutionary change simultaneously, willing to embrace innovation – not just on the edges of their business, but firmly into their core.

Here are the four traits that set these leaders apart from the rest:

Trait #1: They see digital transformation as truly transformational

An overwhelming majority (96%) of digital leaders view digital transformation as a core business goal that requires a unified digital mindset across the entire enterprise. But instead of allowing individual functions to change at their own pace, digital leaders prefer to evolve the organization to help ensure the success of their digital strategies.

The study found that 56% of these businesses regularly shift their organizational structure, which includes processes, partners, suppliers, and customers, compared to 10% of remaining companies. Plus, 70% actively bring lines of business together through cross-functional processes and technologies.

By creating a firm foundation for transformation, digital leaders are further widening the gap between themselves and their less advanced competitors as they innovate business models that can mitigate emerging risks and seize new opportunities quickly.

Trait #2: They focus on transforming customer-facing functions first

Although most companies believe technology, the pace of change, and growing global competition are the key global trends that will affect everything for years to come, digital leaders are expanding their frame of mind to consider the influence of customer empowerment. Executives who build a momentum of breakthrough innovation and industry transformation are the ones that are moving beyond the high stakes of the market to the activation of complete, end-to-end customer experiences.

In fact, 92% of digital leaders have established sophisticated digital transformation strategies and processes to drive transformational change in customer satisfaction and engagement, compared to 22% of their less mature counterparts. As a result, 70% have realized significant or transformational value from these efforts.

Trait #3: They create a virtuous cycle of digital talent

There’s little doubt that the competition for qualified talent is fierce. But for nearly three-quarters of companies that demonstrate digital-transformation leadership, it is easier to attract and retain talent because they are five times more likely to leverage digitization to change their talent management efforts.

The impact of their efforts goes beyond empowering recruiters to identify best-fit candidates, highlight risk factors and hiring errors, and predict long-term talent needs. Nearly half (48%) of digital leaders understand that they must invest heavily in the development of digital skills and technology to drive revenue, retain productive employees, and create new roles to keep up with their digital maturity over the next two years, compared to 30% of all surveyed executives.

Trait #4: They invest in next-generation technology using a bimodal architecture

A couple years ago, Peter Sondergaard, senior vice president at Gartner and global head of research, observed that “CIOs can’t transform their old IT organization into a digital startup, but they can turn it into a bi-modal IT organization. Forty-five percent of CIOs state they currently have a fast mode of operation, and we predict that 75% of IT organizations will be bimodal in some way by 2017.”

Based on the results of the SAP Center for Business Insight study, Sondergaard’s prediction was spot on. As digital leaders dive into advanced technologies, 72% are using a digital twin of the conventional IT organization to operate efficiently without disruption while refining innovative scenarios to resolve business challenges and integrate them to stay ahead of the competition. Unfortunately, only 30% of less advanced businesses embrace this view.

Working within this bimodal architecture is emboldening digital leaders to take on incredibly progressive technology. For example, the study found that 50% of these firms are using artificial intelligence and machine learning, compared to seven percent of all respondents. They are also leading the adoption curve of Big Data solutions and analytics (94% vs. 60%) and the Internet of Things (76% vs. 52%).

Digital leadership is a practice of balance, not pure digitization

Most executives understand that digital transformation is a critical driver of revenue growth, profitability, and business expansion. However, as digital leaders are proving, digital strategies must deliver a balance of organizational flexibility, forward-looking technology adoption, and bold change. And clearly, this approach is paying dividends for them. They are growing market share, increasing customer satisfaction, improving employee engagement, and, perhaps more important, achieving more profitability than ever before.

For any company looking to catch up to digital leaders, the conversation around digital transformation needs to change immediately to combat three deadly sins: Stop investing in one-off, isolated projects hidden in a single organization. Stop viewing IT as an enabler instead of a strategic partner. Stop walling off the rest of the business from siloed digital successes.

As our study shows, companies that treat their digital transformation as an all-encompassing, all-sharing, and all-knowing business imperative will be the ones that disrupt the competitive landscape and stay ahead of a constantly evolving economy.

Follow me on twitter @vivek_bapat 

For more insight on digital leaders, check out the SAP Center for Business Insight report, conducted in collaboration with Oxford Economics,SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart.”

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Do Consumers Seek More Credit After Their Score Recovers?

In a previous post, we noted that the majority of consumers who had a 7-year-old delinquency purged from their credit file saw improvements in their FICO® Scores. Now let’s look at whether these consumers’ credit-seeking behavior changed after the delinquency was purged and their score recovered. Were they more likely to apply for credit? Get approved and open new accounts?

To assess this, we looked at the proportion of the “delinquency purge” population (those that had a delinquency removed from their credit report between May 2016 and July 2016) that had a new inquiry or opened a new account in the three months following the purge window (August through October 2016).

In Figure 1, we compared those values to the same period a year earlier, to avoid capturing seasonal changes in credit habits.

Delinquencies FICO Scores 3 Do Consumers Seek More Credit After Their Score Recovers?

The data showed that there was a minor increase in the percentage of consumers that had a new inquiry compared to a year prior, and a material increase in those that opened a new account. This is particularly noticeable for new bankcards.

For the “full recovery” population (those in the “delinquency purge” population who had no remaining serious delinquencies), the changes were slightly more pronounced, as seen in Figure 2.

Delinquencies FICO Scores 4 Do Consumers Seek More Credit After Their Score Recovers?

We see a greater increase in both new inquiries and account openings for this segment than is observed in the broader “delinquency purge” population. Furthermore, we see higher rates of new account openings despite lower rates of inquiries than the “delinquency purge” population, suggesting a higher approval rate.

This supports the idea that these consumers, who have rebuilt their credit profile and FICO® Score through years of responsible behavior, are now very well-positioned to access new credit at more favorable terms than they would have been offered before (which in turn may induce them to take on new credit).

Is It the Score or the Lending Environment?

But perhaps the increase was due to other factors (such as less restrictive lending standards), and had nothing to do with the removed delinquencies. So we looked at the inquiry and new account activity for the “delinquency baseline” population (those who had a serious delinquency in 2009-2010 but did not have a delinquency removed between May 2016 and July 2016) as a benchmark for comparison.Delinquencies FICO Scores 5 Do Consumers Seek More Credit After Their Score Recovers?

In the “delinquency baseline” population, we found that there was actually a small decline in inquiries between 2015 and 2016, and relatively minor changes in new account openings. The figures look quite different from the increases in new credit activity seen in the “delinquency purge” population.

The difference in the year-over-year change in new inquiry rate between “delinquency purge” consumers and “delinquency baseline” consumers suggests an increase in the willingness of the “delinquency purge” population to seek credit, and increased confidence in their ability to get approved. And the data supports that confidence – the increase in new accounts opened is disproportionately large when compared to the increase in inquiries.

As we move further away from the Great Recession, consumers whose credit situation was adversely impacted but who have since righted the ship via responsible use of credit should continue to see marked improvements in their FICO® Score. Based on what we’ve observed from the “delinquency purge” population, their increased creditworthiness comes with expanded interest in and access to credit: a potential win-win for both lenders and consumers.

Let’s block ads! (Why?)

FICO

Is Employee Autonomy The Key To Successful Workplace Collaboration?

startup 594090 960 720 e1500401428406 Is Employee Autonomy The Key To Successful Workplace Collaboration?

Like the classic parable of the blind man and the elephant, it seems everyone has a unique take on digital transformation. Some equate digital transformation with emerging technologies, placing their bets on as the Internet of Things, machine learning, and artificial intelligence. Others see it as a way to increase efficiencies and change business processes to accelerate product to market. Some others think of it is a means of strategic differentiation, innovating new business models for serving and engaging their customers. Despite the range of viewpoints, many businesses are still challenged with pragmatically evolving digital in ways that are meaningful, industry-disruptive, and market-leading.

According to a recent study of more than 3,000 senior executives across 17 countries and regions, only a paltry three percent of businesses worldwide have successfully completed enterprise-wide digital transformation initiatives, even though 84% of C-level executives ranks such efforts as “critically important” to the fundamental sustenance of their business.

The most comprehensive global study of its kind, the SAP Center for Business Insight report “SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart,” in collaboration with Oxford Economics, identified the challenges, opportunities, value, and key technologies driving digital transformation. The findings specifically analyzed the performance of “digital leaders” – those who are connecting people, things, and businesses more intelligently, more effectively, and creating punctuated change faster than their less advanced rivals.

After analyzing the data, it was eye-opening to see that only three percent of companies (top 100) are successfully realizing their full potential through digital transformation. However, even more remarkable was that these leaders have four fundamental traits in common, regardless of their region of operation, their size, their organizational structure, or their industry.

We distilled these traits in the hope that others in the early stages of transformation or that are still struggling to find their bearings can embrace these principles in order to succeed. Ultimately I see these leaders as true ambidextrous organizations, managing evolutionary and revolutionary change simultaneously, willing to embrace innovation – not just on the edges of their business, but firmly into their core.

Here are the four traits that set these leaders apart from the rest:

Trait #1: They see digital transformation as truly transformational

An overwhelming majority (96%) of digital leaders view digital transformation as a core business goal that requires a unified digital mindset across the entire enterprise. But instead of allowing individual functions to change at their own pace, digital leaders prefer to evolve the organization to help ensure the success of their digital strategies.

The study found that 56% of these businesses regularly shift their organizational structure, which includes processes, partners, suppliers, and customers, compared to 10% of remaining companies. Plus, 70% actively bring lines of business together through cross-functional processes and technologies.

By creating a firm foundation for transformation, digital leaders are further widening the gap between themselves and their less advanced competitors as they innovate business models that can mitigate emerging risks and seize new opportunities quickly.

Trait #2: They focus on transforming customer-facing functions first

Although most companies believe technology, the pace of change, and growing global competition are the key global trends that will affect everything for years to come, digital leaders are expanding their frame of mind to consider the influence of customer empowerment. Executives who build a momentum of breakthrough innovation and industry transformation are the ones that are moving beyond the high stakes of the market to the activation of complete, end-to-end customer experiences.

In fact, 92% of digital leaders have established sophisticated digital transformation strategies and processes to drive transformational change in customer satisfaction and engagement, compared to 22% of their less mature counterparts. As a result, 70% have realized significant or transformational value from these efforts.

Trait #3: They create a virtuous cycle of digital talent

There’s little doubt that the competition for qualified talent is fierce. But for nearly three-quarters of companies that demonstrate digital-transformation leadership, it is easier to attract and retain talent because they are five times more likely to leverage digitization to change their talent management efforts.

The impact of their efforts goes beyond empowering recruiters to identify best-fit candidates, highlight risk factors and hiring errors, and predict long-term talent needs. Nearly half (48%) of digital leaders understand that they must invest heavily in the development of digital skills and technology to drive revenue, retain productive employees, and create new roles to keep up with their digital maturity over the next two years, compared to 30% of all surveyed executives.

Trait #4: They invest in next-generation technology using a bimodal architecture

A couple years ago, Peter Sondergaard, senior vice president at Gartner and global head of research, observed that “CIOs can’t transform their old IT organization into a digital startup, but they can turn it into a bi-modal IT organization. Forty-five percent of CIOs state they currently have a fast mode of operation, and we predict that 75% of IT organizations will be bimodal in some way by 2017.”

Based on the results of the SAP Center for Business Insight study, Sondergaard’s prediction was spot on. As digital leaders dive into advanced technologies, 72% are using a digital twin of the conventional IT organization to operate efficiently without disruption while refining innovative scenarios to resolve business challenges and integrate them to stay ahead of the competition. Unfortunately, only 30% of less advanced businesses embrace this view.

Working within this bimodal architecture is emboldening digital leaders to take on incredibly progressive technology. For example, the study found that 50% of these firms are using artificial intelligence and machine learning, compared to seven percent of all respondents. They are also leading the adoption curve of Big Data solutions and analytics (94% vs. 60%) and the Internet of Things (76% vs. 52%).

Digital leadership is a practice of balance, not pure digitization

Most executives understand that digital transformation is a critical driver of revenue growth, profitability, and business expansion. However, as digital leaders are proving, digital strategies must deliver a balance of organizational flexibility, forward-looking technology adoption, and bold change. And clearly, this approach is paying dividends for them. They are growing market share, increasing customer satisfaction, improving employee engagement, and, perhaps more important, achieving more profitability than ever before.

For any company looking to catch up to digital leaders, the conversation around digital transformation needs to change immediately to combat three deadly sins: Stop investing in one-off, isolated projects hidden in a single organization. Stop viewing IT as an enabler instead of a strategic partner. Stop walling off the rest of the business from siloed digital successes.

As our study shows, companies that treat their digital transformation as an all-encompassing, all-sharing, and all-knowing business imperative will be the ones that disrupt the competitive landscape and stay ahead of a constantly evolving economy.

Follow me on twitter @vivek_bapat 

For more insight on digital leaders, check out the SAP Center for Business Insight report, conducted in collaboration with Oxford Economics,SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart.”

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Women And Financial Services: Work In Progress

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

sap Q217 digital double feature2 images2 Women And Financial Services: Work In ProgressBut that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$ 100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

sap Q217 digital double feature2 images3 1024x572 Women And Financial Services: Work In Progress

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

sap Q217 digital double feature2 images4 Women And Financial Services: Work In Progress

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

sap Q217 digital double feature2 images5 Women And Financial Services: Work In Progress

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

sap Q217 digital double feature2 images6 1024x572 Women And Financial Services: Work In Progress

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $ 14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

sap Q217 digital double feature2 images7 1024x572 Women And Financial Services: Work In Progress

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

How Are Telecom Providers Managing Cybersecurity Risk?

Data breaches are in the news on an almost daily basis and telecom companies are not immune to attack – indeed, cyber-attacks have led to the loss of almost 50 million customer records in the past 10 years.

For telecom businesses it’s not only the frequency and number of records that are breached that is of concern, it’s also the type of data that can be lost. Telecom companies are guardians of a rich set of customer data, including financial information. They also hold a valuable set of behavioural data about their customers based on how they use their services. This rich data set is a honey pot for cybercriminals – particularly those using data to fuel identity theft.

We included telecommunications providers in the cybersecurity survey carried out on our behalf by independent research company Ovum. What did we find?

A full assessment of our findings for the Telco industry can be found in our e-book “Cybersecurity for Telecoms –Views from the C Suite.” Here are some headlines:

Telecom companies have seen an increase in attacks

53% said they’d experienced an increase in attempted data breaches. The larger telecom businesses had experienced this more frequently – 71% of those with over 10,000 employees said they’d seen attempted breaches increase in the past year. They also expect the risk to carry on increasing; 81% said they expect attacks to increase in the coming year.

Telecom companies think they are doing very well at fighting cybercrime – but are they?

The majority of telecom industry respondents thought that their organization’s cybersecurity was better than average, when compared to their competitors. In fact 39% rated themselves as top performers, recognized for their cybersecurity efforts. None thought they were below average.

This is statistically unlikely and a lack of objective measurement of cybersecurity might be to blame. 35% self-assess based on their own benchmarks and criteria and 6% don’t carry out a measurable assessment. Objective measurement could well be a starting point for telecom companies that really want to understand and improve their cybersecurity position — taking a free trial of the FICO Enterprise Security Score will help!

There are gaps in the cyber-readiness of telecom companies

When we looked at specific measures that telecom companies could take to combat cybercrime, it became clear that not every telecom company had taken all the steps they could to enhance their cybersecurity posture. Perhaps most shocking was that only 54% of telecom companies had a tested data breach response plan — in Sweden it was only 38%. Telecom companies were doing better in regards to the other specific measures we asked about, but as the graph below shows there are still gaps.

Cyber survey chart 7 How Are Telecom Providers Managing Cybersecurity Risk?

Next year the General Data Protection Regulation (GDPR) comes into force in Europe. This legislation means that the repercussions of a data breach will become more severe and include a substantial increase in the fines that can be imposed. Telecom companies cannot afford to relax their cyber-readiness efforts.

Download the e-book: “Cybersecurity for Telecoms –Views from the C Suite”

Let’s block ads! (Why?)

FICO

Culture: More Than Just An HR Thing

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

sap Q217 digital double feature2 images2 Culture: More Than Just An HR ThingBut that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$ 100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

sap Q217 digital double feature2 images3 1024x572 Culture: More Than Just An HR Thing

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

sap Q217 digital double feature2 images4 Culture: More Than Just An HR Thing

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

sap Q217 digital double feature2 images5 Culture: More Than Just An HR Thing

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

sap Q217 digital double feature2 images6 1024x572 Culture: More Than Just An HR Thing

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $ 14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

sap Q217 digital double feature2 images7 1024x572 Culture: More Than Just An HR Thing

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Using Science And Data To Build Your Teams

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

sap Q217 digital double feature2 images2 Using Science And Data To Build Your TeamsBut that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$ 100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

sap Q217 digital double feature2 images3 1024x572 Using Science And Data To Build Your Teams

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

sap Q217 digital double feature2 images4 Using Science And Data To Build Your Teams

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

sap Q217 digital double feature2 images5 Using Science And Data To Build Your Teams

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

sap Q217 digital double feature2 images6 1024x572 Using Science And Data To Build Your Teams

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $ 14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

sap Q217 digital double feature2 images7 1024x572 Using Science And Data To Build Your Teams

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

The Future of AI: Meet the Multi-Tasking Machines

AI 25 Years FICO Machine Learning The Future of AI: Meet the Multi Tasking Machines

To commemorate the silver jubilee of FICO’s use of artificial intelligence and machine learning, we asked FICO employees a question: What does the future of AI look like? The post below is one of the thought-provoking responses, from Chahm An, a lead analytic scientist at FICO, working in San Diego.

Artificial intelligence, like human intelligence, uses observations from the past to learn a general model that can be used to make predictions about future similar occurrences. The future I see for AI is based on current work being done in this field, and grounded in what I saw 20 years ago, when I first began to study AI.

What’s Really Accelerating AI?

In 1997, and IBM’s Deep Blue had just defeated reigning world champion Garry Kasparov at the game of chess in what was seen as a landmark event of artificial intelligence surpassing a human champion at an intellectual challenge. Automated speech recognition systems were beginning to replace touchtone menus in commercial applications. The human genome project was nearing completion, and dot-coms were popping up like weeds.

Conventional wisdom on the state of AI at the time was that while some tasks that appeared to be easy for humans, such as driving a car, were extremely difficult for computers to learn, other tasks that were difficult for humans such as playing chess at a grandmaster level could easily be achieved by brute force branching computation.

These brute force optimizations were not feasible for problems with many more possible outcomes, as found in more complex games such as Go, or other tasks such as computer vision on natural language processing. Speech applications were thus limited to menu interfaces that limited choices and optical character recognition was similarly limited in scope.

The key development that most would attribute to overcoming this common challenge is the development of neural network technology that has allowed us to train models that help machines make complex decisions at a higher, more abstract level similar to the way the brain functions. However, this isn’t the complete story – after all, as we know neural networks have been in use by FICO for 25 years, so they can hardly be considered a new development.

More accurately, what has changed is that the cost to train neural networks has decreased dramatically, allowing the development of complex networks to be feasible and further accelerating the development of neural network technologies, making them more efficient and accurate. Geoffrey Hinton, a leader in the field of machine learning, has humorously noted that the training of deep belief networks had become feasible due to a 100,000-fold speed increase in training that could be partly attributed to new, more efficient algorithms, but mostly due to the fact that computers had become 1,000 times faster in the 15 years of stagnation since developing the building blocks.

Although Moore’s law is appearing to slow down, I would predict that as more focus gets put toward developing hardware specialized for machine learning, and if research of more efficient training algorithms continues on its trend, the cost to perform machine learning tasks in 2027 should be roughly 0.1% what it is today. This means that models that would take years to train with today’s technology would take less than a day, and learning tasks that currently require resource only available to supercomputers will be feasible on everyday mobile consumer devices.

How will this ability manifest in our lives? Here are three predictions.

Prediction 1: Dynamic Learners – Everywhere, All the Time

Despite the hype and fanfare surrounding deep learning, most of these advanced neural network architectures are stuck in a box. The latest deep convolutional nets can correctly identify obscure breeds of dogs better than the average human, but they’re highly optimized towards strictly defined snapshots as inputs and are limited to a pre-defined set of classifications.

With greater availability of computational resources and data, I believe that the trend will move from deep architectures with a single snapshot input and single classification output to much more complex, deep recurrent networks that take in multiple streams of varying input and offer a multitude of varying output types. Instead of static image classification, a computer vision system may work with two continuous streams of binocular video similar to what we process as humans. Not only will it identify that a dog is a beagle, but also that the dog is taking a walk, and that the lady who’s taking it on a walk looks kind of like Meryl Streep.

Since we’ve got supercomputers full of detectors in our pockets, this continuously streaming process will also be learning on the job as well. While learning tends to be a large, computationally expensive batch job performed on a server in today’s world, this is more likely to be shifted to end devices to some extent, both to distribute costs and to make each device adapted to its unique environment. My phone might ask me if I was interested in the price of dog leashes, or whether I think Meryl Streep lady looks cute, and continuously build a profile on my preferences.

Prediction 2: Generalized Learners – Not Just One-Trick Ponies

I believe that there will be a significant trend towards intelligent systems that do well in more than a single task. The same neural network that is designed to filter out spam emails may also reuse its knowledge to detect phishing attempts, prioritize your inbox by importance, and help you draft responses to common requests. Again, this follows the trend of multiple inputs, multiple outputs, but the result will be that we have more robust systems that are capable of understanding abstract concepts the way that we do.

To achieve this robustness, we will probably see multiple types of learners interacting with each other – perhaps a self-contained Bayesian network may process text in an unsupervised fashion to provide feedback to both a recurrent neural network and a random forest classifier to form a consensus opinion, all within a reinforcement learning system. With a glut of unlabeled data to work with, we are also more likely to see more unsupervised learning of generative models that are able to understand the underlying distribution of variables of interest, rather than the discriminative models that are currently popular with supervised learning models.

The result of these generalized learners will be intelligent systems that are not just optimized to some pre-defined objective, but actually have some degree of expertise in an area of knowledge. Instead of getting a binary decision maker, we may get systems that can explain their judgement in terms that are easy to understand and adapt to more customized objectives that we can have greater confidence in.

Prediction 3: Mixed Reception, But Gradual Trust

As seen with the issues surrounding self-driving cars, there is likely to be a great deal of resistance to certain advances in AI systems. A computer that can defeat Lee Sedol at a game of Go seems innocuous enough, but are we ready to trust an artificial intelligence with deadly force? Do we trust smart devices and their manufacturers to listen in to every occurrence in our daily lives? Are we afraid that computers will become better at our jobs than we are and render us unemployed?

I believe that this apprehension will be the biggest challenge to the advancement of AI in the next decade. There will likely be legislation put in place to further increase the privacy, job security and safety of the consumers who would benefit most from future advances in AI.

However, progress appears to be inevitable, as we already count on technology for so much nowadays. Who uses a physical map to navigate anymore, for example?

Perhaps artificial intelligence systems will need to learn to become experts at PR and marketing before moving on to the next stage of adoption.

Making Progress Today

As I was writing this blog post, I realized that FICO already does most of the things that I have predicted to become mainstream for artificial intelligence. We are already great at learning continuous customer profiles over time, auto adaptation of models, and providing reasons along with scores. This speaks to the vision that put FICO ahead of the game 25 years ago, but will continue to keep us ahead in the future of AI.

See other FICO posts on artificial intelligence.

Let’s block ads! (Why?)

FICO

Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

When it comes to buying things—even big-ticket items—the way we make decisions makes no sense. One person makes an impulsive offer on a house because of the way the light comes in through the kitchen windows. Another gleefully drives a high-end sports car off the lot even though it will probably never approach the limits it was designed to push.

We can (and usually do) rationalize these decisions after the fact by talking about needing more closet space or wanting to out-accelerate an 18-wheeler as we merge onto the highway, but years of study have arrived at a clear conclusion:

When it comes to the customer experience, human beings are fundamentally irrational.

In the brick-and-mortar past, companies could leverage that irrationality in time-tested ways. They relied heavily on physical context, such as an inviting retail space, to make products and services as psychologically appealing as possible. They used well-trained salespeople and employees to maximize positive interactions and rescue negative ones. They carefully sequenced customer experiences, such as having a captain’s dinner on the final night of a cruise, to play on our hard-wired craving to end experiences on a high note.

sap Q217 digital double feature1 images1 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

Today, though, customer interactions are increasingly moving online. Fortune reports that on 2016’s Black Friday, the day after Thanksgiving that is so crucial to holiday retail results, 108.5 million Americans shopped online, while only 99.1 million visited brick-and-mortar stores. The 9.4% gap between the two was a dramatic change from just one year prior, when on- and offline Black Friday shopping were more or less equal.

When people browse in a store for a few minutes, an astute salesperson can read the telltale signs that they’re losing interest and heading for the exit. The salesperson can then intervene, answering questions and closing the sale.

Replicating that in a digital environment isn’t as easy, however. Despite all the investments companies have made to counteract e-shopping cart abandonment, they lack the data that would let them anticipate when a shopper is on the verge of opting out of a transaction, and the actions they take to lure someone back afterwards can easily come across as less helpful than intrusive.

In a digital environment, companies need to figure out how to use Big Data analysis and digital design to compensate for the absence of persuasive human communication and physical sights, sounds, and sensations. What’s more, a 2014 Gartner survey found that 89% of marketers expected customer experience to be their primary differentiator by 2016, and we’re already well into 2017.

As transactions continue to shift toward the digital and omnichannel, companies need to figure out new ways to gently push customers along the customer journey—and to do so without frustrating, offending, or otherwise alienating them.

sap Q217 digital double feature1 images6 1024x572 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

The quest to understand online customers better in order to influence them more effectively is built on a decades-old foundation: behavioral psychology, the study of the connections between what people believe and what they actually do. All of marketing and advertising is based on changing people’s thoughts in order to influence their actions. However, it wasn’t until 2001 that a now-famous article in the Harvard Business Review formally introduced the idea of applying behavioral psychology to customer service in particular.

The article’s authors, Richard B. Chase and Sriram Dasu, respectively a professor and assistant professor at the University of Southern California’s Marshall School of Business, describe how companies could apply fundamental tenets of behavioral psychology research to “optimize those extraordinarily important moments when the company touches its customers—for better and for worse.” Their five main points were simple but have proven effective across multiple industries:

  1. Finish strong. People evaluate experiences after the fact based on their high points and their endings, so the way a transaction ends is more important than how it begins.
  2. Front-load the negatives. To ensure a strong positive finish, get bad experiences out of the way early.
  3. Spread out the positives. Break up the pleasurable experiences into segments so they seem to last longer.
  4. Provide choices. People don’t like to be shoved toward an outcome; they prefer to feel in control. Giving them options within the boundaries of your ability to deliver builds their commitment.
  5. Be consistent. People like routine and predictability.

For example, McKinsey cites a major health insurance company that experimented with this framework in 2009 as part of its health management program. A test group of patients received regular coaching phone calls from nurses to help them meet health goals.

The front-loaded negative was inherent: the patients knew they had health problems that needed ongoing intervention, such as weight control or consistent use of medication. Nurses called each patient on a frequent, regular schedule to check their progress (consistency and spread-out positives), suggested next steps to keep them on track (choices), and cheered on their improvements (a strong finish).

McKinsey reports the patients in the test group were more satisfied with the health management program by seven percentage points, more satisfied with the insurance company by eight percentage points, and more likely to say the program motivated them to change their behavior by five percentage points.

sap Q217 digital double feature1 images2 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

The nurses who worked with the test group also reported increased job satisfaction. And these improvements all appeared in the first two weeks of the pilot program, without significantly affecting the company’s costs or tweaking key metrics, like the number and length of the calls.

Indeed, an ongoing body of research shows that positive reinforcements and indirect suggestions influence our decisions better and more subtly than blatant demands. This concept hit popular culture in 2008 with the bestselling book Nudge.

Written by University of Chicago economics professor Richard H. Thaler and Harvard Law School professor Cass R. Sunstein, Nudge first explains this principle, then explores it as a way to help people make decisions in their best interests, such as encouraging people to eat healthier by displaying fruits and vegetables at eye level or combatting credit card debt by placing a prominent notice on every credit card statement informing cardholders how much more they’ll spend over a year if they make only the minimum payment.

Whether they’re altruistic or commercial, nudges work because our decision-making is irrational in a predictable way. The question is how to apply that awareness to the digital economy.

sap Q217 digital double feature1 images7 1024x572 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

In its early days, digital marketing assumed that online shopping would be purely rational, a tool that customers would use to help them zero in on the best product at the best price. The assumption was logical, but customer behavior remained irrational.

Our society is overloaded with information and short on time, says Brad Berens, Senior Fellow at the Center for the Digital Future at the University of Southern California, Annenberg, so it’s no surprise that the speed of the digital economy exacerbates our desire to make a fast decision rather than a perfect one, as well as increasing our tendency to make choices based on impulse rather than logic.

sap Q217 digital double feature1 images3 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

Buyers want what they want, but they don’t necessarily understand or care why they want it. They just want to get it and move on, with minimal friction, to the next thing. “Most of our decisions aren’t very important, and we only have so much time to interrogate and analyze them,” Berens points out.

But limited time and mental capacity for decision-making is only half the issue. The other half is that while our brains are both logical and emotional, the emotional side—also known as the limbic system or, more casually, the primitive lizard brain—is far older and more developed. It’s strong enough to override logic and drive our decisions, leaving rational thought to, well, rationalize our choices after the fact.

This is as true in the B2B realm as it is for consumers. The business purchasing process, governed as it is by requests for proposals, structured procurement processes, and permission gating, is designed to ensure that the people with spending authority make the most sensible deals possible. However, research shows that even in this supposedly rational process, the relationship with the seller is still more influential than product quality in driving customer commitment and loyalty.

sap Q217 digital double feature1 images8 1024x572 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

Baba Shiv, a professor of marketing at Stanford University’s Graduate School of Business, studies how the emotional brain shapes decisions and experiences. In a popular TED Talk, he says that people in the process of making decisions fall into one of two mindsets: Type 1, which is stressed and wants to feel comforted and safe, and Type 2, which is bored or eager and wants to explore and take action.

People can move between these two mindsets, he says, but in both cases, the emotional brain is in control. Influencing it means first delivering a message that soothes or motivates, depending on the mindset the person happens to be in at the moment and only then presenting the logical argument to help rationalize the action.

In the digital economy, working with those tendencies means designing digital experiences with the full awareness that people will not evaluate them objectively, says Ravi Dhar, director of the Center for Customer Insights at the Yale School of Management. Since any experience’s greatest subjective impact in retrospect depends on what happens at the beginning, the end, and the peaks in between, companies need to design digital experiences to optimize those moments—to rationally design experiences for limited rationality.

This often involves making multiple small changes in the way options are presented well before the final nudge into making a purchase. A paper that Dhar co-authored for McKinsey offers the example of a media company that puts most of its content behind a paywall but offers free access to a limited number of articles a month as an incentive to drive subscriptions.

Many nonsubscribers reached their limit of free articles in the morning, but they were least likely to respond to a subscription offer generated by the paywall at that hour, because they were reading just before rushing out the door for the day. When the company delayed offers until later in the day, when readers were less distracted, successful subscription conversions increased.

Pre-selecting default options for necessary choices is another way companies can design digital experiences to follow customers’ preference for the path of least resistance. “We know from a decade of research that…defaults are a de facto nudge,” Dhar says.

For example, many online retailers set a default shipping option because customers have to choose a way to receive their packages and are more likely to passively allow the default option than actively choose another one. Similarly, he says, customers are more likely to enroll in a program when the default choice is set to accept it rather than to opt out.

Another intriguing possibility lies in the way customers react differently to on-screen information based on how that information is presented. Even minor tweaks can have a disproportionate impact on the choices people make, as explained in depth by University of California, Los Angeles, behavioral economist Shlomo Benartzi in his 2015 book, The Smarter Screen.

A few of the conclusions Benartzi reached: items at the center of a laptop screen draw more attention than those at the edges. Those on the upper left of a screen split into quadrants attract more attention than those on the lower left. And intriguingly, demographics are important variables.

Benartzi cites research showing that people over 40 prefer more visually complicated, text-heavy screens than younger people, who are drawn to saturated colors and large images. Women like screens that use a lot of different colors, including pastels, while men prefer primary colors on a grey or white background. People in Malaysia like lots of color; people in Germany don’t.

This suggests companies need to design their online experiences very differently for middle-aged women than they do for teenage boys. And, as Benartzi writes, “it’s easy to imagine a future in which each Internet user has his or her own ‘aesthetic algorithm,’ customizing the appearance of every site they see.”

Applying behavioral psychology to the digital experience in more sophisticated ways will require additional formal research into recommendation algorithms, predictions, and other applications of customer data science, says Jim Guszcza, PhD, chief U.S. data scientist for Deloitte Consulting.

In fact, given customers’ tendency to make the fastest decisions, Guszcza believes that in some cases, companies may want to consider making choice environments more difficult to navigate— a process he calls “disfluencing”—in high-stakes situations, like making an important medical decision or an irreversible big-ticket purchase. Choosing a harder-to-read font and a layout that requires more time to navigate forces customers to work harder to process the information, sending a subtle signal that it deserves their close attention.

That said, a company can’t apply behavioral psychology to deliver a digital experience if customers don’t engage with its site or mobile app in the first place. Addressing this often means making the process as convenient as possible, itself a behavioral nudge.

A digital solution that’s easy to use and search, offers a variety of choices pre-screened for relevance, and provides a friction-free transaction process is the equivalent of putting a product at eye level—and that applies far beyond retail. Consider the Global Entry program, which streamlines border crossings into the U.S. for pre-approved international travelers. Members can skip long passport control lines in favor of scanning their passports and answering a few questions at a touchscreen kiosk. To date, 1.8 million people have decided this convenience far outweighs the slow pace of approvals.

sap Q217 digital double feature1 images9 1024x572 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

The basics of influencing irrational customers are essentially the same whether they’re taking place in a store or on a screen. A business still needs to know who its customers are, understand their needs and motivations, and give them a reason to buy.

And despite the accelerating shift to digital commerce, we still live in a physical world. “There’s no divide between old-style analog retail and new-style digital retail,” Berens says. “Increasingly, the two are overlapping. One of the things we’ve seen for years is that people go into a store with their phones, shop for a better price, and buy online. Or vice versa: they shop online and then go to a store to negotiate for a better deal.”

Still, digital increases the number of touchpoints from which the business can gather, cluster, and filter more types of data to make great suggestions that delight and surprise customers. That’s why the hottest word in marketing today is omnichannel. Bringing behavioral psychology to bear on the right person in the right place in the right way at the right time requires companies to design customer experiences that bridge multiple channels, on- and offline.

Amazon, for example, is known for its friction-free online purchasing. The company’s pilot store in Seattle has no lines or checkout counters, extending the brand experience into the physical world in a way that aligns with what customers already expect of it, Dhar says.

Omnichannel helps counter some people’s tendency to believe their purchasing decision isn’t truly well informed unless they can see, touch, hear, and in some cases taste and smell a product. Until we have ubiquitous access to virtual reality systems with full haptic feedback, the best way to address these concerns is by providing personalized, timely, relevant information and feedback in the moment through whatever channel is appropriate. That could be an automated call center that answers frequently asked questions, a video that shows a product from every angle, or a demonstration wizard built into the product. Any of these channels could also suggest the customer visit the nearest store to receive help from a human.

sap Q217 digital double feature1 images4 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

The omnichannel approach gives businesses plenty of opportunities to apply subtle nudges across physical and digital channels. For example, a supermarket chain could use store-club card data to push personalized offers to customers’ smartphones while they shop. “If the data tells them that your goal is to feed a family while balancing nutrition and cost, they could send you an e-coupon offering a discount on a brand of breakfast cereal that tastes like what you usually buy but contains half the sugar,” Guszcza says.

Similarly, a car insurance company could provide periodic feedback to policyholders through an app or even the digital screens in their cars, he suggests. “Getting a warning that you’re more aggressive than 90% of comparable drivers and three tips to avoid risk and lower your rates would not only incentivize the driver to be more careful for financial reasons but reduce claims and make the road safer for everyone.”

Digital channels can also show shoppers what similar people or organizations are buying, let them solicit feedback from colleagues or friends, and read reviews from other people who have made the same purchases. This leverages one of the most familiar forms of behavioral psychology—reinforcement from peers—and reassures buyers with Shiv’s Type 1 mindset that they’re making a choice that meets their needs or encourages those with the Type 2 mindset to move forward with the purchase. The rational mind only has to ask at the end of the process “Am I getting the best deal?” And as Guszcza points out, “If you can create solutions that use behavioral design and digital technology to turn my personal data into insight to reach my goals, you’ve increased the value of your engagement with me so much that I might even be willing to pay you more.”

sap Q217 digital double feature1 images10 1024x572 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

Many transactions take place through corporate procurement systems that allow a company to leverage not just its own purchasing patterns but all the data in a marketplace specifically designed to facilitate enterprise purchasing. Machine learning can leverage this vast database of information to provide the necessary nudge to optimize purchasing patterns, when to buy, how best to negotiate, and more. To some extent, this is an attempt to eliminate psychology and make choices more rational.

B2B spending is tied into financial systems and processes, logistics systems, transportation systems, and other operational requirements in a way no consumer spending can be. A B2B decision is less about making a purchase that satisfies a desire than it is about making a purchase that keeps the company functioning.

That said, the decision still isn’t entirely rational, Berens says. When organizations have to choose among vendors offering relatively similar products and services, they generally opt for the vendor whose salespeople they like the best.

This means B2B companies have to make sure they meet or exceed parity with competitors on product quality, pricing, and time to delivery to satisfy all the rational requirements of the decision process. Only then can they bring behavioral psychology to bear by delivering consistently superior customer service, starting as soon as the customer hits their app or website and spreading out positive interactions all the way through post-purchase support. Finishing strong with a satisfied customer reinforces the relationship with a business customer just as much as it does with a consumer.

sap Q217 digital double feature1 images11 1024x572 Flash Briefing: The Machines Talk Back – How Alexa And Her Friends Are Mastering Casual Conversation

The best nudges make the customer relationship easy and enjoyable by providing experiences that are effortless and fun to choose, on- or offline, Dhar says. What sets the digital nudge apart in accommodating irrational customers is its ability to turn data about them and their journey into more effective, personalized persuasion even in the absence of the human touch.

Yet the subtle art of influencing customers isn’t just about making a sale, and it certainly shouldn’t be about persuading people to act against their own best interests, as Nudge co-author Thaler reminds audiences by exhorting them to “nudge for good.”

Guszcza, who talks about influencing people to make the choices they would make if only they had unlimited rationality, says companies that leverage behavioral psychology in their digital experiences should do so with an eye to creating positive impact for the customer, the company, and, where appropriate, the society.

In keeping with that ethos, any customer experience designed along behavioral lines has to include the option of letting the customer make a different choice, such as presenting a confirmation screen at the end of the purchase process with the cold, hard numbers and letting them opt out of the transaction altogether.

“A nudge is directing people in a certain direction,” Dhar says. “But for an ethical vendor, the only right direction to nudge is the right direction as judged by the customers themselves.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Volker Hildebrand is Global Vice President for SAP Hybris solutions.

Sam Yen is Chief Design Officer and Managing Director at SAP.

Fawn Fitter is a freelance writer specializing in business and technology.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

When it comes to buying things—even big-ticket items—the way we make decisions makes no sense. One person makes an impulsive offer on a house because of the way the light comes in through the kitchen windows. Another gleefully drives a high-end sports car off the lot even though it will probably never approach the limits it was designed to push.

We can (and usually do) rationalize these decisions after the fact by talking about needing more closet space or wanting to out-accelerate an 18-wheeler as we merge onto the highway, but years of study have arrived at a clear conclusion:

When it comes to the customer experience, human beings are fundamentally irrational.

In the brick-and-mortar past, companies could leverage that irrationality in time-tested ways. They relied heavily on physical context, such as an inviting retail space, to make products and services as psychologically appealing as possible. They used well-trained salespeople and employees to maximize positive interactions and rescue negative ones. They carefully sequenced customer experiences, such as having a captain’s dinner on the final night of a cruise, to play on our hard-wired craving to end experiences on a high note.

sap Q217 digital double feature1 images1 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

Today, though, customer interactions are increasingly moving online. Fortune reports that on 2016’s Black Friday, the day after Thanksgiving that is so crucial to holiday retail results, 108.5 million Americans shopped online, while only 99.1 million visited brick-and-mortar stores. The 9.4% gap between the two was a dramatic change from just one year prior, when on- and offline Black Friday shopping were more or less equal.

When people browse in a store for a few minutes, an astute salesperson can read the telltale signs that they’re losing interest and heading for the exit. The salesperson can then intervene, answering questions and closing the sale.

Replicating that in a digital environment isn’t as easy, however. Despite all the investments companies have made to counteract e-shopping cart abandonment, they lack the data that would let them anticipate when a shopper is on the verge of opting out of a transaction, and the actions they take to lure someone back afterwards can easily come across as less helpful than intrusive.

In a digital environment, companies need to figure out how to use Big Data analysis and digital design to compensate for the absence of persuasive human communication and physical sights, sounds, and sensations. What’s more, a 2014 Gartner survey found that 89% of marketers expected customer experience to be their primary differentiator by 2016, and we’re already well into 2017.

As transactions continue to shift toward the digital and omnichannel, companies need to figure out new ways to gently push customers along the customer journey—and to do so without frustrating, offending, or otherwise alienating them.

sap Q217 digital double feature1 images6 1024x572 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

The quest to understand online customers better in order to influence them more effectively is built on a decades-old foundation: behavioral psychology, the study of the connections between what people believe and what they actually do. All of marketing and advertising is based on changing people’s thoughts in order to influence their actions. However, it wasn’t until 2001 that a now-famous article in the Harvard Business Review formally introduced the idea of applying behavioral psychology to customer service in particular.

The article’s authors, Richard B. Chase and Sriram Dasu, respectively a professor and assistant professor at the University of Southern California’s Marshall School of Business, describe how companies could apply fundamental tenets of behavioral psychology research to “optimize those extraordinarily important moments when the company touches its customers—for better and for worse.” Their five main points were simple but have proven effective across multiple industries:

  1. Finish strong. People evaluate experiences after the fact based on their high points and their endings, so the way a transaction ends is more important than how it begins.
  2. Front-load the negatives. To ensure a strong positive finish, get bad experiences out of the way early.
  3. Spread out the positives. Break up the pleasurable experiences into segments so they seem to last longer.
  4. Provide choices. People don’t like to be shoved toward an outcome; they prefer to feel in control. Giving them options within the boundaries of your ability to deliver builds their commitment.
  5. Be consistent. People like routine and predictability.

For example, McKinsey cites a major health insurance company that experimented with this framework in 2009 as part of its health management program. A test group of patients received regular coaching phone calls from nurses to help them meet health goals.

The front-loaded negative was inherent: the patients knew they had health problems that needed ongoing intervention, such as weight control or consistent use of medication. Nurses called each patient on a frequent, regular schedule to check their progress (consistency and spread-out positives), suggested next steps to keep them on track (choices), and cheered on their improvements (a strong finish).

McKinsey reports the patients in the test group were more satisfied with the health management program by seven percentage points, more satisfied with the insurance company by eight percentage points, and more likely to say the program motivated them to change their behavior by five percentage points.

sap Q217 digital double feature1 images2 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

The nurses who worked with the test group also reported increased job satisfaction. And these improvements all appeared in the first two weeks of the pilot program, without significantly affecting the company’s costs or tweaking key metrics, like the number and length of the calls.

Indeed, an ongoing body of research shows that positive reinforcements and indirect suggestions influence our decisions better and more subtly than blatant demands. This concept hit popular culture in 2008 with the bestselling book Nudge.

Written by University of Chicago economics professor Richard H. Thaler and Harvard Law School professor Cass R. Sunstein, Nudge first explains this principle, then explores it as a way to help people make decisions in their best interests, such as encouraging people to eat healthier by displaying fruits and vegetables at eye level or combatting credit card debt by placing a prominent notice on every credit card statement informing cardholders how much more they’ll spend over a year if they make only the minimum payment.

Whether they’re altruistic or commercial, nudges work because our decision-making is irrational in a predictable way. The question is how to apply that awareness to the digital economy.

sap Q217 digital double feature1 images7 1024x572 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

In its early days, digital marketing assumed that online shopping would be purely rational, a tool that customers would use to help them zero in on the best product at the best price. The assumption was logical, but customer behavior remained irrational.

Our society is overloaded with information and short on time, says Brad Berens, Senior Fellow at the Center for the Digital Future at the University of Southern California, Annenberg, so it’s no surprise that the speed of the digital economy exacerbates our desire to make a fast decision rather than a perfect one, as well as increasing our tendency to make choices based on impulse rather than logic.

sap Q217 digital double feature1 images3 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

Buyers want what they want, but they don’t necessarily understand or care why they want it. They just want to get it and move on, with minimal friction, to the next thing. “Most of our decisions aren’t very important, and we only have so much time to interrogate and analyze them,” Berens points out.

But limited time and mental capacity for decision-making is only half the issue. The other half is that while our brains are both logical and emotional, the emotional side—also known as the limbic system or, more casually, the primitive lizard brain—is far older and more developed. It’s strong enough to override logic and drive our decisions, leaving rational thought to, well, rationalize our choices after the fact.

This is as true in the B2B realm as it is for consumers. The business purchasing process, governed as it is by requests for proposals, structured procurement processes, and permission gating, is designed to ensure that the people with spending authority make the most sensible deals possible. However, research shows that even in this supposedly rational process, the relationship with the seller is still more influential than product quality in driving customer commitment and loyalty.

sap Q217 digital double feature1 images8 1024x572 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

Baba Shiv, a professor of marketing at Stanford University’s Graduate School of Business, studies how the emotional brain shapes decisions and experiences. In a popular TED Talk, he says that people in the process of making decisions fall into one of two mindsets: Type 1, which is stressed and wants to feel comforted and safe, and Type 2, which is bored or eager and wants to explore and take action.

People can move between these two mindsets, he says, but in both cases, the emotional brain is in control. Influencing it means first delivering a message that soothes or motivates, depending on the mindset the person happens to be in at the moment and only then presenting the logical argument to help rationalize the action.

In the digital economy, working with those tendencies means designing digital experiences with the full awareness that people will not evaluate them objectively, says Ravi Dhar, director of the Center for Customer Insights at the Yale School of Management. Since any experience’s greatest subjective impact in retrospect depends on what happens at the beginning, the end, and the peaks in between, companies need to design digital experiences to optimize those moments—to rationally design experiences for limited rationality.

This often involves making multiple small changes in the way options are presented well before the final nudge into making a purchase. A paper that Dhar co-authored for McKinsey offers the example of a media company that puts most of its content behind a paywall but offers free access to a limited number of articles a month as an incentive to drive subscriptions.

Many nonsubscribers reached their limit of free articles in the morning, but they were least likely to respond to a subscription offer generated by the paywall at that hour, because they were reading just before rushing out the door for the day. When the company delayed offers until later in the day, when readers were less distracted, successful subscription conversions increased.

Pre-selecting default options for necessary choices is another way companies can design digital experiences to follow customers’ preference for the path of least resistance. “We know from a decade of research that…defaults are a de facto nudge,” Dhar says.

For example, many online retailers set a default shipping option because customers have to choose a way to receive their packages and are more likely to passively allow the default option than actively choose another one. Similarly, he says, customers are more likely to enroll in a program when the default choice is set to accept it rather than to opt out.

Another intriguing possibility lies in the way customers react differently to on-screen information based on how that information is presented. Even minor tweaks can have a disproportionate impact on the choices people make, as explained in depth by University of California, Los Angeles, behavioral economist Shlomo Benartzi in his 2015 book, The Smarter Screen.

A few of the conclusions Benartzi reached: items at the center of a laptop screen draw more attention than those at the edges. Those on the upper left of a screen split into quadrants attract more attention than those on the lower left. And intriguingly, demographics are important variables.

Benartzi cites research showing that people over 40 prefer more visually complicated, text-heavy screens than younger people, who are drawn to saturated colors and large images. Women like screens that use a lot of different colors, including pastels, while men prefer primary colors on a grey or white background. People in Malaysia like lots of color; people in Germany don’t.

This suggests companies need to design their online experiences very differently for middle-aged women than they do for teenage boys. And, as Benartzi writes, “it’s easy to imagine a future in which each Internet user has his or her own ‘aesthetic algorithm,’ customizing the appearance of every site they see.”

Applying behavioral psychology to the digital experience in more sophisticated ways will require additional formal research into recommendation algorithms, predictions, and other applications of customer data science, says Jim Guszcza, PhD, chief U.S. data scientist for Deloitte Consulting.

In fact, given customers’ tendency to make the fastest decisions, Guszcza believes that in some cases, companies may want to consider making choice environments more difficult to navigate— a process he calls “disfluencing”—in high-stakes situations, like making an important medical decision or an irreversible big-ticket purchase. Choosing a harder-to-read font and a layout that requires more time to navigate forces customers to work harder to process the information, sending a subtle signal that it deserves their close attention.

That said, a company can’t apply behavioral psychology to deliver a digital experience if customers don’t engage with its site or mobile app in the first place. Addressing this often means making the process as convenient as possible, itself a behavioral nudge.

A digital solution that’s easy to use and search, offers a variety of choices pre-screened for relevance, and provides a friction-free transaction process is the equivalent of putting a product at eye level—and that applies far beyond retail. Consider the Global Entry program, which streamlines border crossings into the U.S. for pre-approved international travelers. Members can skip long passport control lines in favor of scanning their passports and answering a few questions at a touchscreen kiosk. To date, 1.8 million people have decided this convenience far outweighs the slow pace of approvals.

sap Q217 digital double feature1 images9 1024x572 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

The basics of influencing irrational customers are essentially the same whether they’re taking place in a store or on a screen. A business still needs to know who its customers are, understand their needs and motivations, and give them a reason to buy.

And despite the accelerating shift to digital commerce, we still live in a physical world. “There’s no divide between old-style analog retail and new-style digital retail,” Berens says. “Increasingly, the two are overlapping. One of the things we’ve seen for years is that people go into a store with their phones, shop for a better price, and buy online. Or vice versa: they shop online and then go to a store to negotiate for a better deal.”

Still, digital increases the number of touchpoints from which the business can gather, cluster, and filter more types of data to make great suggestions that delight and surprise customers. That’s why the hottest word in marketing today is omnichannel. Bringing behavioral psychology to bear on the right person in the right place in the right way at the right time requires companies to design customer experiences that bridge multiple channels, on- and offline.

Amazon, for example, is known for its friction-free online purchasing. The company’s pilot store in Seattle has no lines or checkout counters, extending the brand experience into the physical world in a way that aligns with what customers already expect of it, Dhar says.

Omnichannel helps counter some people’s tendency to believe their purchasing decision isn’t truly well informed unless they can see, touch, hear, and in some cases taste and smell a product. Until we have ubiquitous access to virtual reality systems with full haptic feedback, the best way to address these concerns is by providing personalized, timely, relevant information and feedback in the moment through whatever channel is appropriate. That could be an automated call center that answers frequently asked questions, a video that shows a product from every angle, or a demonstration wizard built into the product. Any of these channels could also suggest the customer visit the nearest store to receive help from a human.

sap Q217 digital double feature1 images4 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

The omnichannel approach gives businesses plenty of opportunities to apply subtle nudges across physical and digital channels. For example, a supermarket chain could use store-club card data to push personalized offers to customers’ smartphones while they shop. “If the data tells them that your goal is to feed a family while balancing nutrition and cost, they could send you an e-coupon offering a discount on a brand of breakfast cereal that tastes like what you usually buy but contains half the sugar,” Guszcza says.

Similarly, a car insurance company could provide periodic feedback to policyholders through an app or even the digital screens in their cars, he suggests. “Getting a warning that you’re more aggressive than 90% of comparable drivers and three tips to avoid risk and lower your rates would not only incentivize the driver to be more careful for financial reasons but reduce claims and make the road safer for everyone.”

Digital channels can also show shoppers what similar people or organizations are buying, let them solicit feedback from colleagues or friends, and read reviews from other people who have made the same purchases. This leverages one of the most familiar forms of behavioral psychology—reinforcement from peers—and reassures buyers with Shiv’s Type 1 mindset that they’re making a choice that meets their needs or encourages those with the Type 2 mindset to move forward with the purchase. The rational mind only has to ask at the end of the process “Am I getting the best deal?” And as Guszcza points out, “If you can create solutions that use behavioral design and digital technology to turn my personal data into insight to reach my goals, you’ve increased the value of your engagement with me so much that I might even be willing to pay you more.”

sap Q217 digital double feature1 images10 1024x572 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

Many transactions take place through corporate procurement systems that allow a company to leverage not just its own purchasing patterns but all the data in a marketplace specifically designed to facilitate enterprise purchasing. Machine learning can leverage this vast database of information to provide the necessary nudge to optimize purchasing patterns, when to buy, how best to negotiate, and more. To some extent, this is an attempt to eliminate psychology and make choices more rational.

B2B spending is tied into financial systems and processes, logistics systems, transportation systems, and other operational requirements in a way no consumer spending can be. A B2B decision is less about making a purchase that satisfies a desire than it is about making a purchase that keeps the company functioning.

That said, the decision still isn’t entirely rational, Berens says. When organizations have to choose among vendors offering relatively similar products and services, they generally opt for the vendor whose salespeople they like the best.

This means B2B companies have to make sure they meet or exceed parity with competitors on product quality, pricing, and time to delivery to satisfy all the rational requirements of the decision process. Only then can they bring behavioral psychology to bear by delivering consistently superior customer service, starting as soon as the customer hits their app or website and spreading out positive interactions all the way through post-purchase support. Finishing strong with a satisfied customer reinforces the relationship with a business customer just as much as it does with a consumer.

sap Q217 digital double feature1 images11 1024x572 Top Ten Digitalist Magazine Posts Of The Week [July 10, 2017]

The best nudges make the customer relationship easy and enjoyable by providing experiences that are effortless and fun to choose, on- or offline, Dhar says. What sets the digital nudge apart in accommodating irrational customers is its ability to turn data about them and their journey into more effective, personalized persuasion even in the absence of the human touch.

Yet the subtle art of influencing customers isn’t just about making a sale, and it certainly shouldn’t be about persuading people to act against their own best interests, as Nudge co-author Thaler reminds audiences by exhorting them to “nudge for good.”

Guszcza, who talks about influencing people to make the choices they would make if only they had unlimited rationality, says companies that leverage behavioral psychology in their digital experiences should do so with an eye to creating positive impact for the customer, the company, and, where appropriate, the society.

In keeping with that ethos, any customer experience designed along behavioral lines has to include the option of letting the customer make a different choice, such as presenting a confirmation screen at the end of the purchase process with the cold, hard numbers and letting them opt out of the transaction altogether.

“A nudge is directing people in a certain direction,” Dhar says. “But for an ethical vendor, the only right direction to nudge is the right direction as judged by the customers themselves.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Volker Hildebrand is Global Vice President for SAP Hybris solutions.

Sam Yen is Chief Design Officer and Managing Director at SAP.

Fawn Fitter is a freelance writer specializing in business and technology.

Comments

Let’s block ads! (Why?)

Digitalist Magazine