• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Theory

like conservative humor, is there conservative theory other than the usual fascism

January 7, 2021   Humor
 like conservative humor, is there conservative theory other than the usual fascism


There’s an article that tries to identify contemporary conservative thoughtthat isn’t the usual amalgam of artistic or literary formalism, Whig history, neoclassical economics, scientism, or logical positivism. It doesn’t succeed, but it remains an interesting project to find something that could rationalize Trumpism. You may find it interesting if only to have something to talk about when convincing Trumpists with college degrees why they shouldn’t support Josh Hawley.

It may be that there is no such thing as a conservative humor other than those which are sexist, misogynist, homophobic, transphobic, lookist, ableist, and racist. This would leave us with observational, self-deprecating, ironic humor of an infantile sort. Think Dennis Miller without the intellectual references ceteris paribus. Or maybe Bill Cosby without the crimes.

That’s clearly not what would define as humor of a liberal sort, but it is similar to the relatively circular and often disinformational logic of what might be conservative theory. Such a theory would have some strange idealist libertarianism constructed on racial privilege that scapegoats some classic liberalism with what seems a self-centered, reactionary thought and behavior in a world without imperialism and colonialism, and capitalism is ubiquitous and canonical. Think William F. Buckley’s National Review without the closeted gayness.

This article by Geoff Shullenberger tries to deconstruct whatever conservative theory might be shared by Trumpists who graduated from college and have some position in its political structure. YMMV. It even soft-pedals Heidegger’s Nazism, because math. This is like Trump’s speech to his special, “very fine people” yesterday, except with smaller words.

The career of our next subject points to a similar conclusion. Just prior to becoming a Trump speechwriter and policy adviser, Darrien J. Beattie completed a Ph.D. in Political Theory at Duke, with a dissertation on “Martin Heidegger’s Mathematical Dialectic.” Beattie’s dissertation, the abstract tells us, “attempts to elucidate Martin Heidegger’s diagnosis of modernity, and, by extension, his thought as a whole, from the neglected standpoint of his understanding of mathematics, which he explicitly identifies as the essence of modernity.” Heidegger was not a postmodernist or a critical theorist, but he was a key influence on French post-structuralists like Foucault and Derrida, and on the Frankfurt School, especially Marcuse.

Beattie’s dissertation overlaps thematically with the bodies of theory brought up in relation to Breitbart and Hahn. The theorists discussed by both of them, as we have seen, shared a preoccupation with the way that power is exercised in the modern era by means ostensibly neutral institutions and ostensibly objective scientific techniques. Heidegger’s famous account of technology as an intrinsically violent “enframing” of nature influenced the way thinkers like Marcuse and Foucault challenged the nominal objectivity and neutrality of scientific expertise and technological control. Beattie’s investigation of Heidegger’s account of mathematics – broadly seen as the epistemological basis of this objectivity and neutrality – as the “essence of modernity” fits in well with this set of concerns.

Beattie’s trajectory has certain parallels with Hahn’s. After a successful career as a student of theory at an elite institution, he went on to become part of the Trump White House communications team. Whereas Hahn preceded this with her stint at Breitbart, Beattie is now involved in revolver.news, a pro-Trump news aggregation site. He is also currently writing a book “in defense of Trumpist nationalism.”

Along with Hahn, Beattie has been heavily criticized by organizations like the SPLC for his associations with extreme anti-immigration groups. An exposé of his participation in a 2016 gathering that also included white nationalists let to his firing from the White House. Recently, he again attracted additional criticism when Trump appointed him to a commission that “helps preserve sites related to the Holocaust.” Beattie, like Hahn and Breitbart, is Jewish, but all have been faulted for excessive proximity to white nationalism.

In Beattie’s case, these controversies are something of an echo of the one that once surrounded his dissertation subject, Heidegger, who notoriously became a member of the Nazi Party in the early 1930s. Beattie addresses this fact in a prefatory remark to his dissertation, in which he states that there are “important cautionary lessons to be learned from a careful study of Heidegger’s monumental political blunder,” but also explains that he will not be addressing these lessons at length in his own work. He writes that “out of respect for the magnitude of scholarly literature devoted to this question . . . one must conclude that sheer limitations of space and scope simply prohibit an adequate treatment of Heidegger’s political involvement with National Socialism within the context of a study that intends to explore seriously and comprehensively any separate feature of Heidegger’s thought.” However, he also states the following:

“[T}o study Heidegger’s disastrous political involvement to the exclusion of other aspects of his thought would not only be damaging philosophically, such a stance would also run the risk of unwittingly inviting the repetition of new political misjudgments in the future. To put the matter in more general yet more concrete terms, just as it is important to understand the extent and nature of the philosophical errors behind the 20th century’s most brutal and illiberal totalitarian regimes, it is equally important—indeed, perhaps more so—not to allow an exclusive or inordinate attention to such blunders detract from a deep and critical attention to the dangers that might be lurking within the seemingly more benign political expressions of modernity that have survived the downfall of fascism and communism.”

In other words, Beattie seems to say, an emphasis on Heidegger’s complicity with Nazism risks sidelining the insights his work offers into the “seemingly more benign political expressions of modernity” that shape our reality. It seems safe to infer that Beattie is referring here to the technocratic liberal consensus ascendant today.

In a more overtly polemical piece of writing, in which he attacks neoconservative interventionism and defends an “America First” foreign policy, Beattie states that “the chief threat to America, and indeed the West, is not an overseas regime like the Soviet Union or a foreign-born movement like radical Islam. To the contrary, it is a home-grown threat: the corruption and de-legitimization of our domestic institutions and the elite entrusted with the custody of the American way of life.” Again, these institutions, as has been evident during the pandemic, rest their assertions of authority on assertions of their own expertise, objectivity, and neutrality – claims that are increasingly in disrepute. Heidegger, as well as a number of the later theorists he influenced, have provided their followers with the means to critique of this form of power. It should be no surprise that such an approach is of interest to some adherents of a political movement that aims to exploit and accelerate the crisis of these institutions.

outsidertheory.com/…

x

Almost exactly four years ago, 217 people were arrested and prosecuted for protesting when Trump was inaugurated. They faced 60-year prison terms.

Today, after a mob seized the Capitol building for hours in an insurrection that killed someone, the DC Police have made 13 arrests. https://t.co/7CMpIK1hzH

— Andrew Crespo (@AndrewMCrespo) January 7, 2021

Let’s block ads! (Why?)

moranbetterDemocrats

Read More

Want help with proving a calculus theory

September 6, 2020   BI News and Info

 Want help with proving a calculus theory

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Read More

Chaos Theory And Cost Breakdown Analysis In Sourcing

December 12, 2019   BI News and Info
 Chaos Theory And Cost Breakdown Analysis In Sourcing

Can a butterfly flapping its wings in Timbuktu disrupt your supply chain in the Mall of America, Minnesota? Chaos theory aficionados will answer with a resounding yes, as the butterfly effect, applied here, would dictate that the small wind from the butterflies can yield a snowstorm over time due to magnification.

Strategic sourcing professionals have numerous capabilities to leverage the butterfly effect: develop small changes that result in a dramatic positive impact on the business.

Let’s study one: cost breakdown analysis.

Cost breakdown analysis

A cost breakdown analysis (CBA) can help you understand the different components that make up the cost of a product. Its usual constituents are factors such as material, labor, overhead, etc.

You can get this information from multiple sources such as professional service providers, expert consultation, internal benchmarks, and so on. A key source of information can be your own suppliers. An in-depth understanding of what cost drivers influence your product thus helps to establish the cause-and-effect flow of various events on the product’s supply chain.

Getting cost data from suppliers

The days when suppliers were reluctant to share their cost information with the buying customer are over. The concern that the buyer would use strong-arm tactics for pricing negotiations is very much going the way of the dinosaurs.

Progressive, forward-looking enterprises are moving towards effective supplier collaboration in a multi-enterprise network environment. Your suppliers can be the best source of information to provide strategic insights for not just new product launches, but for high-volume, mature products as well.

Cost breakdown insights as part of sourcing

At the time you raise an RFQ (request for quotation), getting cost information pertaining to various cost drivers in different cost groups from suppliers can provide valuable insights. This can definitely alert you to take proactive steps in commodities markets to protect yourself from future disruptions.

Significantly more important is that it can highlight the differences in patterns from the average norm across different suppliers and can thereby provide untapped opportunities for cost savings and other KPIs (key performance indicator) improvements. The ability to cross-pollinate best practices and ideas across the supplier ecosystem results in a win-win situation for all stakeholders. Long-term, the relationships with stakeholders can evolve into different business models such as buy-sell, cost-plus, and other multi-tier cost/risk management strategies.

Cost traceability across the product lifecycle

As the product progresses throughout its lifecycle (in some cases, a multi-year lifecycle, or in the case of an Apple iPhone, as short as one year), detailed cost-breakdown analysis for all components is necessary to facilitate better margin understanding. Many organizations experience a “cost-gap” – missing information about product cost.

Developing a cost framework across this product lifecycle (from prototype to launch to high volume to retirement or next-generation) will eliminate cost gaps. This approach enables professionals to make the best decisions as the product traverses through various product milestones and decision gates while translating into improved product and supply chain sustainability.

In summary, cost breakdown analysis leads to a stronger, nimbler organization better equipped to handle the challenges brought by the flapping of the wings of a butterfly in Timbuktu!

For more insight, download the IDC whitepaper Achieving Competitive Advantage with Trading Partner Collaboration in a Global, Dynamic Supply Chain.

Follow SAP Finance online: @SAPFinance (Twitter) | LinkedIn | Facebook | YouTube

Let’s block ads! (Why?)

Digitalist Magazine

Read More

AI Weekly: $102 million and machine learning’s theory of general relativity

June 23, 2017   Big Data

What’s $ 102 million among friends? That was the question last week as Element.ai raised this hefty sum in a Series A funding. Investors included Microsoft, Nvidia, and Intel Capital, all of whom have their own AI ambitions.

Element aims to make AI easy for businesses to use by connecting them with machine learning experts. And while Element may have brought together competitors Microsoft and Intel, the rivals have been fiercely staking their own claims with a flurry of investments and acquisitions.

While these moves point to AI one day becoming ubiquitous for business tasks, Google’s release of an academic paper called “One Model to Learn Them All” shows another route for machine learning to become commonplace. The paper describes a single machine learning template — dubbed the MultiModel — that can perform multiple tasks exceptionally well. As Blair Hanley Frank writes, “Google doesn’t claim to have a master algorithm that can learn everything at once,” but implicit in this statement is the ambition of eventually discovering one or more such algorithms, the AI equivalent of a theory of general relativity.

For AI coverage, send news tips to Blair Hanley Frank and Khari Johnson, and guest post submissions to John Brandon — and be sure to bookmark our AI Channel.

Thanks for reading,
Blaise Zerega
Editor in Chief

P.S. Please enjoy this video of Recode’s Walt Mossberg interviewing Amazon CEO Jeff Bezos about the “gigantic” potential of AI.

 AI Weekly: $102 million and machine learning’s theory of general relativity

Textio gets $ 20 million to expand its AI-powered writing platform

Textio announced today that it has raised a $ 20 million series B round of funding to expand its AI-powered augmented writing platform into fields beyond improving job descriptions. The company’s product offers automated guidance so that people writing job descriptions are able to maximize their response from interested and qualified applicants. As people write in […]

Read the full story

 AI Weekly: $102 million and machine learning’s theory of general relativity

Vinod Khosla predicts AI will replace human oncologists

While much of the conversation around AI and jobs is focused on widespread job losses in sectors like trucking, venture capitalist and Sun Microsystems cofounder Vinod Khosla thinks that there’s a high-paying job on the chopping block: oncology. “I can’t imagine why a human oncologist would add value, given the amount of data in oncology,” […]

Read the full story

 AI Weekly: $102 million and machine learning’s theory of general relativity

Why voicebots aren’t ready for normal people

The tech elite love voicebots. Google Home is a game-changer, a way to search by voice for recipes when you’re in the kitchen. Amazon Echo and the Alexa voicebot help you order products and check the weather. I’m eager to see how Apple will make the HomePod a must-own product. And yet there is a […]

Read the full story

 AI Weekly: $102 million and machine learning’s theory of general relativity

Google advances AI with ‘one model to learn them all’

Google quietly released an academic paper that could provide a blueprint for the future of machine learning. Called “One Model to Learn Them All,” it lays out a template for how to create a single machine learning model that can address multiple tasks well. The MultiModel, as the Google researchers call it, was trained on […]

Read the full story

 AI Weekly: $102 million and machine learning’s theory of general relativity

Intel and Microsoft’s latest investment binge shows AI land grab is intensifying

ANALYSIS: Intel and Microsoft have been on something of an artificial intelligence (AI) investment binge of late, with the chip and software giants announcing a slew of deals this week via their respective VC arms — Intel Capital and Microsoft Ventures. Perhaps the most notable of these was Element AI, which raised a gargantuan $ 102 million […]

Read the full story

 AI Weekly: $102 million and machine learning’s theory of general relativity

Facebook’s AI for targeting terrorists will go beyond Muslim extremists

Following sharp criticism from the leaders of European nations, as well as concerns from its own community, Facebook is training artificial intelligence to target terrorist messaging and propaganda on its platform. This AI will target Muslim extremists like ISIS but will also be aimed at any group with a violent mission or that has engaged […]

Read the full story

Beyond VB

An Artificial Intelligence Developed Its Own Non-Human Language

A buried line in a new Facebook report about chatbots’ conversations with one another offers a remarkable glimpse at the future of language. In the report, researchers at the Facebook Artificial Intelligence Research lab describe using machine learning to train their “dialog agents” to negotiate. (And it turns out bots are actually quite good at dealmaking.) At one point, the researchers write, they had to tweak one of their models because otherwise the bot-to-bot conversation “led to divergence from human language as the agents developed their own language for negotiating.” They had to use what’s called a fixed supervised model instead. (via The Atlantic)

Read the full story

In the AI Age, “Being Smart” Will Mean Something Completely Different

Andrew Ng has likened artificial intelligence (AI) to electricity in that it will be as transformative for us as electricity was for our ancestors. I can only guess that electricity was mystifying, scary, and even shocking to them — just as AI will be to many of us. Credible scientists and research firms have predicted that the likely automation of service sectors and professional jobs in the United States will be more than 10 times as large as the number of manufacturing jobs automated to date. That possibility is mind-boggling. (via Harvard Business Review)

Read the full story

How artificial intelligence can deliver real value to companies

After decades of extravagant promises and frustrating disappointments, artificial intelligence (AI) is finally starting to deliver real-life benefits to early-adopting companies. Retailers on the digital frontier rely on AI-powered robots to run their warehouses—and even to automatically order stock when inventory runs low. Utilities use AI to forecast electricity demand. Automakers harness the technology in self-driving cars. (via McKinsey & Company)

Read the full story

AI will create 800,000 jobs and $ 1.1 trillion revenue by 2021: Salesforce

Contrary to the bleak picture painted by critics, a new IDC study of more than 1,000 organisations worldwide shows that AI will be in the workplace “sooner than we think”, and will have a positive impact on productivity, revenues, and job creation. From 2017 to 2021, the Salesforce-sponsored study predicts that AI-powered CRM activities will boost business revenue by $ 1.1 trillion, and create more than 800,000 direct jobs and 2 million indirect jobs globally, surpassing those lost to AI-driven automation. (via ZDNet)

Read the full story

and receive this newsletter every Thursday

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

How Game Theory Is Taking Marketing To The Next Level

May 22, 2017   SAP

When it comes to buying things—even big-ticket items—the way we make decisions makes no sense. One person makes an impulsive offer on a house because of the way the light comes in through the kitchen windows. Another gleefully drives a high-end sports car off the lot even though it will probably never approach the limits it was designed to push.

We can (and usually do) rationalize these decisions after the fact by talking about needing more closet space or wanting to out-accelerate an 18-wheeler as we merge onto the highway, but years of study have arrived at a clear conclusion:

When it comes to the customer experience, human beings are fundamentally irrational.

In the brick-and-mortar past, companies could leverage that irrationality in time-tested ways. They relied heavily on physical context, such as an inviting retail space, to make products and services as psychologically appealing as possible. They used well-trained salespeople and employees to maximize positive interactions and rescue negative ones. They carefully sequenced customer experiences, such as having a captain’s dinner on the final night of a cruise, to play on our hard-wired craving to end experiences on a high note.

sap Q217 digital double feature1 images1 How Game Theory Is Taking Marketing To The Next Level

Today, though, customer interactions are increasingly moving online. Fortune reports that on 2016’s Black Friday, the day after Thanksgiving that is so crucial to holiday retail results, 108.5 million Americans shopped online, while only 99.1 million visited brick-and-mortar stores. The 9.4% gap between the two was a dramatic change from just one year prior, when on- and offline Black Friday shopping were more or less equal.

When people browse in a store for a few minutes, an astute salesperson can read the telltale signs that they’re losing interest and heading for the exit. The salesperson can then intervene, answering questions and closing the sale.

Replicating that in a digital environment isn’t as easy, however. Despite all the investments companies have made to counteract e-shopping cart abandonment, they lack the data that would let them anticipate when a shopper is on the verge of opting out of a transaction, and the actions they take to lure someone back afterwards can easily come across as less helpful than intrusive.

In a digital environment, companies need to figure out how to use Big Data analysis and digital design to compensate for the absence of persuasive human communication and physical sights, sounds, and sensations. What’s more, a 2014 Gartner survey found that 89% of marketers expected customer experience to be their primary differentiator by 2016, and we’re already well into 2017.

As transactions continue to shift toward the digital and omnichannel, companies need to figure out new ways to gently push customers along the customer journey—and to do so without frustrating, offending, or otherwise alienating them.

sap Q217 digital double feature1 images6 1024x572 How Game Theory Is Taking Marketing To The Next Level

The quest to understand online customers better in order to influence them more effectively is built on a decades-old foundation: behavioral psychology, the study of the connections between what people believe and what they actually do. All of marketing and advertising is based on changing people’s thoughts in order to influence their actions. However, it wasn’t until 2001 that a now-famous article in the Harvard Business Review formally introduced the idea of applying behavioral psychology to customer service in particular.

The article’s authors, Richard B. Chase and Sriram Dasu, respectively a professor and assistant professor at the University of Southern California’s Marshall School of Business, describe how companies could apply fundamental tenets of behavioral psychology research to “optimize those extraordinarily important moments when the company touches its customers—for better and for worse.” Their five main points were simple but have proven effective across multiple industries:

  1. Finish strong. People evaluate experiences after the fact based on their high points and their endings, so the way a transaction ends is more important than how it begins.
  2. Front-load the negatives. To ensure a strong positive finish, get bad experiences out of the way early.
  3. Spread out the positives. Break up the pleasurable experiences into segments so they seem to last longer.
  4. Provide choices. People don’t like to be shoved toward an outcome; they prefer to feel in control. Giving them options within the boundaries of your ability to deliver builds their commitment.
  5. Be consistent. People like routine and predictability.

For example, McKinsey cites a major health insurance company that experimented with this framework in 2009 as part of its health management program. A test group of patients received regular coaching phone calls from nurses to help them meet health goals.

The front-loaded negative was inherent: the patients knew they had health problems that needed ongoing intervention, such as weight control or consistent use of medication. Nurses called each patient on a frequent, regular schedule to check their progress (consistency and spread-out positives), suggested next steps to keep them on track (choices), and cheered on their improvements (a strong finish).

McKinsey reports the patients in the test group were more satisfied with the health management program by seven percentage points, more satisfied with the insurance company by eight percentage points, and more likely to say the program motivated them to change their behavior by five percentage points.

sap Q217 digital double feature1 images2 How Game Theory Is Taking Marketing To The Next Level

The nurses who worked with the test group also reported increased job satisfaction. And these improvements all appeared in the first two weeks of the pilot program, without significantly affecting the company’s costs or tweaking key metrics, like the number and length of the calls.

Indeed, an ongoing body of research shows that positive reinforcements and indirect suggestions influence our decisions better and more subtly than blatant demands. This concept hit popular culture in 2008 with the bestselling book Nudge.

Written by University of Chicago economics professor Richard H. Thaler and Harvard Law School professor Cass R. Sunstein, Nudge first explains this principle, then explores it as a way to help people make decisions in their best interests, such as encouraging people to eat healthier by displaying fruits and vegetables at eye level or combatting credit card debt by placing a prominent notice on every credit card statement informing cardholders how much more they’ll spend over a year if they make only the minimum payment.

Whether they’re altruistic or commercial, nudges work because our decision-making is irrational in a predictable way. The question is how to apply that awareness to the digital economy.

sap Q217 digital double feature1 images7 1024x572 How Game Theory Is Taking Marketing To The Next Level

In its early days, digital marketing assumed that online shopping would be purely rational, a tool that customers would use to help them zero in on the best product at the best price. The assumption was logical, but customer behavior remained irrational.

Our society is overloaded with information and short on time, says Brad Berens, Senior Fellow at the Center for the Digital Future at the University of Southern California, Annenberg, so it’s no surprise that the speed of the digital economy exacerbates our desire to make a fast decision rather than a perfect one, as well as increasing our tendency to make choices based on impulse rather than logic.

sap Q217 digital double feature1 images3 How Game Theory Is Taking Marketing To The Next Level

Buyers want what they want, but they don’t necessarily understand or care why they want it. They just want to get it and move on, with minimal friction, to the next thing. “Most of our decisions aren’t very important, and we only have so much time to interrogate and analyze them,” Berens points out.

But limited time and mental capacity for decision-making is only half the issue. The other half is that while our brains are both logical and emotional, the emotional side—also known as the limbic system or, more casually, the primitive lizard brain—is far older and more developed. It’s strong enough to override logic and drive our decisions, leaving rational thought to, well, rationalize our choices after the fact.

This is as true in the B2B realm as it is for consumers. The business purchasing process, governed as it is by requests for proposals, structured procurement processes, and permission gating, is designed to ensure that the people with spending authority make the most sensible deals possible. However, research shows that even in this supposedly rational process, the relationship with the seller is still more influential than product quality in driving customer commitment and loyalty.

sap Q217 digital double feature1 images8 1024x572 How Game Theory Is Taking Marketing To The Next Level

Baba Shiv, a professor of marketing at Stanford University’s Graduate School of Business, studies how the emotional brain shapes decisions and experiences. In a popular TED Talk, he says that people in the process of making decisions fall into one of two mindsets: Type 1, which is stressed and wants to feel comforted and safe, and Type 2, which is bored or eager and wants to explore and take action.

People can move between these two mindsets, he says, but in both cases, the emotional brain is in control. Influencing it means first delivering a message that soothes or motivates, depending on the mindset the person happens to be in at the moment and only then presenting the logical argument to help rationalize the action.

In the digital economy, working with those tendencies means designing digital experiences with the full awareness that people will not evaluate them objectively, says Ravi Dhar, director of the Center for Customer Insights at the Yale School of Management. Since any experience’s greatest subjective impact in retrospect depends on what happens at the beginning, the end, and the peaks in between, companies need to design digital experiences to optimize those moments—to rationally design experiences for limited rationality.

This often involves making multiple small changes in the way options are presented well before the final nudge into making a purchase. A paper that Dhar co-authored for McKinsey offers the example of a media company that puts most of its content behind a paywall but offers free access to a limited number of articles a month as an incentive to drive subscriptions.

Many nonsubscribers reached their limit of free articles in the morning, but they were least likely to respond to a subscription offer generated by the paywall at that hour, because they were reading just before rushing out the door for the day. When the company delayed offers until later in the day, when readers were less distracted, successful subscription conversions increased.

Pre-selecting default options for necessary choices is another way companies can design digital experiences to follow customers’ preference for the path of least resistance. “We know from a decade of research that…defaults are a de facto nudge,” Dhar says.

For example, many online retailers set a default shipping option because customers have to choose a way to receive their packages and are more likely to passively allow the default option than actively choose another one. Similarly, he says, customers are more likely to enroll in a program when the default choice is set to accept it rather than to opt out.

Another intriguing possibility lies in the way customers react differently to on-screen information based on how that information is presented. Even minor tweaks can have a disproportionate impact on the choices people make, as explained in depth by University of California, Los Angeles, behavioral economist Shlomo Benartzi in his 2015 book, The Smarter Screen.

A few of the conclusions Benartzi reached: items at the center of a laptop screen draw more attention than those at the edges. Those on the upper left of a screen split into quadrants attract more attention than those on the lower left. And intriguingly, demographics are important variables.

Benartzi cites research showing that people over 40 prefer more visually complicated, text-heavy screens than younger people, who are drawn to saturated colors and large images. Women like screens that use a lot of different colors, including pastels, while men prefer primary colors on a grey or white background. People in Malaysia like lots of color; people in Germany don’t.

This suggests companies need to design their online experiences very differently for middle-aged women than they do for teenage boys. And, as Benartzi writes, “it’s easy to imagine a future in which each Internet user has his or her own ‘aesthetic algorithm,’ customizing the appearance of every site they see.”

Applying behavioral psychology to the digital experience in more sophisticated ways will require additional formal research into recommendation algorithms, predictions, and other applications of customer data science, says Jim Guszcza, PhD, chief U.S. data scientist for Deloitte Consulting.

In fact, given customers’ tendency to make the fastest decisions, Guszcza believes that in some cases, companies may want to consider making choice environments more difficult to navigate— a process he calls “disfluencing”—in high-stakes situations, like making an important medical decision or an irreversible big-ticket purchase. Choosing a harder-to-read font and a layout that requires more time to navigate forces customers to work harder to process the information, sending a subtle signal that it deserves their close attention.

That said, a company can’t apply behavioral psychology to deliver a digital experience if customers don’t engage with its site or mobile app in the first place. Addressing this often means making the process as convenient as possible, itself a behavioral nudge.

A digital solution that’s easy to use and search, offers a variety of choices pre-screened for relevance, and provides a friction-free transaction process is the equivalent of putting a product at eye level—and that applies far beyond retail. Consider the Global Entry program, which streamlines border crossings into the U.S. for pre-approved international travelers. Members can skip long passport control lines in favor of scanning their passports and answering a few questions at a touchscreen kiosk. To date, 1.8 million people have decided this convenience far outweighs the slow pace of approvals.

sap Q217 digital double feature1 images9 1024x572 How Game Theory Is Taking Marketing To The Next Level

The basics of influencing irrational customers are essentially the same whether they’re taking place in a store or on a screen. A business still needs to know who its customers are, understand their needs and motivations, and give them a reason to buy.

And despite the accelerating shift to digital commerce, we still live in a physical world. “There’s no divide between old-style analog retail and new-style digital retail,” Berens says. “Increasingly, the two are overlapping. One of the things we’ve seen for years is that people go into a store with their phones, shop for a better price, and buy online. Or vice versa: they shop online and then go to a store to negotiate for a better deal.”

Still, digital increases the number of touchpoints from which the business can gather, cluster, and filter more types of data to make great suggestions that delight and surprise customers. That’s why the hottest word in marketing today is omnichannel. Bringing behavioral psychology to bear on the right person in the right place in the right way at the right time requires companies to design customer experiences that bridge multiple channels, on- and offline.

Amazon, for example, is known for its friction-free online purchasing. The company’s pilot store in Seattle has no lines or checkout counters, extending the brand experience into the physical world in a way that aligns with what customers already expect of it, Dhar says.

Omnichannel helps counter some people’s tendency to believe their purchasing decision isn’t truly well informed unless they can see, touch, hear, and in some cases taste and smell a product. Until we have ubiquitous access to virtual reality systems with full haptic feedback, the best way to address these concerns is by providing personalized, timely, relevant information and feedback in the moment through whatever channel is appropriate. That could be an automated call center that answers frequently asked questions, a video that shows a product from every angle, or a demonstration wizard built into the product. Any of these channels could also suggest the customer visit the nearest store to receive help from a human.

sap Q217 digital double feature1 images4 How Game Theory Is Taking Marketing To The Next Level

The omnichannel approach gives businesses plenty of opportunities to apply subtle nudges across physical and digital channels. For example, a supermarket chain could use store-club card data to push personalized offers to customers’ smartphones while they shop. “If the data tells them that your goal is to feed a family while balancing nutrition and cost, they could send you an e-coupon offering a discount on a brand of breakfast cereal that tastes like what you usually buy but contains half the sugar,” Guszcza says.

Similarly, a car insurance company could provide periodic feedback to policyholders through an app or even the digital screens in their cars, he suggests. “Getting a warning that you’re more aggressive than 90% of comparable drivers and three tips to avoid risk and lower your rates would not only incentivize the driver to be more careful for financial reasons but reduce claims and make the road safer for everyone.”

Digital channels can also show shoppers what similar people or organizations are buying, let them solicit feedback from colleagues or friends, and read reviews from other people who have made the same purchases. This leverages one of the most familiar forms of behavioral psychology—reinforcement from peers—and reassures buyers with Shiv’s Type 1 mindset that they’re making a choice that meets their needs or encourages those with the Type 2 mindset to move forward with the purchase. The rational mind only has to ask at the end of the process “Am I getting the best deal?” And as Guszcza points out, “If you can create solutions that use behavioral design and digital technology to turn my personal data into insight to reach my goals, you’ve increased the value of your engagement with me so much that I might even be willing to pay you more.”

sap Q217 digital double feature1 images10 1024x572 How Game Theory Is Taking Marketing To The Next Level

Many transactions take place through corporate procurement systems that allow a company to leverage not just its own purchasing patterns but all the data in a marketplace specifically designed to facilitate enterprise purchasing. Machine learning can leverage this vast database of information to provide the necessary nudge to optimize purchasing patterns, when to buy, how best to negotiate, and more. To some extent, this is an attempt to eliminate psychology and make choices more rational.

B2B spending is tied into financial systems and processes, logistics systems, transportation systems, and other operational requirements in a way no consumer spending can be. A B2B decision is less about making a purchase that satisfies a desire than it is about making a purchase that keeps the company functioning.

That said, the decision still isn’t entirely rational, Berens says. When organizations have to choose among vendors offering relatively similar products and services, they generally opt for the vendor whose salespeople they like the best.

This means B2B companies have to make sure they meet or exceed parity with competitors on product quality, pricing, and time to delivery to satisfy all the rational requirements of the decision process. Only then can they bring behavioral psychology to bear by delivering consistently superior customer service, starting as soon as the customer hits their app or website and spreading out positive interactions all the way through post-purchase support. Finishing strong with a satisfied customer reinforces the relationship with a business customer just as much as it does with a consumer.

sap Q217 digital double feature1 images11 1024x572 How Game Theory Is Taking Marketing To The Next Level

The best nudges make the customer relationship easy and enjoyable by providing experiences that are effortless and fun to choose, on- or offline, Dhar says. What sets the digital nudge apart in accommodating irrational customers is its ability to turn data about them and their journey into more effective, personalized persuasion even in the absence of the human touch.

Yet the subtle art of influencing customers isn’t just about making a sale, and it certainly shouldn’t be about persuading people to act against their own best interests, as Nudge co-author Thaler reminds audiences by exhorting them to “nudge for good.”

Guszcza, who talks about influencing people to make the choices they would make if only they had unlimited rationality, says companies that leverage behavioral psychology in their digital experiences should do so with an eye to creating positive impact for the customer, the company, and, where appropriate, the society.

In keeping with that ethos, any customer experience designed along behavioral lines has to include the option of letting the customer make a different choice, such as presenting a confirmation screen at the end of the purchase process with the cold, hard numbers and letting them opt out of the transaction altogether.

“A nudge is directing people in a certain direction,” Dhar says. “But for an ethical vendor, the only right direction to nudge is the right direction as judged by the customers themselves.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Volker Hildebrand is Global Vice President for SAP Hybris solutions.

Sam Yen is Chief Design Officer and Managing Director at SAP.

Fawn Fitter is a freelance writer specializing in business and technology.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Why big data needs a unified theory of everything

April 10, 2016   Big Data

As I learned from my work in flight dynamics, to keep an airplane flying safely, you have to predict the likelihood of equipment failure. And today we do that by combining various data sets with real-world knowledge, such as the laws of physics.

Integrating these two sets of information — data and human knowledge — automatically is a relatively new idea and practice. It involves combining human knowledge with a multitude of data sets via data analytics and artificial intelligence to potentially answer critical questions (such as how to cure a specific type of cancer). As a systems scientist who has worked in areas such as robotics and distributed autonomous systems, I see how this integration has changed many industries. And I believe there is a lot more we can do.

Take medicine, for example. The immense amount of patient data, trial data, medical literature, and knowledge of key functions like metabolic and genetic pathways could give us tremendous insight if it was available for mining and analysis. If we could overlay all of this data and knowledge with analytics and artificial intelligence (AI) technology, we could solve challenges that today seem out of our reach.

I’ve been exploring this frontier for quite a few years now – both personally and professionally. During my years of training and continuing into my early career, my father was diagnosed with a sequence of chronic conditions, starting with a brain tumor when he was only 40 years old. Later, a small but unfortunate car accident injured the same area of scalp that had been weakened by radio- and chemotherapy. Then he developed cardiovascular issues resulting from repeated use of anesthesia, and lastly he was diagnosed with chronic lymphocytic leukemia. This unique combination of conditions (comorbidities) meant it was extremely difficult to get insight into his situation. My family and I desperately wanted to learn more about his medical issues and to understand how others have dealt with similar diagnoses; we wanted to completely immerse ourselves in the latest medications and treatment options, learn the potential adverse and side effects of the medications, understand the interactions among the comorbidities and medications, and understand how new medical discoveries could be relevant to his conditions.

But the information we were looking for was difficult to source and didn’t exist in a form that could be readily analyzed.

Each of my father’s conditions was being treated in isolation, with no insight into drug interactions. A phenytoin-warfarin interaction was just one of the many potential dangers of this lack of insight. And doctors were unsure of how to adjust the dosages of each of my father’s medications to minimize their adverse and side effects, which turned out to be a big problem.

We also had no predictive knowledge of what to expect next.

My father’s situation is a frighteningly common one. Comorbidities — cases in which patients have two or more chronic conditions — was named the 21st century challenge for healthy aging by the “White House Conference on Aging” in 2014. In developed nations, about one in four adults have at least two chronic conditions, and more than half of older adults have three or more chronic conditions. In the United States, the $ 2 trillion healthcare industry spends 71¢ of every dollar  on treating individuals with comorbidities. In Medicare spending, the amount rises to 93¢ of every dollar.

And comorbidities pose huge challenges to clinicians, who must be cognizant of many layers of care and complexities involved in treating these patients. These cohorts of patients are excluded from most clinical trials. In particular, it is quite difficult to design hypothesis tests due to the heterogeneity and diverse set of possibilities, and it is expensive to run the trials. So even the medical community must rely heavily on observational data and analytical tools from data mining and machine learning algorithms.

But what if we were able to form deep partnerships across medicine and data science in order to bring together the vast set of medical knowledge, patient data, and analytics? I wanted to find out.

As my family struggled to learn more about and track my father’s medical conditions, I was able to get hold of some public medical data. Putting my science hat on, I started to mine these data sets in my after-work hours and weekends using data analytics techniques. And before noticing it, this became my full-time profession at PARC. My work on comorbidities provides a view of how this new field of data analytics works, the partnerships that could arise, and the disruptive changes it will bring.

AI can integrate medical knowledge with data analytics

With help from new regulations and incentive programs as well as new technological advancements, we have access to more digital healthcare records than at any time in the past. Healthcare data sets consist of both structured and unstructured information. Rich Electronic Medical Record (EMR) data sets exist, which include personal and family medical history, treatments, procedures, laboratory tests, large collections of complex physiological information, medical imaging data, genomics, and socio-economic and behavioral data. The data captures a variety of layers — from molecular information and genomics to pathophysiologic responses to diagnoses and procedures to data from self-quantified devices.

Recently, I was fortunate enough to get access to a rich longitudinal inpatient EMR data set with more than nine million unique patients. I started by looking at what clusters of comorbidities co-occur, why, and how these clusters vary as a function of different patient populations and other covariates such as age, gender, ethnicity, environment, and socio-economic factors. I applied advanced statistical methods to create a map of causal relationships between different diseases. Leveraging temporal data led to developing mathematical disease progression models. But something was not quite right.

First, no matter how good EMR data is, medical data is noisy and biased in most cases. The complex nature of the factors involved in transforming the verbal information exchange between patients and physicians into written information on medical charts and from there to International Classification of Disease (ICD) codes used in EMR data leads to enormous coding errors. In addition, different hospitals have different coding quality standards. The medical claims are the backbone of EMR data, but they are collected for billing purposes, which brings yet another source of bias and noise into the data. Coders, hospital administrators, health providers, payers, and patients have different perspectives and expectations when it comes to the medical data. This multi-faceted nature of medical data makes a big impact on the way data is collected and how it will be mined. Inventing algorithms that measure and quantify the quality of data from different resources, and filtering noise and bias from the data will be an inevitable part of working with medical data.

Besides the quality of data, there was something more fundamentally wrong about using only EMR data. For example, my causal inference algorithms resulted in noisy and often invalidated relationships between comorbidities. I tried to validate and explain the results by talking to medical doctors and researchers as well as reviewing the extensive medical knowledge in literature and other databases.

Going through this process led me to a “eureka moment”: If we can automatically integrate our accumulated experiences around the globe together with the long history of medicine, we could:

  1. Identify interesting and yet non-intuitive insights that would help health providers to effectively choose appropriate treatment plans
  2. Generate hypotheses for medical researchers that would expedite knowledge discovery
  3. Develop actionable information for patients and family members to efficiently manage the comorbidities.

Medicine has perhaps one of the longest histories among the different branches of science. The accumulated knowledge in literature and medical and pharma trials is enormous today. Medical knowledge will continue to expand. Big data in medicine can give us interesting insights only if it goes hand-in-hand with the medical knowledge. Looking for causal relationships between different diseases in big EMR data will lead to robust results only if the existing medical knowledge, e.g. causal relationships between diabetes and kidney diseases, is incorporated into our machine learning algorithms. This is all fantastic, but the challenge is that medical knowledge is captured in different ontologies and representations (text, pathways, images, etc.). Additionally, combining medical knowledge is complicated because each source describes a different level of the human system. Some may describe high-level functions, others may describe organ-level functions, and others may focus on the subcell level, describing DNA, RNA, and proteins. So an important part of this process is to inventing AI machinery that can assimilate all of this disparate information.

Consider the types of problems we could address from both a patient’s and scientist’s perspective:

Patient perspective: A tremendous amount of data from patient histories combined with medical knowledge can be used to identify clusters of comorbidities and their past and future progression trajectories. Then, patients can be classified based on the comorbidities and the trajectories they follow. This approach will help both patients and doctors to summarize experiences and figure out what to expect next and which treatment plan is the most effective.

Scientist perspective: We can exploit commonalities in trajectories to provide evidence for interactions between comorbidities and generate scientific hypotheses. The goal is to achieve meaningful and actionable insights through a successful marriage of artificial intelligence/machine learning and medicine. In order to perform data-driven analysis, we need to address such challenges as integrating multiple data types, dealing with missing data, and handling irregularly sampled and biased data. Automatic integration of data and medical knowledge is a challenging and yet promising scientific question. While these challenges need to be taken into account by computational scientists working with healthcare data, a larger problem involves how best to ensure the hypotheses posed and types of knowledge discoveries sought are relevant to the healthcare community.

Given the expanding reach of medical data, we are entering a new age of intelligent medicine. Machine learning is the core technology enabling this development, but it will be critical for domain experts to understand and trust the results of machine learning algorithms. Current machine learning techniques produce models that are opaque, non-intuitive, and difficult for experts to rely on in their decision processes. But if we can integrate medical data and human knowledge, we can deliver explainable/interpretable intelligence to health providers and medical researchers.

I hope we can start to leverage the power of the experiences of all patients combined with the long history of medical knowledge to improve the quality of care for individual patients. The process would have to begin with a new generation of partnerships between data science and the keepers of knowledge.

As I said above, this approach isn’t just relevant to the medical world. It could be used to solve complex problems across a variety of fields. When this happens, a new wave of disruption will take place in the form of data analytics married to human knowledge.

Marzieh Nabi is a research scientist and technical lead at PARC, a Xerox company, with a background that encompasses control, optimization, networked dynamics systems, robotics, and flight dynamics. She is interested in applications of these tools in energy, transportation, aerospace, multi-agent and autonomous systems, and healthcare.


Get more stories like this:  twitter Why big data needs a unified theory of everything  facebook Why big data needs a unified theory of everything


Let’s block ads! (Why?)

Big Data – VentureBeat

Read More
  • Recent Posts

    • Quality Match raises $6 million to build better AI datasets
    • Teradata Joins Open Manufacturing Platform
    • Get Your CRM Ready for Some Good News
    • MTG
    • TripleBlind raises $8.2 million for its encrypted data science platform
  • Categories

  • Archives

    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited