Are You Engineering the Customer Experience Out of Your Business?

Automation. Robots. Technology taking our jobs. I defy you to pick up a business magazine and avoid this topic — it will be in there somewhere. However, there’s another theme you won’t be able to avoid: the need to focus on the customer experience.

These two trends are in tension much of the time. They don’t have to be, but due to most businesses’ seemingly inescapable need to focus on themselves instead of the customer, that’s often the case.

I ran across a prime example recently. I’ll admit it: I eat fast food sometimes — I eat healthy at home, and I have to balance it out once in a while. As a result, I noticed when the local McDonald’s was closed for three weeks for remodeling. When it re-opened, I visited. As someone who pays attention to customer experience, I was astounded — so much so that I went back two weeks later to take notes.

So Long, No. 4

Instead of a counter where your order was taken, there were four large touchscreens, so inconspicuously placed that the workers who used to be behind the counters were out in the lobby directing customers to them (and then offering guidance on how to use them).

The screens were about two feet wide and three and a half feet tall — so tall you had to stand back in order to see the entire surface. Even then, the interfaces were so poorly designed that you had to scroll down on some screens (because 36 inches just isn’t enough space, apparently).

To order what used to be a “No. 4,” I had to hunt down the right icons and push buttons 11 times. Once to indicate whether I was eating in or taking the food out, once to select chicken (vs. hamburgers, salads, etc.), once for the kind of chicken, once for the size of the meal, once for the kind of drink, once for the kind of sauce, once for the OTHER kind of sauce (?!?), once to confirm that was what I wanted, once to key in my table number, once to specify my payment type, and once to finalize payment.

There was a payment card reader at the kiosk. If you wanted to pay with cash, you had to flag down an employee and have them ring you up at the counter. You may remember the counter — it’s where I could have asked a human for a “Number 4, medium, for here,” and been done ordering.

When the food was brought out, there was no drink — I had not realized that when I pushed the button for a Coke, I should have known to turn around and get a cup from the dispenser behind me. I had to go get my drink after the woman running the food out pointed it out to me.

I get that the idea here is to reduce the number of employees in order to have a positive impact on the bottom line — but that wasn’t happening. There were people helping customers with the kiosks and people running food to the tables — the same people who had been behind the counter a month earlier.

The whole point of a restaurant of this nature is that you get in, order your food, and eat as quickly as possible. It’s a mundane customer experience, to be sure, but it sets expectations. The kiosks blew up my two-second ordering experience into a four-minute adventure in navigating a not-very-well-designed set of touchscreens. Then, the process wasn’t well explained about the drink.

Everything about the old way of ordering is better. Is the savings in headcount going to pay off if your customers stop coming in?

Keep Automation Simple

The experience was vaguely reminiscent of the supermarket self-checkout, which was supposed to do away with checkout clerks but instead created a new position for them as helpers for the befuddled self-checkouters struggling with a poorly designed process. (They also try to prevent clever customers from
exploiting the machines for fun and profit.)

Another aspect of this type of automation is that it takes the bat out of your employees’ hands. They can’t add to the customer experience with a little of their own personality if the technology takes their place and only allows them to participate when the customer is frustrated.

The best automation, when it comes to preserving the customer experience, is designed to automate the simplest part of the transactions. Think about airline check-in kiosks: they retrieve your flight information, print boarding passes and generate tags for your luggage, but you still have to show your ID and hand your bags to people behind the counter, and those people still tell you the gate number and wish you well on your flight. Your last contact with the airline before your wait at the gate is with a person, not with a terminal.

If you feel like you must automate customer-facing activities, map them out and decide which ones automation can do best — not to your benefit, but to the benefit of the customer experience. If you identify activities that aren’t suited to your machines, have your people handle them — and think through the process, automation should be used to help your customer-facing employees deliver better experiences, not to assist them out the door.
end enn Are You Engineering the Customer Experience Out of Your Business?

Chris%20Bucholtz Are You Engineering the Customer Experience Out of Your Business?Chris Bucholtz has been an ECT News Network columnist since 2009. His focus is on CRM, sales and marketing software, and the interface between people and technology. A noted speaker and author, Chris has covered the CRM space for 10 years.
Email Chris.

Let’s block ads! (Why?)

CRM Buyer

Uniqueifier considerations and error 666

This post is intended to shed some light around uniqueifiers and table design that rely on its usage.

First a quick information about the subject.

A uniqueifier (or uniquifier as reported by SQL Server internal tools) has been used in the engine for a long time (since SQL Server 7.0), and even being known to many, referenced in books and blogs, our documentation clearly states that you will not see it exposed externally in the engine (

“If the clustered index is not created with the UNIQUE property, the Database Engine automatically adds a 4-byte uniqueifier column to the table. When it is required, the Database Engine automatically adds a uniqueifier value to a row to make each key unique. This column and its values are used internally and cannot be seen or accessed by users.”

While it´s unlikely that you will face an issue related with uniqueifiers, we have seen rare cases where customer reaches the uniqueifier limit of 2,147,483,648, generating error 666.

Msg 666, Level 16, State 2, Line 1

The maximum system-generated unique value for a duplicate group was exceeded for index with partition ID . Dropping and re-creating the index may resolve this; otherwise, use another clustering key.

The error message is clear about the actions you should take to resolve the issue.

Drop and recreate the index may resolve the problem if you don´t have 2,147,483,648 records with the same key value. Recreating the index will allow the uniqueifier to be reset, giving you some time to review table design before reaching the limit again.

As of February 2018, the design goal for the storage engine is to not reset uniqueifiers during REBUILDs. As such, rebuild of the index ideally would not reset uniquifiers and issue would continue to occur, while inserting new data with a key value for which the uniquifiers were exhausted. But current engine behavior is different for one specific case, if you use the statement ALTER INDEX ALL ON

REBUILD WITH (ONLINE = ON), it will reset the uniqueifiers (across all version starting SQL Server 2005 to SQL Server 2017).

Important:This is something that is not documented and can change in future versions, so our recommendation is that you should review table design to avoid relying on it.

Related to the table design aspect, based on the second recommendation from error message, our question is: “Is a good design to choose a non-unique clustering key that could have several million/billion duplicate key values?”

The short answer is probably NO. Of course, we know that when it comes to table design there is usually not a right or wrong answer, but the majority of cases should not rely heavily on uniqueifiers.

While most cases will likely have at most hundreds or thousands of duplicated records for a single value and, is straightforward to solve the issue if you get error 666 (having less duplicated rows than the uniqueifier limit), it can cause some downtime when executing the required steps. Consequently, the best course of action is to review tables that rely on uniqueifiers and proactively work to improve its design.

Hope it helps you to have a better understanding about uniqueifiers and to review table design, in order to avoid error 666 in your production environment.

If you want to have more information about uniqueifiers, review post at and example 03 from companion content for Chapter 6 (Index Internals) of SQL Server 2008 Internals book (

Skip to main content

Let’s block ads! (Why?)

CSS SQL Server Engineers

Battle of Shanhai Pass in 1644 marked the end of the Ming…

Battle of Shanhai Pass in 1644 marked the end of the Ming Dynasty and the beginning of the Shun Dynasty, which lasted only a year and resulted in the establishment of the


Let’s block ads! (Why?)

A Historian Walks into a Bar . . .

How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

Feature Image 3 300x225 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

That’s right – no more being limited to mass updating only 250 records at once in Dynamics 365! The Bulk Workflow Execution is a great tool which allows users to run an On-Demand Workflow against a set of records pulled from a System or Personal View, all in bulk! You can find the “Bulk Workflow Execution” tool in the Plug-in Store using the XRMToolBox application.

Let’s look at a quick example of how the tool works. Say that you need to change the Owner of 10,000+ Case records. Follow the steps below:

1. Create an On-demand Workflow.

021918 1640 HowtoBulkUp1 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

2. Create a Personal View or use an existing System View to pull your data set.

021918 1640 HowtoBulkUp2 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

3. Open XRMToolBox.

4. Scroll and click the Bulk Workflow Execution.

5. Click Yes to connect to your Dynamics 365 organization.

021918 1640 HowtoBulkUp3 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

6. Type your Organization URL.

021918 1640 HowtoBulkUp4 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

7. Type your Username/Password and click Connect.

021918 1640 HowtoBulkUp5 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

8. Type a Name for your organization after it successfully connects. NOTE: once the organization connection is established, XRMToolBox will open a new tab with the tool name and organization name you provided.

9. Select the On-Demand Workflow you created from the dropdown and wait for the tool to retrieve the applicable Views.

021918 1640 HowtoBulkUp6 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

10. From the View list, select the System View or Personal View you are running the On-Demand Workflow against. Note: you should see the FetchXML Query populated in the area to the right after you select the View from the list.

11. Click Validate Query to verify the amount of records being pulled in the data set.

021918 1640 HowtoBulkUp7 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

12. Determine your Batch Size and Interval Delay.
Note: if unsure, keep the default Batch Size at “200” and Interval Delay at “0” seconds

13. When ready, click Start Workflows.

021918 1640 HowtoBulkUp8 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

14. The tool will run and will inform you of progress.

021918 1640 HowtoBulkUp9 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

15. When finished, the tool will display a small window indicating the time it completed, how many records the workflow ran against, and number of errors as a result of the execution.

021918 1640 HowtoBulkUp10 How to Bulk Update CRM Records Using the Bulk Workflow Execution Tool

A good rule of thumb is allowing about 2 seconds per record for the tool to update a record. That would be 30 records per minute and 1,800 records per hour. Remember that this tool is completely automated! So, while the job runs you can leave your computer unattended or have it running in the background if multitasking.

That’s it! For more Tips & Tricks and other educational materials on Dynamics 365, check out our blog.

Happy Dynamics 365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Power Query (M)agic: Dynamically remove leading rows AND columns

Does Power Query sometimes seem too rigid? It’s great for defining a sequence of transformations to clean your data but requires the incoming data to be in a consistent format. Today, I want to talk about a technique for dynamically handling sheets with different structures.

You can download a sample .pbix to follow along with me here.

Let’s look at my sample data. Suppose my sales manager sends me a monthly sales report structured similarly to that below:

Nar Table 1 1024x199 Power Query (M)agic: Dynamically remove leading rows AND columns

This table looks pretty clean, right? Using Power Query, you can load this sheet, remove the first two rows and columns, unpivot the data, and you have the numbers you’re looking for in tabular format.

Patting yourself on the back, you file the report away until February rolls around. The Sales Manager sends you the new numbers, and you hit refresh, and… an error! Peaking at the data, you see that he changed the layout of the file and also added new salespeople to the matrix!

Nar Table 2 1024x333 Power Query (M)agic: Dynamically remove leading rows AND columns

Nar Table 3 1024x334 Power Query (M)agic: Dynamically remove leading rows AND columns

As enjoyable as it would be just to tell the manager to stick to a consistent format, sometimes this isn’t realistic or feasible. So, let’s look at how we can use Power Query to handle this chaos:

  1.  Identify the header name for the column that demarcates the beginning of the matrix.
  2.  Automatically detect rows up to that header row.
  3.  Automatically detect columns up to that header column.
  4.  Remove the extra rows and columns, leaving you with a squeaky-clean dataset.
  5.  Unpivot the matrix for each sheet and append into a single dataset.


Looking back at our sample data, you can see that there is a common cell in all our sheets that lies at the upper left of the data we want to extract. In this case, the target cell is the “Region” header. We want to remove all rows ABOVE that cell and all columns LEFT of that cell without hard-coding the location of that cell.


We need some way to identify where the header row starts so that we can remove rows up to that point. This functionality is something that I would have thought is built in by default, but surprisingly is not! Luckily our friend Ken Puls over at came up with [a solution] for this which I’ve adapted slightly for our purposes.

Load your first worksheet into Power Query, add an Index column, and filter the table to the target value from step 1:

  1. Add Column > Index column > From 0.
  2. Add a custom step (click the fx button in the formula bar).
  3. Replace the formula with some custom M code: Table.FindText(#”Starting Table w/Index”, “Region”). Name the step “Filtered to Header Row.”

Nar Table 4 1024x322 Power Query (M)agic: Dynamically remove leading rows AND columns

Note that you’ll want to replace the yellow-highlighted text with the Target Header name from step 1. Table.FindText() scans the entire table for the target text and returns any row with that value. So be careful that your dataset doesn’t have that exact target word in other places!

Now we have our header row isolated along with the Index value for that row. Rename the step to “Filtered Header Row” as we’ll come back to this shortly.

Nar Table 5 1024x118 Power Query (M)agic: Dynamically remove leading rows AND columns


Let’s move on to the more difficult part: dynamically removing leading columns. We have a bunch of columns, and we want to eliminate the first X, up to our target column. We’ll leverage the technique above and add some P3 secret sauce.

First, transpose the #”Filtered Header Row” step and add an index. That will make a single row table into a single column table that we use to identify the columns to remove.

  1. Transform > Transpose.
  2. Add an index column: Add Column > Index column > From 1.
  3. To handle blank columns in the header row (always a possibility in dirty data), add a custom column that checks if the column list has any nulls: Add Column > Custom Column > if [Column1] = null then “Column” & Number.ToText([Index]) else [Column1].

Our goal is to delete all columns left of the “Region” column (or above “Region” in the transposed table) so let’s find the index of that row:

  1. Right Click Column1 > Text Filters > Equals > “Region”.

Nar Table 7 2 Power Query (M)agic: Dynamically remove leading rows AND columns Equals > “Region”” width=”568″ height=”79″ srcset=” 568w,×42.png 300w” sizes=”(max-width: 568px) 100vw, 568px”>

We’re building upon Ken’s technique of finding the Index that corresponds to a target cell but this time with a transposed table. Since we’ll reference this number a couple of times, let’s Drill Down to just the Index number so that we have an easily referenceable step:

  1. Right-Click on the Index value > Drill Down.
  2. Rename that step to “TargetColumnIndex”.

Now, jump around a bit and reference the original column list and filter it down to include ONLY the rows that have an index number less than the target column.

  1. Click the fx button to insert a New Step.
  2. Revise the M code to point to the full column list: = Table.SelectRows(#”Added ColumnName”, each [Index] < #”TargetColumnIndex”.

Let’s break down what we’re doing here: the outer Table.SelectRows filters the inner table #” Added ColumnName” down to all rows that have an Index less than the “TargetColumnIndex” value we isolated a couple of steps ago.

Nar Table 8 1 1024x612 Power Query (M)agic: Dynamically remove leading rows AND columns

Finally, remove the helper columns keeping only “ColumnName,” and you have a nice list of columns to exclude!


We now have all the pieces we need to eliminate our junk rows and columns! Let’s jump back to our original query and clean it up.

Create a new step and change its code to reference the Starting Table:

  1. Click fx > rename step to “Back to Starting Table” > change code to = #”Starting Table”.
  2. Home > “Remove Top Rows.” Enter any value for the parameter.
  3. Edit the M code directly and change the 2nd parameter of Table.Skip(#”Back to Starting Table,” #”Filtered to Header Row”[Index]{0}), in this case, we want to reference the step where we isolated the header row number from earlier.
  4. Home > Use First Rows as Headers.

Nar Table 9 2 1024x409 Power Query (M)agic: Dynamically remove leading rows AND columns

Boom! We’ve dynamically eliminated the top rows!

Now for the final act, we’ll tweak Table.RemoveColumns (when you do “Remove Columns” Power Query uses this function) to use a dynamic parameter! Remember that list of columns we generated earlier, the list we want to extract? That’s what we’ll input into Table.RemoveColumns.

First, select any one of the junk columns and right-click > “Remove Columns.” Take a look at the formula that Power Query generated.

  1.  Table.RemoveColumns(#”Promoted Headers”,{“Column1”}).

We know that Table.RemoveColumns requires a list for its second argument, so we can reference the list of columns to remove from an earlier step:

  1. Table.ToList(#”Columns to Remove”).

Now we’re left with a clean matrix which we can easily unpivot without any problems.

    1. Right-click the Region column > Unpivot Other Columns.
    2. Rename columns something useful.

Nar Table 11 848x1024 Power Query (M)agic: Dynamically remove leading rows AND columns

STEP 5: Convert magic query to function

  1. The final step is to convert your magic query into a function so that you can apply this dynamic transformation to each file/worksheet that needs to be unpivoted.


Using the technique of identifying a target row and column, you can create extremely powerful and dynamic queries that can handle input files that aren’t always perfect, because let’s face it, you can’t always rely on training or data validation to prevent end users from modifying your perfect templates. Our job as data modelers is to make the end user experience as friendly as possible by foreseeing and handling exceptions.

If you have any ideas on how you might use this in your reports, please feel free to share in the comments section!

We get it:  you probably arrived here via Google, and now that you’ve got what you needed, you’re leaving. And we’re TOTALLY cool with that – we love what we do more than enough to keep writing free content.  And besides, what’s good for the community (and the soul) is good for the brand.

But before you leave, we DO want you to know that instead of struggling your way through a project on your own, you can hire us to accelerate you into the future. Like, 88 mph in a flux-capacitor-equipped DeLorean – THAT kind of acceleration. C’mon McFly. Where you’re going, you won’t need roads.

Let’s block ads! (Why?)


Expert Interview (Part 2): Cloudera’s Mike Olson on the Gartner Hype Cycle

At the Cloudera Sessions event in Munich, Germany, Paige Roberts of Syncsort sat down with Mike Olson, Chief Strategy Officer of Cloudera. In the first part of the interview, Mike Olson went into what’s new at Cloudera, how machine learning is evolving, and the adoption of the Cloud in organizations.

In this part he talks about Gartner’s latest hype cycle, and where he sees things going.

Paige:   Did you see the latest Gartner’s hype cycle? They say that Hadoop will be obsolete before it reaches the plateau of productivity.

Mike:    Yes, and I’ll say that Gartner’s conclusions on Big Data just don’t match ours. We’ve got lots of serious customers doing really mission critical production workloads on our platform. I’m not sure who they’re talking to that’s leading to these conclusions. I will say that if you view the Big Data landscape as really just Hadoop, there’s all kinds of reasons to be skeptical, right?

Banner 6 Key Questions About Getting More from Your Mainframe with Big Data Technologies Expert Interview (Part 2): Cloudera’s Mike Olson on the Gartner Hype Cycle

Especially if you just look at it as MapReduce and HDFS.

That’s right and it’s perfectly fair to say those are awful alternatives to traditional relational databases. In fact, it’s legit to say there’s going to be a place for Oracle, SAP HANA, Teradata, Microsoft SQL Server, and DB2 Parallel Edition for the long term.  Scale out platforms are never going to be good at online transaction processing.

Distributed transactions have been hard forever and nothing about Hadoop makes it easier, but using tools like Impala to do high-performance analytic queries gives companies an alternative for certain parts of their traditional relational workloads on the scale-out platform. We’re not just bullish, we’ve been quite successful in delivering those capabilities to the enterprise.

The Gartner hype cycle, if you look at the terminology, there’s the peak of inflated expectations and then the trough of disillusionment, and then the plateau of productivity. And maybe Gartner’s current down outlook is because right now, we’re in the trough, and it’s the plateau of productivity broadly across the industry we have to get to. We’ve said publicly that we’ve got more than a thousand customers, more than 600 in the Global 8,000 running this platform in production for a bunch of very demanding workloads.

I have to wonder if they’re looking at the Hadoop of 10 years ago, as opposed to now. It used to be you had just MapReduce and HDFS which was really limited, but now it’s 25 different projects including Spark, and all these other capabilities, and that’s a completely different kind of distribution.

Frankly I think that if you look at Hadoop as just Hadoop, then there’s a bunch of stuff it doesn’t do. But the ecosystem has evolved way beyond that.

Yeah, it’s growing all the time. Actually I do, a “What is Hadoop” presentation and it starts with a slide that gives the basics of Hadoop 1.x. “Here’s this cool thing, and let me explain it to you.” Then it shows the ecosystem progressing in slide after slide, now it grew, and it grew, and it grew and grew some more.

HDFS and MapReduce are always going to be part of our platform. They’re really important. But, you can now spin a cluster infrastructure running on Microsoft Azure using ADLS object store and Spark running on top of that and there’s no HDFS or MapReduce anywhere near that thing.

It is no longer the be-all and end-all of Hadoop.

It’s a much more expansive and capable ecosystem than it used to be.

Tune in for the final installment of this interview, where Mike Olson shares his view on women in tech and explains the difference between Cloudera Altus and Director.

Make sure to check out our latest eBook, 6 Key Questions About Getting More from Your Mainframe with Big Data Technologies, and learn what you need to ask yourself to get around the challenges, and reap the promised benefits of your next Big Data project.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Are you ready for GDPR? It’s coming.

As you’ve probably heard, on May 25, 2018, a European privacy law, the General Data Protection Regulation (GDPR), is due to take effect. The GDPR imposes new rules on companies, government agencies, non-profits, and other organizations that offer goods and services to people in the European Union (EU), or that collect and analyze data tied to EU residents. The GDPR applies no matter where you are located.


There are a few things that GDPR will change:

  • Personal privacy rights: Individuals will have the right to access their personal data, correct errors in their personal data, erase their personal data, object to the processing of their personal data, or export their personal data.
  • Added controls and notifications: Organizations will be required to protect personal data using appropriate security measures, notify authorities of personal data breaches, obtain appropriate consents for processing data, and keep records detailing data processing.
  • Transparent policies: Organizations must provide clear notice of data collection, outline processing purposes and use cases, and define data retention and deletion policies.
  • IT and training requirements: Organizations will need to train privacy personnel and employees, audit and update data policies, employ a Data Protection Officer (if required), and create and manage compliant vendor contracts.

In short, GDPR demands stricter controls on where personal data is stored and how it is used. The bring better data governance tools for improved transparency, recordkeeping, and reporting. Finally, it will improve data policies to give data subjects greater control and to ensure lawful processing.


Preparing for the GDPR is a business-wide challenge that will take time, tools, processes, and expertise. Preparations may require significant changes to how you conduct your business and to customers’ privacy and data management practices. The requirements are complicated and each organization’s path to readiness will be unique, so don’t wait until May to begin preparing.

We’re here to help. Our team of data and technical experts can assess your readiness and help you determine the best path forward to ensure you can continue to serve your customers.

Contact BroadPoint today for more information and to assess your readiness.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]

In the tech world in 2017, several trends emerged as signals amid the noise, signifying much larger changes to come.

As we noted in last year’s More Than Noise list, things are changing—and the changes are occurring in ways that don’t necessarily fit into the prevailing narrative.

While many of 2017’s signals have a dark tint to them, perhaps reflecting the times we live in, we have sought out some rays of light to illuminate the way forward. The following signals differ considerably, but understanding them can help guide businesses in the right direction for 2018 and beyond.

SAP Q417 DigitalDoubles Feature1 Image2 1024x572 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]

When a team of psychologists, linguists, and software engineers created Woebot, an AI chatbot that helps people learn cognitive behavioral therapy techniques for managing mental health issues like anxiety and depression, they did something unusual, at least when it comes to chatbots: they submitted it for peer review.

Stanford University researchers recruited a sample group of 70 college-age participants on social media to take part in a randomized control study of Woebot. The researchers found that their creation was useful for improving anxiety and depression symptoms. A study of the user interaction with the bot was submitted for peer review and published in the Journal of Medical Internet Research Mental Health in June 2017.

While Woebot may not revolutionize the field of psychology, it could change the way we view AI development. Well-known figures such as Elon Musk and Bill Gates have expressed concerns that artificial intelligence is essentially ungovernable. Peer review, such as with the Stanford study, is one way to approach this challenge and figure out how to properly evaluate and find a place for these software programs.

The healthcare community could be onto something. We’ve already seen instances where AI chatbots have spun out of control, such as when internet trolls trained Microsoft’s Tay to become a hate-spewing misanthrope. Bots are only as good as their design; making sure they stay on message and don’t act in unexpected ways is crucial.

SAP Q417 DigitalDoubles Feature1 Image3 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]This is especially true in healthcare. When chatbots are offering therapeutic services, they must be properly designed, vetted, and tested to maintain patient safety.

It may be prudent to apply the same level of caution to a business setting. By treating chatbots as if they’re akin to medicine or drugs, we have a model for thorough vetting that, while not perfect, is generally effective and time tested.

It may seem like overkill to think of chatbots that manage pizza orders or help resolve parking tickets as potential health threats. But it’s already clear that AI can have unintended side effects that could extend far beyond Tay’s loathsome behavior.

For example, in July, Facebook shut down an experiment where it challenged two AIs to negotiate with each other over a trade. When the experiment began, the two chatbots quickly went rogue, developing linguistic shortcuts to reduce negotiating time and leaving their creators unable to understand what they were saying.

The implications are chilling. Do we want AIs interacting in a secret language because designers didn’t fully understand what they were designing?

In this context, the healthcare community’s conservative approach doesn’t seem so farfetched. Woebot could ultimately become an example of the kind of oversight that’s needed for all AIs.

Meanwhile, it’s clear that chatbots have great potential in healthcare—not just for treating mental health issues but for helping patients understand symptoms, build treatment regimens, and more. They could also help unclog barriers to healthcare, which is plagued worldwide by high prices, long wait times, and other challenges. While they are not a substitute for actual humans, chatbots can be used by anyone with a computer or smartphone, 24 hours a day, seven days a week, regardless of financial status.

Finding the right governance for AI development won’t happen overnight. But peer review, extensive internal quality analysis, and other processes will go a long way to ensuring bots function as expected. Otherwise, companies and their customers could pay a big price.

SAP Q417 DigitalDoubles Feature1 Image4 1024x572 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]

Elon Musk is an expert at dominating the news cycle with his sci-fi premonitions about space travel and high-speed hyperloops. However, he captured media attention in Australia in April 2017 for something much more down to earth: how to deal with blackouts and power outages.

In 2016, a massive blackout hit the state of South Australia following a storm. Although power was restored quickly in Adelaide, the capital, people in the wide stretches of arid desert that surround it spent days waiting for the power to return. That hit South Australia’s wine and livestock industries especially hard.

South Australia’s electrical grid currently gets more than half of its energy from wind and solar, with coal and gas plants acting as backups for when the sun hides or the wind doesn’t blow, according to ABC News Australia. But this network is vulnerable to sudden loss of generation—which is exactly what happened in the storm that caused the 2016 blackout, when tornadoes ripped through some key transmission lines. Getting the system back on stable footing has been an issue ever since.

Displaying his usual talent for showmanship, Musk stepped in and promised to build the world’s largest battery to store backup energy for the network—and he pledged to complete it within 100 days of signing the contract or the battery would be free. Pen met paper with South Australia and French utility Neoen in September. As of press time in November, construction was underway.

For South Australia, the Tesla deal offers an easy and secure way to store renewable energy. Tesla’s 129 MWh battery will be the most powerful battery system in the world by 60% once completed, according to Gizmodo. The battery, which is stationed at a wind farm, will cover temporary drops in wind power and kick in to help conventional gas and coal plants balance generation with demand across the network. South Australian citizens and politicians largely support the project, which Tesla claims will be able to power 30,000 homes.

Until Musk made his bold promise, batteries did not figure much in renewable energy networks, mostly because they just aren’t that good. They have limited charges, are difficult to build, and are difficult to manage. Utilities also worry about relying on the same lithium-ion battery technology as cellphone makers like Samsung, whose Galaxy Note 7 had to be recalled in 2016 after some defective batteries burst into flames, according to CNET.

SAP Q417 DigitalDoubles Feature1 Image5 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]However, when made right, the batteries are safe. It’s just that they’ve traditionally been too expensive for large-scale uses such as renewable power storage. But battery innovations such as Tesla’s could radically change how we power the economy. According to a study that appeared this year in Nature, the continued drop in the cost of battery storage has made renewable energy price-competitive with traditional fossil fuels.

This is a massive shift. Or, as David Roberts of news site Vox puts it, “Batteries are soon going to disrupt power markets at all scales.” Furthermore, if the cost of batteries continues to drop, supply chains could experience radical energy cost savings. This could disrupt energy utilities, manufacturing, transportation, and construction, to name just a few, and create many opportunities while changing established business models. (For more on how renewable energy will affect business, read the feature “Tick Tock” in this issue.)

Battery research and development has become big business. Thanks to electric cars and powerful smartphones, there has been incredible pressure to make more powerful batteries that last longer between charges.

The proof of this is in the R&D funding pudding. A Brookings Institution report notes that both the Chinese and U.S. governments offer generous subsidies for lithium-ion battery advancement. Automakers such as Daimler and BMW have established divisions marketing residential and commercial energy storage products. Boeing, Airbus, Rolls-Royce, and General Electric are all experimenting with various electric propulsion systems for aircraft—which means that hybrid airplanes are also a possibility.

Meanwhile, governments around the world are accelerating battery research investment by banning internal combustion vehicles. Britain, France, India, and Norway are seeking to go all electric as early as 2025 and by 2040 at the latest.

In the meantime, expect huge investment and new battery innovation from interested parties across industries that all share a stake in the outcome. This past September, for example, Volkswagen announced a €50 billion research investment in batteries to help bring 300 electric vehicle models to market by 2030.

SAP Q417 DigitalDoubles Feature1 Image6 1024x572 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]

At first, it sounds like a narrative device from a science fiction novel or a particularly bad urban legend.

Powerful cameras in several Chinese cities capture photographs of jaywalkers as they cross the street and, several minutes later, display their photograph, name, and home address on a large screen posted at the intersection. Several days later, a summons appears in the offender’s mailbox demanding payment of a fine or fulfillment of community service.

As Orwellian as it seems, this technology is very real for residents of Jinan and several other Chinese cities. According to a Xinhua interview with Li Yong of the Jinan traffic police, “Since the new technology has been adopted, the cases of jaywalking have been reduced from 200 to 20 each day at the major intersection of Jingshi and Shungeng roads.”

The sophisticated cameras and facial recognition systems already used in China—and their near–real-time public shaming—are an example of how machine learning, mobile phone surveillance, and internet activity tracking are being used to censor and control populations. Most worryingly, the prospect of real-time surveillance makes running surveillance states such as the former East Germany and current North Korea much more financially efficient.

According to a 2015 discussion paper by the Institute for the Study of Labor, a German research center, by the 1980s almost 0.5% of the East German population was directly employed by the Stasi, the country’s state security service and secret police—1 for every 166 citizens. An additional 1.1% of the population (1 for every 66 citizens) were working as unofficial informers, which represented a massive economic drain. Automated, real-time, algorithm-driven monitoring could potentially drive the cost of controlling the population down substantially in police states—and elsewhere.

We could see a radical new era of censorship that is much more manipulative than anything that has come before. Previously, dissidents were identified when investigators manually combed through photos, read writings, or listened in on phone calls. Real-time algorithmic monitoring means that acts of perceived defiance can be identified and deleted in the moment and their perpetrators marked for swift judgment before they can make an impression on others.

SAP Q417 DigitalDoubles Feature1 Image7 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]Businesses need to be aware of the wider trend toward real-time, automated censorship and how it might be used in both commercial and governmental settings. These tools can easily be used in countries with unstable political dynamics and could become a real concern for businesses that operate across borders. Businesses must learn to educate and protect employees when technology can censor and punish in real time.

Indeed, the technologies used for this kind of repression could be easily adapted from those that have already been developed for businesses. For instance, both Facebook and Google use near–real-time facial identification algorithms that automatically identify people in images uploaded by users—which helps the companies build out their social graphs and target users with profitable advertisements. Automated algorithms also flag Facebook posts that potentially violate the company’s terms of service.

China is already using these technologies to control its own people in ways that are largely hidden to outsiders.

According to a report by the University of Toronto’s Citizen Lab, the popular Chinese social network WeChat operates under a policy its authors call “One App, Two Systems.” Users with Chinese phone numbers are subjected to dynamic keyword censorship that changes depending on current events and whether a user is in a private chat or in a group. Depending on the political winds, users are blocked from accessing a range of websites that report critically on China through WeChat’s internal browser. Non-Chinese users, however, are not subject to any of these restrictions.

The censorship is also designed to be invisible. Messages are blocked without any user notification, and China has intermittently blocked WhatsApp and other foreign social networks. As a result, Chinese users are steered toward national social networks, which are more compliant with government pressure.

China’s policies play into a larger global trend: the nationalization of the internet. China, Russia, the European Union, and the United States have all adopted different approaches to censorship, user privacy, and surveillance. Although there are social networks such as WeChat or Russia’s VKontakte that are popular in primarily one country, nationalizing the internet challenges users of multinational services such as Facebook and YouTube. These different approaches, which impact everything from data safe harbor laws to legal consequences for posting inflammatory material, have implications for businesses working in multiple countries, as well.

For instance, Twitter is legally obligated to hide Nazi and neo-fascist imagery and some tweets in Germany and France—but not elsewhere. YouTube was officially banned in Turkey for two years because of videos a Turkish court deemed “insulting to the memory of Mustafa Kemal Atatürk,” father of modern Turkey. In Russia, Google must keep Russian users’ personal data on servers located inside Russia to comply with government policy.

While China is a pioneer in the field of instant censorship, tech companies in the United States are matching China’s progress, which could potentially have a chilling effect on democracy. In 2016, Apple applied for a patent on technology that censors audio streams in real time—automating the previously manual process of censoring curse words in streaming audio.

SAP Q417 DigitalDoubles Feature1 Image8 1024x572 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]

In March, after U.S. President Donald Trump told Fox News, “I think maybe I wouldn’t be [president] if it wasn’t for Twitter,” Twitter founder Evan “Ev” Williams did something highly unusual for the creator of a massive social network.

He apologized.

Speaking with David Streitfeld of The New York Times, Williams said, “It’s a very bad thing, Twitter’s role in that. If it’s true that he wouldn’t be president if it weren’t for Twitter, then yeah, I’m sorry.”

Entrepreneurs tend to be very proud of their innovations. Williams, however, offers a far more ambivalent response to his creation’s success. Much of the 2016 presidential election’s rancor was fueled by Twitter, and the instant gratification of Twitter attracts trolls, bullies, and bigots just as easily as it attracts politicians, celebrities, comedians, and sports fans.

Services such as Twitter, Facebook, YouTube, and Instagram are designed through a mix of look and feel, algorithmic wizardry, and psychological techniques to hang on to users for as long as possible—which helps the services sell more advertisements and make more money. Toxic political discourse and online harassment are unintended side effects of the economic-driven urge to keep users engaged no matter what.

Keeping users’ eyeballs on their screens requires endless hours of multivariate testing, user research, and algorithm refinement. For instance, Casey Newton of tech publication The Verge notes that Google Brain, Google’s AI division, plays a key part in generating YouTube’s video recommendations.

According to Jim McFadden, the technical lead for YouTube recommendations, “Before, if I watch this video from a comedian, our recommendations were pretty good at saying, here’s another one just like it,” he told Newton. “But the Google Brain model figures out other comedians who are similar but not exactly the same—even more adjacent relationships. It’s able to see patterns that are less obvious.”

SAP Q417 DigitalDoubles Feature1 Image9 Top Ten Digitalist Magazine Posts Of The Week [February 19, 2018]A never-ending flow of content that is interesting without being repetitive is harder to resist. With users glued to online services, addiction and other behavioral problems occur to an unhealthy degree. According to a 2016 poll by nonprofit research company Common Sense Media, 50% of American teenagers believe they are addicted to their smartphones.

This pattern is extending into the workplace. Seventy-five percent of companies told research company Harris Poll in 2016 that two or more hours a day are lost in productivity because employees are distracted. The number one reason? Cellphones and texting, according to 55% of those companies surveyed. Another 41% pointed to the internet.

Tristan Harris, a former design ethicist at Google, argues that many product designers for online services try to exploit psychological vulnerabilities in a bid to keep users engaged for longer periods. Harris refers to an iPhone as “a slot machine in my pocket” and argues that user interface (UI) and user experience (UX) designers need to adopt something akin to a Hippocratic Oath to stop exploiting users’ psychological vulnerabilities.

In fact, there is an entire school of study devoted to “dark UX”—small design tweaks to increase profits. These can be as innocuous as a “Buy Now” button in a visually pleasing color or as controversial as when Facebook tweaked its algorithm in 2012 to show a randomly selected group of almost 700,000 users (who had not given their permission) newsfeeds that skewed more positive to some users and more negative to others to gauge the impact on their respective emotional states, according to an article in Wired.

As computers, smartphones, and televisions come ever closer to convergence, these issues matter increasingly to businesses. Some of the universal side effects of addiction are lost productivity at work and poor health. Businesses should offer training and help for employees who can’t stop checking their smartphones.

Mindfulness-centered mobile apps such as Headspace, Calm, and Forest offer one way to break the habit. Users can also choose to break internet addiction by going for a walk, turning their computers off, or using tools like StayFocusd or Freedom to block addictive websites or apps.

Most importantly, companies in the business of creating tech products need to design software and hardware that discourages addictive behavior. This means avoiding bad designs that emphasize engagement metrics over human health. A world of advertising preroll showing up on smart refrigerator touchscreens at 2 a.m. benefits no one.

According to a 2014 study in Cyberpsychology, Behavior and Social Networking, approximately 6% of the world’s population suffers from internet addiction to one degree or another. As more users in emerging economies gain access to cheap data, smartphones, and laptops, that percentage will only increase. For businesses, getting a head start on stopping internet addiction will make employees happier and more productive. D!

About the Authors

Maurizio Cattaneo is Director, Delivery Execution, Energy, and Natural Resources, at SAP.

David Delaney is Global Vice President and Chief Medical Officer, SAP Health.

Volker Hildebrand is Global Vice President for SAP Hybris solutions.

Neal Ungerleider is a Los Angeles-based technology journalist and consultant.

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


Let’s block ads! (Why?)

Digitalist Magazine

Farewell Pentaho

blank Farewell Pentaho

Dear Kettle friends,

12 years ago I joined a wonderful team of people at Pentaho who thought they could make a real change in the world of business analytics. At that point I recently open sourced my own data integration tool (then still called ‘ETL’) called Kettle and so I joined in the role of Chief Architect of Data Integration. The title sounded great and the job included everything from writing articles (and a book), massive amounts of coding, testing, software releases, giving support, doing training, workshops, … In other words, life was simply doing everything I possibly and impossibly could to make our software succeed when deployed by our users. With Kettle now being one of the most popular data integration tools on the planet I think it’s safe to say that this goal has been reached and that it’s time for me to move on.

I don’t just want to announce my exit from Pentaho/Hitachi Vantara. I would also like to thank all the people involved in making our success happen. First and foremost I want to express my gratitude to the founders (Richard, Doug, James, Marc, …) for even including a crazy Belgian like myself on the team but I also want to extend my warmest thanks to everyone who I got to become friends with at Pentaho for the always positive and constructive attitude. Without exaggeration I can say it’s been a lot of fun.

I would also explicitly like to thank the whole community of users of Kettle (now called Pentaho Data Integration). Without your invaluable support in the form of new plugins, bug reports, documentation, forum posts, talks, … we could never have pulled off what we did in the past 12 years! I hope we will continue to meet at one of the many popular community events.

Finally I want to thank everyone at Hitachi and Hitachi Vantara for being such a positive and welcoming group of people. I know that Kettle is used all over Hitachi and I’m quite confident this piece of software will not let you down any time soon.

Now I’m going to go skiing for a week and when I get back it’s time to hunt for a new job. I can’t wait to see what impossible problems need solving out there…


Let’s block ads! (Why?)

Matt Casters on Data Integration

Pig Farts

0 Pig Farts

Pig squeezes out gas while making room for food.

Farting Pig

October 09, 2009


Let’s block ads! (Why?)