Tag Archives: Challenges

Expert Interview (Part 2): James Kobielus on Reasons for Data Scientist Insomnia including Neural Network Development Challenges

In the first half of our two-part conversation with Wikibon lead analyst James Kobielus (@jameskobielus), he discussed the incredible impact of machine learning in helping organizations make better business decisions and be more productive. In today’s Part 2, he addresses what aspects of machine learning should be keeping data scientists up at night. (Hint: neural networks)

Several Challenges Involved with Developing Neural Networks

Developing these algorithms is not without its challenges, Kobielus says.

The first major challenge is finding data.

Algorithms can’t do magic unless they’ve been “trained.” And in order to train them, the algorithms require fresh data. But acquiring this training data set is a big hurdle for developers.

For eCommerce sites, this is less of a problem – they have their own data in the form of transaction histories, site visits and customer information that can be used to train the model and determine how predictive it is.

blog banner 2018 Big Data Trends eBook Expert Interview (Part 2): James Kobielus on Reasons for Data Scientist Insomnia including Neural Network Development Challenges

But the process of amassing those training data sets when you don’t have data is trickier – developers have to rely upon commercial data sets that they’ve purchased or open source data sets.

After getting the training data, which might come from a dozen different sources, the next challenge is aggregating it so the data can be harmonized with a common set of variables. Another challenge is having the ability to cleanse data to make sure it’s free of contradictions and inconsistencies. All this takes time and resources in the form of databases, storage, processing and data engineers. This process is expensive but essential. (For more on this, read Uniting Data Quality and Data Integration)

Third, organizations need data scientists, who are expensive resources. They need to find enough people to manage the whole process – from building to training to evaluating to governing.

“Finding the right people with the right skills, recruiting the right people is absolutely essential,” Kobielus says.

Before jumping into machine learning, organizations should also make sure it makes sense for your business strategies.

Industries like finance and marketing have made a clear case for themselves in implementing Big Data. In the case of finance, it allows them to do high-level analysis to detect things like fraud. And in marketing, for instance, CMOs, found it useful to develop algorithms that allowed them  to conduct sentiment analysis on social media.

There are a lot of uses for it to be sure, Kobielus says, but there are methods for deriving insights from data that don’t involve neural networks. It’s up to the business to determine whether using neural networks is overkill for their purposes.

“It’s not the only way to skin these cats,” he says.

If you already have the tools in place, then it probably makes sense to keep using them. Or, if you find traditional tools can’t address needs like transcription or facial recognition, then it probably makes sense to go to a newer form of machine learning.

What Should Really Be Keeping Data Scientists Up at Night 

While those in the tech industry might be fretting over whether AI will displace the gainfully employed or that there’s a skills deficit in the field, Kobielus has other worries related to data science.

For one, the algorithms used for machine learning and AI are really complex and they drive so many decisions and processes in our lives.

“What if something goes wrong? What if a self-driving vehicle crashes? What if the algorithm does something nefarious in your bank account? How can society mitigate the risks,” Kobielus asks.

When there’s a negative outcome, the question asked is who’s responsible. The person who wrote the algorithm? The data engineer? The business analyst who defined the features?

These are the questions that should keep data scientists, businesses, and lawyers up at night. And the answers aren’t clear-cut.

In order to start answering some of these questions, there needs to be algorithmic transparency, so that there can be algorithmic accountability.

Ultimately, everyone is responsible for the outcome.

There’s a huge legal gray area when it comes to machine learning because the models used are probabilistic and you can’t predict every single execution path for a given probabilistic application built on ML.

blog kobielus quote3 Expert Interview (Part 2): James Kobielus on Reasons for Data Scientist Insomnia including Neural Network Development Challenges

“There’s a limit beyond which you can anticipate the particular action of a particular algorithm at a particular time,” Kobielus says.

For algorithmic accountability, there need to be audit trails. But an audit log for any given application has the potential to be larger than all the databases on Earth. Not just that, but how would you roll it up into a coherent narrative to hand to a jury?

“Algorithmic accountability should keep people up at night,” he says.

Just as he said concerns about automation are overblown, Kobielus says it’s also unnecessary to worry that there aren’t enough skilled data scientists working today.

Data science is getting easier.

Back in the 80s, developers had to know underlying protocols like HTTP, but today nobody needs to worry about the protocol plumbing anymore. It will be the same for machine learning, Kobielus says. Increasingly, the underlying data is being abstracted away by higher-level tools that are more user friendly.

“More and more, these things can be done by average knowledge workers, and it will be executed by underlying structure,” he says.

Does Kobielus worry about the job security of data scientists then? Not really. He believes data science automation tools will allow data scientists to do less with more and hopefully to allow them to develop their skills in more challenging and creative realms.

For 5 key trends to watch for in the next 12 months, check out our new report: 2018 Big Data Trends: Liberate, Integrate & Trust

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

The Four Challenges to Growth – and How Buster + Punch is Tackling Them

Posted by David Turner, Senior Marketing Director, EMEA, Oracle NetSuite

When Martin Preen was brought in by the founders of Buster + Punch his task was expressed simply, if a challenging one to deliver. As CEO, his mission was to help build the “world’s first home fashion label” and plot a path to international expansion.

BP The Four Challenges to Growth – and How Buster + Punch is Tackling ThemLaunched in an East London garage by industrial designer Massimo ‘Buster’ Minale, Buster + Punch creates cutting edge products, ranging from lighting and home hardware to custom motorcycles. Since 2013, it has become a “cult-favourite interior design brand.”

Critical acclaim was followed by financial success. Turnover has doubled every year and it now has operations in seven office locations across three continents. But growth hasn’t been straight forward. Like any start-up, it has to manage the multiple complexities that come with international expansion, an increase in personnel and a change in expectations.

Preen outlined the four key challenges to growth and how to tackle them:

1. Managing cash

Preen, a former investment banker, said: “Most people who do a start-up know that you’ve got to keep an eye on the cash or you’re screwed.” In other words, cash is king. But managing the cash is not just about reconciling the bank account – it’s about having a complete understanding of the supply chain, payment terms and many other aspects of business finance. Understanding requires visibility.

2. Reading the data

“If cash is king, data is, too,” said Preen. To manage cash requires reading the data. “One of the big challenges for us was that while in London we had a system that was just okay; in the US we had nothing; and in Asia we just had spreadsheets. Trying to pull something together to see where we were took an age,” explained Preen. “If you haven’t got that visibility about where you are, it’s exceptionally difficult to make really, really good decisions.”

Earlier this year, Preen and his team selected and implemented NetSuite OneWorld to give it a real-time view across its operations from manufacturing and warehouses to supply chain and customer service. The implementation of OneWorld – which supports 190 currencies, more than 20 languages, automated tax calculation and reporting in more than 100 countries, and customer transactions in more than 200 countries – means Buster + Punch can more accurately forecast international product demand, scaling its manufacturing and delivery processes accordingly.

3. People and operational control

As a start-up grows, the culture begins to change. “The core people at the beginning know each other, they do everything together and they know what they have got to do,” explained Preen. He calls those early recruits “X-Men of business,” people with a clear major talent but put in a position to multitask at a stage where getting the job done is more important than defining precise roles and responsibilities. As the business expands, however, this ad hoc approach becomes incompatible with long-term growth. While the founding members of the team try and hold on to the old way of doing things, “the new people don’t quite know what … their remit is.” It is adapting to this new “business-as-usual” that is most difficult for employees and business leaders alike.

Preen said NetSuite’s capabilities have the ability to “make us smarter.” It means using workflow features to define the team’s roles and responsibilities. It means being able to manage all operations on a single cloud platform. And it means being able to draw on data from manufacturing plants in Asia, physical locations in Sweden and the UK and retail channels across the world. Where lack of visibility made it difficult for the company to make the good decisions, this new found global clarity is game changing.

“We had a whole hairball infrastructure of systems and spreadsheets across various sites,” recalled Preen. “In order to bring that together we wanted to look at a technology that would run in all of our different locations and be able to consolidate that data across three legal entities as well. And we came across NetSuite. We loved the dashboard views and the workflow reporting.”

4. Keeping it fun

In pursuit of business discipline, organisations can lose the spark of ingenuity that made them successful in the first place. “The challenge for me,” said Preen, “is how do we put a level of control into the structure to allow the business to grow but also keep it fun?” For fun, read product innovation and what Preen called, “involving ourselves in things that are a little bit of rock and roll.” This includes motorcycle customisation, joining forces with Rolls Royce to create a contemporary version of the Edison light bulb and an association with the Q Awards, not just as a sponsor but as the maker of the awards themselves.

Learn more about NetSuite for the retail industry.

Posted on Mon, December 11, 2017
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Modelling Deposit Price Elasticity: Challenges and Approach

This is the third in a series of blogs on deposit pricing, focusing on price elasticity modelling approaches and challenges.

The goal of any deposit price optimization solution is to make data-driven pricing decisions to manage portfolio balances and trade these off against the associated costs. These solutions should allow a pricing manager to prepare and run what-if analyses to assess the impact of pricing strategies, competitor price actions or movements in central bank base rates.

Fundamental to these solutions are price-elasticity models that capture and predict customer behavior as a response to pricing and other non-price factors. In this blog, we discuss the challenges and solution approaches for the development of robust price-elasticity models.

Price Response Signal

Price sensitivity can be measured with regards to product rate, market ranking, competitor rates or even interest paid to other products in the portfolio. The modelling challenge is not only to measure price sensitivity accurately, but to capture as much richness in pricing behavior as possible, e.g., variations in price sensitivity across different segments.

For example, ultra-low bank base rates have become the new normal in the US and Europe and it can be a challenge to isolate the price signal. There may be limited price variation in the modelling period or else one-time shocks caused by the presence of non-price related factors (such as Brexit) that drive balance flows. Previous experience of price sensitivity models across different markets, interest rate environments and transformation of price-related variables provides an anchor to avoid misdiagnosis of the price signal.

Cannibalization

In order to truly understand the impact of pricing decisions, the flows between individual products and segments must be understood. This allows the prediction of the balance distribution by product and requires a tried and tested process for inferring balance flows. Understanding and predicting balance flows across products in turn allows pricing managers to assess the impact of cannibalization on pricing decisions and overall portfolio revenue.

Data Availability & Granularity

One of the biggest challenges for the development of price sensitivity models is data availability. The development of overly complex models with insufficient data results in the often-cited adage of garbage-in, garbage-out, so it is important to ensure that the modelling approach is appropriate to the data available.

Finer granularity, where more modelling segments are used, introduces richer pricing behavior and allows greater insights into pricing decisions. This must be balanced with the need for sufficient deposits within each segment to ensure a statistically significant price signal can be measured.

Another consideration concerns the model resolution: An established deposit portfolio with relatively stable balances might only re-price a few times a year and a monthly granularity is sufficient. On the other hand, a smaller bank that needs to make shorter-term funding decisions might buy balances through the introduction of market-leading rates. This type of pricing action typically occurs more frequently and would therefore need a weekly granularity.

Modelling Methodologies

Depending on a bank’s objectives and the availability of data, there are a number of modelling approaches that might be considered.

Deposit Price Elasticity Modelling Approaches FICO Modelling Deposit Price Elasticity: Challenges and Approach

Model Management

It is vitally important that all stakeholders from the pricing analyst to senior management have confidence in pricing models. The best way to achieve this is to ensure full transparency of the models, methodology and the factors that drive predictions.

With a full understanding of model drivers, the business can justify why particular pricing decisions are made to internal stakeholders and also answer any challenges posed by external regulators that a “black box” solution could not.

An ongoing model management process should monitor model performance against established accuracy thresholds to guide model recalibration and redevelopment. This ensures that models respond to changes in the market and provides a mechanism whereby the impact of recent pricing decisions feed back into the price sensitivity models.

Conclusion

In order to develop deposit price sensitivity models, careful consideration is needed to evaluate an organization’s requirements and mitigate some of the challenges discussed here. The deployment of such models offers substantial improvements on approaches that rely exclusively on expert judgment. It allows banks to more effectively manage their deposit portfolio, better understand their customer behavior and preferences, and identify new revenue opportunities. It is also a critical component on the journey towards full price optimization where banks derive all the benefits that data driven models have to offer.

Let’s block ads! (Why?)

FICO

Overcome 3 Challenges Faced by Professional Services Firms with a Specific CRM Solution

CRM Blog Overcome 3 Challenges Faced by Professional Services Firms with a Specific CRM Solution

The flexibility of a CRM solution such as Microsoft Dynamics 365 allows you to turn it into the heart of your organization’s operations. You can then use it as the entryway from which all your activities can be completed. It can also integrate a solution specifically designed for your industry, allowing you to complete all your tasks from a single interface that’s intuitive, familiar and user-friendly. With a solution that incorporates their specific business rules and needs, professional services firms can overcome several of the challenges they encounter daily.

1. Have only one version of the truth across your organization

One of the challenges that many professional services firms must face is the utilization of non-integrated applications and Excel spreadsheets. Result: resources don’t always have access to the latest information or the most recent version of a document, which can lead to discrepancies regarding facts or activity status. By centralizing all your information with a specific solution integrated to your CRM system, not only do you ensure access for resources across your entire organization, you also offer them complete visibility on all aspects of your business.

As such, your mandates, invoices and WIPs, as well as customer activities and all communications exchanged with them, can be accessed from a single source updated in real time. The associates have increased visibility on their mandates and their clients, as well as on their complete history and profitability. By having all this data on hand, they can also better serve their clients, exceed their expectations and ensure their satisfaction.

2. Shorten your invoicing cycle

Maintain the balance between finances and operations by leveraging the flexibility of your CRM system and the control of your ERP solution. With a complete, bidirectional integration, information is transferred from one system to the other in real time. This not only avoids constant back-and-forth communications between your associates and the accounting team, but it also eliminates manual data entry and reduces the risk of errors and double entries.

Various invoice templates can be used and combined from the CRM, then transferred to the accounting system. This allows you to create invoices that meet all your clients’ criteria and that fit the requirements of the different services offered by your organization, while also adapting them to your organization’s branding. Lastly, having access to your financial data in real time means that you can invoice faster. This way, you improve your cash flow while also reducing the risk of write-offs and bad debts.

3. Optimize the utilization and productivity rates of your resources

By using workflows to automate certain process, you can facilitate your associates’ tasks and activities. For example, they can be notified to approve the budget when a new mandate is created. Among the specific functionalities of our solution, a resource planning tool also makes it possible to view easily the resources that are currently assigned and those that are available to work on new mandates. This tool thus makes it possible to optimize resource utilization and better plan hiring. Lastly, the solution allows you to budget the potential for billable hours per resource per month and to track them throughout the year.

One last challenge that many professional services firms must face is the difficulty of staying on top of technological trends to meet the constantly evolving needs of their clients. Scalable and flexible, the Microsoft Dynamics 365 platform makes it possible to catch up on this technological lag, while a specific solution such as JOVACO’s accounting solution improves efficiency and performance across your entire organization.

By JOVACO Solutions, Microsoft Dynamics 365 specialist in Quebec

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Expert Interview (Part 2): Robert Corace of SoftServe Discusses Digital Transformation Challenges

In Part 1, Robert talked about how critical importance of digital transformation for organizations. In part two, highlights the results of recent research on digital transformation with a focus on the common challenges organizations face. He also provides some examples of  innovative strategies that companies such as Netflix and Amazon are using to tackle these digital transformation challenges.

What are the most common frustration or challenges your clients are coming to you with to solve? How do you help them?

As our recent research showed, security is the chief concern and the biggest challenge to solve. As I already mentioned, data mining and analytics is also a struggle for many, as well as experience design and organizational inflexibility.

On a higher, more strategic level, though, many companies understand that they need to transform, but they lack clear vision into what areas they need to focus on, where to start, and how to move forward with their transformative initiatives in the fastest and most efficient way. That’s why we have SoftServe Labs, in fact, to help our clients with research and proof-of-concept before they make large investments.

I wouldn’t describe these as purely challenges, though, as these companies also stand to gain a lot. Digital asset management, Cloud computing, mobile technologies, and the Internet of Things (IoT) approached as a part of digital transformation efforts can bring a lot of benefits to consumer facing operations, retail, the finance and banking sector, and many others.

What are some of the digital transformation challenges facing organizations today in harnessing their data?

Numerous security breaches and hacking attacks serve as a proof that we haven’t yet solved security challenges facing all businesses, small and large. Privacy is also a big concern, especially when it comes to access to personal data in healthcare, education, state and government organizations, etc.

blog digital transformation Expert Interview (Part 2): Robert Corace of SoftServe Discusses Digital Transformation Challenges

Data security is one of the common digital transformation challenges facing businesses today.

Another aspect of it is legacy software that cannot handle the amounts of data that require daily processing, and it can’t be all substituted within a couple of days due to financial and resource strains it would put upon the organizations. Artificial Intelligence (AI), though hugely promising, is not yet at that stage when it can automate decision making for truly impactful processes, beyond initial analysis. However, it can facilitate and speed them up considerably.

Related: The Future of Artificial Intelligence in Sales and Account Planning

It also is very important to remember that harnessing data is not an end in itself, but rather a means to help organizations achieve their business and strategic goals. And the consumer – a human being– is at the heart of all of it. So, no purely technical solution, no matter how powerful or innovative, will bring true value if it’s not applied correctly or as a part of a well-thought and comprehensive strategy.

What organizations do you feel have been especially forward-thinking and/or innovative at leveraging their data to solve? What can we learn from them to solve our own digital transformation challenges?

Well, when it comes to leveraging data and personalization, giants like Google and Netflix immediately come to mind. It’s interesting how thoroughly analyzing data and making the right predictions, Netflix managed to reduce the range of content available on their platform while improving customer satisfaction.

And look how Amazon is using data from different sensors and machine learning to disrupt the grocery business with their “Amazon Go” retail store.

When it comes to attracting new customers, which is also a challenge for traditional companies, I like the example of L’inizio Pizza Bar in New York. Their manager decided to attract Pokémon Go players to the place, and he spent just $ 10 to have Pokémon characters lured to his restaurant. The business went up by 75 percent. So, it’s never about technology or software only, it’s about innovative thinking and human ingenuity.

How can organizations manage their data assets more efficiently and effectively? What should their data management strategies include?

With the “Internet of Everything” and connected everything blurring the concepts of office and home devices as well as working hours and workplace, data assets need to be secure and protected and accessible from a variety of different devices, in different formats and easily searchable.

blog banner HP2017webcast Expert Interview (Part 2): Robert Corace of SoftServe Discusses Digital Transformation Challenges

For some organizations – most likely in the government sector, finance, and insurance, etc. – it will require switching to intranet to secure their assets from any unauthorized access or potential loss of information. For others, where remote access from any place, any time is a higher priority, omni-channel and compatibility will be the key focus. The challenges here include the already discussed legacy software and integration issues.

According to IDC research, by 2022 almost all data – 93 percent – in the digital universe will be unstructured. It will also, most likely be content in different formats, including audio and video files, images, interactive content, etc. Not only will this require greater storage and processing capacity, it also means that this data will need to be easily searchable and user friendly if we want it to be used versus stored.

When it comes to customer-facing content, another requirement is consistency across various channels. On the whole, when it comes to data, the current leaders in asset management are platform providers. With these platforms, instead of building their own solutions from scratch, which is a costly and time-consuming approach, businesses can quickly customize and scale a ready-made solution, adding and discarding additional features depending on their current needs.

What are some of the most exciting Big Data trends or innovations you’re following right now? Why do they interest you?

SoftServe’s 2016 Big Data survey showed 62 percent of organizations expect to implement machine learning by 2018, so apparently machine learning and Artificial Intelligence are huge Big Data trends we’re following right now. Chatbots as a customer-facing form of AI technology have gained momentum and are quickly becoming an area of huge interest for all kinds of user support activities.

But from a high-level perspective it’s nothing new, really. Once again, it’s all focused around building a better, different experience for a consumer, so machine learning, AI and chatbots are in fact just new(ish), possibly more effective ways to achieve the same goal: leveraging data to improve customer experience and stay relevant in an increasingly competitive marketplace.

For more on challenges driving digital transformations, download the eBook “Hadoop Perspectives for 2017” which offers an in-depth look at the results of Syncsort’s annual Hadoop survey, including five trends to watch for in 2017.

Let’s block ads! (Why?)

Syncsort blog

Overcoming Technical Challenges in Large Mainframe to Hadoop Implementations

In the past couple of years, we’ve seen tremendous growth in demand for Syncsort DMX-h from large enterprises looking to leverage their Mainframe data on Hadoop. Syncsort DMX-h is currently the recognized leader in bringing Mainframe data into Big Data platforms as part of Hadoop implementation.

A large percentage of these customers have come to us after a recommendation from one of the Hadoop vendors or from a Systems Integrator, the hands-on people who really get the technical challenges of this type of integration. Some people might wonder why the leaders in this industry recommend Syncsort, rather than some of the giants in Data Integration. What kind of problems trip those giants up, and what does Syncsort have under the hood that makes the hands-on Hadoop professionals recommend it?

To get an idea of the challenges that Syncsort takes on, let’s look at some Hadoop implementation examples and what technical problems they faced.

Use Case 1: How Mainframe Data Can Go Native on Hadoop

A large enterprise was masking sensitive data on the Mainframe to use in mainframe application testing. This used a lot of expensive CPU time. They were looking to save the cost of MIPS by doing the masking on Hadoop, then bringing the masked data back to the Mainframe.

DMX-h has a unique capability to move Mainframe data to be processed in Hadoop without doing any conversion. By preserving the original Mainframe EBCDIC encoding and data formats such as Packed-Decimal, DMX-h could process the files in Hadoop without any loss of precision or changes in data structure. The original COBOL copybook was used by DMX-h to understand the data, including Occurs Depending On and Redefines definitions. After masking sensitive parts on Hadoop, that same copybook still matched the data when it was moved back to the Mainframe.

dmxh slide 1 Overcoming Technical Challenges in Large Mainframe to Hadoop Implementations

Keeping the data in its original Mainframe format also helped with governance, compliance and auditing. There is no need to justify data changes when the data is simply moved to another processing system, not altered.

Use Case 2: Efficiently Combining Complex Mainframe Data with Diverse Data Sources for Big Data Analytics on Hadoop

Another common use case Syncsort DMX-h does routinely is to move Mainframe data to Hadoop so it can be combined with other data sources, and be part of analytics run in MapReduce or Spark. We’ve seen plenty of situations when a customer has tried different solutions to ingest Mainframe data but has hit roadblocks that stalled its Hadoop implementation, where DMX-h easily handles the situation.

In one example, a customer had nested Occurs Depending On clauses in their COBOL copybooks. Their existing solution was expanding each of the records to the maximum occurrence length. This was causing the size of data to blow up hugely on Hadoop, and the ingestion was painfully slow. With DMX-h, the records were kept at their intended size and the ingestion proceeded at a much faster rate. The ingestion completed in under 10 minutes, as opposed to 4 hours with the existing solution.

In another example, the customer had VSAM data with many segments. Their existing solution was reading the VSAM file once for each segment, which was taking a lot of time and using up expensive processing on the Mainframe. If a VSAM file had for instance, 5 segments, the other solution had to read that same file 5 times over. DMX-h reads VSAM only once, partitioning the data by segment ID using a field on the copybook and then splits the data into separate segments, allowing each segment to be mapped by a different copybook, for further processing.

blog BD Legacy Overcoming Technical Challenges in Large Mainframe to Hadoop Implementations

Use Case 3: Simplifying Mainframe Data Access

We have helped users who discovered that accessing Mainframe data isn’t as easy as other data formats. For example, they might have found that their tool doesn’t handle Packed-Decimal or LOW-VALUES in COBOL, or cannot transfer data securely from the Mainframe using Connect:Direct or FTPS.

Another big challenge during Hadoop implementation is getting the COBOL copybook to match the Mainframe data. The original COBOL developers may be long gone, and there is no one around who can fix the copybook. Our Professional Services team sees that all the time, and helps enterprises correct copybook problems so the value of the data can be fully realized.

In these and other practical situations, customers have told us they chose DMX-h as the faster, easier to use, and more cost-effective solution. Mainframe sources can be accessed and processed with DMX-h with just a few mouse clicks. The simplicity makes hard problems look easy.

We’ve spent a lot of time listening to our customer’s pains and aspirations for their Mainframe data. Our 40+ years of Mainframe and Data Integration expertise combined with active involvement in the Apache Hadoop community resulted in DMX-h’s strong Mainframe Access and Integration functionality. Another thing that sets us apart is our mature set of features related to security. Our ease of integration with Kerberos, and easy handling of encrypted or compressed data make a huge difference in production implementations.

For some more specifics, read where Arnie Farrelly, VP of Global Support and Services, has recounted some of what his team has experienced when working with large enterprises trying to leverage their Mainframe data in Hadoop.

And here is a short video demonstrating how to use DMX-h to access Mainframe data and integrate it with Hadoop.

Let’s block ads! (Why?)

Syncsort blog

Lack of skills remains one of the biggest data science challenges

TTlogo 379x201 Lack of skills remains one of the biggest data science challenges

The shortage of trained data scientists remains at the top of the list of data science challenges enterprises face today, according to new research from TDWI.

“We hear constantly that the biggest challenge any organization faces in a data science environment is finding the right skills,” said Fern Halper, vice president and research director at TDWI, based in Renton, Wash., in a webcast highlighting the recent findings.

The research surveyed more than 300 enterprises on their experiences with big data and data science. The two topics are increasingly blending into one another, as organizations need workers who can make sense of the massive troves of data they’ve been collecting over the last several years.

Other common challenges cited by survey respondents included lack of clarity around who owns certain data, lack of understanding of big data tools, lack of enterprise architectures needed to harness big data, security and privacy concerns, and insufficient governance protocols.

The technology piece appeared particularly vexing. Halper said many new tools have emerged within the last few years, including Hadoop, Spark, Python and others, and enterprises are having a hard time staying on top of all these rapid developments.

“Some respondents thought there were too many technologies and a lot of hype out there,” she said. “They didn’t know what to do. Others thought things are changing so fast, and they’re not nimble enough to maintain the best architectures.”

For now, enterprises are sticking with the tools they know, in part, to address these data science challenges. About 80% of survey respondents said they currently use data warehouse tools as their primary data source. For analysis, simple query and data visualization tools top the list of most used. Over the next two years, data warehouse tools will remain prominent, but the top two technologies enterprises plan to add during that time are Hadoop and open source R.

Halper said the results show clear momentum around unstructured data querying and predictive analytics, including machine learning. At the same time, it doesn’t look like these emerging tools and practices are going to completely unseat more tried-and-true tools in the foreseeable future.

“The data warehouse isn’t going away, but it’s being supplanted by these other types of platforms and creating an ecosystem,” she said. “There’s lots of momentum around predictive analytics. It’s a hot technology, and machine learning is making it hotter.”

Let’s block ads! (Why?)


SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources

Can Artificial Intelligence Address The Challenges Of An Aging Population?

These days it seems that we are witnessing waves of extreme disruption rather than incremental technology change. While some tech news stories have been just so much noise, unlikely to have long-term impact, a few are important signals of much bigger, longer-term changes afoot.

From bots to blockchains, augmented realities to human-machine convergence, a number of rapidly advancing technological capabilities hit important inflection points in 2016. We looked at five important emerging technology news stories that happened this year and the trends set in motion that will have an impact for a long time to come.

sap Q416 digital double feature1  1 Can Artificial Intelligence Address The Challenges Of An Aging Population?

Immersive experiences were one of three top-level trends identified by Gartner for 2016, and that was evident in the enormous popularity of Pokémon Go. While the hype may have come and gone, the immersive technologies that have been quietly advancing in the background for years are ready to boil over into the big time—and into the enterprise.

The free location-based augmented reality (AR) game took off shortly after Nintendo launched it in July, and it became the most downloaded app in Apple’s app store history in its first week, as reported by TechCrunch. Average daily usage of the app on Android devices in July 2016 exceeded that of the standard-bearers Snapchat, Instagram, and Facebook, according to SimilarWeb. Within two months, Pokémon Go had generated more than US$ 440 million, according to Sensor Tower.

Unlike virtual reality (VR), which immerses us in a simulated world, AR layers computer-generated information such as graphics, sound, or other data on top of our view of the real world. In the case of Pokémon Go, players venture through the physical world using a digital map to search for Pokémon characters.

The game’s instant global acceptance was a surprise. Most watching this space expected an immersive headset device like Oculus Rift or Google Cardboard to steal the headlines. But it took Pikachu and the gang to break through. Pokémon Go capitalized on a generation’s nostalgia for its childhood and harnessed the latest advancements in key AR enabling technologies such as geolocation and computer vision.

sap Q416 digital double feature1 images8 Can Artificial Intelligence Address The Challenges Of An Aging Population?Just as mobile technologies percolated inside companies for several years before the iPhone exploded onto the market, companies have been dabbling in AR since the beginning of the decade. IKEA created an AR catalog app in 2013 to help customers visualize how their KIVIK modular sofa, for example, would look in their living rooms. Mitsubishi Electric has been perfecting an AR application, introduced in 2011, that enables homeowners to visualize its HVAC products in their homes. Newport News Shipbuilding has launched some 30 AR projects to help the company build and maintain its vessels. Tech giants including Facebook, HP, and Apple have been snapping up immersive tech startups for some time.

The overnight success of Pokémon Go will fuel interest in and understanding of all mediated reality technology—virtual and augmented. It’s created a shorthand for describing immersive reality and could launch a wave of technology consumerization the likes of which we haven’t seen since the iPhone instigated a tsunami of smartphone usage. Enterprises would be wise to figure out the role of immersive technology sooner rather than later. “AR and VR will both be the new normal within five years,” says futurist Gerd Leonhard, noting that the biggest hurdles may be mobile bandwidth availability and concerns about sensory overload. “Pokémon is an obvious opening scene only—professional use of AR and VR will explode.”

sap Q416 digital double feature1  3 Can Artificial Intelligence Address The Challenges Of An Aging Population?

Blockchains, the decentralized digital ledgers of transactions that are processed by a distributed network, first made headlines as the foundation for new types of financial transactions beginning with Bitcoin in 2009. According to Greenwich Associates, financial and technology companies will invest an estimated $ 1 billion in blockchain technology in 2016. But, as Gartner recently pointed out, there could be even more rapid evolution and acceptance in the areas of manufacturing, government, healthcare, and education.

By the 2020s, blockchain-based systems will reduce or eliminate many points of friction for a variety of business transactions. Individuals and companies will be able to exchange a wide range of digitized or digitally represented assets and value with anyone else, according to PwC. The supervised peer-to-peer network concept “is the future,” says Leonhard.

But the most important blockchain-related news of 2016 revealed a weak link in the application of technology that is touted as an immutable record.

In theory, blockchain technology creates a highly tamper-resistant structure that makes transactions secure and verifiable through a massively distributed digital ledger. All the transactions that take place are recorded in this ledger, which lives on many computers. High-grade encryption makes it nearly impossible for someone to cheat the system.

In practice, however, blockchain-based transactions and contracts are only as good as the code that enables them.

Case in point: The DAO, one of the first major implementations of a “Decentralized Autonomous Organization” (for which the fund is named). The DAO was a crowdfunded venture capital fund using cryptocurrency for investments and run through smart contracts. The rules that govern those smart contracts, along with all financial transaction records, are maintained on the blockchain. In June, the DAO revealed that an individual exploited a vulnerability in the company’s smart contract code to take control of nearly $ 60 million worth of the company’s digital currency.

The fund’s investors voted to basically rewrite the smart contract code and roll back the transaction, in essence going against the intent of blockchain-based smart contracts, which are supposed to be irreversible once they self-execute.

The DAO’s experience confirmed one of the inherent risks of distributed ledger technology—and, in particular, the risk of running a very large fund autonomously through smart contracts based on blockchain technology. Smart contract code must be as error-free as possible. As Cornell University professor and hacker Emin Gün Sirer wrote in his blog, “writing a robust, secure smart contract requires extreme amounts of diligence. It’s more similar to writing code for a nuclear power reactor, than to writing loose web code.” Since smart contracts are intended to be executed irreversibly on the blockchain, their code should not be rewritten and improved over time, as software typically is. But since no code can ever be completely airtight, smart contracts may have to build in contingency plans for when weaknesses in their code are exploited.

Importantly, the incident was not a result of any inherent weakness in the blockchain or distributed ledger technology generally. It will not be the end of cryptocurrencies or smart contracts. And it’s leading to more consideration of editable blockchains, which proponents say would only be used in extraordinary circumstances, according to Technology Review.

sap Q416 digital double feature1  5 Can Artificial Intelligence Address The Challenges Of An Aging Population?

Application programming interfaces (APIs), the computer codes that serve as a bridge between software applications, are not traditionally a hot topic outside of coder circles. But they are critical components in much of the consumer technology we’ve all come to rely on day-to-day.

One of the most important events in API history was the introduction of such an interface for Google Maps a decade ago. The map app was so popular that everyone wanted to incorporate its capabilities into their own systems. So Google released an API that enabled developers to connect to and use the technology without having to hack into it. The result was the launch of hundreds of inventive location-enabled apps using Google technology. Today, millions of web sites and apps use Google Maps APIs, from Allstate’s GoodHome app, which shows homeowners a personalized risk assessment of their properties, to Harley-Davidson’s Ride Planner to 7-Eleven’s app for finding the nearest Slurpee.

sap Q416 digital double feature1 images6 Can Artificial Intelligence Address The Challenges Of An Aging Population?Ultimately, it became de rigueur for apps to open up their systems in a safe way for experimentation by others through APIs. Technology professional Kin Lane, who tracks the now enormous world of APIs, has said, “APIs bring together a unique blend of technology, business, and politics into a transparent, self-service mix that can foster innovation.”

Thus it was significant when Apple announced in June that it would open up Siri to third-party developers through an API, giving the wider world the ability to integrate Siri’s voice commands into their apps. The move came on the heels of similar decisions by Amazon, Facebook, and Microsoft, all of which have AI bots or assistants of their own. And in October, Google opened up its Google Assistant as well.

The introduction of APIs confirms that the AI technology behind these bots has matured significantly—and that a new wave of AI-based innovation is nigh.

The best way to spark that innovation is to open up AI technologies such as Siri so that coders can use them as platforms to build new apps that can more rapidly expand AI uses and capabilities. Call it the “platformication” of AI. The value will be less in the specific AI products a company introduces than in the value of the platform for innovation. And that depends on the quality of the API. The tech company that attracts the best and brightest will win. AI platforms are just beginning to emerge and the question is: Who will be the platform leader?

sap Q416 digital double feature1  4 Can Artificial Intelligence Address The Challenges Of An Aging Population?

In June, Swiss citizens voted on a proposal to introduce a guaranteed basic income for all of its citizens, as reported by BBC News. It was the first country to take the issue to the polls, but it won’t be the last. Discussions about the impact of both automation and the advancing gig economy on individual livelihoods are happening around the world. Other countries—including the United States—are looking at solutions to the problem. Both Finland and the Netherlands have universal guaranteed income pilots planned for next year. Meanwhile, American startup incubator Y Combinator is launching an experiment to give 100 families in Oakland, California, a minimum wage for five years with no strings attached, according to Quartz.

The world is on the verge of potential job loss at a scale and speed never seen before. The Industrial Revolution was more of an evolution, happening over more than a century. The ongoing digital revolution is happening in relative hyper speed.

No one is exactly sure how increased automation and digitization will affect the world’s workforce. One 2013 study suggests as much as 47% of the U.S workforce is at risk of being replaced by machines over the next two decades, but even a conservative estimate of 10% could have a dramatic impact, not just on workers but on society as a whole.

The proposed solution in Switzerland did not pass, in part because a major political party did not introduce it, and citizens are only beginning to consider the potential implications of digitization on their incomes. What’s more, the idea of simply guaranteeing pay runs contrary to long-held notions in many societies that humans ought to earn their keep.

Whether or not state-funded support is the answer is just one of the questions that must be answered. The votes and pilots underway make it clear that governments will have to respond with some policy measures. The question is: What will those measures be? The larger impact of mass job displacement, what future employment conditions might look like, and what the responsibilities of institutions are in ensuring that we can support ourselves are among the issues that policy makers will need to address.

New business models resulting from digitization will create some new types of roles—but those will require training and perhaps continued education. And not all of those who will be displaced will be in a position to remake their careers. Just consider taxi drivers: In the United States, about 223,000 people currently earn their living behind the wheel of a hired car. The average New York livery driver is 46 years old, according to the New York City Taxi and Limousine Commission, and no formal education is required. When self-driving cars take over, those jobs will go away and the men and women who held them may not be qualified for the new positions that emerge.

As digitization dramatically changes the constructs of commerce and work, no one is quite sure how people will be impacted. But waiting to see how it all shakes out is not a winning strategy. Companies and governments today will have to experiment with potential solutions before the severity of the problem is clear. Among the questions that will have to be answered: How can we retrain large parts of the workforce? How will we support those who fall through the cracks? Will we prioritize and fund education? Technological progress and shifting work models will continue, whether or not we plan for their consequences.

sap Q416 digital double feature1  2 Can Artificial Intelligence Address The Challenges Of An Aging Population?

In April, a young man, who was believed to have permanently lost feeling in and control over his hands and legs as the result of a devastating spine injury, became able to use his right hand and fingers again. He used technology that transmits his thoughts directly to his hand muscles, bypassing his injured spinal cord. Doctors implanted a computer chip into the quadriplegic’s brain two years ago and—with ongoing training and practice—he can now perform everyday tasks like pouring from a bottle and playing video games.

The system reconnected the man’s brain directly to his muscles—the first time that engineers have successfully bypassed the nervous system’s information superhighway, the spinal cord. It’s the medical equivalent of moving from wired to wireless computing.

The man has in essence become a cyborg, that term first coined in 1960 to describe “self-regulating human-machine systems.” Yet the beneficiary of this scientific advance himself said, “You’re not going to be looked on as, ‘Oh, I’m a cyborg now because I have this big huge prosthetic on the side of my arm.’ It’s something a lot more natural and intuitive to learn because I can see my own hand reacting.”

As described in IEEE Spectrum, the “neural-bypass system” records signals that the man generates when thinking about moving his hand, decodes those signals, and routes them to the electric sleeve around his arm to stimulate movement: “The result looks surprisingly simple and natural: When Burkhart thinks about picking up a bottle, he picks up the bottle. When he thinks about playing a chord in Guitar Hero, he plays the chord.”

sap Q416 digital double feature1 images5 Can Artificial Intelligence Address The Challenges Of An Aging Population?What seems straightforward on the surface is powered by a sophisticated algorithm that can analyze the vast amounts of data the man’s brain produces, separating important signals from noise.

The fact that engineers have begun to unlock the complex code that controls brain-body communication opens up enormous possibilities. Neural prostheses (cochlear implants) have already reversed hearing loss. Light-sensitive chips serving as artificial retinas are showing progress in restoring vision. Other researchers are exploring computer implants that can read human thoughts directly to signal an external computer to help people speak or move in new ways. “Human and machine are converging,” says Leonhard.

The National Academy of Engineering predicts that “the intersection of engineering and neuroscience promises great advances in healthcare, manufacturing, and communication.”

Burkhart spent two years in training with the computer that has helped power his arm to get this far. It’s the result of more than a decade of development in brain-computer interfaces. And it can currently be used only in the lab; researchers are working on a system for home use. But it’s a clear indication of how quickly the lines between man and machine are blurring—and it opens the door for further computerized reanimation in many new scenarios.

This fall, Switzerland hosted its first cyborg Olympics, in which disabled patients compete using the latest assistive technologies, including robot exoskeletons and brainwave-readers. Paraplegic athletes use electrical simulation systems to compete in cycling, for example. The winners are those who can control their device the best. “Instead of celebrating the human body moving under its own power,” said a recent article in the IEEE Spectrum, “the cyborg games will celebrate the strength and ingenuity of human-machine collaborations.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Navigate the challenges of location-based technologies

TTlogo 379x201 Navigate the challenges of location based technologies

As the popularity of Google Maps, Snapchat and Pokémon Go has made clear, location-based technologies have revolutionized how people use mobile phones. By 2018, there will be 2.5 billion smartphone users worldwide, according to eMarketer Inc. They will be navigating to restaurants, Snapchatting their vacations, checking movies playing at the nearest cinema, RSVPing on Meetup or looking up product availability at area Walmart stores.

According to the Pew Research Center, 90% of smartphone owners use them to get information related to their location. Now, companies are starting to tap the location-based services (LBS) on consumers’ phones in order to send them relevant offers and messages. According to the Location Based Marketing Association’s (LBMA) latest trends report, 75% of marketers agree and believe that location-based marketing is an important business issue for 2016.

Location-based technologies use wireless transmission, such as between a smartphone and a beacon or Wi-Fi access point, to pinpoint a user’s location. A mobile app that has access to a phone’s location services can provide navigation as well as location-specific content, like coupons or product reviews. In fact, there are myriad uses for LBS in marketing, advertising and customer engagement.

MEPLAN GmbH, a German trade services provider, created the expoNAVIGATION app to help conference attendees find their favorite exhibitors faster. A user searches a database of exhibitors and enters a list of those he wants to visit. The app uses beacons to plot the shortest route around the floor, thus optimizing the customer’s time and, hopefully, boosting sales for exhibitors.

The Aquarium of Western Australia (AQWA), based in Hillarys, Australia, has a mobile app that guides visitors along several themed tours (like the Shipwreck Coast or Animal Extremes tours) with interactive activities for kids. Created by Apps Ppl, a developer of cloud-based mobile apps, the AQWA app is part of a larger mobile app — “Everythere” — that tourists use to research activities around Perth and to get directions.

Location is the only piece of data that lets you know where people are throughout the day so you can engage [with] them. Asif Khanpresident, Location Based Marketing Association

But it’s the ability to combine location data with other customer information collected from a mobile app or store loyalty program that has the biggest potential for personalizing how businesses engage with their customers. People often use smartphones to browse the web, make online purchases and pay at the checkout counter. That information, and more, can be accessed and used to understand the buyer’s habits and shopping preferences.

“[LBS] can tie your entire marketing strategy together,” said Asif Khan, president of the LBMA. “We’re using location to blend brick and mortar with e-commerce and digital.”

The data can also be aggregated and combined with other consumer information and used to analyze consumer behaviors and trends.

Investment in location-based technologies will rise significantly in the near future, according to Juniper Research Ltd., based in Hampshire, England. The firm expects the LBS market to jump from $ 12.2 billion in 2014 to $ 43.3 billion by 2019, with context-aware mobile services being the main driving force.

Potholes in the road to location-based technologies

Nevertheless, businesses have been cautious about adopting LBS, despite their interest. Forrester Research’s report “Make Smart Wireless Location Technology Decisions” found that just 3% of businesses surveyed were actually using beacons, while another 11% were piloting them.

This is because location-based services are actually a collection of technologies — some old, some new and others still in research and development. It’s a rapidly developing market, one that can quickly confound an unsuspecting marketer or business owner.

The most common technologies are the following:

GPS: GPS systems are commonly used for maps and other outdoor navigation. They can’t penetrate walls and aren’t accurate enough for use in small spaces, so they’re not used for indoor tracking.

Wi-Fi: Already generally offered free to customers, Wi-Fi is often used for simple tracking of store traffic. One downside is it can’t identify unique individuals if they’re using an iPhone, and accuracy can vary.

Bluetooth beacons: Used primarily for indoor navigation, beacons have an accuracy range of one to several meters, depending on the product and whether fingerprinting or triangulation techniques are also used. Available in sizes as small as a matchbook, they can be hidden behind pictures or in lights. According to Forrester’s June 2016 report “Make Smart Wireless Location Decisions,” some beacons can send only basic data and can’t accept updates, while others are more flexible. Beacons also require maintenance.

“You have to place them, manage them, change batteries in them. They’re operationally intensive,” explained Andre Kindness, principal analyst at Forrester, which estimated an annual maintenance cost of $ 240,000 for keeping beacons operational in a 1,000-square-meter store versus $ 60,000 for Wi-Fi. On the other hand, Wi-Fi nodes run $ 900 a piece, according to the same study.

Two emerging technologies are visible light, emitted via smart LED lights and potentially capable of tracking with an accuracy of a few centimeters, and ultrasound waves, which send out chirps that are picked up by a phone’s audio receiver.

Both are promising technologies, said Bruce Krulwich, chief analyst at New York-based Grizzly Analytics Ltd., which specializes in mobile technologies like location-based services and IoT. However, both have their drawbacks, as well. For visible light, businesses have to replace their lighting with smart LED lights and controllers. Meanwhile, ultrasound may trip on other ambient sounds.

With more work, however, both may achieve performance better than today’s Bluetooth-based solutions, said Krulwich.

Fear of big brother

Consumer concerns about privacy are another challenge for location-based technologies. To work well, the apps need access to the phone’s location services, and consumers can deny apps access. According to the Pew Research Center, over one-third of adults and 46% of teenagers turn off location services due to fears over privacy.

The Los Angeles County Museum of Art’s (LACMA) mobile app requires both Bluetooth and location services to be fully functional. To encourage participation, the museum pre-empted the usual terse system messages users get asking if they want to share location data with one that asks, “Would you like to receive location-based data?”

“We created it to be less intimidating,” said Tomas Garcia, digital media product developer at LACMA. He added that it’s also faster than using individual service prompts.

In fact, users will give up their location data for the right incentive. A Forrester brief, “Fuel Contextual Marketing with Location Data,” found that most phone owners would do so in exchange for benefits like discounts, a loyalty program, rewards for visiting a store or to get navigational aid in a store.

A location-based future?

Location is rapidly becoming the most valuable piece of information for consumer marketing.

Social media platforms and many mobile apps, like weather and news, already collect a user’s location information, said Khan, and they make it available in aggregate form to advertisers.

“We describe location as the cookie for the physical world. Location is the only piece of data that lets you know where people are throughout the day so you can engage [with] them,” he said.

Behavioral data, such as location, is fast becoming more important in marketing than standard demographics, said Maribel Lopez, head of mobile marketing research firm Lopez Research, based in San Francisco.

“Behavioral demographics are much more interesting,” said Lopez. “You may find that Android users do this, iOS users do that, people who are in my store 10 minutes do one thing, while those who stay much longer do another.”

But that also means marketers must think through their messages to target customers’ preferences without making them feel stalked.

“The greatest challenge will be to figure out what messages you want to send and where,” said Lopez. “It’s the most contextual engagement you can have, and people expect engagement, not a generic message or coupon.” 

Let’s block ads! (Why?)


SearchCRM: News on CRM trends and technology

Cognitive computing applications present new business challenges

TTlogo 379x201 Cognitive computing applications present new business challenges

The Dutch banking and financial services company Rabobank had been doing predictive analytics for several years, maintaining models that projected which customers might default on their mortgage or abandon an account application midway through the process.

These were mainly built around structured data analysis, but in 2015, the analytics team started receiving more requests for models that would have to delve into unstructured data. Enter cognitive computing applications.

In a presentation at IBM’s World of Watson event in Las Vegas, Muriel Serrurier Schepper, business consultant for advanced analytics at Rabobank, said her team started seeing more demand for analytics of customer feedback and mortgage files, both of which are unstructured free-text files. This demanded a new technology — in this case, IBM’s Watson platform — but also a new approach to thinking about analytics.

Cognitive apps are more business than tech challenge

A big part of the growing excitement behind artificial intelligence (AI) and cognitive computing is the possibility of simple tools. IBM and other vendors, such as Microsoft, Amazon and Facebook, are introducing tools that promise cognitive capabilities without requiring the user to be an expert programmer or data scientist. So, in some cases, the biggest hurdles to overcome for enterprises looking to adopt cognitive tools are organizational, not technical.

At Rabobank, Serrurier Schepper helped create a group dedicated to all things AI. This group identifies use cases, selects technologies and shares information about projects throughout the organization. She said this kind of centralized approach is important in this time of rapid development of AI tools.

Around the time the group was created, Serrurier Schepper said she was seeing lines of business talking with technology vendors about AI tools. She was worried that the situation could lead to duplicated work, siloed projects and inflated expectations. The centralized approach has helped mitigate these problems by allowing people who know the technology to lead projects.

“AI is everywhere, and people think it’s so fantastic. And these companies, including IBM, come in and then you go to do a project and see that it’s not really that great yet,” Serrurier Schepper said. “You have to train a model, and it takes time.”

After building a centralized AI unit, teams should look for quick wins and then publicize their success, Serrurier Schepper said. Models may take a long time to train, but once they’re delivering strong results, sharing this with the rest of the company can help build support for future initiatives.

“We’ve been a bit under the radar at our company,” Serrurier Schepper said. “We didn’t want everyone wondering what’s going on as our models learn. But now, we’re out there telling people what we can do.”

Pick the right use case for cognitive tools

Choosing the right use cases for cognitive computing applications is also important. There is a general notion that AI software can perform just about any task. And while that may be the ultimate goal of the technology, today’s tools are a ways off from that. Enterprises need to identify business problems where the technology is competent, and that’s not always a simple proposition.

“Sometimes, it’s tough, because with most of these problems, you first have to get your hands dirty in the data before seeing if there’s any value there,” Gianluca Antonini, director of IT at Swiss Re, said in a presentation at the conference. “The business case isn’t always clear.”

If you get too hung up on ROI, you’ll never do anything. Abhijit Singhhead of the business technology group at ICICI Bank Ltd.

The Zurich-based reinsurance company has rolled out Watson-based cognitive applications in several areas, including enterprise search, claims processing and chatbots that serve as internal assistants, as well as customer service agents.

To address this, Swiss Re has set up an internal analytics consulting service within the IT department. This team sits down with lines of business to determine if they have a need for cognitive computing applications. And together, the two sides of the house shape projects. The analytics team then develops proof-of-concept projects, which may or may not succeed. Projects have to perform well in this stage before being rolled out in a wider production environment.

Businesses need to accept a certain rate of failure with AI projects if they want to reap the benefits of the technology. In a panel discussion, Abhijit Singh, head of the business technology group at ICICI Bank Ltd., said experimental projects are just as likely to turn into tools that reshape the business as they are to fail completely. Enterprises need to continue supporting these initiatives, even if the immediate payoff isn’t clear.

“If you get too hung up on ROI, you’ll never do anything,” he said.

Singh and his team recently developed a chatbot for use in customer service built around a homegrown cognitive system. He said it started as an experiment and nobody bothered projecting a return.

It’s now performing well and is an example of the kind of benefit businesses can reap if they remain open to the possibilities of cognitive computing applications. “There are things coming that are opening up huge opportunities,” he said.

Let’s block ads! (Why?)


SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources