Tag Archives: Challenges

Data Integration Challenges in a Siloed World

Modernizing your infrastructure and operations means breaking down “silos” — including those that hamper your data integration processes. Here’s a look at the silos that typically stand in the way of data integration, and what businesses can do to tear them down.

“Breaking down silos” is lingo that you’ll hear if you follow the DevOps movement. Part of the point of DevOps is to eliminate the barriers that typically prevent different types of IT staff — such as the development and the IT Ops teams — from collaborating with each other.

Data Integration Challenges in a Siloed World banner Data Integration Challenges in a Siloed World

According to the DevOps mantra, everyone should work in close coordination, rather than having each team operate in its own silo. Silos stifle innovation, make automation difficult and lead to the loss of important information as data is transferred between teams.

Silos and Data Integration

articulated male 818202 960 720 600x Data Integration Challenges in a Siloed World

Although the DevOps movement focuses primarily on software development and delivery, rather than data operations, the value of tearing down silos is not limited to the world of DevOps.

The same concept can be applied to data integration operations — especially if you embrace the DataOps mantra, which extends DevOps thinking into the world of data management.

After all, the typical business’s data operations tend to be “siloed” for a number of reasons:

  • Businesses have many discrete sources of data, ranging from server and network logs to website logs, digital transactions records and perhaps even ink-and-paper files. Because each type of data originates from a different source, building a single process for integrating all of the data into a common pipeline can be challenging.
  • Different teams within the organization tend to produce different types of information, and they may not share it with each other. For example, your marketing department might store data related to customer engagement in a recent offline ad campaign. That data might be able to provide insights to website designers, who could use it to determine how best to engage customers online. But chances are that your marketing team and web design team don’t communicate much, or share data with each other on a routine basis.
  • Modern IT infrastructure tends to be quite diverse. Your businesses may use a combination of on-premise and cloud servers, with multiple operating systems, Web servers and so on in the mix. Each part of your infrastructure produces logs and other types of data in its own format, making integration hard.
  • Young data and historical data are usually stored in different locations. You probably archive historical data after a certain period, for example, possibly off-site. In contrast, real-time data and near-real-time data may remain in their original data sources. This also leads to data silos because data is stored in different places depending on its age.

How do you destroy these silos? The short answer is data integration.

bigstock Data Integration and Database  192826513 600x Data Integration Challenges in a Siloed World

Data integration refers to the process of collecting data from disparate sources and turning it into actionable information. Data integration typically involves data aggregation, data transformations, and data visualizations.

The end goal of data integration is to turn data into information that can deliver meaningful insights to people reviewing it, without forcing them to think too hard in order to find those insights.

If your data remains siloed, data integration is nigh impossible. You can’t achieve easy, obvious insights if you have to look in multiple places to find data, or if complementary data produced by different teams or machines is not combined together.

Nor can you integrate data effectively if your organization is siloed. Everyone should be able to view collective data insights in the same place and at the same time, so that they can also communicate in the same place and at the same time.

Conclusion

In short, your data, and the teams that work with it are probably spread across disparate locations. In other words, they are siloed.

Deriving maximum value from your data requires breaking down those silos through data integration.

For more Big Data insights, check out our recent webcast, 2018 Big Data Trends: Liberate, Integrate, and Trust Your Data, to see what every business needs to know in the upcoming year.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Challenges For Sustainable Digital Ethics

In 2013, several UK supermarket chains discovered that products they were selling as beef were actually made at least partly—and in some cases, entirely—from horsemeat. The resulting uproar led to a series of product recalls, prompted stricter food testing, and spurred the European food industry to take a closer look at how unlabeled or mislabeled ingredients were finding their way into the food chain.

By 2020, a scandal like this will be eminently preventable.

The separation between bovine and equine will become immutable with Internet of Things (IoT) sensors, which will track the provenance and identity of every animal from stall to store, adding the data to a blockchain that anyone can check but no one can alter.

Food processing companies will be able to use that blockchain to confirm and label the contents of their products accordingly—down to the specific farms and animals represented in every individual package. That level of detail may be too much information for shoppers, but they will at least be able to trust that their meatballs come from the appropriate species.

The Spine of Digitalization

Q118 CoverFeature img1 spine Challenges For Sustainable Digital Ethics

Keeping food safer and more traceable is just the beginning, however. Improvements in the supply chain, which have been incremental for decades despite billions of dollars of technology investments, are about to go exponential. Emerging technologies are converging to transform the supply chain from tactical to strategic, from an easily replicable commodity to a new source of competitive differentiation.

You may already be thinking about how to take advantage of blockchain technology, which makes data and transactions immutable, transparent, and verifiable (see “What Is Blockchain and How Does It Work?”). That will be a powerful tool to boost supply chain speed and efficiency—always a worthy goal, but hardly a disruptive one.

However, if you think of blockchain as the spine of digitalization and technologies such as AI, the IoT, 3D printing, autonomous vehicles, and drones as the limbs, you have a powerful supply chain body that can leapfrog ahead of its competition.

What Is Blockchain and How Does It Work?

Here’s why blockchain technology is critical to transforming the supply chain.

Blockchain is essentially a sequential, distributed ledger of transactions that is constantly updated on a global network of computers. The ownership and history of a transaction is embedded in the blockchain at the transaction’s earliest stages and verified at every subsequent stage.

A blockchain network uses vast amounts of computing power to encrypt the ledger as it’s being written. This makes it possible for every computer in the network to verify the transactions safely and transparently. The more organizations that participate in the ledger, the more complex and secure the encryption becomes, making it increasingly tamperproof.

Why does blockchain matter for the supply chain?

  • It enables the safe exchange of value without a central verifying partner, which makes transactions faster and less expensive.
  • It dramatically simplifies recordkeeping by establishing a single, authoritative view of the truth across all parties.
  • It builds a secure, immutable history and chain of custody as different parties handle the items being shipped, and it updates the relevant documentation.
  • By doing these things, blockchain allows companies to create smart contracts based on programmable business logic, which can execute themselves autonomously and thereby save time and money by reducing friction and intermediaries.

Hints of the Future

Q118 CoverFeature img2 future Challenges For Sustainable Digital EthicsIn the mid-1990s, when the World Wide Web was in its infancy, we had no idea that the internet would become so large and pervasive, nor that we’d find a way to carry it all in our pockets on small slabs of glass.

But we could tell that it had vast potential.

Today, with the combination of emerging technologies that promise to turbocharge digital transformation, we’re just beginning to see how we might turn the supply chain into a source of competitive advantage (see “What’s the Magic Combination?”).

What’s the Magic Combination?

Those who focus on blockchain in isolation will miss out on a much bigger supply chain opportunity.

Many experts believe emerging technologies will work with blockchain to digitalize the supply chain and create new business models:

  • Blockchain will provide the foundation of automated trust for all parties in the supply chain.
  • The IoT will link objects—from tiny devices to large machines—and generate data about status, locations, and transactions that will be recorded on the blockchain.
  • 3D printing will extend the supply chain to the customer’s doorstep with hyperlocal manufacturing of parts and products with IoT sensors built into the items and/or their packaging. Every manufactured object will be smart, connected, and able to communicate so that it can be tracked and traced as needed.
  • Big Data management tools will process all the information streaming in around the clock from IoT sensors.
  • AI and machine learning will analyze this enormous amount of data to reveal patterns and enable true predictability in every area of the supply chain.

Combining these technologies with powerful analytics tools to predict trends will make lack of visibility into the supply chain a thing of the past. Organizations will be able to examine a single machine across its entire lifecycle and identify areas where they can improve performance and increase return on investment. They’ll be able to follow and monitor every component of a product, from design through delivery and service. They’ll be able to trigger and track automated actions between and among partners and customers to provide customized transactions in real time based on real data.

After decades of talk about markets of one, companies will finally have the power to create them—at scale and profitably.

Amazon, for example, is becoming as much a logistics company as a retailer. Its ordering and delivery systems are so streamlined that its customers can launch and complete a same-day transaction with a push of a single IP-enabled button or a word to its ever-attentive AI device, Alexa. And this level of experimentation and innovation is bubbling up across industries.

Consider manufacturing, where the IoT is transforming automation inside already highly automated factories. Machine-to-machine communication is enabling robots to set up, provision, and unload equipment quickly and accurately with minimal human intervention. Meanwhile, sensors across the factory floor are already capable of gathering such information as how often each machine needs maintenance or how much raw material to order given current production trends.

Once they harvest enough data, businesses will be able to feed it through machine learning algorithms to identify trends that forecast future outcomes. At that point, the supply chain will start to become both automated and predictive. We’ll begin to see business models that include proactively scheduling maintenance, replacing parts just before they’re likely to break, and automatically ordering materials and initiating customer shipments.

Italian train operator Trenitalia, for example, has put IoT sensors on its locomotives and passenger cars and is using analytics and in-memory computing to gauge the health of its trains in real time, according to an article in Computer Weekly. “It is now possible to affordably collect huge amounts of data from hundreds of sensors in a single train, analyse that data in real time and detect problems before they actually happen,” Trenitalia’s CIO Danilo Gismondi told Computer Weekly.

The project, which is scheduled to be completed in 2018, will change Trenitalia’s business model, allowing it to schedule more trips and make each one more profitable. The railway company will be able to better plan parts inventories and determine which lines are consistently performing poorly and need upgrades. The new system will save €100 million a year, according to ARC Advisory Group.

New business models continue to evolve as 3D printers become more sophisticated and affordable, making it possible to move the end of the supply chain closer to the customer. Companies can design parts and products in materials ranging from carbon fiber to chocolate and then print those items in their warehouse, at a conveniently located third-party vendor, or even on the client’s premises.

In addition to minimizing their shipping expenses and reducing fulfillment time, companies will be able to offer more personalized or customized items affordably in small quantities. For example, clothing retailer Ministry of Supply recently installed a 3D printer at its Boston store that enables it to make an article of clothing to a customer’s specifications in under 90 minutes, according to an article in Forbes.

This kind of highly distributed manufacturing has potential across many industries. It could even create a market for secure manufacturing for highly regulated sectors, allowing a manufacturer to transmit encrypted templates to printers in tightly protected locations, for example.

Meanwhile, organizations are investigating ways of using blockchain technology to authenticate, track and trace, automate, and otherwise manage transactions and interactions, both internally and within their vendor and customer networks. The ability to collect data, record it on the blockchain for immediate verification, and make that trustworthy data available for any application delivers indisputable value in any business context. The supply chain will be no exception.

Q118 CoverFeature img3 cheers ver2 Challenges For Sustainable Digital Ethics

Blockchain Is the Change Driver

The supply chain is configured as we know it today because it’s impossible to create a contract that accounts for every possible contingency. Consider cross-border financial transfers, which are so complex and must meet so many regulations that they require a tremendous number of intermediaries to plug the gaps: lawyers, accountants, customer service reps, warehouse operators, bankers, and more. By reducing that complexity, blockchain technology makes intermediaries less necessary—a transformation that is revolutionary even when measured only in cost savings.

“If you’re selling 100 items a minute, 24 hours a day, reducing the cost of the supply chain by just $ 1 per item saves you more than $ 52.5 million a year,” notes Dirk Lonser, SAP go-to-market leader at DXC Technology, an IT services company. “By replacing manual processes and multiple peer-to-peer connections through fax or e-mail with a single medium where everyone can exchange verified information instantaneously, blockchain will boost profit margins exponentially without raising prices or even increasing individual productivity.”

But the potential for blockchain extends far beyond cost cutting and streamlining, says Irfan Khan, CEO of supply chain management consulting and systems integration firm Bristlecone, a Mahindra Group company. It will give companies ways to differentiate.

“Blockchain will let enterprises more accurately trace faulty parts or products from end users back to factories for recalls,” Khan says. “It will streamline supplier onboarding, contracting, and management by creating an integrated platform that the company’s entire network can access in real time. It will give vendors secure, transparent visibility into inventory 24×7. And at a time when counterfeiting is a real concern in multiple industries, it will make it easy for both retailers and customers to check product authenticity.”

Blockchain allows all the critical steps of the supply chain to go electronic and become irrefutably verifiable by all the critical parties within minutes: the seller and buyer, banks, logistics carriers, and import and export officials. Although the key parts of the process remain the same as in today’s analog supply chain, performing them electronically with blockchain technology shortens each stage from hours or days to seconds while eliminating reams of wasteful paperwork. With goods moving that quickly, companies have ample room for designing new business models around manufacturing, service, and delivery.

Q118 CoverFeature img4 roadblock Challenges For Sustainable Digital Ethics

Challenges on the Path to Adoption

For all this to work, however, the data on the blockchain must be correct from the beginning. The pills, produce, or parts on the delivery truck need to be the same as the items listed on the manifest at the loading dock. Every use case assumes that the data is accurate—and that will only happen when everything that’s manufactured is smart, connected, and able to self-verify automatically with the help of machine learning tuned to detect errors and potential fraud.

Companies are already seeing the possibilities of applying this bundle of emerging technologies to the supply chain. IDC projects that by 2021, at least 25% of Forbes Global 2000 (G2000) companies will use blockchain services as a foundation for digital trust at scale; 30% of top global manufacturers and retailers will do so by 2020. IDC also predicts that by 2020, up to 10% of pilot and production blockchain-distributed ledgers will incorporate data from IoT sensors.

Despite IDC’s optimism, though, the biggest barrier to adoption is the early stage level of enterprise use cases, particularly around blockchain. Currently, the sole significant enterprise blockchain production system is the virtual currency Bitcoin, which has unfortunately been tainted by its associations with speculation, dubious financial transactions, and the so-called dark web.

The technology is still in a sufficiently early stage that there’s significant uncertainty about its ability to handle the massive amounts of data a global enterprise supply chain generates daily. Never mind that it’s completely unregulated, with no global standard. There’s also a critical global shortage of experts who can explain emerging technologies like blockchain, the IoT, and machine learning to nontechnology industries and educate organizations in how the technologies can improve their supply chain processes. Finally, there is concern about how blockchain’s complex algorithms gobble computing power—and electricity (see “Blockchain Blackouts”).

Blockchain Blackouts

Q118 CoverFeature img5 blackout Challenges For Sustainable Digital EthicsBlockchain is a power glutton. Can technology mediate the issue?

A major concern today is the enormous carbon footprint of the networks creating and solving the algorithmic problems that keep blockchains secure. Although virtual currency enthusiasts claim the problem is overstated, Michael Reed, head of blockchain technology for Intel, has been widely quoted as saying that the energy demands of blockchains are a significant drain on the world’s electricity resources.

Indeed, Wired magazine has estimated that by July 2019, the Bitcoin network alone will require more energy than the entire United States currently uses and that by February 2020 it will use as much electricity as the entire world does today.

Still, computing power is becoming more energy efficient by the day and sticking with paperwork will become too slow, so experts—Intel’s Reed among them—consider this a solvable problem.

“We don’t know yet what the market will adopt. In a decade, it might be status quo or best practice, or it could be the next Betamax, a great technology for which there was no demand,” Lonser says. “Even highly regulated industries that need greater transparency in the entire supply chain are moving fairly slowly.”

Blockchain will require acceptance by a critical mass of companies, governments, and other organizations before it displaces paper documentation. It’s a chicken-and-egg issue: multiple companies need to adopt these technologies at the same time so they can build a blockchain to exchange information, yet getting multiple companies to do anything simultaneously is a challenge. Some early initiatives are already underway, though:

  • A London-based startup called Everledger is using blockchain and IoT technology to track the provenance, ownership, and lifecycles of valuable assets. The company began by tracking diamonds from mine to jewelry using roughly 200 different characteristics, with a goal of stopping both the demand for and the supply of “conflict diamonds”—diamonds mined in war zones and sold to finance insurgencies. It has since expanded to cover wine, artwork, and other high-value items to prevent fraud and verify authenticity.
  • In September 2017, SAP announced the creation of its SAP Leonardo Blockchain Co-Innovation program, a group of 27 enterprise customers interested in co-innovating around blockchain and creating business buy-in. The diverse group of participants includes management and technology services companies Capgemini and Deloitte, cosmetics company Natura Cosméticos S.A., and Moog Inc., a manufacturer of precision motion control systems.
  • Two of Europe’s largest shipping ports—Rotterdam and Antwerp—are working on blockchain projects to streamline interaction with port customers. The Antwerp terminal authority says eliminating paperwork could cut the costs of container transport by as much as 50%.
  • The Chinese online shopping behemoth Alibaba is experimenting with blockchain to verify the authenticity of food products and catch counterfeits before they endanger people’s health and lives.
  • Technology and transportation executives have teamed up to create the Blockchain in Transport Alliance (BiTA), a forum for developing blockchain standards and education for the freight industry.

It’s likely that the first blockchain-based enterprise supply chain use case will emerge in the next year among companies that see it as an opportunity to bolster their legal compliance and improve business processes. Once that happens, expect others to follow.

Q118 CoverFeature img7 milk Challenges For Sustainable Digital Ethics

Customers Will Expect Change

It’s only a matter of time before the supply chain becomes a competitive driver. The question for today’s enterprises is how to prepare for the shift. Customers are going to expect constant, granular visibility into their transactions and faster, more customized service every step of the way. Organizations will need to be ready to meet those expectations.

If organizations have manual business processes that could never be automated before, now is the time to see if it’s possible. Organizations that have made initial investments in emerging technologies are looking at how their pilot projects are paying off and where they might extend to the supply chain. They are starting to think creatively about how to combine technologies to offer a product, service, or business model not possible before.

A manufacturer will load a self-driving truck with a 3D printer capable of creating a customer’s ordered item en route to delivering it. A vendor will capture the market for a socially responsible product by allowing its customers to track the product’s production and verify that none of its subcontractors use slave labor. And a supermarket chain will win over customers by persuading them that their choice of supermarket is also a choice between being certain of what’s in their food and simply hoping that what’s on the label matches what’s inside.

At that point, a smart supply chain won’t just be a competitive edge. It will become a competitive necessity. D!


About the Authors

Gil Perez is Senior Vice President, Internet of Things and Digital Supply Chain, at SAP.

Tom Raftery is Global Vice President, Futurist, and Internet of Things Evangelist, at SAP.

Hans Thalbauer is Senior Vice President, Internet of Things and Digital Supply Chain, at SAP.

Dan Wellers is Global Lead, Digital Futures, at SAP.

Fawn Fitter is a freelance writer specializing in business and technology.

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Expert Interview (Part 2): James Kobielus on Reasons for Data Scientist Insomnia including Neural Network Development Challenges

In the first half of our two-part conversation with Wikibon lead analyst James Kobielus (@jameskobielus), he discussed the incredible impact of machine learning in helping organizations make better business decisions and be more productive. In today’s Part 2, he addresses what aspects of machine learning should be keeping data scientists up at night. (Hint: neural networks)

Several Challenges Involved with Developing Neural Networks

Developing these algorithms is not without its challenges, Kobielus says.

The first major challenge is finding data.

Algorithms can’t do magic unless they’ve been “trained.” And in order to train them, the algorithms require fresh data. But acquiring this training data set is a big hurdle for developers.

For eCommerce sites, this is less of a problem – they have their own data in the form of transaction histories, site visits and customer information that can be used to train the model and determine how predictive it is.

blog banner 2018 Big Data Trends eBook Expert Interview (Part 2): James Kobielus on Reasons for Data Scientist Insomnia including Neural Network Development Challenges

But the process of amassing those training data sets when you don’t have data is trickier – developers have to rely upon commercial data sets that they’ve purchased or open source data sets.

After getting the training data, which might come from a dozen different sources, the next challenge is aggregating it so the data can be harmonized with a common set of variables. Another challenge is having the ability to cleanse data to make sure it’s free of contradictions and inconsistencies. All this takes time and resources in the form of databases, storage, processing and data engineers. This process is expensive but essential. (For more on this, read Uniting Data Quality and Data Integration)

Third, organizations need data scientists, who are expensive resources. They need to find enough people to manage the whole process – from building to training to evaluating to governing.

“Finding the right people with the right skills, recruiting the right people is absolutely essential,” Kobielus says.

Before jumping into machine learning, organizations should also make sure it makes sense for your business strategies.

Industries like finance and marketing have made a clear case for themselves in implementing Big Data. In the case of finance, it allows them to do high-level analysis to detect things like fraud. And in marketing, for instance, CMOs, found it useful to develop algorithms that allowed them  to conduct sentiment analysis on social media.

There are a lot of uses for it to be sure, Kobielus says, but there are methods for deriving insights from data that don’t involve neural networks. It’s up to the business to determine whether using neural networks is overkill for their purposes.

“It’s not the only way to skin these cats,” he says.

If you already have the tools in place, then it probably makes sense to keep using them. Or, if you find traditional tools can’t address needs like transcription or facial recognition, then it probably makes sense to go to a newer form of machine learning.

What Should Really Be Keeping Data Scientists Up at Night 

While those in the tech industry might be fretting over whether AI will displace the gainfully employed or that there’s a skills deficit in the field, Kobielus has other worries related to data science.

For one, the algorithms used for machine learning and AI are really complex and they drive so many decisions and processes in our lives.

“What if something goes wrong? What if a self-driving vehicle crashes? What if the algorithm does something nefarious in your bank account? How can society mitigate the risks,” Kobielus asks.

When there’s a negative outcome, the question asked is who’s responsible. The person who wrote the algorithm? The data engineer? The business analyst who defined the features?

These are the questions that should keep data scientists, businesses, and lawyers up at night. And the answers aren’t clear-cut.

In order to start answering some of these questions, there needs to be algorithmic transparency, so that there can be algorithmic accountability.

Ultimately, everyone is responsible for the outcome.

There’s a huge legal gray area when it comes to machine learning because the models used are probabilistic and you can’t predict every single execution path for a given probabilistic application built on ML.

blog kobielus quote3 Expert Interview (Part 2): James Kobielus on Reasons for Data Scientist Insomnia including Neural Network Development Challenges

“There’s a limit beyond which you can anticipate the particular action of a particular algorithm at a particular time,” Kobielus says.

For algorithmic accountability, there need to be audit trails. But an audit log for any given application has the potential to be larger than all the databases on Earth. Not just that, but how would you roll it up into a coherent narrative to hand to a jury?

“Algorithmic accountability should keep people up at night,” he says.

Just as he said concerns about automation are overblown, Kobielus says it’s also unnecessary to worry that there aren’t enough skilled data scientists working today.

Data science is getting easier.

Back in the 80s, developers had to know underlying protocols like HTTP, but today nobody needs to worry about the protocol plumbing anymore. It will be the same for machine learning, Kobielus says. Increasingly, the underlying data is being abstracted away by higher-level tools that are more user friendly.

“More and more, these things can be done by average knowledge workers, and it will be executed by underlying structure,” he says.

Does Kobielus worry about the job security of data scientists then? Not really. He believes data science automation tools will allow data scientists to do less with more and hopefully to allow them to develop their skills in more challenging and creative realms.

For 5 key trends to watch for in the next 12 months, check out our new report: 2018 Big Data Trends: Liberate, Integrate & Trust

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

The Four Challenges to Growth – and How Buster + Punch is Tackling Them

Posted by David Turner, Senior Marketing Director, EMEA, Oracle NetSuite

When Martin Preen was brought in by the founders of Buster + Punch his task was expressed simply, if a challenging one to deliver. As CEO, his mission was to help build the “world’s first home fashion label” and plot a path to international expansion.

BP The Four Challenges to Growth – and How Buster + Punch is Tackling ThemLaunched in an East London garage by industrial designer Massimo ‘Buster’ Minale, Buster + Punch creates cutting edge products, ranging from lighting and home hardware to custom motorcycles. Since 2013, it has become a “cult-favourite interior design brand.”

Critical acclaim was followed by financial success. Turnover has doubled every year and it now has operations in seven office locations across three continents. But growth hasn’t been straight forward. Like any start-up, it has to manage the multiple complexities that come with international expansion, an increase in personnel and a change in expectations.

Preen outlined the four key challenges to growth and how to tackle them:

1. Managing cash

Preen, a former investment banker, said: “Most people who do a start-up know that you’ve got to keep an eye on the cash or you’re screwed.” In other words, cash is king. But managing the cash is not just about reconciling the bank account – it’s about having a complete understanding of the supply chain, payment terms and many other aspects of business finance. Understanding requires visibility.

2. Reading the data

“If cash is king, data is, too,” said Preen. To manage cash requires reading the data. “One of the big challenges for us was that while in London we had a system that was just okay; in the US we had nothing; and in Asia we just had spreadsheets. Trying to pull something together to see where we were took an age,” explained Preen. “If you haven’t got that visibility about where you are, it’s exceptionally difficult to make really, really good decisions.”

Earlier this year, Preen and his team selected and implemented NetSuite OneWorld to give it a real-time view across its operations from manufacturing and warehouses to supply chain and customer service. The implementation of OneWorld – which supports 190 currencies, more than 20 languages, automated tax calculation and reporting in more than 100 countries, and customer transactions in more than 200 countries – means Buster + Punch can more accurately forecast international product demand, scaling its manufacturing and delivery processes accordingly.

3. People and operational control

As a start-up grows, the culture begins to change. “The core people at the beginning know each other, they do everything together and they know what they have got to do,” explained Preen. He calls those early recruits “X-Men of business,” people with a clear major talent but put in a position to multitask at a stage where getting the job done is more important than defining precise roles and responsibilities. As the business expands, however, this ad hoc approach becomes incompatible with long-term growth. While the founding members of the team try and hold on to the old way of doing things, “the new people don’t quite know what … their remit is.” It is adapting to this new “business-as-usual” that is most difficult for employees and business leaders alike.

Preen said NetSuite’s capabilities have the ability to “make us smarter.” It means using workflow features to define the team’s roles and responsibilities. It means being able to manage all operations on a single cloud platform. And it means being able to draw on data from manufacturing plants in Asia, physical locations in Sweden and the UK and retail channels across the world. Where lack of visibility made it difficult for the company to make the good decisions, this new found global clarity is game changing.

“We had a whole hairball infrastructure of systems and spreadsheets across various sites,” recalled Preen. “In order to bring that together we wanted to look at a technology that would run in all of our different locations and be able to consolidate that data across three legal entities as well. And we came across NetSuite. We loved the dashboard views and the workflow reporting.”

4. Keeping it fun

In pursuit of business discipline, organisations can lose the spark of ingenuity that made them successful in the first place. “The challenge for me,” said Preen, “is how do we put a level of control into the structure to allow the business to grow but also keep it fun?” For fun, read product innovation and what Preen called, “involving ourselves in things that are a little bit of rock and roll.” This includes motorcycle customisation, joining forces with Rolls Royce to create a contemporary version of the Edison light bulb and an association with the Q Awards, not just as a sponsor but as the maker of the awards themselves.

Learn more about NetSuite for the retail industry.

Posted on Mon, December 11, 2017
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Modelling Deposit Price Elasticity: Challenges and Approach

This is the third in a series of blogs on deposit pricing, focusing on price elasticity modelling approaches and challenges.

The goal of any deposit price optimization solution is to make data-driven pricing decisions to manage portfolio balances and trade these off against the associated costs. These solutions should allow a pricing manager to prepare and run what-if analyses to assess the impact of pricing strategies, competitor price actions or movements in central bank base rates.

Fundamental to these solutions are price-elasticity models that capture and predict customer behavior as a response to pricing and other non-price factors. In this blog, we discuss the challenges and solution approaches for the development of robust price-elasticity models.

Price Response Signal

Price sensitivity can be measured with regards to product rate, market ranking, competitor rates or even interest paid to other products in the portfolio. The modelling challenge is not only to measure price sensitivity accurately, but to capture as much richness in pricing behavior as possible, e.g., variations in price sensitivity across different segments.

For example, ultra-low bank base rates have become the new normal in the US and Europe and it can be a challenge to isolate the price signal. There may be limited price variation in the modelling period or else one-time shocks caused by the presence of non-price related factors (such as Brexit) that drive balance flows. Previous experience of price sensitivity models across different markets, interest rate environments and transformation of price-related variables provides an anchor to avoid misdiagnosis of the price signal.

Cannibalization

In order to truly understand the impact of pricing decisions, the flows between individual products and segments must be understood. This allows the prediction of the balance distribution by product and requires a tried and tested process for inferring balance flows. Understanding and predicting balance flows across products in turn allows pricing managers to assess the impact of cannibalization on pricing decisions and overall portfolio revenue.

Data Availability & Granularity

One of the biggest challenges for the development of price sensitivity models is data availability. The development of overly complex models with insufficient data results in the often-cited adage of garbage-in, garbage-out, so it is important to ensure that the modelling approach is appropriate to the data available.

Finer granularity, where more modelling segments are used, introduces richer pricing behavior and allows greater insights into pricing decisions. This must be balanced with the need for sufficient deposits within each segment to ensure a statistically significant price signal can be measured.

Another consideration concerns the model resolution: An established deposit portfolio with relatively stable balances might only re-price a few times a year and a monthly granularity is sufficient. On the other hand, a smaller bank that needs to make shorter-term funding decisions might buy balances through the introduction of market-leading rates. This type of pricing action typically occurs more frequently and would therefore need a weekly granularity.

Modelling Methodologies

Depending on a bank’s objectives and the availability of data, there are a number of modelling approaches that might be considered.

Deposit Price Elasticity Modelling Approaches FICO Modelling Deposit Price Elasticity: Challenges and Approach

Model Management

It is vitally important that all stakeholders from the pricing analyst to senior management have confidence in pricing models. The best way to achieve this is to ensure full transparency of the models, methodology and the factors that drive predictions.

With a full understanding of model drivers, the business can justify why particular pricing decisions are made to internal stakeholders and also answer any challenges posed by external regulators that a “black box” solution could not.

An ongoing model management process should monitor model performance against established accuracy thresholds to guide model recalibration and redevelopment. This ensures that models respond to changes in the market and provides a mechanism whereby the impact of recent pricing decisions feed back into the price sensitivity models.

Conclusion

In order to develop deposit price sensitivity models, careful consideration is needed to evaluate an organization’s requirements and mitigate some of the challenges discussed here. The deployment of such models offers substantial improvements on approaches that rely exclusively on expert judgment. It allows banks to more effectively manage their deposit portfolio, better understand their customer behavior and preferences, and identify new revenue opportunities. It is also a critical component on the journey towards full price optimization where banks derive all the benefits that data driven models have to offer.

Let’s block ads! (Why?)

FICO

Overcome 3 Challenges Faced by Professional Services Firms with a Specific CRM Solution

CRM Blog Overcome 3 Challenges Faced by Professional Services Firms with a Specific CRM Solution

The flexibility of a CRM solution such as Microsoft Dynamics 365 allows you to turn it into the heart of your organization’s operations. You can then use it as the entryway from which all your activities can be completed. It can also integrate a solution specifically designed for your industry, allowing you to complete all your tasks from a single interface that’s intuitive, familiar and user-friendly. With a solution that incorporates their specific business rules and needs, professional services firms can overcome several of the challenges they encounter daily.

1. Have only one version of the truth across your organization

One of the challenges that many professional services firms must face is the utilization of non-integrated applications and Excel spreadsheets. Result: resources don’t always have access to the latest information or the most recent version of a document, which can lead to discrepancies regarding facts or activity status. By centralizing all your information with a specific solution integrated to your CRM system, not only do you ensure access for resources across your entire organization, you also offer them complete visibility on all aspects of your business.

As such, your mandates, invoices and WIPs, as well as customer activities and all communications exchanged with them, can be accessed from a single source updated in real time. The associates have increased visibility on their mandates and their clients, as well as on their complete history and profitability. By having all this data on hand, they can also better serve their clients, exceed their expectations and ensure their satisfaction.

2. Shorten your invoicing cycle

Maintain the balance between finances and operations by leveraging the flexibility of your CRM system and the control of your ERP solution. With a complete, bidirectional integration, information is transferred from one system to the other in real time. This not only avoids constant back-and-forth communications between your associates and the accounting team, but it also eliminates manual data entry and reduces the risk of errors and double entries.

Various invoice templates can be used and combined from the CRM, then transferred to the accounting system. This allows you to create invoices that meet all your clients’ criteria and that fit the requirements of the different services offered by your organization, while also adapting them to your organization’s branding. Lastly, having access to your financial data in real time means that you can invoice faster. This way, you improve your cash flow while also reducing the risk of write-offs and bad debts.

3. Optimize the utilization and productivity rates of your resources

By using workflows to automate certain process, you can facilitate your associates’ tasks and activities. For example, they can be notified to approve the budget when a new mandate is created. Among the specific functionalities of our solution, a resource planning tool also makes it possible to view easily the resources that are currently assigned and those that are available to work on new mandates. This tool thus makes it possible to optimize resource utilization and better plan hiring. Lastly, the solution allows you to budget the potential for billable hours per resource per month and to track them throughout the year.

One last challenge that many professional services firms must face is the difficulty of staying on top of technological trends to meet the constantly evolving needs of their clients. Scalable and flexible, the Microsoft Dynamics 365 platform makes it possible to catch up on this technological lag, while a specific solution such as JOVACO’s accounting solution improves efficiency and performance across your entire organization.

By JOVACO Solutions, Microsoft Dynamics 365 specialist in Quebec

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Expert Interview (Part 2): Robert Corace of SoftServe Discusses Digital Transformation Challenges

In Part 1, Robert talked about how critical importance of digital transformation for organizations. In part two, highlights the results of recent research on digital transformation with a focus on the common challenges organizations face. He also provides some examples of  innovative strategies that companies such as Netflix and Amazon are using to tackle these digital transformation challenges.

What are the most common frustration or challenges your clients are coming to you with to solve? How do you help them?

As our recent research showed, security is the chief concern and the biggest challenge to solve. As I already mentioned, data mining and analytics is also a struggle for many, as well as experience design and organizational inflexibility.

On a higher, more strategic level, though, many companies understand that they need to transform, but they lack clear vision into what areas they need to focus on, where to start, and how to move forward with their transformative initiatives in the fastest and most efficient way. That’s why we have SoftServe Labs, in fact, to help our clients with research and proof-of-concept before they make large investments.

I wouldn’t describe these as purely challenges, though, as these companies also stand to gain a lot. Digital asset management, Cloud computing, mobile technologies, and the Internet of Things (IoT) approached as a part of digital transformation efforts can bring a lot of benefits to consumer facing operations, retail, the finance and banking sector, and many others.

What are some of the digital transformation challenges facing organizations today in harnessing their data?

Numerous security breaches and hacking attacks serve as a proof that we haven’t yet solved security challenges facing all businesses, small and large. Privacy is also a big concern, especially when it comes to access to personal data in healthcare, education, state and government organizations, etc.

blog digital transformation Expert Interview (Part 2): Robert Corace of SoftServe Discusses Digital Transformation Challenges

Data security is one of the common digital transformation challenges facing businesses today.

Another aspect of it is legacy software that cannot handle the amounts of data that require daily processing, and it can’t be all substituted within a couple of days due to financial and resource strains it would put upon the organizations. Artificial Intelligence (AI), though hugely promising, is not yet at that stage when it can automate decision making for truly impactful processes, beyond initial analysis. However, it can facilitate and speed them up considerably.

Related: The Future of Artificial Intelligence in Sales and Account Planning

It also is very important to remember that harnessing data is not an end in itself, but rather a means to help organizations achieve their business and strategic goals. And the consumer – a human being– is at the heart of all of it. So, no purely technical solution, no matter how powerful or innovative, will bring true value if it’s not applied correctly or as a part of a well-thought and comprehensive strategy.

What organizations do you feel have been especially forward-thinking and/or innovative at leveraging their data to solve? What can we learn from them to solve our own digital transformation challenges?

Well, when it comes to leveraging data and personalization, giants like Google and Netflix immediately come to mind. It’s interesting how thoroughly analyzing data and making the right predictions, Netflix managed to reduce the range of content available on their platform while improving customer satisfaction.

And look how Amazon is using data from different sensors and machine learning to disrupt the grocery business with their “Amazon Go” retail store.

When it comes to attracting new customers, which is also a challenge for traditional companies, I like the example of L’inizio Pizza Bar in New York. Their manager decided to attract Pokémon Go players to the place, and he spent just $ 10 to have Pokémon characters lured to his restaurant. The business went up by 75 percent. So, it’s never about technology or software only, it’s about innovative thinking and human ingenuity.

How can organizations manage their data assets more efficiently and effectively? What should their data management strategies include?

With the “Internet of Everything” and connected everything blurring the concepts of office and home devices as well as working hours and workplace, data assets need to be secure and protected and accessible from a variety of different devices, in different formats and easily searchable.

blog banner HP2017webcast Expert Interview (Part 2): Robert Corace of SoftServe Discusses Digital Transformation Challenges

For some organizations – most likely in the government sector, finance, and insurance, etc. – it will require switching to intranet to secure their assets from any unauthorized access or potential loss of information. For others, where remote access from any place, any time is a higher priority, omni-channel and compatibility will be the key focus. The challenges here include the already discussed legacy software and integration issues.

According to IDC research, by 2022 almost all data – 93 percent – in the digital universe will be unstructured. It will also, most likely be content in different formats, including audio and video files, images, interactive content, etc. Not only will this require greater storage and processing capacity, it also means that this data will need to be easily searchable and user friendly if we want it to be used versus stored.

When it comes to customer-facing content, another requirement is consistency across various channels. On the whole, when it comes to data, the current leaders in asset management are platform providers. With these platforms, instead of building their own solutions from scratch, which is a costly and time-consuming approach, businesses can quickly customize and scale a ready-made solution, adding and discarding additional features depending on their current needs.

What are some of the most exciting Big Data trends or innovations you’re following right now? Why do they interest you?

SoftServe’s 2016 Big Data survey showed 62 percent of organizations expect to implement machine learning by 2018, so apparently machine learning and Artificial Intelligence are huge Big Data trends we’re following right now. Chatbots as a customer-facing form of AI technology have gained momentum and are quickly becoming an area of huge interest for all kinds of user support activities.

But from a high-level perspective it’s nothing new, really. Once again, it’s all focused around building a better, different experience for a consumer, so machine learning, AI and chatbots are in fact just new(ish), possibly more effective ways to achieve the same goal: leveraging data to improve customer experience and stay relevant in an increasingly competitive marketplace.

For more on challenges driving digital transformations, download the eBook “Hadoop Perspectives for 2017” which offers an in-depth look at the results of Syncsort’s annual Hadoop survey, including five trends to watch for in 2017.

Let’s block ads! (Why?)

Syncsort blog

Overcoming Technical Challenges in Large Mainframe to Hadoop Implementations

In the past couple of years, we’ve seen tremendous growth in demand for Syncsort DMX-h from large enterprises looking to leverage their Mainframe data on Hadoop. Syncsort DMX-h is currently the recognized leader in bringing Mainframe data into Big Data platforms as part of Hadoop implementation.

A large percentage of these customers have come to us after a recommendation from one of the Hadoop vendors or from a Systems Integrator, the hands-on people who really get the technical challenges of this type of integration. Some people might wonder why the leaders in this industry recommend Syncsort, rather than some of the giants in Data Integration. What kind of problems trip those giants up, and what does Syncsort have under the hood that makes the hands-on Hadoop professionals recommend it?

To get an idea of the challenges that Syncsort takes on, let’s look at some Hadoop implementation examples and what technical problems they faced.

Use Case 1: How Mainframe Data Can Go Native on Hadoop

A large enterprise was masking sensitive data on the Mainframe to use in mainframe application testing. This used a lot of expensive CPU time. They were looking to save the cost of MIPS by doing the masking on Hadoop, then bringing the masked data back to the Mainframe.

DMX-h has a unique capability to move Mainframe data to be processed in Hadoop without doing any conversion. By preserving the original Mainframe EBCDIC encoding and data formats such as Packed-Decimal, DMX-h could process the files in Hadoop without any loss of precision or changes in data structure. The original COBOL copybook was used by DMX-h to understand the data, including Occurs Depending On and Redefines definitions. After masking sensitive parts on Hadoop, that same copybook still matched the data when it was moved back to the Mainframe.

dmxh slide 1 Overcoming Technical Challenges in Large Mainframe to Hadoop Implementations

Keeping the data in its original Mainframe format also helped with governance, compliance and auditing. There is no need to justify data changes when the data is simply moved to another processing system, not altered.

Use Case 2: Efficiently Combining Complex Mainframe Data with Diverse Data Sources for Big Data Analytics on Hadoop

Another common use case Syncsort DMX-h does routinely is to move Mainframe data to Hadoop so it can be combined with other data sources, and be part of analytics run in MapReduce or Spark. We’ve seen plenty of situations when a customer has tried different solutions to ingest Mainframe data but has hit roadblocks that stalled its Hadoop implementation, where DMX-h easily handles the situation.

In one example, a customer had nested Occurs Depending On clauses in their COBOL copybooks. Their existing solution was expanding each of the records to the maximum occurrence length. This was causing the size of data to blow up hugely on Hadoop, and the ingestion was painfully slow. With DMX-h, the records were kept at their intended size and the ingestion proceeded at a much faster rate. The ingestion completed in under 10 minutes, as opposed to 4 hours with the existing solution.

In another example, the customer had VSAM data with many segments. Their existing solution was reading the VSAM file once for each segment, which was taking a lot of time and using up expensive processing on the Mainframe. If a VSAM file had for instance, 5 segments, the other solution had to read that same file 5 times over. DMX-h reads VSAM only once, partitioning the data by segment ID using a field on the copybook and then splits the data into separate segments, allowing each segment to be mapped by a different copybook, for further processing.

blog BD Legacy Overcoming Technical Challenges in Large Mainframe to Hadoop Implementations

Use Case 3: Simplifying Mainframe Data Access

We have helped users who discovered that accessing Mainframe data isn’t as easy as other data formats. For example, they might have found that their tool doesn’t handle Packed-Decimal or LOW-VALUES in COBOL, or cannot transfer data securely from the Mainframe using Connect:Direct or FTPS.

Another big challenge during Hadoop implementation is getting the COBOL copybook to match the Mainframe data. The original COBOL developers may be long gone, and there is no one around who can fix the copybook. Our Professional Services team sees that all the time, and helps enterprises correct copybook problems so the value of the data can be fully realized.

In these and other practical situations, customers have told us they chose DMX-h as the faster, easier to use, and more cost-effective solution. Mainframe sources can be accessed and processed with DMX-h with just a few mouse clicks. The simplicity makes hard problems look easy.

We’ve spent a lot of time listening to our customer’s pains and aspirations for their Mainframe data. Our 40+ years of Mainframe and Data Integration expertise combined with active involvement in the Apache Hadoop community resulted in DMX-h’s strong Mainframe Access and Integration functionality. Another thing that sets us apart is our mature set of features related to security. Our ease of integration with Kerberos, and easy handling of encrypted or compressed data make a huge difference in production implementations.

For some more specifics, read where Arnie Farrelly, VP of Global Support and Services, has recounted some of what his team has experienced when working with large enterprises trying to leverage their Mainframe data in Hadoop.

And here is a short video demonstrating how to use DMX-h to access Mainframe data and integrate it with Hadoop.

Let’s block ads! (Why?)

Syncsort blog

Lack of skills remains one of the biggest data science challenges

TTlogo 379x201 Lack of skills remains one of the biggest data science challenges

The shortage of trained data scientists remains at the top of the list of data science challenges enterprises face today, according to new research from TDWI.

“We hear constantly that the biggest challenge any organization faces in a data science environment is finding the right skills,” said Fern Halper, vice president and research director at TDWI, based in Renton, Wash., in a webcast highlighting the recent findings.

The research surveyed more than 300 enterprises on their experiences with big data and data science. The two topics are increasingly blending into one another, as organizations need workers who can make sense of the massive troves of data they’ve been collecting over the last several years.

Other common challenges cited by survey respondents included lack of clarity around who owns certain data, lack of understanding of big data tools, lack of enterprise architectures needed to harness big data, security and privacy concerns, and insufficient governance protocols.

The technology piece appeared particularly vexing. Halper said many new tools have emerged within the last few years, including Hadoop, Spark, Python and others, and enterprises are having a hard time staying on top of all these rapid developments.

“Some respondents thought there were too many technologies and a lot of hype out there,” she said. “They didn’t know what to do. Others thought things are changing so fast, and they’re not nimble enough to maintain the best architectures.”

For now, enterprises are sticking with the tools they know, in part, to address these data science challenges. About 80% of survey respondents said they currently use data warehouse tools as their primary data source. For analysis, simple query and data visualization tools top the list of most used. Over the next two years, data warehouse tools will remain prominent, but the top two technologies enterprises plan to add during that time are Hadoop and open source R.

Halper said the results show clear momentum around unstructured data querying and predictive analytics, including machine learning. At the same time, it doesn’t look like these emerging tools and practices are going to completely unseat more tried-and-true tools in the foreseeable future.

“The data warehouse isn’t going away, but it’s being supplanted by these other types of platforms and creating an ecosystem,” she said. “There’s lots of momentum around predictive analytics. It’s a hot technology, and machine learning is making it hotter.”

Let’s block ads! (Why?)


SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources

Can Artificial Intelligence Address The Challenges Of An Aging Population?

These days it seems that we are witnessing waves of extreme disruption rather than incremental technology change. While some tech news stories have been just so much noise, unlikely to have long-term impact, a few are important signals of much bigger, longer-term changes afoot.

From bots to blockchains, augmented realities to human-machine convergence, a number of rapidly advancing technological capabilities hit important inflection points in 2016. We looked at five important emerging technology news stories that happened this year and the trends set in motion that will have an impact for a long time to come.

sap Q416 digital double feature1  1 Can Artificial Intelligence Address The Challenges Of An Aging Population?

Immersive experiences were one of three top-level trends identified by Gartner for 2016, and that was evident in the enormous popularity of Pokémon Go. While the hype may have come and gone, the immersive technologies that have been quietly advancing in the background for years are ready to boil over into the big time—and into the enterprise.

The free location-based augmented reality (AR) game took off shortly after Nintendo launched it in July, and it became the most downloaded app in Apple’s app store history in its first week, as reported by TechCrunch. Average daily usage of the app on Android devices in July 2016 exceeded that of the standard-bearers Snapchat, Instagram, and Facebook, according to SimilarWeb. Within two months, Pokémon Go had generated more than US$ 440 million, according to Sensor Tower.

Unlike virtual reality (VR), which immerses us in a simulated world, AR layers computer-generated information such as graphics, sound, or other data on top of our view of the real world. In the case of Pokémon Go, players venture through the physical world using a digital map to search for Pokémon characters.

The game’s instant global acceptance was a surprise. Most watching this space expected an immersive headset device like Oculus Rift or Google Cardboard to steal the headlines. But it took Pikachu and the gang to break through. Pokémon Go capitalized on a generation’s nostalgia for its childhood and harnessed the latest advancements in key AR enabling technologies such as geolocation and computer vision.

sap Q416 digital double feature1 images8 Can Artificial Intelligence Address The Challenges Of An Aging Population?Just as mobile technologies percolated inside companies for several years before the iPhone exploded onto the market, companies have been dabbling in AR since the beginning of the decade. IKEA created an AR catalog app in 2013 to help customers visualize how their KIVIK modular sofa, for example, would look in their living rooms. Mitsubishi Electric has been perfecting an AR application, introduced in 2011, that enables homeowners to visualize its HVAC products in their homes. Newport News Shipbuilding has launched some 30 AR projects to help the company build and maintain its vessels. Tech giants including Facebook, HP, and Apple have been snapping up immersive tech startups for some time.

The overnight success of Pokémon Go will fuel interest in and understanding of all mediated reality technology—virtual and augmented. It’s created a shorthand for describing immersive reality and could launch a wave of technology consumerization the likes of which we haven’t seen since the iPhone instigated a tsunami of smartphone usage. Enterprises would be wise to figure out the role of immersive technology sooner rather than later. “AR and VR will both be the new normal within five years,” says futurist Gerd Leonhard, noting that the biggest hurdles may be mobile bandwidth availability and concerns about sensory overload. “Pokémon is an obvious opening scene only—professional use of AR and VR will explode.”

sap Q416 digital double feature1  3 Can Artificial Intelligence Address The Challenges Of An Aging Population?

Blockchains, the decentralized digital ledgers of transactions that are processed by a distributed network, first made headlines as the foundation for new types of financial transactions beginning with Bitcoin in 2009. According to Greenwich Associates, financial and technology companies will invest an estimated $ 1 billion in blockchain technology in 2016. But, as Gartner recently pointed out, there could be even more rapid evolution and acceptance in the areas of manufacturing, government, healthcare, and education.

By the 2020s, blockchain-based systems will reduce or eliminate many points of friction for a variety of business transactions. Individuals and companies will be able to exchange a wide range of digitized or digitally represented assets and value with anyone else, according to PwC. The supervised peer-to-peer network concept “is the future,” says Leonhard.

But the most important blockchain-related news of 2016 revealed a weak link in the application of technology that is touted as an immutable record.

In theory, blockchain technology creates a highly tamper-resistant structure that makes transactions secure and verifiable through a massively distributed digital ledger. All the transactions that take place are recorded in this ledger, which lives on many computers. High-grade encryption makes it nearly impossible for someone to cheat the system.

In practice, however, blockchain-based transactions and contracts are only as good as the code that enables them.

Case in point: The DAO, one of the first major implementations of a “Decentralized Autonomous Organization” (for which the fund is named). The DAO was a crowdfunded venture capital fund using cryptocurrency for investments and run through smart contracts. The rules that govern those smart contracts, along with all financial transaction records, are maintained on the blockchain. In June, the DAO revealed that an individual exploited a vulnerability in the company’s smart contract code to take control of nearly $ 60 million worth of the company’s digital currency.

The fund’s investors voted to basically rewrite the smart contract code and roll back the transaction, in essence going against the intent of blockchain-based smart contracts, which are supposed to be irreversible once they self-execute.

The DAO’s experience confirmed one of the inherent risks of distributed ledger technology—and, in particular, the risk of running a very large fund autonomously through smart contracts based on blockchain technology. Smart contract code must be as error-free as possible. As Cornell University professor and hacker Emin Gün Sirer wrote in his blog, “writing a robust, secure smart contract requires extreme amounts of diligence. It’s more similar to writing code for a nuclear power reactor, than to writing loose web code.” Since smart contracts are intended to be executed irreversibly on the blockchain, their code should not be rewritten and improved over time, as software typically is. But since no code can ever be completely airtight, smart contracts may have to build in contingency plans for when weaknesses in their code are exploited.

Importantly, the incident was not a result of any inherent weakness in the blockchain or distributed ledger technology generally. It will not be the end of cryptocurrencies or smart contracts. And it’s leading to more consideration of editable blockchains, which proponents say would only be used in extraordinary circumstances, according to Technology Review.

sap Q416 digital double feature1  5 Can Artificial Intelligence Address The Challenges Of An Aging Population?

Application programming interfaces (APIs), the computer codes that serve as a bridge between software applications, are not traditionally a hot topic outside of coder circles. But they are critical components in much of the consumer technology we’ve all come to rely on day-to-day.

One of the most important events in API history was the introduction of such an interface for Google Maps a decade ago. The map app was so popular that everyone wanted to incorporate its capabilities into their own systems. So Google released an API that enabled developers to connect to and use the technology without having to hack into it. The result was the launch of hundreds of inventive location-enabled apps using Google technology. Today, millions of web sites and apps use Google Maps APIs, from Allstate’s GoodHome app, which shows homeowners a personalized risk assessment of their properties, to Harley-Davidson’s Ride Planner to 7-Eleven’s app for finding the nearest Slurpee.

sap Q416 digital double feature1 images6 Can Artificial Intelligence Address The Challenges Of An Aging Population?Ultimately, it became de rigueur for apps to open up their systems in a safe way for experimentation by others through APIs. Technology professional Kin Lane, who tracks the now enormous world of APIs, has said, “APIs bring together a unique blend of technology, business, and politics into a transparent, self-service mix that can foster innovation.”

Thus it was significant when Apple announced in June that it would open up Siri to third-party developers through an API, giving the wider world the ability to integrate Siri’s voice commands into their apps. The move came on the heels of similar decisions by Amazon, Facebook, and Microsoft, all of which have AI bots or assistants of their own. And in October, Google opened up its Google Assistant as well.

The introduction of APIs confirms that the AI technology behind these bots has matured significantly—and that a new wave of AI-based innovation is nigh.

The best way to spark that innovation is to open up AI technologies such as Siri so that coders can use them as platforms to build new apps that can more rapidly expand AI uses and capabilities. Call it the “platformication” of AI. The value will be less in the specific AI products a company introduces than in the value of the platform for innovation. And that depends on the quality of the API. The tech company that attracts the best and brightest will win. AI platforms are just beginning to emerge and the question is: Who will be the platform leader?

sap Q416 digital double feature1  4 Can Artificial Intelligence Address The Challenges Of An Aging Population?

In June, Swiss citizens voted on a proposal to introduce a guaranteed basic income for all of its citizens, as reported by BBC News. It was the first country to take the issue to the polls, but it won’t be the last. Discussions about the impact of both automation and the advancing gig economy on individual livelihoods are happening around the world. Other countries—including the United States—are looking at solutions to the problem. Both Finland and the Netherlands have universal guaranteed income pilots planned for next year. Meanwhile, American startup incubator Y Combinator is launching an experiment to give 100 families in Oakland, California, a minimum wage for five years with no strings attached, according to Quartz.

The world is on the verge of potential job loss at a scale and speed never seen before. The Industrial Revolution was more of an evolution, happening over more than a century. The ongoing digital revolution is happening in relative hyper speed.

No one is exactly sure how increased automation and digitization will affect the world’s workforce. One 2013 study suggests as much as 47% of the U.S workforce is at risk of being replaced by machines over the next two decades, but even a conservative estimate of 10% could have a dramatic impact, not just on workers but on society as a whole.

The proposed solution in Switzerland did not pass, in part because a major political party did not introduce it, and citizens are only beginning to consider the potential implications of digitization on their incomes. What’s more, the idea of simply guaranteeing pay runs contrary to long-held notions in many societies that humans ought to earn their keep.

Whether or not state-funded support is the answer is just one of the questions that must be answered. The votes and pilots underway make it clear that governments will have to respond with some policy measures. The question is: What will those measures be? The larger impact of mass job displacement, what future employment conditions might look like, and what the responsibilities of institutions are in ensuring that we can support ourselves are among the issues that policy makers will need to address.

New business models resulting from digitization will create some new types of roles—but those will require training and perhaps continued education. And not all of those who will be displaced will be in a position to remake their careers. Just consider taxi drivers: In the United States, about 223,000 people currently earn their living behind the wheel of a hired car. The average New York livery driver is 46 years old, according to the New York City Taxi and Limousine Commission, and no formal education is required. When self-driving cars take over, those jobs will go away and the men and women who held them may not be qualified for the new positions that emerge.

As digitization dramatically changes the constructs of commerce and work, no one is quite sure how people will be impacted. But waiting to see how it all shakes out is not a winning strategy. Companies and governments today will have to experiment with potential solutions before the severity of the problem is clear. Among the questions that will have to be answered: How can we retrain large parts of the workforce? How will we support those who fall through the cracks? Will we prioritize and fund education? Technological progress and shifting work models will continue, whether or not we plan for their consequences.

sap Q416 digital double feature1  2 Can Artificial Intelligence Address The Challenges Of An Aging Population?

In April, a young man, who was believed to have permanently lost feeling in and control over his hands and legs as the result of a devastating spine injury, became able to use his right hand and fingers again. He used technology that transmits his thoughts directly to his hand muscles, bypassing his injured spinal cord. Doctors implanted a computer chip into the quadriplegic’s brain two years ago and—with ongoing training and practice—he can now perform everyday tasks like pouring from a bottle and playing video games.

The system reconnected the man’s brain directly to his muscles—the first time that engineers have successfully bypassed the nervous system’s information superhighway, the spinal cord. It’s the medical equivalent of moving from wired to wireless computing.

The man has in essence become a cyborg, that term first coined in 1960 to describe “self-regulating human-machine systems.” Yet the beneficiary of this scientific advance himself said, “You’re not going to be looked on as, ‘Oh, I’m a cyborg now because I have this big huge prosthetic on the side of my arm.’ It’s something a lot more natural and intuitive to learn because I can see my own hand reacting.”

As described in IEEE Spectrum, the “neural-bypass system” records signals that the man generates when thinking about moving his hand, decodes those signals, and routes them to the electric sleeve around his arm to stimulate movement: “The result looks surprisingly simple and natural: When Burkhart thinks about picking up a bottle, he picks up the bottle. When he thinks about playing a chord in Guitar Hero, he plays the chord.”

sap Q416 digital double feature1 images5 Can Artificial Intelligence Address The Challenges Of An Aging Population?What seems straightforward on the surface is powered by a sophisticated algorithm that can analyze the vast amounts of data the man’s brain produces, separating important signals from noise.

The fact that engineers have begun to unlock the complex code that controls brain-body communication opens up enormous possibilities. Neural prostheses (cochlear implants) have already reversed hearing loss. Light-sensitive chips serving as artificial retinas are showing progress in restoring vision. Other researchers are exploring computer implants that can read human thoughts directly to signal an external computer to help people speak or move in new ways. “Human and machine are converging,” says Leonhard.

The National Academy of Engineering predicts that “the intersection of engineering and neuroscience promises great advances in healthcare, manufacturing, and communication.”

Burkhart spent two years in training with the computer that has helped power his arm to get this far. It’s the result of more than a decade of development in brain-computer interfaces. And it can currently be used only in the lab; researchers are working on a system for home use. But it’s a clear indication of how quickly the lines between man and machine are blurring—and it opens the door for further computerized reanimation in many new scenarios.

This fall, Switzerland hosted its first cyborg Olympics, in which disabled patients compete using the latest assistive technologies, including robot exoskeletons and brainwave-readers. Paraplegic athletes use electrical simulation systems to compete in cycling, for example. The winners are those who can control their device the best. “Instead of celebrating the human body moving under its own power,” said a recent article in the IEEE Spectrum, “the cyborg games will celebrate the strength and ingenuity of human-machine collaborations.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

Let’s block ads! (Why?)

Digitalist Magazine