Tag Archives: Speed

New BPM, CRM Features Aim to Speed Strategy Execution

Bpm’online on Thursday introduced software update 7.11 for its intelligent customer relationship management and business process management platform, with more than 200 improvements designed to accelerate strategy execution.

Among the key new features:

  • Machine learning capabilities and predictive algorithms that offer users relevant information and automation within a specific context based on analysis of historical data. These additions let the platform make intelligent recommendations across a variety of business applications, ranging from automatically identifying customer needs to specifying an opportunity category, or predicting the best agent to resolve an open case;
  • A revamped campaign engine, which makes marketing process automation simple and speedy. Its new interface lets users design highly personalized campaigns, set up complex branching based on multiple criteria, and create entrance and exit conditions for campaign participants;
  • Extended BPM and case management capabilities for streamlined process automation, including the ability to run processes for selected records in any section of the system; capabilities to deeply co-figure the rules and conditions of transition between all stages; tools to effortlessly track all tasks associated with a business process;
  • Mobile app enhancements, including extended filtering capabilities and revamped customizable push notifications for events, system updates and other items; and
  • One-click Bpm’online marketplace integration, which lets users easily install or add a wide variety of applications, add-ons or process templates directly from the marketplace.

The company provides a unified platform for marketing, sales and service.

200 New Features – Twice

Bpm’online rolled out its previous update, 7.10, this spring — also with 200 enhancements and updates, and featuring enhanced tools to help businesses accelerate strategy execution.

Two hundred “is not a magic number,” said Matthew Tharp, Bpm’online’s chief evangelist.

“It happens to be the number of updates and enhancements we were able to release for 7.11,” he told CRM Buyer. “We’re determined with each release to address as many requests and great ideas as possible.”

Bpm’online is “an agile organization, so we do updates every three weeks,” Tharp said. The company delivers three to four major releases each year that have “more substantial features and functionality.”

Machine Learning and AI

Bpm’online developed its machine learning and AI capabilities in-house using open libraries and algorithms, Tharp noted.

“Because of the strengths of the company’s BPM capabilities, it’s well positioned to use AI to automate not just recommendations, but processes for customers,” noted Rebecca Wettemann, VP of research at Nucleus Research.

“Key to getting value out of AI is operationalizing it on a broad scale,” she told CRM Buyer. “Customer device data, for example, is a ripe territory for that.”

Giving Marketing a Boost

Update 7.11’s revamped campaign engine “can work with relevant marketing applications and can orchestrate across channels for a comprehensive marketing stack,” Tharp said.

The campaign engine has open application programming interfaces that let customers add custom elements as desired.

“Revamping of the marketing engine is great news, as it should drive down the learning curve for marketers and make them more productive in the long run,” Wettemann said.

For the Enterprise

Bpm’online developed its product with the complex architectures of large enterprises in mind, Tharp noted.

“Speed is … a major challenge for larger organizations, which have more systems and more complexity, and need technology that enables them to move as fast as smaller [companies] when it comes to changing or adding new process automation, or integrating new applications into an existing process,” he explained.

“The ability to scale and deliver not just implementations, but add-on products and capabilities, depends on the strength of the ecosystem,” Wettemann remarked. “Bpm’online’s advances in this area will make it easier for partners to monetize their intellectual property and strengthen partner relationships.”

Competitors for BPM deals include Pega and Appian, Tharp said. For CRM deals, BPM’online’s competitors include Salesforce and Microsoft.
end enn New BPM, CRM Features Aim to Speed Strategy Execution

Richard%20Adhikari New BPM, CRM Features Aim to Speed Strategy ExecutionRichard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology.
Email Richard.

Let’s block ads! (Why?)

CRM Buyer

3 Keys to Customer Satisfaction: Speed, Efficiency, Knowledge

Customers want fast service or support from knowledgeable people where, when and how they prefer to receive it, based on results of a study the
CMO Council published Tuesday.

Together with SAP Hybris, the CMO Council last year conducted an online survey of 2,000 respondents, equally divided between men and women. Fifty percent were in the United States, and 25 percent each resided in Canada and Europe.

Among the findings:

  • 52 percent mentioned fast response time as a key attribute of an exceptional customer experience;
  • 47 percent said knowledgeable staff, ready to assist whenever and wherever needed, was key;
  • 38 percent wanted an actual person to speak with at any time and place;
  • 38 percent wanted information when and where they needed it;
  • 9 percent wanted brand-developed social communities; and
  • 8 percent wanted always-on automated services.

Consumers have a shortlist of critical channels they expect to have access to, the survey found, including the company’s website, email, a phone number, and a knowledgeable person to speak with.

“The mindsets of consumers — whether B2B or B2C — are shifting, said Liz Miller, SVP of marketing at the CMO Council.

Marketers “have to start asking, ‘Are we set up to be a responsive organization that looks at data, looks at analytics, understands what’s coming in through CRM and is able to reflect that back through all touchpoints, including physical ones, quickly? Or are we simply waiting to react?'” she told CRM Buyer.

84678 620x420 3 Keys to Customer Satisfaction: Speed, Efficiency, Knowledge

Serve Us Well or Die!

Angry customers hurt brands. The survey identified the following behaviors:

  • 47 percent of respondents said they would stop doing business with a brand if they were continually frustrated;
  • 33 percent were annoyed because of slow service or dealing with reps who knew nothing about their past history or purchases;
  • 32 percent said they would email a company to complain; and
  • 29 percent said they would tell their family and friends about their bad experience.

That’s a possibility that
Warrantech, which provides extended service plans and warranties nationwide, is well aware of.

The company on Tuesday announced a partnership with mobile workforce management ServicePower, which will let it instantly connect customers who have service needs with available repair technicians.

“Prompt response time is critical in our line of business,” said Brian Weaver, director of service operations at Warrantech.

The teamup is expected to reduce overall customer turnover time by up to 10 percent across all verticals and “ultimately translate into commensurate lifts in clients’ sales and customer retention,” he told CRM Buyer.

The Customer Wish Conundrum

One problem marketers face is that customers appear to have conflicting desires, as shown by the survey results:

  • 36 percent of respondents were angry about not being recognized for their loyalty;
  • 12 percent wanted companies to recognize their history with the brand at any touchpoint;
  • 10 percent wanted multiple touchpoints; and
  • 23 percent felt they were being followed online.

That poses a conundrum for marketers.

“This survey did indicate the respondents wanted knowledgeable staff — and for that, I’d argue part of the knowledge is knowing about the customer, not just the product, so that the advice can directly address the customer’s unique problem,” said Rob Enderle, principal analyst at the Enderle Group.

“Knowing about the customer improves the quality of the engagement,” he told CRM Buyer.

That said, consumers don’t care how companies do what they do — they only care about the results, the CMO Council’s Miller noted.

“It’s a sausage factory. Consumers don’t care how you make them, they just want tasty sausage. At the end of the day, consumers want to be treated like persons,” she said.

“Each firm needs to survey their own customers, assess the impact and costs of changes, then formulate a strategy that applies uniquely to them,” Enderle suggested. “Some may find that the cost/benefit ratio is still better with automation.”
end enn 3 Keys to Customer Satisfaction: Speed, Efficiency, Knowledge

Richard%20Adhikari 3 Keys to Customer Satisfaction: Speed, Efficiency, KnowledgeRichard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology.
Email Richard.

Let’s block ads! (Why?)

CRM Buyer

SuiteWorld 2017 Product Company Keynote: Full Speed to Next

websitelogo SuiteWorld 2017 Product Company Keynote: Full Speed to Next

Posted by Barney Beal, Content Director

In the SuiteWorld 2017 manufacturing and wholesale distribution keynote, NetSuite Executive Vice President Jim McGeever details the challenges facing product-focused companies, what’s in store for the industry and how NetSuite is helping innovative manufacturers and distributors achieve their goals, including the SuiteSuccess industry solutions for manufacturing and wholesale distribution.

McGeever is joined on stage by R “Ray” Wang, Principal Analyst and Chairman with Constellation Research; Paul Farrell, NetSuite’s Vice President of Product Marketing; Bruce Capagli, COO of Precision Disposables, a manufacturer and distributor of protective apparel; Gary Anderson, Vice President of Operations for GT Golf Supply, a golf supply company; Marissa Kinsley, NetSuite’s Industry Marketing Lead for Manufacturing and Wholesale Distribution; Barth Thielen, CFO and COO of Blue Microphone, a microphone and headphone company; Bill Apgood Jr., CFO and COO of RST Brands, a manufacturer and distributor of furniture.

comments powered by

Let’s block ads! (Why?)

The NetSuite Blog

Full Speed to Next: SuiteWorld Reveals the Future for Manufacturers and Distributors

Posted by Maggie Miller, Senior Commerce Content Manager

Faster time to market, a need for rapid implementations and the shift to omnichannel are just a few of the things NetSuite is helping manufacturers and wholesale distributors account for and adapt to. All three took center stage at the industry-specific keynote address at the annual SuiteWorld 2017 conference recently.

Indeed, rapid change is a significant hurdle for all companies in the space. As R “Ray” Wang, Principal Analyst and CEO of Constellation Research, noted in the keynote– 52 percent of companies in the Fortune 500 have either gone bankrupt, been acquired or merged or fallen off the list since 2000. These organizations failed to respond quickly to the change.

In order to show the depth of change happening right now, Wang highlighted nine trends he’s seeing in the industry.

mfgwdkeynote Full Speed to Next: SuiteWorld Reveals the Future for Manufacturers and Distributors

NetSuite outlined what it’s doing to tackle these industry challenges as well as show how its customers are using the NetSuite solution to help them realize what is next for their businesses.

Having the Right Systems in Place

A unified cloud solution enables businesses to grow, scale and adapt to change. It eliminates the hassles of managing disparate systems and allows companies to streamline the business and get a single source of truth.

 Full Speed to Next: SuiteWorld Reveals the Future for Manufacturers and DistributorsTo help manufacturers and distributors build the right technology foundation, NetSuite unveiled SuiteSuccess for both Manufacturing and Wholesale Distribution. NetSuite’s industry teams leverage SuiteSuccess to focus on the unique industry challenges in Manufacturing and Distribution so that companies have faster time to value, increased business efficiency, flexibility and greater customer success.

Bruce Capagli, COO of Precision Disposables, a manufacturer and distributor of protective apparel for industrial and healthcare markets, implemented NetSuite in 59 days with the SuiteSuccess model. Capagli said the biggest benefit of SuiteSuccess was having a direct relationship with NetSuite experts.

“The NetSuite group really took the time to understand our business, our needs and where we want to go in the future to make those proper recommendations, and create that roadmap for how we implement,” Capagli said.

Precision Disposables is currently in phase two of SuiteSuccess implementing NetSuite warehouse management and SuiteCommerce.

Supply Chain Operational Excellence

NetSuite is providing new and enhanced functionality within its software to help bridge the gap between the operational and financial sides of the supply chain to help companies improve operations and go to market faster. Some of these news features include better product data management, supply chain control tower, container management and cross subsidiary fulfillment and improvements around the inter-company framework and journal.

Bart Thielen, CFO/COO at Blue Microphones said his company has cut manual processes within the global supply chain with the NetSuite platform.

“When we brought in NetSuite one of the things we looked at was the portal,” said Thielen. “We now have our 3PLs use NetSuite as the portal to pull, fulfill and invoice orders. “

The next step for Blue Microphones is implementing SuiteCommerce for B2C ecommerce. The goal is to better understand its customers and capture that data in one central location.

“We expect in five years to double our size,” said Thielen. “With NetSuite it’s scalable and we can grow with this system for a long time.”

Seamless Omnichannel Customer Experiences

RST Brands, a manufacturer and distributor of furniture and lifestyle décor, is pioneering the omnichannel experience for the industry. To provide that seamless experience while moving from strictly a B2B company to also offer B2C ecommerce, RST Brands traded in its home-grown websites for SuiteCommerce Advanced.

“The nice thing about SuiteCommerce Advanced is that the sites are now one,” said Bill Apgood, Jr. CFO/COO of RST Brands. “The data is live inside of NetSuite which provides our customers a better experience.”

Utilizing the Bronto commerce marketing automation solution, RST Brands can send automated, targeted emails to customers who leave products in their online shopping carts.

“We have a 60 percent open rate on our email,” said Apgood. “Of that 60 percent, 30 percent click on the link within Bronto and 13 percent order the product in the abandon cart — which is great.”

RST Brands also has a unique relationship with Costco. Members in over 450 stores can test out RST Brand’s furniture display inside the store, and when they’re ready to purchase they text a code and have it shipped directly to them. After the product ships, Costco members receive an email (via Bronto) explaining important product and contact information.

To learn more about what NetSuite is doing to support the manufacturing and wholesale distribution industry, watch the on-demand keynote session from SuiteWorld17.

Let’s block ads! (Why?)

The NetSuite Blog

It’s Full Speed to Next for Manufacturers and Distributors at SuiteWorld 2017

websitelogo It’s Full Speed to Next for Manufacturers and Distributors at SuiteWorld 2017

Posted by Marissa Kinsley, Manufacturing and Distribution Industry Marketing Lead

NetSuite has been in the business of cloud ERP for nearly 20 years – from our beginnings as a startup above a small hair salon, to taking the company public and being acquired, NetSuite has seen plenty of change over with growth and innovation as the main constants. While some companies may have become complacent over those 20 years, NetSuite instead continues to throttle forward.

At this year’s SuiteWorld conference, we are offering more content than ever before to address the specific challenges and opportunities that the manufacturing and distribution industries face to help take businesses to the next level. Between our industry-specific keynote, executive meeting opportunities, demonstrations and hundreds of expert-led breakout sessions, we have the tools and ERP specialists you need to get your business on track for your most successful year yet.

To start off SuiteWorld 2017 right, join us at 1:15 p.m. on Tuesday for the manufacturing and distribution keynote featuring key industry leaders from Constellation Research and NetSuite experts as well as NetSuite customers Blue Microphones, RST Brands, Precision Disposables, and GT Golf Supplies. During this year’s manufacturing and distribution keynote titled “Full Speed to Next: The Future for Product Companies” you will learn about…

Join us for the Manufacturing and Distribution Industry Keynote on Tuesday, April 25th at 1:15 p.m., chat with experts at the Manufacturing and Distribution Birds of a Feather Luncheon on Wednesday, April 26th at 12:00 p.m., and create a breakout session schedule that fits your business interests to hear more on these key trends affecting other manufacturing and distribution companies, see how businesses are capitalizing on them, and learn how NetSuite customers are capitalizing on these trends to take control of their own success.

Register now for SuiteWorld 2017 to take getting ahead and staying into your own hands.

Posted on Thu, April 6, 2017
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

How can i speed upt this function

 How can i speed upt this function
FastFanoBroadening[y_, enestep_, n1_, n2_, G1offset_, G1grad_, q1_, 
   G2offset_, G2grad_, q2_] = 
  Module[{y1, y2, x1long, x2long, y1Salami, y2Salami, G1, G2, Fano1, 
    Fano2, y1conv, y2conv, youtfront, youtend, youtinter1, youtinter2,
     youtinter, i, j, k},
   (*Zurechtschneiden der Arbeitsbereiche*)
   y1 = Take[y, n1];
   y2 = Take[y, -n2];
   x1long = enestep*Range[-n1, n1 - 1];
   x2long = enestep*Range[-n2, n2 - 1];
   y1conv = 0*y1;
   y2conv = 0*y2;
   (*Salami-Fano-Faltung mit gradieller Breite auf beiden Bereichen*)

    y1Salami = SparseArray[{}, 3*n1];
    y1Salami = ReplacePart[y1Salami, n1 + j -> y1[[j]]];
    G1 = G1grad*(j - 1) + G1offset;
    If[G1 <= 0.05, G1 = 0.05, G1];
    Fano1 = 
     2/(G1*(q1^2 + 1)*
         Pi)*((q1 + x1long/(G1/2))^2/(1 + (x1long/(G1/2))^2) - 1);
    y1conv = 
     y1conv + enestep*Drop[ListConvolve[Fano1, y1Salami], 1], {j, 1, 
    y2Salami = SparseArray[{}, 3*n2];
    y2Salami = ReplacePart[y2Salami, n2 + k -> y2[[k]]];
    G2 = G2grad*(k - 1) + G2offset;
    If[G2 <= 0.05, G2 = 0.05, G2];
    Fano2 = 
     2/(G2*(q2^2 + 1)*
         Pi)*((q2 + x2long/(G2/2))^2/(1 + (x2long/(G2/2))^2) - 1);
    y2conv = 
     y2conv + enestep*Drop[ListConvolve[Fano2, y2Salami], 1], {k, 1, 
   (*Zusammensetzen der beiden Bereiche, 
   gradielle Mittelung im Überlappbereich*)
   youtfront = Take[y1conv, Length[y] - n2];
   youtend = Take[y2conv, -(Length[y] - n1)];
   youtinter1 = Drop[y1conv, Length[youtfront]];
   youtinter2 = Drop[y2conv, -Length[youtend]];
   youtinter = 0*youtinter1;
   For[i = 1, i <= Length[youtinter], i++,
      i]] = (Length[youtinter] - i)/(Length[youtinter] - 1)*
       youtinter1[[i]] + (i - 1)/(Length[youtinter] - 1)*
   Flatten[{youtfront, youtinter, youtend}]];

I need your help for speed up this function 5 times faster. I tried paralleization and compilation but had no luck with this. perhaps you have some advice how to optimize this. thx fsfsaf f sfas f sdf sadf as fsad f asdf dsf ase gf segas g argaerlgmbadlibngmaehmadyögmn#aeslbhü+radsöh rd hbö.,dsätü#hleü +sdm,re,

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Integrated Planning And Simulation: Delivering New Insights At High Speed

These days it seems that we are witnessing waves of extreme disruption rather than incremental technology change. While some tech news stories have been just so much noise, unlikely to have long-term impact, a few are important signals of much bigger, longer-term changes afoot.

From bots to blockchains, augmented realities to human-machine convergence, a number of rapidly advancing technological capabilities hit important inflection points in 2016. We looked at five important emerging technology news stories that happened this year and the trends set in motion that will have an impact for a long time to come.

sap Q416 digital double feature1  1 Integrated Planning And Simulation: Delivering New Insights At High Speed

Immersive experiences were one of three top-level trends identified by Gartner for 2016, and that was evident in the enormous popularity of Pokémon Go. While the hype may have come and gone, the immersive technologies that have been quietly advancing in the background for years are ready to boil over into the big time—and into the enterprise.

The free location-based augmented reality (AR) game took off shortly after Nintendo launched it in July, and it became the most downloaded app in Apple’s app store history in its first week, as reported by TechCrunch. Average daily usage of the app on Android devices in July 2016 exceeded that of the standard-bearers Snapchat, Instagram, and Facebook, according to SimilarWeb. Within two months, Pokémon Go had generated more than US$ 440 million, according to Sensor Tower.

Unlike virtual reality (VR), which immerses us in a simulated world, AR layers computer-generated information such as graphics, sound, or other data on top of our view of the real world. In the case of Pokémon Go, players venture through the physical world using a digital map to search for Pokémon characters.

The game’s instant global acceptance was a surprise. Most watching this space expected an immersive headset device like Oculus Rift or Google Cardboard to steal the headlines. But it took Pikachu and the gang to break through. Pokémon Go capitalized on a generation’s nostalgia for its childhood and harnessed the latest advancements in key AR enabling technologies such as geolocation and computer vision.

sap Q416 digital double feature1 images8 Integrated Planning And Simulation: Delivering New Insights At High SpeedJust as mobile technologies percolated inside companies for several years before the iPhone exploded onto the market, companies have been dabbling in AR since the beginning of the decade. IKEA created an AR catalog app in 2013 to help customers visualize how their KIVIK modular sofa, for example, would look in their living rooms. Mitsubishi Electric has been perfecting an AR application, introduced in 2011, that enables homeowners to visualize its HVAC products in their homes. Newport News Shipbuilding has launched some 30 AR projects to help the company build and maintain its vessels. Tech giants including Facebook, HP, and Apple have been snapping up immersive tech startups for some time.

The overnight success of Pokémon Go will fuel interest in and understanding of all mediated reality technology—virtual and augmented. It’s created a shorthand for describing immersive reality and could launch a wave of technology consumerization the likes of which we haven’t seen since the iPhone instigated a tsunami of smartphone usage. Enterprises would be wise to figure out the role of immersive technology sooner rather than later. “AR and VR will both be the new normal within five years,” says futurist Gerd Leonhard, noting that the biggest hurdles may be mobile bandwidth availability and concerns about sensory overload. “Pokémon is an obvious opening scene only—professional use of AR and VR will explode.”

sap Q416 digital double feature1  3 Integrated Planning And Simulation: Delivering New Insights At High Speed

Blockchains, the decentralized digital ledgers of transactions that are processed by a distributed network, first made headlines as the foundation for new types of financial transactions beginning with Bitcoin in 2009. According to Greenwich Associates, financial and technology companies will invest an estimated $ 1 billion in blockchain technology in 2016. But, as Gartner recently pointed out, there could be even more rapid evolution and acceptance in the areas of manufacturing, government, healthcare, and education.

By the 2020s, blockchain-based systems will reduce or eliminate many points of friction for a variety of business transactions. Individuals and companies will be able to exchange a wide range of digitized or digitally represented assets and value with anyone else, according to PwC. The supervised peer-to-peer network concept “is the future,” says Leonhard.

But the most important blockchain-related news of 2016 revealed a weak link in the application of technology that is touted as an immutable record.

In theory, blockchain technology creates a highly tamper-resistant structure that makes transactions secure and verifiable through a massively distributed digital ledger. All the transactions that take place are recorded in this ledger, which lives on many computers. High-grade encryption makes it nearly impossible for someone to cheat the system.

In practice, however, blockchain-based transactions and contracts are only as good as the code that enables them.

Case in point: The DAO, one of the first major implementations of a “Decentralized Autonomous Organization” (for which the fund is named). The DAO was a crowdfunded venture capital fund using cryptocurrency for investments and run through smart contracts. The rules that govern those smart contracts, along with all financial transaction records, are maintained on the blockchain. In June, the DAO revealed that an individual exploited a vulnerability in the company’s smart contract code to take control of nearly $ 60 million worth of the company’s digital currency.

The fund’s investors voted to basically rewrite the smart contract code and roll back the transaction, in essence going against the intent of blockchain-based smart contracts, which are supposed to be irreversible once they self-execute.

The DAO’s experience confirmed one of the inherent risks of distributed ledger technology—and, in particular, the risk of running a very large fund autonomously through smart contracts based on blockchain technology. Smart contract code must be as error-free as possible. As Cornell University professor and hacker Emin Gün Sirer wrote in his blog, “writing a robust, secure smart contract requires extreme amounts of diligence. It’s more similar to writing code for a nuclear power reactor, than to writing loose web code.” Since smart contracts are intended to be executed irreversibly on the blockchain, their code should not be rewritten and improved over time, as software typically is. But since no code can ever be completely airtight, smart contracts may have to build in contingency plans for when weaknesses in their code are exploited.

Importantly, the incident was not a result of any inherent weakness in the blockchain or distributed ledger technology generally. It will not be the end of cryptocurrencies or smart contracts. And it’s leading to more consideration of editable blockchains, which proponents say would only be used in extraordinary circumstances, according to Technology Review.

sap Q416 digital double feature1  5 Integrated Planning And Simulation: Delivering New Insights At High Speed

Application programming interfaces (APIs), the computer codes that serve as a bridge between software applications, are not traditionally a hot topic outside of coder circles. But they are critical components in much of the consumer technology we’ve all come to rely on day-to-day.

One of the most important events in API history was the introduction of such an interface for Google Maps a decade ago. The map app was so popular that everyone wanted to incorporate its capabilities into their own systems. So Google released an API that enabled developers to connect to and use the technology without having to hack into it. The result was the launch of hundreds of inventive location-enabled apps using Google technology. Today, millions of web sites and apps use Google Maps APIs, from Allstate’s GoodHome app, which shows homeowners a personalized risk assessment of their properties, to Harley-Davidson’s Ride Planner to 7-Eleven’s app for finding the nearest Slurpee.

sap Q416 digital double feature1 images6 Integrated Planning And Simulation: Delivering New Insights At High SpeedUltimately, it became de rigueur for apps to open up their systems in a safe way for experimentation by others through APIs. Technology professional Kin Lane, who tracks the now enormous world of APIs, has said, “APIs bring together a unique blend of technology, business, and politics into a transparent, self-service mix that can foster innovation.”

Thus it was significant when Apple announced in June that it would open up Siri to third-party developers through an API, giving the wider world the ability to integrate Siri’s voice commands into their apps. The move came on the heels of similar decisions by Amazon, Facebook, and Microsoft, all of which have AI bots or assistants of their own. And in October, Google opened up its Google Assistant as well.

The introduction of APIs confirms that the AI technology behind these bots has matured significantly—and that a new wave of AI-based innovation is nigh.

The best way to spark that innovation is to open up AI technologies such as Siri so that coders can use them as platforms to build new apps that can more rapidly expand AI uses and capabilities. Call it the “platformication” of AI. The value will be less in the specific AI products a company introduces than in the value of the platform for innovation. And that depends on the quality of the API. The tech company that attracts the best and brightest will win. AI platforms are just beginning to emerge and the question is: Who will be the platform leader?

sap Q416 digital double feature1  4 Integrated Planning And Simulation: Delivering New Insights At High Speed

In June, Swiss citizens voted on a proposal to introduce a guaranteed basic income for all of its citizens, as reported by BBC News. It was the first country to take the issue to the polls, but it won’t be the last. Discussions about the impact of both automation and the advancing gig economy on individual livelihoods are happening around the world. Other countries—including the United States—are looking at solutions to the problem. Both Finland and the Netherlands have universal guaranteed income pilots planned for next year. Meanwhile, American startup incubator Y Combinator is launching an experiment to give 100 families in Oakland, California, a minimum wage for five years with no strings attached, according to Quartz.

The world is on the verge of potential job loss at a scale and speed never seen before. The Industrial Revolution was more of an evolution, happening over more than a century. The ongoing digital revolution is happening in relative hyper speed.

No one is exactly sure how increased automation and digitization will affect the world’s workforce. One 2013 study suggests as much as 47% of the U.S workforce is at risk of being replaced by machines over the next two decades, but even a conservative estimate of 10% could have a dramatic impact, not just on workers but on society as a whole.

The proposed solution in Switzerland did not pass, in part because a major political party did not introduce it, and citizens are only beginning to consider the potential implications of digitization on their incomes. What’s more, the idea of simply guaranteeing pay runs contrary to long-held notions in many societies that humans ought to earn their keep.

Whether or not state-funded support is the answer is just one of the questions that must be answered. The votes and pilots underway make it clear that governments will have to respond with some policy measures. The question is: What will those measures be? The larger impact of mass job displacement, what future employment conditions might look like, and what the responsibilities of institutions are in ensuring that we can support ourselves are among the issues that policy makers will need to address.

New business models resulting from digitization will create some new types of roles—but those will require training and perhaps continued education. And not all of those who will be displaced will be in a position to remake their careers. Just consider taxi drivers: In the United States, about 223,000 people currently earn their living behind the wheel of a hired car. The average New York livery driver is 46 years old, according to the New York City Taxi and Limousine Commission, and no formal education is required. When self-driving cars take over, those jobs will go away and the men and women who held them may not be qualified for the new positions that emerge.

As digitization dramatically changes the constructs of commerce and work, no one is quite sure how people will be impacted. But waiting to see how it all shakes out is not a winning strategy. Companies and governments today will have to experiment with potential solutions before the severity of the problem is clear. Among the questions that will have to be answered: How can we retrain large parts of the workforce? How will we support those who fall through the cracks? Will we prioritize and fund education? Technological progress and shifting work models will continue, whether or not we plan for their consequences.

sap Q416 digital double feature1  2 Integrated Planning And Simulation: Delivering New Insights At High Speed

In April, a young man, who was believed to have permanently lost feeling in and control over his hands and legs as the result of a devastating spine injury, became able to use his right hand and fingers again. He used technology that transmits his thoughts directly to his hand muscles, bypassing his injured spinal cord. Doctors implanted a computer chip into the quadriplegic’s brain two years ago and—with ongoing training and practice—he can now perform everyday tasks like pouring from a bottle and playing video games.

The system reconnected the man’s brain directly to his muscles—the first time that engineers have successfully bypassed the nervous system’s information superhighway, the spinal cord. It’s the medical equivalent of moving from wired to wireless computing.

The man has in essence become a cyborg, that term first coined in 1960 to describe “self-regulating human-machine systems.” Yet the beneficiary of this scientific advance himself said, “You’re not going to be looked on as, ‘Oh, I’m a cyborg now because I have this big huge prosthetic on the side of my arm.’ It’s something a lot more natural and intuitive to learn because I can see my own hand reacting.”

As described in IEEE Spectrum, the “neural-bypass system” records signals that the man generates when thinking about moving his hand, decodes those signals, and routes them to the electric sleeve around his arm to stimulate movement: “The result looks surprisingly simple and natural: When Burkhart thinks about picking up a bottle, he picks up the bottle. When he thinks about playing a chord in Guitar Hero, he plays the chord.”

sap Q416 digital double feature1 images5 Integrated Planning And Simulation: Delivering New Insights At High SpeedWhat seems straightforward on the surface is powered by a sophisticated algorithm that can analyze the vast amounts of data the man’s brain produces, separating important signals from noise.

The fact that engineers have begun to unlock the complex code that controls brain-body communication opens up enormous possibilities. Neural prostheses (cochlear implants) have already reversed hearing loss. Light-sensitive chips serving as artificial retinas are showing progress in restoring vision. Other researchers are exploring computer implants that can read human thoughts directly to signal an external computer to help people speak or move in new ways. “Human and machine are converging,” says Leonhard.

The National Academy of Engineering predicts that “the intersection of engineering and neuroscience promises great advances in healthcare, manufacturing, and communication.”

Burkhart spent two years in training with the computer that has helped power his arm to get this far. It’s the result of more than a decade of development in brain-computer interfaces. And it can currently be used only in the lab; researchers are working on a system for home use. But it’s a clear indication of how quickly the lines between man and machine are blurring—and it opens the door for further computerized reanimation in many new scenarios.

This fall, Switzerland hosted its first cyborg Olympics, in which disabled patients compete using the latest assistive technologies, including robot exoskeletons and brainwave-readers. Paraplegic athletes use electrical simulation systems to compete in cycling, for example. The winners are those who can control their device the best. “Instead of celebrating the human body moving under its own power,” said a recent article in the IEEE Spectrum, “the cyborg games will celebrate the strength and ingenuity of human-machine collaborations.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


Let’s block ads! (Why?)

Digitalist Magazine

Users look to real-time streaming to speed up big data analytics

TTlogo 379x201 Users look to real time streaming to speed up big data analytics

NEW YORK — For more organizations, there’s no time like the present to process and analyze the information flowing into their big data systems. And IT vendors increasingly are releasing technologies that facilitate the real-time streaming analytics process.

Comcast Corp. is among the real-time vanguard. The TV and movie conglomerate is on the verge of expanding a Hadoop cluster used by its data science team from 300 compute nodes to 480. In addition, Comcast plans to upgrade the system to include Apache Kudu, an open source data store designed for use in real-time analytics applications involving streaming data that’s updated frequently.

“For us, the update ability is a very big thing,” said Kiran Muglurmath, executive director of data science and big data analytics at the Philadelphia-based company. The Hadoop cluster, set up earlier this year, already contains more than a petabyte of information — for example, data collected from set-top boxes on the TV viewing activities of Comcast customers and the operations of the boxes themselves. But Muglurmath’s team needs to keep the data as up-to-date as possible for effective analysis, which means updating individual records via table scans as new information comes in.

Sridhar Alla, director of big data architecture at Comcast, said doing so takes “an immense amount of time” in the Hadoop Distributed File System (HDFS) and its companion HBase database — too long to be feasible at petabyte scale. Kudu, on the other hand, has significantly accelerated the process in a proof-of-concept project over the past three months. In one test, for example, it scanned more than two million rows of data per second. “It’s writing the data as fast as the disks can handle,” Alla said during a session at Strata + Hadoop World 2016 here this week.

Real-time waiting game comes to an end

The Kudu technology was created last year by Hadoop vendor Cloudera Inc. and then open sourced. The Apache Software Foundation last week released Kudu 1.0.0, the first production version — a step that Comcast was waiting for before going live with its Kudu deployment.

The expansion of the Cloudera-based Hadoop cluster should be completed by the end of October, Muglurmath said after the conference session. Kudu will be configured on all of the compute nodes along with HDFS, which will continue to be used to store other types of data. The data science team also plans to use Impala, a SQL-on-Hadoop query engine developed by Cloudera, to join together data from HDFS and Kudu for analysis.

Dell EMC, the data storage unit of IT vendor Dell Technologies, is also going down the real-time streaming path to support its internal analytics efforts.

You couldn’t just throw all the data in Hadoop and say ‘Go at it.’ It’s a different thing to take real-time data and do actionable analytics on it. Darryl Smithchief data platform architect, Dell EMC

The IT team is using the Spark processing engine and other data ingestion tools to funnel real-time data on interactions with customers into a combination of databases — Cassandra, GemFire, MemSQL and PostgreSQL. Automated algorithms are then run against the data to generate up-to-the-minute customer experience scores that help guide Dell EMC’s salesforce in selling tech-support subscription renewals, said Darryl Smith, chief data platform architect at the Hopkinton, Mass.-based organization.

The customer interaction data is also fed into a Hadoop data lake, but that’s for longer-term customer profiling and trend analysis. For the customer scoring application, “you couldn’t just throw all the data in Hadoop and say ‘Go at it’ [to the sales reps],” Smith said. “It’s a different thing to take real-time data and do actionable analytics on it.”

That does mean the same data is being processed and stored in different locations within Dell EMC’s big data architecture, but Smith doesn’t see that as a bad thing. “And it’s not just because I work for a storage company,” he joked. “If you’re going to get value out of the data, you’re going to need to store it in multiple places, because you’re going to consume it in different ways.”

One of the real-time streaming processes adopted by Dell EMC uses the open source Kafka message queueing tool to push data into MemSQL, an in-memory database designed for real-time applications. Vendor MemSQL Inc. this week released a version 5.5 update that incorporates the Kafka connectivity into a feature for creating data pipelines with exactly-once semantics — meaning that data transmissions are processed only once, with guaranteed delivery and no data loss along the way. Smith said such a guarantee is “absolutely critical” for the kind of real-time analytics Dell EMC is looking to do.

Living with some real-time data loss

Guaranteed data delivery isn’t a necessity for eBay Inc., though. The online auction and e-commerce company uses Pulsar, an open source stream processing and analytics technology it created, to analyze data on user activities in order to drive personalization of the eBay website for individual visitors. In creating and expanding the real-time architecture over the past three years, eBay’s IT team decided it didn’t have to spend extra development money to build a delivery guarantee into the data pipeline.

“For our use cases, we can afford to lose a little bit of the data,” said Tony Ng, director of engineering for user behavioranalytics and other data services at eBay. But Ng’s team does have to keep on its toes as data flows in. For example, one of the goals is to detect bots on the site and separate out the activity data they generate so it doesn’t skew the personalization process for real users. That requires frequent updates to the bot-detection rules built into eBay’s analytics algorithms, Ng said.

The San Jose, Calif., company’s real-time streaming setup also includes Kafka as a transport mechanism, plus several other open source technologies — Storm, Kylin and Druid — for processing and storing data. Ng noted that the streaming operations are a lot different from the batch data loading eBay does into its Hadoop clusters and Teradata data warehouse for other analytics uses.

“There are some constraints on how much processing you can do on the data,” he said. It is eventually cleaned up and consolidated in batch mode for downstream analytics applications — “but the things that need to be real time, we want to keep real time.”

Putting together a real-time data streaming and analytics architecture is a complicated process in and of itself, said Mark Madsen, president of data management and analytics consultancy Third Nature Inc. in Portland, Ore.

Users can also tap a variety of other streaming technologies — for example, Spark’s Spark Streaming module and Apache Flink, an upstart alternative to Spark that was released in a commercial version this month by lead developer Data Artisans GmbH. But a lot of assembly is typically required to combine different tools into a functional platform. “It’s a build-to-order problem,” Madsen said. “[Individual IT vendors] carve out a piece of the problem, but it’s hard for them to carve out the whole problem.”

Let’s block ads! (Why?)

SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources

Clothing company benefits from speed of modern BI reporting tools

TTlogo 379x201 Clothing company benefits from speed of modern BI reporting tools

Modern BI reporting tools are all about speed.

That’s the lesson recently learned by Arc’teryx, an outerwear manufacturer and retailer based in Vancouver, B.C. The company recently overhauled a reporting system that helps identify potential locations for new retail stores. The initial goal was to make the process more automated. It has since turned BI tasks that once took hours into much quicker jobs.

“It’s been massive time savings,” William Jackson, manager of business intelligence at Arc’teryx, said.

The new system is built around Qlik Sense, software from QlikTech. The team also assessed software from Tableau and Microstrategy, but settled on Qlik because the vendor was willing to help Arc’teryx work through a proof-of-concept project prior to purchase. It also leverages the data market offered by Qlik, a prepopulated and formatted list of data sources. From this market Arc’teryx pulls in data about local weather trends and economic indicators for specific regions and combines it with its own sales data. From this data the BI team builds reports that assess locations where the company is considering expanding, looking for areas that match up with typical customer demographics.

Prior to implementing Qlik Sense, these reports were created through a combination of SAP Crystal Reports and IBM Cognos. Data was pulled from web sources. But Jackson said the process of collecting data and putting it through the necessary extract, transform and load processes could take hours, which made it hard to get insights into the hands of the executives.

“We were doing this previously, but it was a lot of scraping data from the internet manually,” he said. “Because you’re pulling all that data from all these sources, you can’t be sure of the consistency.”

Most employees have embraced the new BI reporting tools. It helps that most of the workers have youthful attitudes and are generally ready to take on new technologies, particularly ones that can save them time, Jackson said.

Users have also taken to the sandboxing feature of the new software, Jackson said. Each report that Jackson and his team put out comes with a separate tab containing a blank workspace. This allows users to play around with the data in the report, test out new ways of looking at it or see if combining different variables or data sets into one view makes sense. They can do this without disturbing the predefined reports Jackson’s team puts out. Periodically the BI team meets with business users to see if there’s anything they regularly build in this workspace that might be worth institutionalizing.

Another key factor in getting strong adoption of the new way of doing things was executive support. The company’s leaders liked the ability to drill down into reports and the speed with which they were able to get insights, Jackson said. Once they picked up the tool, the rest of the workforce followed.

“Because we had the buy-in at the executive level, it drove adoption through the organization,” Jackson said. 

Let’s block ads! (Why?)

SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources

Boston looks to location analysis to speed fire, EMT response

TTlogo 379x201 Boston looks to location analysis to speed fire, EMT response

Imagine emergency responders racing to a call with an interactive map in their hands that uses location analysis to show them the location of fire hydrants, biological and industrial hazards, and heavy traffic.

The city of Boston is developing just such a system that takes advantage of a growing trove of location data. The city, which has rolled out the system to dispatchers and will soon have it in the hands of first responders, hopes the tool will cut emergency response times and improve safety for responders and residents.

“From the old [computer aided dispatch] system we had a couple years ago, we had limitations,” John McKenna, senior fire alarm operator at Boston Fire Department, said. “To go to this now, it’s just that much better.”

McKenna started the project on his own about a year ago, essentially dumping location data for fire hydrants into a spread sheet to help dispatchers. The problem was that different city information systems stored the location of hydrants in different formats. This meant dispatchers were unable to give responders information about hydrants, let alone other potential hazards. McKenna was trying to standardize this.

Then an analytics fellow working for City Hall called to ask if the fire department was working on anything he could help with. He and McKenna connected and formalized the project, getting IT and data analysis staff from City Hall involved. This allowed the project to go from a siloed spreadsheet to something that could be used by all dispatchers.

The location analysis tool went live the week of March 7. Currently, dispatchers are the primary users. They look up addresses to which responders are called and then radio information about hazards to the response teams. The tool is still being developed, though, and the team hopes, within the next year or two, they’ll be able to put it in the hands of emergency responders, allowing them to check data about the locations they’re headed to themselves.

Loosine Vartani, a data visualization analyst with the city of Boston who managed the project, said data from various city of Boston data stores, like permitting and assessing offices, is joined using Python and then pulled into an Esri ArchGIS database. This database stores the location data of things such as fire hydrants and potential hazards throughout the city. It then connects to a Google Maps API, which serves as the front-end visualization.

Vartani said one major reason for choosing Google Maps to visualize this location data is that it’s so ubiquitous and most people have some level of familiarity and comfort with it.

This is a major focus of much of the city’s data initiatives. Jascha Franklin-Hodge, the city’s CIO, said he and his team are looking at ways to “break apart giant platform lock-ins.” They want to build as much as they can using simple, widely available tools. For example, Franklin-Hodge and his team are currently migrating an HR system from a large Oracle installation to something customized around the open source content management system Drupal.

On the emergency response project, Franklin-Hodge said this lighter-touch approach was a good fit. It takes location data from disparate databases and leverages it for location analysis, something that is far from its original intended purpose, but nevertheless adds value.

“This is a bit of a departure from how government tech gets built,” he said. “The normal way is very heavy weight and its purpose built. This project went from idea to deployment in 10 months.”

Let’s block ads! (Why?)

SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources