Live Product Innovation, Part 1: The Role Of In-Memory Computing

In Part 1 of this series, we look at how in-memory computing affects live product innovation. In Part 2, we’ll explore the impact of the Internet of Things (IoT) and Big Data.

In our previous series, we examined key drivers changing the way we use product data throughout the enterprise. Digitalization of customer experience, distributed manufacturing and engineering, and IoT and networked processes are transforming business processes and models. The proliferation of data – and the commoditization of technology to capture, store, share, analyze, predict, and learn from data – are enabling “live” product lifecycle management (PLM).

As a result, product data can no longer reside only in engineering systems. Instead, it must synchronize with and facilitate physical and digital twins throughout the product lifecycle. Ultimately, it must extend beyond PLM to manufacturing, cost management, order fulfillment, and service delivery. Likewise, the R&D process must leverage input from manufacturing, customer service, market trends, and more.

What makes this live product innovation possible? The answer is in-memory computing. Or I should say, the right in-memory computing, because not all in-memory platforms are created equal.

Greater volumes, greater speed

Described simply, in-memory computing processes data while the data resides in memory. This allows for orders-of-magnitude faster analysis of vast data volumes. But to extend PLM beyond R&D and engineering, in-memory platforms also must allow:

  • Real-time data access across enterprise domains, with no locking
  • Analytics directly on line-item-level data, without the need to extract to data warehouses
  • A massive reduction of data footprint – typically up to a factor of 10
  • Simplified coding through the removal of database aggregates

With these capabilities, data sets previously segmented because of volume and performance concerns can be left intact. That means data from manufacturing, maintenance, warranty, service, finance, CRM, and marketing can potentially be deployed in one architecture and analyzed in real time across functions.

It also means R&D and engineering can access previously untapped information sources in real time to support product decisions. They can achieve this without the need to pre-aggregate and assemble huge data warehouses, at least for relevant business data.

Other capabilities of an effective in-memory platform include:

  • Tight connection of manufacturing, supply chain, maintenance, and service data and processes
  • An open integration framework for cross-discipline authoring tools
  • The ability of 3D models to be part of the user interface (UI) across enterprise processes
  • The ability to manage complex integrations of CAD and authoring tools

Thanks for the in-memory

What are some outcomes of these capabilities? Product cost is increasingly important to understand early, because it dictates the profitability of a service-based contract. By leveraging the simulation and predictive capabilities of an effective in-memory platform – and by integrating business data from an EP environment – engineering can understand, simulate, and predict variations in design costs based on materials, routes, suppliers, and production techniques. It can achieve this in real time, either before the product is launched or during product redesign.

For project and program management, simulating the effect of a design initiative on an existing fleet of assets allows much better understanding of the impact on new and existing projects and programs. Will a new project delay a critical stage gate? Will a seemingly minor change in resource allocation require more labor at higher cost? These factors can significantly alter cost projections across a portfolio.

Finally, understanding system reliability in the field, as well as product quality throughout the manufacturing process, allows designers to correlate data from materials, suppliers, producers, and processes. It was extremely cumbersome to do this in the past.

All this might sound like powered-up analytics, and to some extent it is. But if you add integration of PLM, ERP, CRM, supplier relationship management (SRM), and enterprise asset management (EAM), you gain an incredible ability to rapidly move from insight to action. A common platform allows far more agile management of interactions across silos. And simplification of user interactions and the underlying data model allows massively accelerated process modifications, as well as solution development and extension.

Want to learn more about live product innovation? Join us for “Logistics & SCM, PLM, Manufacturing, and Procurement 2017,” held March 6-8 in Orlando, Fla. I’ll be presenting on “The Impact of IoT on Product Design.” Hope to see you there.

How else is SAP innovating in supply chain? Learn more at SAP.com or follow us on @SCMatSAP for the latest news.

Digitalist Anchor Banner1 1 Live Product Innovation, Part 1: The Role Of In Memory Computing

Comments

Let’s block ads! (Why?)

Digitalist Magazine