We know, we know. Data is a strategic asset.
By now, it’s clear to CIOs everywhere that data is a strategic asset; they’re hearing it from technology vendors in the ERP space, strategic management consulting firms like McKinsey & Company, and big data analytics companies like Pentaho. And while knowing data is important is one thing, companies continue to wrestle with how to best access, prepare and govern that data to create a positive impact on their business.
One of the biggest challenges companies face today is integrating, prepping and governing siloed data, particularly when the data sits in multiple locations across the organization (and often the world) and is stored on different platforms (particularly in M&A situations). Last year, Forrester Consulting conducted a study commissioned by Pentaho that examined the ability of businesses in the U.S. and the UK to keep up with changing data needs while keeping that data secure. The study showed that over half of respondents (52 percent) are using 50 or more distinct data sources to enable analytics capabilities. About a third (34 percent) blend 100 or more data sources, and 12 percent blend 1,000 or more.
Traditionally, getting disparate data management platforms to interact has been difficult, which is understandable as platform vendors fight to gain and hold market share. However, with the advent of cloud computing and big data, the days of vendor lock-in are no longer acceptable (nor perhaps feasible). Last week, SAP owned the enterprise software news cycle during their annual SAPPHIRE NOW customer event. Bill McDermott kicked off his keynote underlining that SAP would be a kinder, gentler and more empathetic company, focused on driving customer success. They announced partnerships/integrations with Microsoft Azure and AWS, as well as a more open HANA platform for app development and data integration.
While it’s encouraging to see the ecosystem integrating and collaborating to solve customer requirements, we still have a long road ahead. As we saw with the legacy ERP market a decade back, the lack of interoperability between public cloud offerings such as AWS and Azure are now creating the same discussions and concerns at the customer level.
As an industry, for us to truly unlock the value of big (all) data, we have to find a better way to co-operate and integrate – with an open, secure, flexible data architecture that fits with customers’ heterogeneous environments – in order to create an environment flexible enough to accommodate whatever data needs the customer has. The IoT market is particularly interested in this issue, as its value proposition is to analyze the data created by billions of connected devices to see how it can improve the customer experience.
Customers can’t wait while the ecosystem struggles to figure out open standards. They need to devise a data integration and quality strategy ASAP, so they can help grow their business and drive a competitive advantage with data today. Pentaho has an early lead in understanding how emerging trends like IoT and the convergence of Cloud and Big Data impact enterprise IT strategies. Our customers are at the front lines of big data in the enterprise, driving best practices in data quality and integration strategies, so they can help grow their business and drive a competitive advantage with data.
In partnership with Dan Woods and CITO Research, we’ve built a guide to help advise CIOs, CTOs and other IT and business professionals to help overcome these barriers to implementation and drive a successful integration. At the end of the day, the data challenges don’t change the fact that those companies that embrace the strategic use of data will have the competitive edge to be the ultimate winners in their industries.
By Rosanne Saccone, Chief Marketing Officer at Pentaho, a Hitachi Group Company