Tag Archives: Mainstream

Notes from 2017 MIT CDOIQ Symposium: CDO’s, and Data, go Mainstream

During the conclusion of this year’s annual MIT Chief Data Officer and Information Quality Symposium (MITCDOIQ) held July 12-14 in Cambridge, MA, the organizers noted that the focus of the event was at last shifting from the rise and role of the CDO to a focus on the key, strategic business initiatives undertaken by the CDO and their organization.  But this shift was evident right from the start of the event – one featured session called this the process of “Shifting from Survival to Success.”

Front and center were key topics around the business value of data:  building out the data infrastructure and democratizing data, the data privacy requirements of GDPR, the value and use of machine learning and advanced analytics to drive business initiatives and achieve better business outcomes, and adaption to the accelerating pace of change.  One panel of CDO’s – including Christina Clark of General Electric (GE), Mark Ramsey of Glaxo Smith Kline, and Venkat Varadachary of American Express – addressed a number of these points in detail.

blog MITCDOIQ presentation Notes from 2017 MIT CDOIQ Symposium: CDO’s, and Data, go Mainstream

Managing Data for Business Value and Growth

Though the CDO role emerged out of the financial crisis and the need for broader controls around regulatory requirements and data governance, that role has changed and most new CDO’s are addressing the need for business value and revenue growth.  This doesn’t mean that the foundation and fundamentals of data governance and compliance are forgotten or dropped.  In fact, those aspects are critical to this shift as business growth is dependent on an organization’s ability to structure and manage data for speed and flexibility.  The approaches taken to succeed may vary considerably though.

blog banner landscape Notes from 2017 MIT CDOIQ Symposium: CDO’s, and Data, go Mainstream

One participant noted the example of Google absorbing YouTube.  Google took 2 years to build the infrastructure and information needed to effectively monetize the volumes of data acquired.  In the case of financial services firms, that approach is rare, particularly with mergers.  In those cases, the customer support and client relationship aspects are critical and it is more important to leave the systems in place and develop an approach to span across those systems, even where they have similar data.

The participants particularly noted that the benefits of this data-driven business approach are large, but that there is a need for a clear vision of what the organization wants to do and address.  This includes establishing a scorecard with quantifiable metrics for business value.

Some of these may be as straightforward as: identifying how many different revenue-generating use cases have been deployed; or determining how much faster deployment is with new approaches to data delivery.  Often, it’s the critical data elements, perhaps no more than 50 per subject area, that provide the key metrics around operational value (and the impact to business processes that will break if incorrect) tied in with the costs to acquire, manage, and consume such data.

blog MITCDOIQ trillium Notes from 2017 MIT CDOIQ Symposium: CDO’s, and Data, go Mainstream

I was on hand with colleagues at MITCDOIQ to demonstrate how our Trillium data quality software can help organization’s data governance initiatives 

The CDO of Glaxo Smith Kline noted a goal to change the time to discover new drugs from 8 years to 1 year, and transform the pharmaceutical industry by leveraging sensor-based and genomics data.  Many steps are needed to reach that business goal including standardizing internal data and being able to connect the internal data to new external sources.

Democratization of Data

For individuals in an organization to be effective, they need trusted data at hand to move forward with speed and efficiency.  Data scientists are just a part of that equation.  Sales, marketing, operations, and others in the lines of business all need data.  Getting data into the hands of employees, even if imperfect, is valuable – it creates incentives sooner as people can see the data issues and work to solve them.  Helping solve the problems for these people in accessing and using data not only democratizes the data, but provides them the ability to act in a more agile way with faster time-to-value.

At the same time, it is important to remember that data must be served in a manner that is consumable to these varied users.  Some will want visualizations and dashboards, some need alerts and notifications for faster action, some need data in Excel, and some could care less about visualization and want access via tools such as Python.

To achieve this democratization, it’s important then to understand what data people want access to, how it may be delivered and consumed, and how individuals can accelerate this process.  Shifting the cultural mindset to a process of collecting and accessing data rather than modeling and structuring the data first helps to more readily identify where the business challenges are and how data may be applied to solve the issues and drive value.

Barbara Latulippe, CDO of Dell, reiterated many of the panels themes in her MITCDOIQ presentation on “Governance and Stewardship in the Big Data Era.”  She noted that the data scientists in her organization were struggling to find data.  In one case, it took 35 phone calls by a data scientist to determine all the context around the data!  Democratizing data means it is critical to make the data easy to find, easy to understand, and easy to determine trust and quality.

One metric for Dell is simply reducing the time needed to find and consume data for prescriptive value with a goal to move from 70% of a data scientist’s available time spent in finding data (a statistic regularly reported) to 30% of their time.  Achieving this requires data governance, echoing the earlier panel’s comments that governance is foundational to success in this area.  Dell’s approach follows a Lean Data Governance model, a practice that Trillium Software has noted in the past, including:  starting small, showing success, visualizing results, and breaking down silos by showing others “what’s in it for them.”

Finding Data Skills, Building Data Literacy

On the final day of MITCDOIQ, Natalie Evans Harris, VP of Ecosystem Development at The Impact Lab, discussed the perceived issue in finding individuals with the data skills needed to help organizations achieve business value and growth.  She noted that this is often a “signaling” problem.

The focus by organizations on finding the “data scientist” who can understand and communicate with the business while finding and accessing data, testing hypotheses, building algorithms and models, and ramping these up into ongoing executable frameworks is misguided.  What organizations need to focus on is bringing teams together with the mix of skills that can empower all involved to move the organization forward.  This is the approach noted by Booz, Allen, Hamilton in their Field Guide to Data Science.

It’s important to remember that the range of skills needed to work effectively with data exist in many individuals and consider whether we are really looking for specialists or trying to take advantage of competencies (e.g. biologists, linguists, etc. can provide data science) and blend those with the subject matter experts who understand the business, understand business opportunities, and can present ideas in a manner that makes sense in the organization.

My own topic at MITCDOIQ, Finding Relevance in the Big Data World, touched on an aspect of data literacy, specifically how to approach the challenge of considering what data is important, i.e. relevant, for a given business initiative. Wolf Ruzicka, the Chairman of the Board at EastBanc Technologies, noted in his blog “Grow A Data Tree Out Of The “Big Data” Swamp” of June 1, 2017. “If you don’t know what you want to get out of the data, how can you know what data you need – and what insight you’re looking for?”

A fundamental step then in bringing data into the mainstream is ensuring that the individuals working with the data to establish a goal (whether generating new revenue, meeting compliance goals such as GDPR, or reducing operational costs).  Only with a business goal in mind can you test hypotheses, evaluate and measure data, and determine whether the data is fit for purpose.  The results must be documented in a way that they can be communicated out through a repeatable data governance process.  Such a process should start small, but it provides an approach to build a practice, show success, and build business value while democratizing and measuring the data used and highlighting which data has value for which business purpose.

As Harris noted, it’s important to address change management services and processes, particularly to understand how people can use, interpret, and understand their data and their dashboards.  This means not only thinking about data literacy, but building data literacy!

lessons learned Notes from 2017 MIT CDOIQ Symposium: CDO’s, and Data, go Mainstream

From MITCDOIQ: Lessons Learned in Dell’s CDO/Data Governance Journey

Data-driven Success

As the CDO panel noted, having both data governance and data science teams together in the organization helps ensure that regulatory obligations are met while building for growth.  It’s the underlying foundation needed to achieve success at data-driven initiatives.  And it’s hard to get people bought in fully, and requires culture change, but that is part of the CDO’s work.  This shift is evidenced in even at MITCDOIQ in its topics – no longer is the focus on creating a CDO office but on sharing the stories of organizational change and the adoption of fundamental data-driven processes and data literacy.

Discover the new rules for how data is moved, manipulated, and cleansed – download our new eBook The New Rules for Your Data Landscape today!

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Microsoft OneNote goes mainstream after long adolescence

Microsoft OneNote is a powerful tool for content management and collaboration, but many think of it as a lightweight, personal notetaking tool rather than an enterprise-ready content and collaboration technology.

OneNote has been misunderstood in part because of rapid evolution and in part because of Microsoft’s Office 365 strategy.

OneNote’s walk in the Windows wilderness

Although notetaking and sharing apps are new to many people, OneNote has been available for over a dozen years. Microsoft OneNote was announced by Bill Gates in late 2002 as “an application designed to allow people to capture notes in one place and then organize and use them more effectively” and had its first release as part of Office 2003.

In many ways, the release was a break from the other Microsoft Office, with a flexible page-centric notebook model that had more in common with web-based authoring than the file model of Word, Excel and PowerPoint. OneNote also did away with the traditional File/Save model, automatically saving all changes. Microsoft made it possible with OneNote 2007 to seamlessly share notes among multiple people, which anticipated the concurrent-authoring approach in Google Apps, Microsoft Office Online, Quip and other tools. Overall, the original OneNote conceptual model required some unlearning for most users.

The biggest challenges for OneNote during its first years on the market can be summed up in a single word: Windows. OneNote was initially stigmatized by its close association with Microsoft’s Tablet PC. OneNote could have been a killer app for the Windows XP Tablet PC platform, but Tablet PC failed to gain widespread market success. As Microsoft’s platform group struggled with a series of problematic releases of Windows for PCs and mobile devices — e.g., Windows Vista and Windows Mobile — OneNote remained popular with a small, but enthusiastic user base.

There was also ambiguity about where OneNote fit relative to other Microsoft content and collaboration offerings such as SharePoint and Groove. Microsoft didn’t explain how OneNote fit relative to wikis and other types of collaborative workspaces within SharePoint and filled the central content and collaboration role OneNote would have played via the Groove Networks acquisition in 2005.

To add further complexity, OneNote was not easy to acquire. It was initially available as a standalone app, later as part of the Home and Student and highest-end enterprise editions of the Office suite (Office 2007), and finally available for all usage scenarios in 2015.

As Microsoft OneNote struggled, Evernote, a notetaking app supporting Windows and other leading platforms, rapidly attracted a large user base. Figure 1 shows a Google Trends snapshot of the relative interest in Evernote (in blue) and OneNote (in red) from 2004 to early 2013:

sContentManagement Figure1 071916 mobile Microsoft OneNote goes mainstream after long adolescence Figure 1: Google Trends of Evernote interest vs. OneNote interest (captured 7/2/2016).

Although Evernote was less feature-rich than OneNote and had — and still has — relatively primitive notebook-sharing capabilities, its freemium business model and early support for popular smartphone and tablet platforms made it a compelling alternative. Other content and collaboration apps came and went during this period, including Google Wave, demonstrating OneNote wasn’t the only app that struggled to win mainstream market momentum.

OneNote in Microsoft’s cloud-first, mobile-first era

There were some causes for cautious optimism in the OneNote community, including the introduction of OneNote Mobile for iPhone, OneNote for iPad and OneNote for Android. However, more dramatic changes came with Microsoft’s 2014 cloud-first, mobile-first strategic shift and its expanded commitment to sincerely support all leading platforms. The moderate success of Microsoft Surface eventually helped to expand OneNote’s market, delivering on much of the earlier Tablet PC vision.

OneNote was expensive and difficult to acquire for many years relative to alternatives such as Evernote, but it’s now freely available on a wide variety of platforms (see Figure 2).

sContentManagement Figure2 071916 mobile Microsoft OneNote goes mainstream after long adolescence Figure 2: OneNote platform support as of June 2016.

OneNote’s web support, as part of Microsoft’s Offline Online service, is another aspect of making OneNote broadly accessible. With Microsoft OneNote Online, for example, anyone with permission to use a shared notebook can work from any internet-connected device using a modern browser, so collaborating via OneNote does not require an app download.

In another important milestone, a Mac version of OneNote was released in early 2014. It’s not yet as feature-rich as the full version of OneNote for Windows, but it’s a solid first release and likely adequate for many users who are primarily interested in OneNote for personal information management.

OneNote also had a starring role in Apple’s introduction of the iPad Pro in November 2015. As highlighted in “Microsoft Office Apps are Ready for the iPad Pro,” the original OneNote vision had perhaps finally come to full fruition on the iPad Pro. OneNote for PC Windows is still the most feature-rich version, fully leveraging the Surface as a tablet and laptop hybrid, but OneNote for the iPad Pro is a compelling and state-of-the-art combination.

Microsoft’s cloud infrastructure services have also matured since Windows Live Mesh. OneNote notebooks can now be shared via OneDrive as well as SharePoint — on-premises SharePoint Server or SharePoint Online. While notebook-sharing with earlier releases of OneNote required several not entirely intuitive steps, the latest OneNote apps make sharing simple and easy to work with personal or shared notebooks across multiple devices.

Revisiting related competitive dynamics, Evernote has faltered since its popularity peak several years ago and is now considered somewhat bloated and buggy by users. Evernote significantly increased its pricing during June 2016, making the premium Evernote subscription as expensive as a personal Office 365 subscription, meaning a full-featured Evernote now costs as much as a license for Word, Excel, PowerPoint and OneNote (for Windows or Mac OS), along with a terabyte of OneDrive storage. OneNote is now free on all platforms. Figure 3 indicates Evernote’s problems, with a Google Trends snapshot extending from 2013 to mid-2016 (again with Evernote in blue and OneNote in red).

sContentManagement Figure3 071916 mobile Microsoft OneNote goes mainstream after long adolescence Figure 3: Google Trends Snapshot covering Evernote’s decline in popularity (captured 7/2/2016).

OneNote still lags Evernote in Google Trends as of mid-2016, but is gaining momentum while Evernote strives to recover from several years of mixed results. There are competitors, including Apple Notes, Google Keep and Wunderlist (perhaps somewhat paradoxically acquired by Microsoft in mid-2015). None come close to matching the capabilities of OneNote as it continues to evolve.

Prominently positioned for continued growth

While for years OneNote seemed like a second-class citizen in the Office app family, Microsoft has expanded and clarified OneNote’s role. OneNote Class Notebook provides a powerful example of how OneNote can advance content and collaboration tools. Along with other Office apps, OneNote is gaining a unified multi-platform extension model. Developers will be able create OneNote add-ins that work across Windows, Mac OS X, mobile and browser versions of OneNote.

OneNote’s role within the Office team is also much clearer these days, with shared OneNote notebooks automatically provisioned for each new SharePoint team site, Office 365 Group and Planner subscription. While there was once ambiguity about OneNote’s role relative to SharePoint, for example, the modern SharePoint navigation menu in Office 365 indicates that OneNote is part of the SharePoint story (see Figure 4).

sContentManagement Figure4 071916 mobile Microsoft OneNote goes mainstream after long adolescence Figure 4: SharePoint Online and iOS SharePoint mobile navigation menus.

After a long and sometimes winding road since its first release in 2003, OneNote now reflects the best of the revitalized and cloud-first, mobile-first Microsoft and is likely to continue winning loyal users for years to come. If you’re interested in exploring a more modern approach to content and collaboration, compared to traditional document-centric alternatives, Microsoft OneNote is a great place to start.

Let’s block ads! (Why?)


ECM, collaboration and search news and features

Mainstream Hadoop users zero in on business benefits of big data

TTlogo 379x201 Mainstream Hadoop users zero in on business benefits of big data

The need to prove the business benefits of big data applications and platforms has taken center stage in a growing number of mainstream organizations, and it isn’t always an easy task for IT and analytics managers.

For example, a big data deployment wasn’t a slam-dunk decision for Blue Cross Blue Shield of Michigan.

“For a lot of organizations like ours, big data has not yet become a core foundation of running the business,” said Beata Puncevic, director of analytics, data engineering and data management at the medical insurer. “When you go in and talk to a lot of [executives] about investing in a big data platform, it completely does not resonate with the challenges of the day.”

At Blue Cross and other healthcare businesses, those challenges include low profit margins that don’t leave a lot of money for technology innovation, plus resource and skill-set issues, and a relatively conservative culture, according to Puncevic. As a result, she and her colleagues had to put in some extra effort to get approval and funding for a Hadoop data lake that went into use in May.

Puncevic set up a team to develop an ROI framework for the data lake project, with metrics on the projected big data benefits based on before and after calculations. In building the business case, she also focused on three IT-related improvements: reducing data processing and management costs, enabling more insightful analytics, and creating a more agile and adaptable technology architecture.

In addition, Puncevic said she worked to obtain corporate-level funding for the initial rollout and subsequent project phases, “so we don’t have to worry about getting funding from individual business units” for different aspects of the big data initiative.

The strategy worked, and the Detroit-based insurer is on a path to fully constructing the big data platform over the next three to five years. The benefits of big data are “potentially tremendous” for the healthcare industry as a whole, Puncevic said at Hadoop Summit 2016 in San Jose, Calif., last week. Besides lower IT expenses, she cited an opportunity to reduce healthcare costs, while also improving the quality of patient care and boosting preventive medicine efforts — all through better analytics.

On the road to big data benefits

The value of big data is definitely real for Progressive Casualty Insurance Co. and its auto policy customers, said Brian Durkin, an innovation strategist in the company’s enterprise architecture group. Progressive uses a Hadoop cluster partly to power its Snapshot program, which awards discounts to safe drivers based on operational data collected from their vehicles. The insurer has handed out more than $ 560 million worth of discounts since launching the program in 2008, Durkin said in another conference session.

“It’s not some little science experiment that we’re running,” he said. “We’re fully invested in it, and it means a lot to our customers.”

To track participating drivers and calculate discounts, huge volumes of data get processed and analyzed in the cluster, which, like the one at Blue Cross, is based on the Hortonworks Hadoop distribution. Progressive has collected data on 2.4 billion trips, and it retains all of the information. For analyzing driving patterns to identify bad habits drivers can be alerted to, “it’s the older data that’s more valuable,” Durkin said. “So, we have to keep everything and analyze everything.”

Crunching the data requires a lot of processing resources, and Progressive has deployed various advanced analytics tools for its data scientists to use, including SAS, the R programming language and H2O. But business executives have been willing to foot the bill, said Pawan Divakarla, data and analytics business leader at the Mayfield Village, Ohio, insurer.

“It’s a very data-driven company,” he said. “We want people to have intuition and ideas, but they need to prove them out with data.”

Hadoop’s higher-value proposition

Retailer Macy’s Inc. runs a mix of BI and analytics applications off of a Hortonworks-based Hadoop system to support marketing, merchandising, product management and other business operations. On a daily basis, thousands of business users access hundreds of BI dashboards fed by the cluster — making it a key component in decision making, said Seetha Chakrapany, director of marketing analytics and customer relationship management systems at Cincinnati-based Macy’s.

“You don’t want to just see Hadoop as a cheap storage solution,” Chakrapany said. “Its value is much higher than that.”

You don’t want to just see Hadoop as a cheap storage solution. Its value is much higher than that. Seetha Chakrapanydirector of marketing analytics and customer relationship management systems, Macy’s Inc.

Hadoop is still maturing and has “a lot of rough edges,” he cautioned, saying new users should expect some instability and missing IT management functionality. “If you come in with the typical IT mindset that this has to be rock-solid, it’s not going to be the right [technology].” Nonetheless, he said he thinks Hadoop “could truly be an enterprise data analytics platform” for Macy’s.

But Chakrapany isn’t taking the benefits of big data analytics and Hadoop-based BI applications for granted. Last year, he set up a team of evangelists to sell the merits of the big data environment internally and lobby more business units to use it. His group also tracks the business benefits generated by the Hadoop platform, in both qualitative and quantitative ways.

“We don’t want to just be counting the number of users, the number of queries, how much data [is analyzed] — those are just numbers,” Chakrapany said. “The key piece is, what has this done for the business?”

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.
Recommended article from FiveFilters.org: Most Labour MPs in the UK Are Revolting.


SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources

Is PHP mainstream?

History of PHP

PHP stands for Hypertext Preprocessor. PHP is a server-side scripting language. The invention of PHP started sometime back in 1994 and was started by Rasmus Lerdorf. In 1997, Zeev Suraski and Andi Gutmans rewrote the parser from scratch to create PHP version 3.

PHP Development Is PHP mainstream?

PHP is essentially a server-side programming language designed for web development. It is also used as a general purpose programming language.

Since every programming language comes with its own goods and bad, we have categorised the PHP story based on ‘Why PHP’ and ‘Why Not PHP’.

All said, the disadvantages of PHP are easily overpowered by the advantages. However, like with most technologies, there are different school of thoughts that prevail. There are people who consider PHP to have direct disadvantages while some consider it as language with a few indirect disadvantages.

Current stage of PHP

Today, PHP is installed on millions of servers worldwide. W3Tech- which provides technology analysis and surveys – claims that PHP is used by 81.9% of all the websites among all the server-side programming languages.

PHP has a very wide presence which can be seen almost everywhere globally. A highly recommended language, it has powered many well-known applications.

The current version of PHP is 5.6. PHP will also release the next version PHP 7 very soon. Most of the PHP functions are criticized due to security reasons. However, in most of the releases PHP fixes security related issues. This means that PHP is working on improving the internal security of code.

 Why PHP?

PHP is an open-source language, which means there are no usage charges on using PHP and its features. PHP is very easy and quick to learn making it a programmer’s delight. Anyone who starts using it can quickly become productive at it.

It supports near about 23 different databases. Some of the prominent supported databases include – IBM DB2, Mongo — MongoDB driver (stable and supported), MongoDB — New MongoDB driver (beta), mSQL, Mssql — Microsoft SQL Server, MySQL — MySQL Drivers and Plugins, OCI8 — Oracle OCI8, Paradox — Paradox File Access, PostgreSQL, SQLite, SQLite3, SQLSRV — Microsoft SQL Server Driver for PHP, Sybase

What makes PHP unique?

PHP is uniquely positioned in many ways. You can deploy your PHP script on different server platforms as it supports multiple platforms and is supported by almost all webservers.

In addition, the PHP community has a large number of contributors and users. Many of them help you provide valuable information and support. Also, there are lots of good frameworks and CMS’ that have been built on PHP.  The language also has a great number of available extensions and libraries.

PHP has been a natural choice for many as it assists rapid development. Any change in the PHP code is immediately reflected in the browser. You do not need to compile any PHP code before its execution.

For companies which do not want to compromise with security, PHP allows execution of the code in restricted environments to make it more secure.

Why Not PHP?

PHP is extensively used for managing low traffic on the site. PHP has generally not been used for sites with heavy traffic. It is generally difficult to manage the heavy traffic on site with PHP.

As PHP is an open source language, anyone can see the source code. People can view and change its source code and customize it which can lead to code security issues. In such a scenario, chances of your site getting hacked goes up greatly unless you are not taking proper care of the security.

PHP is not suitable for large scale applications as it is very hard to maintain its modules and sources (This is now being extensively looked upon by PHP frameworks which are working hard to remove this disadvantage).

PHP is a ‘Loosely Typed Language’. For example, in PHP, the string “2000” and “2e3” are equal, because they are implicitly cast to a floating point number. In such contexts, PHP code gives you expected results, whereas other languages give you error.

Where can I use PHP?

PHP can be used for most small to large scale and dynamic web application development purposes. PHP can also be used for a quick understanding of programming as it is a simple and easy programming language.

If project time and cost is limited, PHP becomes a natural choice. PHP can be also used in most of the CMS sites.

PHP Roadmap in future

There will be some key changes as a part of the future PHP roadmap that will improve the usability of PHP. These are the proposed PHP versions – PHP 5.7 and then PHP 7 with huge difference between these two releases. Post release, there will be no impact on existing extension and code. It is purely a planning of RFC (Request for comments).

The expected date for PHP 5.7 release is September 2015 and PHP 7 is end of 2015/January 2016. It will remove old PHP 4 support and extension code.

The new PHP release will definitely enhance its extension power. In the future roadmap, PHP will try to keep a clear distinction between Zend extension and PHP extension. There are also plans on ‘Refactor error handlers’ for extensions to be able to stack like zend_execute.

One key change will be that users will be able to use ‘array_* functions’ with array as well as objects.

There will be some changes at engine, design and debugging level, which means you will see a lot of benefits in future versions of PHP.

Conclusion

PHP is the most extensively used server-side programming language. 80% of the webservers use PHP today, which is more than ASP.Net, Perl, Python, Java, Ruby and all other combined languages. You can develop small to very large scale applications in PHP.

With a high number of extensions and growing PHP usage, every new release of PHP will bring improvements in PHP security threads, enhancements, existing features and fixes.

References:

http://php.net/

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.

e-Zest | India | USA | UK | Germany | Europe

Big Data Turning Point? Yes, Hortonworks’ IPO Filing Signals the Mainstream Momentum of Hadoop

I have always been fascinated by the intersection of technology and Wall Street.  Having worked as Chief Administrative Officer for equities at Lehman Brothers and recently as the CFO/CAO for global technology at NYSE Euronext, I got a bird’s eye view of how technology is driving a fundamental shift in the global economy.

On Wall Street, we had being hearing for the past couple of years how Hadoop is going to fundamentally help solve one of the biggest challenges that companies face: the challenge of getting value from “big data.” Venture capital investors had seen the vision and had invested significantly to build this ecosystem, but Wall Street was still trying to understand its potential.

Until very recently, companies were largely just testing Hadoop. We at Syncsort, with our advanced Big Data offerings, have first hand knowledge of how our clients are deploying their new Hadoop environments. Clients are now moving quickly to put Hadoop into production given the maturity of the product and the significant cost and process improvement it provides. This tipping point is critical in the evolution of the product.

Hortonworks’ S-1 filing in advance of an IPO illustrates that the broader financial community is starting to understand the power of Hadoop and the importance of the Big Data revolution. Still more importantly it will give investors access to the first pure-play Hadoop vendor in the market. Many large public companies have Hadoop-based products but they represent a tiny fraction of their business. The only real options in the public markets were companies like Splunk, Tableau, etc., which don’t really provide Hadoop exposure. With the Horton IPO, investors now have their chance to be part of a market-leading Hadoop distribution pure play.

Hortonworks and the other distribution companies like Cloudera and MapR are competing in a huge potential market – over $ 50 billion by 2018 as forecast by Wikibon.  And Gartner predicts that Hadoop will be in two-thirds of advanced analytics products by 2015. Investors are starting to recognize the importance of this secular opportunity.

This data evolution is that it is not vendor driven – it’s customer driven.  From the conversations we had at the recent Strata conference to various studies by IDC, Wikibon and our own survey data, enterprises are looking to move data to Hadoop production environments and are earmarking budget to grow those initiatives through 2015 and beyond.

What will 2015/2016 hold for Hadoop and other innovative Big Data technologies?  I predict more IPO filings, live customer stories hitting the business and trade press, and Big Data becoming mainstream for every IT department, as customers look to derive strategic value from all of their data.

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.

Syncsort blog