Category Archives: TIBCO Spotfire

WEBINAR SERIES: Simplify and Scale Your Smart Data Discovery

Attivio Webinar Series 2 Blog 620x360 WEBINAR SERIES: Simplify and Scale Your Smart Data Discovery

Join TIBCO for this 3-part webinar series where we help you simplify and scale your smart data discovery. We’ll walk you through the concepts, then do a live example using the technique and conclude with a live Q&A session. All webinars will be recorded, so bookmark this page to return for the on-demand content.

Register today.

AI-Driven Data Discovery and Cognitive Search to Accelerate Big Data Insights with TIBCO Spotfire Smart Data Catalog
Live Date: July 26, 2017   

Looking for the right data across departmental silos is not a great use of analysis time. The TIBCO Spotfire® Smart Data Catalog, a new data connectivity and management solution makes it dramatically easy to search and automatically blend related data across multiple disparate data sources, including databases, data lakes and files. Once found and blended the catalog makes it easy to provision data to Spotfire for analysis. Learn more.

Unlocking the Value from Unstructured Data Goldmine with TIBCO Spotfire Data Catalog
Live Date: September 7, 2017  

Historically it has been very difficult to unlock the value of information trapped in unstructured data. Tune in to this webcast to see how the Smart Spotfire Data Catalog can extract meaning and sentiment from millions of documents and make the results available to analyze in an easy to use visual analysis environment. Learn more.

AI & Machine Learning Powered Data Discovery with TIBCO Spotfire Data Catalog
Live Date: October 11, 2017   

The TIBCO Spotfire Smart Data Catalog, uses artificial intelligence and machine learning methods to make it easy to search and find the right data. It uses natural language processing, synonyms and customer specific taxonomies to increase the accuracy of search results. Machine learning is used to auto-join data sources leveraging a patent-pending technology. Learn more.

Register today.

Let’s block ads! (Why?)

The TIBCO Blog

TIBCO Named Market Leader in Middleware-as-a-Service for 2017

leader banner 620x360 TIBCO Named Market Leader in Middleware as a Service for 2017

Integration is the lifeblood of today’s digital companies, and TIBCO is at the forefront, providing some of the best and most comprehensive integration solutions out there today.

According to Ovum’s latest research paper, “Ovum Decision Matrix: Selecting a Middleware-as-a-Service Suite, 2017–18,” TIBCO is one of today’s best vendors offering middleware-as-a-service (MWaaS). Download the report free, compliments of TIBCO: Ovum’s MWaaS report.

TIBCO got high scores on all four of the vendor selection criteria:

— Cloud integration
— API platform (both creation and management)
— B2B integration
— IoT application integration

As you can see, there are many components to a company’s integration needs, and TIBCO is helping companies support all aspects of their integration scenario with a comprehensive MWaaS suite.

An MWaaS includes TIBCO’s popular iPaaS as well as an “apiPaaS, mobile backend as a service (MBaaS) and other cloud-based integration services, such as data-centric PaaS and cloud-based B2B integration services.”

A lot of companies today are not ready to move fully into the cloud due to regulations, age of core applications or simply because of their willingness, but still need to connect everything to get a complete view of their customers. They have what is known as a hybrid scenario with a combination of legacy systems that reside on-premises and newer tools that reside in the cloud. It’s more appropriate to consider a company’s entire integration needs, rather than just narrow use cases.

Ovum says companies should be looking at a vendor’s overall MWaaS offerings as opposed to just narrowly focusing on their iPaaS solution. An MWaaS will more appropriately address a digital company’s entire spectrum of integration needs.

Ovum graded TIBCO as one of the best choices out there to help companies with their entire spectrum of integration needs. Now is the perfect time to look at your entire integration environment and really assess your needs from the ground up. TIBCO can help. Contact us today.

Let’s block ads! (Why?)

The TIBCO Blog

Learn How to Integrate in Minutes with TIBCO Cloud Integration

rsz bigstock white paper clouds on blue bac 97760516 Learn How to Integrate in Minutes with TIBCO Cloud Integration

In order to stay competitive, your company needs to become a fully connected or integrated organization. In other words, a digital business. An organization where technologies offer innovative ways to connect, collaborate, conduct business, and build bridges between people and devices.

The reality is, this transition is proving harder than expected for many. A lot of companies are finding that their systems can’t connect to today’s cutting edge, all-important cloud services because they are on-premises, behind firewalls, or legacy, and therefore not built with cloud in mind. What to do? How can your business remain agile and competitive in today’s landscape and easily and quickly transition into a digital organization without heavy costs?

Come check out TIBCO’s Cloud Integration in action and see how easy it is to integrate everything. The TIBCO Cloud Integration live demo series, starting July 11th, will run through the end of November. With this powerful yet easy to use integration tool, you don’t have to break up with your legacy applications and systems to transform into a digital company. TIBCO Cloud™ Integration allows users to quickly and easily connect all applications, systems, mobile devices, and partners both inside and outside your company’s network, regardless of how old they are or if they are cloud-native or not. And, you don’t have to be an IT expert to accomplish this. TIBCO Cloud Integration allows any user—from marketing, to development, to IT, to office hero—the ability to access integration functionality, all from a web browser.

See TIBCO Cloud Integration come to life and how it can:

  • Automate everyday workflows
  • Give you quick time to results
  • Provide a low barrier to entry (both with cost and ease of use)
  • Connect all of your SaaS apps without IT

These webinars are designed to be highly interactive, so bring your questions and use cases. If the first demo doesn’t work for you, sign up for a date that suits your calendar. We recommend you start your TIBCO Cloud Integration trial so you can follow along in real time. Looking forward to seeing you there!

Let’s block ads! (Why?)

The TIBCO Blog

3 Musketeers—IoT, Middleware, APIs

rsz bigstock iot business man hand working 186179884 3 Musketeers—IoT, Middleware, APIs

Over the next few years, anything that you can dream of being connected to the Internet will be, with an end goal of performing more automated, efficient tasks. The Internet of Things (IoT) has gained significant momentum in the last couple of years and is becoming one of the fastest growing technologies in the enterprise space. IoT will massively increase the amount of data available for analysis by organizations.

Middleware for IoT acts as a glue connecting the mixed domains of applications communicating over heterogeneous interfaces.

An IoT integration platform is required to collect, aggregate, and transform data from multiple sources before it can be analyzed. There should be one tool performing all the actions rather than having multiple heterogeneous services working in silos.

It is difficult to define and enforce a common standard among all the diverse devices belonging to diverse vendors and domains in IoT, with no common standards, the need for an integration platform is more prominent.  It will act as a common standard amongst the diverse sensors, devices, and applications supporting numerous communication protocols, data types, and transport requirements.

Also, you need a digital fence around every IoT system, and middleware has a role to play in security as well. Integration platform should incorporate access control and data authentication mechanisms.

APIs should be the standard way of communication between devices and sensors. Enabling APIs in an IoT solution is very critical, as you do not want your IoT solution to be closed and not interoperable.

APIs can act as a standard gateway for device communication. It can be used for device registration and activation, providing a management interface for the sensors, and exposing a device capability. In an IoT world, seldom a device would be able to perform a functionality on its own, as it needs to talk to other devices and sensors. Having a middleware and API management layer to facilitate this communication is key.

Middleware and APIs are fundamental enablers of the Internet of Things; but without these two, the unique characteristics of IoT can easily lead to catastrophe, especially when it comes to:

—Devices supporting multiple communication interfaces and standards
—Monitoring and managing multiple IoT devices
—Performance and scalability
—Developer and device registration and security
—Visibility and analytics

Gone are the days of stand-alone, rigid IoT solutions that cannot evolve over time. In a fast-moving world, applications require adaptability and open integrations. Combining IoT with middleware and APIs will lead to faster, smarter, scalable, and flexible solutions.

“All for one and one for all, united we stand divided we fall.” 

This is the motto of French Musketeers named Athos, Porthos, and Aramis who were inseparable and stayed loyal to each other through thick and thin in Alexandre Dumas’ The Three Musketeers. It is very much true in case of IoT solutions. Together, they can be the Athos, Porthos, and Aramis of the enterprise world.

Learn more about the TIBCO BusinessWorks Hybrid Integration Platform and check out TIBCO Mashery API Management Platform.

Let’s block ads! (Why?)

The TIBCO Blog

Harnessing Big Data to Build a Smarter Public Transport System

rsz bigstock 175001542 Harnessing Big Data to Build a Smarter Public Transport System

As first appeared in TODAY:

Public transport plays a pivotal role in the daily lives of Singaporeans. The vast majority of citizens depend on buses and the MRT to get them to work and home again, and to ensure they can move around the island efficiently and inexpensively throughout the day.

The government is faced with a constant dilemma—how to square the circle of a growing economy and population, citizen aspirations for car ownership, and extremely constrained land area. The policy answer, rightly, has been to expand and improve public transport, in ways that minimize land use and provide travellers with convenient, reliable, comfortable public transport options, available everywhere. The Ministry of Transport’s aim is for commuters to make use of public transport for 75% of morning peak hour journeys by 2030.

The challenge is in balancing supply against demand. Considerable growth in utilization and demand for the public transportation networks—fuelled by a growing economy and population, and the success of measures to reduce private transport—is happening right now. On the other hand, the completion of new MRT lines, both under construction and on the drawing board, is still some years in the future.

This situation is creating a social and economic problem today. Singaporeans are demanding better service, fewer breakdowns and ways to cope with the overcrowding during peak periods. The local authorities and service providers are struggling to optimize services, fleets, and manpower as well as manage contingencies.

The government has broadly shown itself to be tech-savvy, and open to the solutions to multiple issues offered by big data. The transport sector is no exception. In a speech at the Committee of Supply debate, Second Minister for Transport Ng Chee Meng said that big data and analytics will be used to improve train reliability as well as help public bus operators to track the location of their buses.

Ultimately, the objective is to integrate private transport data as well, leading to aggregated, comprehensive, and real-time data on road traffic.

Data analytics is clearly key to resolving Singapore’s transport woes; how can the science be applied?

A project at MIT’s Department of Urban Studies & Planning points the way. A number of papers published under this project show the need for every element of the public transport eco-system to be interconnected, to provide critical real-time data. Analysis of this data can serve to augment intelligence and manage anomalies in real-time.

Predictive maintenance, for example, can be scheduled to minimize vehicle breakdowns, the great bugbear of commuters. Data feeds on the areas and timings of regular traffic congestion can allow for the planning of more efficient bus routes as well as managing peak-period congestion at bus stops with more frequent services for popular routes.

This may sound esoteric but it is really not rocket science, and other countries are already using data analysis to help manage their public transport issues. What can we here in Singapore learn from best practices around the world that are alleviating these challenges for transport authorities, service providers, and consumers?

According to a report by McKinsey & Company, collection and strategic use of information can improve forecasting and help to nudge behavior in ways that improve the reliability of transport infrastructure and increase its efficiency and utilization.

The report cites as an example the fact that Israel has introduced a 13-mile fast lane on Highway 1 between Tel Aviv and Ben Gurion Airport. The lane uses a toll system that calculates fees based on traffic at the time of travel. To make it work, the system counts the cars on the road; it can also evaluate the space between cars to measure congestion. This is real-time pattern recognition of a very high order. The information is then put to use in a way that increases “throughput,” or the amount of traffic the road can bear. If traffic density is high, tolls are high; if there are few cars on the road, charges are cheap. This not only keeps toll revenues flowing, but also reduces congestion by “steering” demand.

Holland, too, is benefitting from the application of big data analysis. Dutch Railways is the principal passenger railway operator in the Netherlands, providing rail services on the Dutch main-rail network and international services to other European destinations. Running these vast networks gives Dutch Railways access to huge amounts of data, collected through intelligent train technology, ticketing systems, travel information, real-time monitoring, and services for maintenance and control unit staff.

Until now, train suppliers delivered all this IT, so each type of train had its own IT environment—making it difficult to work together and maintain each system. Dutch Railways had a vision to integrate all these information to deliver more reliable and better serves to customers. Using streaming analytics, in-memory computing, integration, and messaging software, Dutch Railways is now able to provide real-time information about train services and maintenance scheduling. Commuters are also able to use a travel planner application to ensure a seamless and prompt commute.

The clear conclusion is that digitizing infrastructure networks can improve forecasting, promote reliability, and increase efficiency.

So, what are the next steps?

The Singapore government has already taken the first step to open up and share the data being gathered on transportation with start-ups and entrepreneurs, allowing them to explore and use it for the kind of innovation that the private sector excels at. The sharing of transport data among all the stakeholders—transport operators, system providers, and citizens alike—will, in our opinion, speed up the development of practical solutions to reduce congestion, improve waiting times, and overcome commuter inconvenience. Embracing technology in this area will not only improve our daily lives, but serve as an important step in our journey toward actualizing the Singapore Smart Nation Vision.

Let’s block ads! (Why?)

The TIBCO Blog

Tips & Tricks: Registering a New Data Function

Over the past few months, I have received numerous requests for information regarding the topic of advanced analytics. Many of these requests center around the concept of how to register a new data function using TIBCO Spotfire. Although this capability has been available in Spotfire for quite some time, it appears that many people are now discovering it for the first time as they approach business problems that are increasingly more complex.

Given the renewed interest in this topic, I have decided to post an example of how to register a new data function in Spotfire. I encourage you to use the data and code files included at the bottom of this post to follow along.  I believe you will find that the process is much simpler than you might expect. Once you master this process, an entire new vista of advanced analytics will be available to you.

Problem description: Register a new TERR data function that will allow any user to fit a Loess Smoothing function to a set of data. In this example, I assume you are using the Spotfire Analyst Client with the Spotfire Server up and running.

1. Open the file named “fuel_data.dxp”. This dataset contains information about various types of cars, their type, weight, displacement, fuel, and gas mileage.

Screen Shot 2017 07 07 at 12.27.32 PM Tips & Tricks: Registering a New Data Function

2. Create a simple line chart to show the relationship between car weight and gas mileage. Clearly, there is a relationship between weight and mileage. Heavier cars have lower gas mileage.

Screen Shot 2017 07 07 at 12.27.40 PM Tips & Tricks: Registering a New Data Function

3. After reviewing the scatter plot, we decide that it might be helpful to draw a smoothing curve through the points to help see overall trends or relationships between these two variables. We decide to create a new Data Function that will calculate a Loess Smoothing Function.

Begin by selecting “Register Data Functions” from the Tools menu.

Screen Shot 2017 07 07 at 12.27.46 PM Tips & Tricks: Registering a New Data Function

4. Give your new Data Function a name. In this example, I have named it “My Loess Smoothing Function”. You can name it anything you wish. Enter the TERR code shown below (see TERR_Script.txt).

Screen Shot 2017 07 07 at 12.27.54 PM Tips & Tricks: Registering a New Data Function

result.loess = loess( y ~ x )

result.table = data.frame( = x, = y,

smooth = result.loess$ fitted


5. Click the “Input Parameters” tab and then click “Add”.

Screen Shot 2017 07 07 at 12.28.00 PM Tips & Tricks: Registering a New Data Function

6. Complete the dialog box as shown below. Click “OK”. This action sets up the x-axis parameter that will be fed into the function.

Screen Shot 2017 07 07 at 12.28.05 PM Tips & Tricks: Registering a New Data Function

7. Click the “Input Parameters” tab and then click “Add”.

Screen Shot 2017 07 07 at 12.28.13 PM Tips & Tricks: Registering a New Data Function

8. Complete the dialog box as shown below. Click “OK”. This action sets up the y-axis parameter that will be fed into the function.

Screen Shot 2017 07 07 at 12.28.18 PM Tips & Tricks: Registering a New Data Function

9. Click the “Output Parameters” tab and then click “Add”.

Screen Shot 2017 07 07 at 12.28.27 PM Tips & Tricks: Registering a New Data Function

10. Complete the dialog box as shown. This will set up the result output table that will contain the smoothed data function.

Screen Shot 2017 07 07 at 12.28.33 PM Tips & Tricks: Registering a New Data Function

11. Click the “Save As” button. Select the directory to put your new Data Function in. Click “Save”. When finished, click the “Close” button. You have now created a new Data Function which can be used at any time to plot a smoothing function over top of a set of data.

Screen Shot 2017 07 07 at 12.28.42 PM Tips & Tricks: Registering a New Data Function

12. Open a new page in Spotfire. Add a line chart as shown.

Screen Shot 2017 07 07 at 12.28.49 PM Tips & Tricks: Registering a New Data Function

Screen Shot 2017 07 07 at 12.34.32 PM Tips & Tricks: Registering a New Data Function

13. Select “Data Function” from the “Insert” menu.

Screen Shot 2017 07 07 at 12.35.10 PM Tips & Tricks: Registering a New Data Function

14. Find your new Data Function by name. Highlight it and then click “OK”.

Screen Shot 2017 07 07 at 12.35.17 PM Tips & Tricks: Registering a New Data Function

15. Map the “x” variable to the weight column in your dataset as shown below. Be sure to check the “Refresh function automatically” and “(Active filtering scheme)” options as shown.

Screen Shot 2017 07 07 at 12.35.23 PM Tips & Tricks: Registering a New Data Function

16. Map the “y” variable to the mileage column in your dataset as shown below. Be sure to check the “Refresh function automatically” and “(Active filtering scheme)” options as shown.

Screen Shot 2017 07 07 at 12.35.30 PM Tips & Tricks: Registering a New Data Function

17. Click the “Output” tab and then map the “result.table” output as shown. When finished click “OK”.

Screen Shot 2017 07 07 at 12.35.42 PM Tips & Tricks: Registering a New Data Function

18. Back on the line chart page, change the data source for the line chart to be “result.table”. This is the smoothing function output you just created when you inserted the data function.

Screen Shot 2017 07 07 at 12.35.49 PM Tips & Tricks: Registering a New Data Function

19. Using the Data panel place the “Smooth” and “” values on the vertical axis. Place the “” on the horizontal axis.

Screen Shot 2017 07 07 at 12.37.17 PM Tips & Tricks: Registering a New Data Function

20. Remove the “Sum” aggregation from both variables on the vertical axis.

Screen Shot 2017 07 07 at 12.37.38 PM Tips & Tricks: Registering a New Data Function

21. Go to “Properties” for the line chart. From the “Appearance” option choose to display markers. You will now see the original mileage and weight data with a Loess Smoothing Function plotted over top. As you filter data from the original “Fuel Data” data table the function will automatically recalculate and the graph will be updated in real time.

Congratulations, you just registered your first Data Function!
Screen Shot 2017 07 07 at 12.38.02 PM Tips & Tricks: Registering a New Data Function

Screen Shot 2017 07 07 at 12.38.11 PM Tips & Tricks: Registering a New Data Function

Try this and TIBCO Spotfire’s other great features in a free software trial. Start yours today.

Let’s block ads! (Why?)

The TIBCO Blog

WEBINAR SERIES: The Building Blocks of Data Science

TIBCO building blog WEBINAR SERIES: The Building Blocks of Data Science

Join us for this 5-part webinar series solving real data science questions using Spotfire and TERR. We’ll walk you through the concepts and then give live examples using the techniques. All webinars will be recorded, so bookmark this page to return for the on-demand content. Register today.

Beginning Topics: Getting Started with Statistica
Live Date: 7/12/2017

What is the value of a star on Yelp? In this webinar, we’ll answer real data science questions like this using Spotfire and TERR to make smarter decisions. For the cornerstone of our series on data science in Spotfire, we are going to begin with linear regression and how to use it. Learn more.

Beginning Topics: Analytics: From Insight to Action with Statistica
Live Date: 8/16/2017

As our companies try to make sense of the data deluge, analytic tools have evolved to radically simplify the data aggregation, preparation, model building, and deployment process. This introductory webinar is geared towards beginners who want to learn more about the newest addition to the TIBCO Insight Platform—TIBCO Statistica—and how one can quickly create and deploy analytic models to drive change within your organization.  

We will also share how TIBCO Statistica, Spotfire, and Streambase work in concert to get the most out of your data. Learn More.

Intermediate Topics: Inline TERR and Data Functions
Live Date: 9/27/2017

Do seasons affect my business? In this webinar, we’ll answer real data science questions like this using Spotfire and TERR to make smarter decisions. For our next topic, we are going to introduce TERR, the enterprise version of R. This will enable us to create data functions and in this scenario we will be looking at smoothing, a technique for understanding trends. Learn more.

Advanced Topics: Classification with Spotfire
Live Date: 11/14/2017

How can I predict my customer base? In this webinar, we’ll answer real data science questions like this using Spotfire and TERR to make smarter decisions. For our next webinar, we’ll be managing a hotel’s marketing group, using classification methods inside of Spotfire. We’ll learning the basics of the technique and then looking at examples of implementation in TERR. Learn more.

Advanced Topics: Machine Learning in Spotfire
Live Date 1/24/2018

Can I predict the future price of my home? In this webinar, we’ll answer real data science questions like this using Spotfire and TERR to make smarter decisions. We’ll take the role of a real estate firm in this webinar, examining how to prospect for development projects.

Register today!

Let’s block ads! (Why?)

The TIBCO Blog

Melbourne Airport and Essent on Success with TIBCO

blog melbourne essent Melbourne Airport and Essent on Success with TIBCO

Melbourne Airport Launches Digital Transformation to Become a Smart City

Melbourne Airport expects to service about 64 million passengers by 2033, double today’s volume. “To service this growth, we need to use all our assets in a smarter fashion, and we think that technology, amongst other things, will allow us to move from a mini city to a smart city, and provide major value-add for our customers and the business,” says Vic Raymond, ICT Planning and Delivery manager.

Andrew May, senior consultant with Nukon says, “We needed a platform that could support the strategic vision, not just now, but something that is going to be evolving 10 years from now, a foundation that is vendor agnostic and allows connecting a large number of devices, any device whether an IT system, a control system, or another type of asset, and then present that in a user friendly way. TIBCO technology is a core enabler.”

Learn more about the project’s architecture using TIBCO StreamBase, TIBCO Live Datamart, TIBCO ActiveMatrix BusinessWorks, and TIBCO Enterprise Message Service, and its timing, scope, and progress.

Essent Supplies Self-service Energy for Customer and Employee Satisfaction

Over the last 10 years, the energy market in the Netherlands evolved dramatically. “Gas and electricity consumers in the Netherlands got the right to switch suppliers, the government unbundled retail from power generation functions, Essent was acquired by RWE, and we began offering services to the broader European market,” explains Senior Enterprise Architect Niels Wolf. “Like every organization, our goals are to increase revenue and make a profit, but also to give our customers self-service capabilities. We needed to adapt very quickly to what the customer wants. If you don’t adapt to their needs and wishes, your competitors will, and you will be out of business before you know it.

“From a technology perspective, we needed fast performance, and that meant decoupled systems for optimal flexibility.”

Learn how, relying on the TIBCO platform, Essent solved its legacy problems and launched cost-cutting customer self-service, fast innovation through partnerships, and fast performance that is increasing agility and bringing in IoT/sensor data for energy supply planning.

Let’s block ads! (Why?)

The TIBCO Blog

Announcing Jaspersoft 6.4: Embedded BI and Reporting Evolved

jaspersoft Announcing Jaspersoft 6.4: Embedded BI and Reporting Evolved

I’m pleased to announce our latest release, TIBCO Jaspersoft 6.4. Based on popular demand, the release has more than 100 quality, feature, and performance improvements across the entire product set. Capabilities span from more web-based reporting, enabling less technical users to create more formatted content, to multi-level hyperlink actions through dashboards, to new import-export features that streamline the management content on the JasperReports Server.

Visualize.js continues to drive a modern, immersive user experience for application developers. We’ve added input control components and the ability to inherit CSS for styling web page content. Our product team continues to create a substantial body of new sample content to better enable developer success. Be sure to check out our latest content:

  • New live Visualize.js API samples guide
  • js API samples on GitHub
  • Full sample application showcasing embedding on GitHub

Report developers will also find continued usability improvements in Jaspersoft Studio based upon popular demand from our commercial customers. The new release has completely re-worked UIs for defining HTML5 charts, with hundreds of properties at your fingertips for creating highly customized report content.

As always, we want to hear your feedback and see how you are using Jaspersoft to fit your embedded reporting and analytics needs. Comments? Drop me a line at

Let’s block ads! (Why?)

The TIBCO Blog

Data Reform to Drive Digital Transformation

bigstock 151146176 620x360 Data Reform to Drive Digital Transformation

As first appeared in Financial IT:

The General Data Protection Regulation (GDPR) narrative may often be framed around security breaches, but this headline-grabbing angle perhaps overlooks the new legislation’s broader role as a catalyst to support the digital transformation agenda at many organizations.

With reforms to the handling of personal data having far reaching consequences and bringing a new legal framework to the UK, the EU and across the globe, the stakes are high. At 4% of annual global turnover, potential fines for non-compliance dwarf the usual regulatory penalties, and in turn present the very real prospect of extinction for businesses that do not get a handle on what is expected.

It’s why visionary organizations are embracing the challenge by looking beyond box-ticking compliance to capitalize on the opportunity to align legislative adherence to their broader digital strategy.

In essence, this heightened governance will demand much more when it comes to the processing of consumer data; greater responsibility, accountability, and robust evidence of privacy controls, and for many this will mean a rethink of their entire ethos towards data protection. Now, with the compliance deadline looming, the need to review existing business processes and remedy any gaps intensifies. Experts predict that even the most digitally mature organizations could still be caught napping if they don’t make fundamental changes to their data infrastructure by May of next year.

Ultimately, those that use this development as an impetus for more innovative and ingrained digitalization will be better placed to foster an organizational culture of continuous improvement to enrich their offering and stay competitive. This will be critical to serve customers whose growing power is cemented by the new legislation. Now, with greater control over their data—including the right to request its deletion if there’s no compelling reason for an organization to carry on processing it—customers have much higher expectations as to how their data should be used. As well as enhanced transparency and protection around personal information, a more value-driven data exchange fuelled by innovation has increasingly become the norm in this relationship—be they business or private individuals, or partners in the broader supply chain.

Furthermore, with the success of the reforms resting on the extent to which consumers buy into the increased sharing of their personal data—albeit in new and controlled ways—the onus is on businesses to drive their buy in. Yet while consumer trust is paramount, it remains elusive, borne out by the most recent ICO survey which found that 75% of adults in the UK don’t trust businesses with their personal data.

Overcoming an inherent suspicion means that all core processes must be underpinned by accountability, accuracy, and transparency, with a demonstrable understanding from the business of the risks that it creates for others and how these can be mitigated.

As a result, savvy organizations are turning to the solutions which help them negotiate this delicate balance between consumer data analysis, data governance, and privacy protection, without the need for compromise. It calls for the kind of astute data management which provides an up-to-date and single view of the customer with accessible reporting dashboards, while meeting the heightened security and privacy demands as their personal financial information is shared more broadly with a range of third parties.

Built-in governance features to control business process content become a critical component for auditors who require a 360-degree review of activity.  With greater visibility and control of operations, business processes can be consolidated and standardized to ensure best practice and absorb the new regulatory framework for GDPR, adhere to new policies, and ensure demonstrable compliance, all while driving a culture of continuous improvements and added value.

Other technologies and solutions aiming to drive greater collaboration internally will also reap dividends, as teams are equipped to define, simplify, share, and change their processes in minutes, not days, leave feedback, and approve changes. The result is that major operational changes around digital transformation can be communicated to the entire organization in a way that minimizes business disruption, reducing inefficiency and risk.

Furthermore, this technology plays a central role in data management and process change education for employees—steering them through the quagmire of user rights around consumer data, personal, and financial information. As we prepare for the seismic shift that GDPR represents, establishing a deep understanding across an organization is vital to facilitate the effective responses to areas of deficiency identified and actions to address them, including evidential, demonstrable progress to the regulator, when requested.

For savvy organizations GDPR can therefore unite the digitalization and compliance agendas, driving initiatives of mutual benefit for both the Chief Digital Officer and Chief Data Officer, the organization and ultimately, the consumer.

Discover more about TIBCO Spotfire‘s data analysis tools and try them for yourself, free.

Let’s block ads! (Why?)

The TIBCO Blog