Category Archives: Data Mining
In January, TIBCO and Automated Insights announced a partnership to bring automated narrative explanations to TIBCO Spotfire dashboards. The integration of Automated Insights’ natural language generation platform, Wordsmith, with TIBCO’s Spotfire data visualization and analytics software, enables customers to make faster, more cohesive conclusions and decisions for their companies.
With the integration of Spotfire and Wordsmith, narratives that summarize and explain dashboards can be automated, reducing the time between the creation of data visualizations and actionable insight. The unique methods of data aggregation and dissection provided by TIBCO’s Spotfire platform enables users to quickly turn those findings into appealing, effective visualizations. By connecting Wordsmith, Spotfire users can enhance dashboards with written narratives that ensure every person in the organization gathers the same understanding from these visualizations.
By complementing Spotfire’s visualizations with written text, important metrics are made clear and highlighted based on the needs of each department, or even individual teams within departments. The narratives display directly in the Spotfire dashboard, and are customizable to the company’s internal lexicon. If the user drills down to specific metrics or filters in the dashboard, the narrative will change dynamically to summarize the highlighted portions of the visualization.
Why is this important? It eliminates naturally occurring discrepancies within the interpretation of a visualization between individuals within a company and provides uniformity in the understanding of the company’s key metrics. It also increases the time data analysts have for higher value projects because they no longer have to write individual explanations for every visualization, department, or each unique combination of data filters in the dashboard.
Additionally, the integration of the two platforms incorporates a virtual data analyst into every Spotfire dashboard, leading to faster and smarter decision-making processes that don’t waste internal resources. Connecting the two platforms creates increased efficiency of analytics teams, reducing the time between dashboard creation and interpretation through manual reporting. It also reduces the time between data aggregation in Spotfire visualizations and action taken by executive members after deriving meaning from these dashboards. No matter the data analytics and interpretive expertise of the user, the text will adapt to the viewers selected filters and adjust the narrative to extract the desired insights.
As a result, your best analysts will have more time to tackle high value projects that was previously spent writing internal reporting documents for every scenario and reader.
Now, TIBCO Spotfire users can use their best data analyst’s expertise in their dashboards to ensure cohesive understanding of key metrics across the company. This integration of TIBCO’s Spotfire platform with Automated Insights’ Wordsmith enhances visualization dashboards with written summaries of key metrics, improving the business intelligence industry’s efficiency and effectiveness in acting upon important insights.
Do you think that FitBit could sell more than 60 million devices and maintain 20 million+ active users without help from data? Without data, those devices are just pieces of plastic. The kind of data experiences delivered by companies like FitBit is ‘over-the-counter’—easily digested, easy to get insights. Like over the counter medicine, over-the-counter data doesn’t require a prescription, is easy to take, and can be found at any local pharmacy.
So what does it take to deliver those kinds of insights in your own B2B applications? There are four areas to consider that set Embedded BI platforms, like TIBCO Jaspersoft, apart from regular BI and analytic solutions:
- Multi-Tenant Architecture: Embedded BI has to be “embedded” into something else—and most applications today are web based and increasingly multi-tenant. So, look for tools with modern multi-tenant capable backends that install next to your apps.
- Single Sign On: Your application needs to seamlessly work with your embedded BI environment and provide a frictionless experience to users as they hop between application and analytics.
- Performant and scalable architecture: New applications tend to be cloud-based, deployed on container technologies like Docker, and work with modern scalable systems like AWS.
- Why traditional BI deployments fail
- With more than 73 vendors in the BI and analytics market, what is the best way to select an embedded BI vendor?
- How to gain competitive advantage with data driven applications
In Turkey, Kuveyt Turk operates as a participation bank, recruiting other banks to participate and share risks and profits. It conducts business in five segments: retail, small business, commercial, corporate; and treasury, international, and investment banking, and wanted to efficiently increase trading volumes and its customer base.
“Our objectives were to integrate our platforms and internal systems and transform to algorithmic trading,” says Senior Trader Abdulkerim Ozcan. “We had four basic decision-making points (user friendly interface, less code development, cost, and customer support) that we compared for two vendors, Oracle and TIBCO. Oracle was the first option because we had history. But our industry challenges included continuously changing and emerging markets and increasing requirements for pricing, volumes of data and transactions, and speed. And we lacked industry standards. We analyzed both companies using the four criteria and decided on TIBCO.”
Read about the successes Kuveyt Turk Bank had with its TIBCO systems, including a 2X increase in FX volumes and customers, new efficiencies, and reduced costs.
Join the TIBCO customer reference program to have your business transformation story shared globally with the technology industry, and trade and business press. Your story in print, web, and video format can boost your status as a thought leader and increase awareness with technology leaders, helping you raise your company visibility and attract and retain top talent. Email firstname.lastname@example.org today!
Little can rival the ascendancy of Artificial Intelligence’s (AI) profile from a niche concern to technology’s hottest theme, transcending the pages of the technical press to sit at the heart of mainstream culture.
A certain breed of robotics has been a dominant force in this traction bringing machine learning to the masses in the form of chatbots and avatars that feature in our homes and customer service experience, as well as in banking and call centers. In doing so, a previously little understood concept has become a daily transaction woven into the fabric of daily life with a helping of futuristic gloss that excites and intrigues users and fuels an appetite for ever more ingenious applications and deployments.
However, where there is hype there can also be misconceptions. As a space, AI has been particularly vulnerable to some tired narratives over the years. How often do we see the rhetoric around AI crudely pitched as little more than a threat to jobs?
While we can’t dismiss the march of automation, it’s a prevailing focus that entirely overlooks the role this technology has to play in augmenting human intelligence, bolstering the capabilities and the scope of what people can do rather than usurping it. Indeed, AI has the potential to create far more employment opportunities and entirely new jobs, the type of which we cannot begin to envisage or predict at present.
It can also be easy to focus solely on the application, the eye-catching form, the quirky features, the high-profile brand association and overlook the data story happening behind the scenes which is breathing life into all the iterations we see. Perhaps it’s no surprise that in my capacity as CTO at TIBCO I put forward data analytics as a powerful illustration of what can be achieved when AI and humans combine to drive capabilities and elevate the offering. My role to date has afforded me the privilege of seeing time and time again how this two-pronged approach makes for the most effective outcomes, adding significant value and transforming process efficiency.
Without being explicitly programmed to do so, or the need for continuous human intervention, AI algorithms can make recommendations and find patterns and trends to draw deeper and smarter insight from massive data sets expanding the possibilities of big data analytics while making them far more accessible.
Crucially, algorithms can’t be relied upon to find a definitive answer. Instead their predictive capabilities provide the heavy lifting, laying the groundwork ready for the human brain to apply its creativity to evaluate each suggestion before ultimately deciding on the best solution to the most demanding of problems. Without having to create an algorithm for each application, the data scientist is freed up to tackle more profound aspects of machine learning, such as deep learning and neural networks.
This optimized and nuanced result of balancing human insight with algorithmic action becomes ever more critical as we consider AI’s expanding role in future society through applications that have the power to transform our lives on a deeper, more profound level beyond the cool gadgets, music recommendations and weather updates from the Cortanas, Alexas, and Siris. That isn’t to be dismissive of these kinds of interventions that can bring both fun and efficiencies into our lives, but ultimately AI has a greater purpose, as a tool for higher level goals across major societal issues from climate change to sustainable cities.
It is why we are starting to see the term empathetic AI trickle down into the parlance around the subject—the very antithesis of man versus machine debate that has framed this narrative for so long. A more humanistic approach to AI will increasingly be employed to meet the demands of a society faced with an ageing population and to ease the strain of a cash-strapped health service and time-poor health professionals.
One example will be remote monitoring systems that enable patients’ health to be assessed away from traditional care settings. Then there are the systems which exploit AI’s superiority over the human eye/brain in terms of pattern recognition, with images of patient symptoms sent to an AI-driven app to read and recommend the best course of action. And for many living with chronic conditions, health professionals acting on data from wearable or external sensors can discuss patient care via the television set, laptop, tablet computer or smartphone, all of which brings more speed, accessibility and convenience to patient care.
And this is just a small snapshot of how much more intelligently we can do things, all by making analytics smarter with AI. Discover more about augmenting intelligence with TIBCO here.
TIBCO Software Scores In Top 4 Across All Use Cases
As the BI market continues to evolve from from its IT-centric roots to being heavily driven by lines of business, customers are demanding more.
More employees being empowered to make data driven decisions.
More flexibility to respond to business needs.
More valuable insights from more sophisticated questions.
More capability in a cohesive, enterprise-grade platform.
The Gartner Critical Capabilities report scores 26 vendors in the BI & Analytics space on 5 different use cases:
- Agile Centralized BI Provisioning
- Decentralized Analytics
- Governed Data Discovery
- OEM or Embedded BI
- Extranet Deployment
TIBCO Spotfire has received one of the top four scores across all use cases, and received the highest score for Decentralized Analytics!
We believe our success can be attributed to our focus on developing a data discovery solution that’s meant to grow with an organization’s needs. While there are many products out there for creating simple charts, Spotfire offers analyst a comprehensive, scalable platform for business intelligence.
Don’t let your BI investment hold you back. Get the whole report and discover how you can #DoMoreWithSpotfire!
This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from TIBCO Software, Inc.
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally, and is used herein with permission. All rights reserved.
The first electronic business reports required a highly skilled technician to physically patch wires into a board. The output was then printed on paper. It took months to get a new report written and even longer to provide data consumers with answers.
The world of enterprise reporting has moved on tremendously from the days of tearing off the perforated edges of continuous feed paper! Here are the four major game changers.
Game changer 1: Not all data is flat
Since the 1970s when SQL languages were created the dominating data model was always columns and rows. These were easy to summarize for reports. In the early 2010s, an explosion of new databases was born, led by NoSQL databases like MongoDB and non-flat storage options like JSON. Reporting had to learn to deal with new data models—if SQL is about rows and columns, noSQL is about flexible data structures. Can your reporting tools handle those new data structures?
Game changer 2: Ditching the printer for interactivity
When reporting first became “a thing” people didn’t sit in front of a monitor—they shuffled papers around their desks and “Bring me those numbers” was a physical task. As soon as business users started seeing their reports on screen they started to ask themselves, “Why can’t I sort this by date?”—so reports printed to PDF are becoming a thing of the past, even for the most basic reporting tasks, some level of interactivity is required.
Game changer 3: Near real time
Reports used to be an “end of the month” thing—a push from IT vs a pull from business. Again, this has shifted, and at the heart of this shift are APIs. These days anything can be consumed via an API, which means expectations from users have evolved from a monthly cadence to a whenever-I-need-it cadence. This puts a bigger strain on reporting systems and the databases they connect to, leading to a rise in the idea of treating reporting like any other repeatable DevOps process.
Game changer 4: A sea of visualization types
When reporting was printed on paper, visualizations were extremely limited. If it didn’t fit into a bar, pie, or line chart then it didn’t get visualized. This has changed a lot too—with the advent of extensible visualization libraries, like D3.js, data is no longer bound to basic visualization types. Modern reporting tools should be extensible to allow for new visualization types, for example this solar correlation map.
Want to learn more about these changes in enterprise reporting and how they apply to your current reporting systems? Download O’Reilly Media’s new book: “Is Reporting Dead? The Evolution of a Classic Information Delivery Strategy”, written by Teodor Danciu, Founder & Chief Architect of the most popular reporting tool on the planet, JasperReports.
TIBCO Software has been selected for inclusion in KMWorld Magazine‘s annual list of “100 Companies That Matter in Knowledge Management,” alongside companies such as Apple, Adobe, and Google. This is the eighth year TIBCO has been named to the list. TIBCO is recognized for its complete integration platform, which allows companies to interconnect everything, and a broad analytics platform to augment intelligence.
Over the past decade and a half, KMWorld’s distinction has recognized companies that successfully blended agile innovation in the knowledge management (KM) space with commitment to customer satisfaction. The annual list is compiled by the KMWorld network of knowledge management practitioners, theorists, analysts, and end-users, and is intended to spark a larger discussion of knowledgeable companies leading the way for innovation and customer service.
“The banner of knowledge management spans a wealth of territory to encompass solutions that range in functionality from the tried-and-true to the futuristic. Those designated to this year’s list of KMWorld 100 Companies That Matter in Knowledge Management run the gamut of capabilities, but share such similar characteristics as innovation, ingenuity, usefulness and resourcefulness,” says KMWorld Editor Sandra Haimila. “Moreover, the companies on this list create solutions that help their customers turn vast amounts of data into usable knowledge that they can leverage to enhance collaboration, gain insights and achieve their goals.”
KMWorld is the leading information provider serving the Knowledge Management systems market and covers the latest in Content, Document and Knowledge Management, informing more than 30,000 subscribers about the components and processes – and subsequent success stories—that together offer solutions for improving business performance. KMWorld is a publishing unit of Information Today, Inc.
To learn more about TIBCO and its platform, visit this page to find out how to interconnect everything and augment intelligence in your organization.
There are a lot of people hard at work building APIs who are rightfully focused on just getting the API right and not looking beyond that to what comes next. Well, what comes next is putting something in place to manage those APIs you just created, especially if you intend to expose them as public APIs, or even just share them with some of your B2B partners.
You might think this is an easy decision; just go out and get an API management product, right? Well, not necessarily. Depending on what you intend to do with the APIs, you may not need a full-blown API management platform. It might be good enough to just drop an on-premise API gateway in place and call it a day. This is very common approach, especially when your APIs are targeted at an internal set of developers and you don’t plan to enable access outside your firewall.
However, if you intend to share those APIs with outside developers, you must treat those APIs like products (which they are) and put an API management platform in place that allows you to productize your APIs, manage your community of users, expose a portal for API access, secure the APIs, and manage and measure how your APIs are being used. What I’m describing here is related to a relatively new role that has emerged, called the API Product Manager. Putting an API management platform in place without an API product manager, is like installing software in a datacenter with no system administrator. I wouldn’t recommend it.
So, bite the bullet and get yourself an API product manager. You’ve made a decision on the appropriate product to manage your API program and you’ve hired someone who’s accountable for the successful implementation and adoption of your APIs. Now what?
Now you need to decide how you’d like to deploy your gateway or API management solution. If it’s a standalone API gateway and not part of larger API management product the answer is clear. You deploy it to your data center and run it on-premise. But, you may have other options if the gateway is a component of a broader SaaS-based API management platform, such as Mashery (TIBCO’s API management offering).
All SaaS offerings for API platforms include an embedded gateway capability and most people just use it that way. Policy definition and enforcement—both design and runtime—are handled in the SaaS environment. But for some companies, in industries that have strict security requirements due to internal policies or a regulatory framework, API traffic management through the gateway must be done on-premise.
Trying to achieve both of these objectives using a SaaS products sounds like a little bit of an oxymoron, but it’s not. This is a hybrid approach to API management, where some of your functionality runs in the cloud and some runs on-premise; all in a single interconnected environment, managed through a single management console. When someone buys a Mashery subscription, they have the option of deploying this hybrid implementation. The piece of software that runs on-premise is called Mashery Local.
Mashery is one of just a few vendors that supports a hybrid gateway deployment option. Many of our customers like this approach, because they get the benefits of a SaaS environment, but with the greater control that comes with running software on-premise.
On March 9th, TIBCO will be hosting a webinar where we will explore the concept of implementing an API platform in a hybrid fashion. We’ll get into some detail on how it works, what the benefits are, and how to determine whether this is the right approach for you. Register for the webinar today.