Monthly Archives: October 2016

Pentaho 7.0 a further instalment in the pursuit of excellence

As readers of my somewhat irregular contributions to the Bloor website will know, one of the products that I keep on coming back to explain their latest steps is Pentaho. I believe that BI, or as Pentaho prefer to call it business analytics is an ever more important component of an enterprise architecture. To the point where I wonder with some organisations how their executives actually manage their organisations when they have very limited insight into what has happened and almost zero visibility at all as to what might happen given current trends. To achieve a comprehensive BI solution requires more than just reporting and analysis tools, it is as much about data management as it is about reporting and as such the stack required to realise the potential is out of necessity complex and varied in its nature and content. For organisations they create a stack by combining elements of many vendors’ offerings, often with very clunky interfaces as a consequence. Whilst many vendors offer what at first sight appears to be a range of products which assemble into a comprehensive integrated whole, but in reality are also a variety of acquired elements with a GUI trying to mask what lies underneath.

Pentaho offer one of the most comprehensive tool kits and in my opinion one of the best integrated, but even then it can at times still be difficult to achieve what is required simply, and as we know complexity results in cost and is likely to introduce error. I am therefore delighted to be able to announce that with Pentaho 7 they are aiming to simplify and streamline without any loss of capability, by abstracting above the level of the complexity and enabling the technology to address the complexity and the BI professionals to focus on providing functionality and insight, ie the high value adding element of BI.

Pentaho as I have said is a very comprehensive solution, it handles the data pipeline, from the capture and integration through to the delivery of analytics, but now the user experience is being vastly improved, with the interface between the elements being made seamless and the experience made far smoother. At the same time the arbitrary distinctions in the traditional pipeline which see engineering precede, preparation, and finally you have analytics is replaced by a pipeline which recognises that analytics is required at all stages not just when the data is at rest but whilst it passes through any of the stages.

The presentation of results is also automated to ensure that charts are presented in ways that render them intelligible and easy on the eye. This is really useful, because we all get fed up when presented with a chart which instead of replacing a thousand words requires 2000 to explain what it is attempting to represent! I think that the key to what Pentaho are achieving is that they are moving the focus away from the tools, and towards the data, the data is driving what is happening. There will of course be technicians who will decry this and claim that all such moves are deskilling them, but the truth is we need to fundamentally change the productivity of the BI process and get the focus away from the low level technology and higher up the value chain, so that we get better results faster and focus more on the use of those results to change the business for the better. Actionable insight has to be timely as well as accurate to be of maximum value.

Other major enhancements include making Hadoop easier to deal with, again by abstracting above the level of the complexity. They have extended their Kerberos integration, bringing more secure big data integration within reach of enterprise users far more readily. They provided a drag and drop interface for developers to coordinate and schedule Spark applications, and making it easier to apply those Spark applications to a wider variety of application types and environments. Data preparation right through to Visualisation is being catered for in one tool, enabling a broader audience to explore a broader set of data in a more meaningful way as the technical barriers are removed. There are also enhancements to Security for Hadoop. A further enhancement which looks to be really useful is metadata injection, where you can define a template, and pass it against the metadata at run time, and then inject it into downstream steps. So you can have disparate data sources and rules which you want to dynamically change, and this enables you to do that without having to build and maintain a whole raft of transformations and jobs each time.

So a whole hatful of the sorts of things that make this one of the most enterprise ready BI stacks that I am aware of. So this looks to be a major enhancement which is really setting out the Pentaho stall as a BI vendor of choice at the Enterprise level with integrated capability which is easier to use and more powerful out of the box than the comparable offerings in the marketplace which are still reliant on skilled technicians to unite and enact the solutions.

Let’s block ads! (Why?)

Colbran South Africa

TIBCO Mashery Named a Leader in Gartner’s Magic Quadrant for Full Lifecycle API Management

tm TIBCO Mashery Named a Leader in Gartner’s Magic Quadrant for Full Lifecycle API Management

We’re thrilled that Gartner has recognized TIBCO Mashery as a leader in Full Cycle API Management!

TIBCO believes this acknowledgment provides tangible evidence to support the strategic vision behind TIBCO’s acquisition of Mashery last year and validates the combination of TIBCO and Mashery capabilities, providing a unified API platform to deliver strong API creation capabilities in addition to the traditional needs of API management, to create real value for our customers.

We believe our position in the leader quadrant of this report clearly reflects the strength of our product offering, completeness of our vision, and ability to execute. But we could never land in this coveted spot without our strongest asset—our amazing, successful customers. The Gartner assessment process strives to reflect the raw reality of the market and is heavily based on customer feedback of vendor performance, strengths, and challenges. TIBCO Mashery customers have been vital partners on our journey as we have helped them transform their businesses to more effectively compete in the digital world through their API platforms.

We sincerely thank all of our customers for their continuing faith in us, and for relying on TIBCO Mashery to be a core component in their digital transformation. We are proud to be their partner and look forward to help maximize the value they derive from our solutions.

Download a complimentary copy of the Gartner report.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Let’s block ads! (Why?)

The TIBCO Blog

Why a Semantic Layer Like Azure Analysis Services is Relevant

This is part 1 of 2 about Azure Analysis Services (Azure AS) which was announced a few days ago. Azure AS is a Platform-as-a-Service (PaaS) offering which is currently in public preview mode at the time of this writing (late October 2016).

Part 1: Why a Semantic Layer Like Azure Analysis Services is Relevant {you are here}

Part 2: Where Azure Analysis Services Fits Into BI & Analytics Architecture {coming soon}

Fundamentally, Analysis Services serves as a semantic layer (see below for further discussion of a semantic layer). Because the business intelligence industry now embraces an array of technology choices, sometimes it seems like a semantic layer is no longer valued like it once was. Well, my opinion is that for many businesses, a semantic layer is tremendously important to support the majority of business users who do *not* want to do their own data wrangling, data prep, and data modeling activities.

 Why a Semantic Layer Like Azure Analysis Services is Relevant

Let’s first do a super quick recap what Analysis Services is.

Analysis Services Primer

Analysis Services can be categorized with competitor tools such as Business Objects Universe, Cognos Framework Manager, or AtScale.

Prior to the introduction of Azure AS, Analysis Services was available as part of the SQL Server stack. Therefore, we now have SSAS which is a component of SQL Server, and Azure AS which is now a separate cloud-based offering. The main difference is in where the model is deployed; the development experience is exactly the same. Prior to the release of Azure AS (PaaS), the only way to run Analysis Services in Azure was within a virtual machine (IaaS). If you’re searching for information online, be a little careful & watch the dates-you will easily run into older articles which are talking about running SSAS in a VM rather than this PaaS service. The good news is that nearly all of the information will still be relevant though since the development experience remains in SQL Server Data Tools; the primary difference is in deployment.

There are two modes to an Analysis Services instance: Tabular and Multidimensional.

  • Tabular is the newer in-memory data structure, based on DAX (Data Analysis eXpressions). Generally speaking, Tabular has an easier entry point though there is *definitely* a learning curve to DAX if you go beyond the simple stuff. 
  • Multidimensional is traditional OLAP so it’s been around a lot longer, has a few more advanced features, and is based on MDX (MultiDimensional eXpressions). Multidimensional models are less ‘forgiving’ than Tabular models with respect to certain design characteristics (such as classifying facts and dimensions).

The rule of thumb most BI folks follow these days is to look at using a Tabular model unless you find you need one of the features that Multidimensional provides than isn’t in Tabular yet. For this public preview of Azure AS, Tabular is the only mode supported initially. The product team has stated that Multidimensional will be considered for a future release, based on customer demand.

Regardless if you use Tabular or Multidimensional, your data is handled one of two ways: 

  • It can be imported into the Analysis Services in-memory model, which is also sometimes called cached mode. In this mode, Analysis Services holds a copy of the data and serves up the data from user queries. An in-memory model provides an excellent user experience assuming that enough memory is provided, and the design is sound. The data contained in an in-memory model needs to be refreshed on a schedule, so there’s always some data latency.
  • The other option is to utilize DirectQuery mode. In this mode, the queries are passed onto the underlying data source so Analysis Services really is metadata only.  To get optimal performance with DirectQuery mode, the data source needs to be tuned for performance of user queries which is sometimes a challenge. The appeal of DirectQuery mode is access to real-time data, and the omission of needing to populate another dataset in Analysis Services.

Analysis Services development is usually handled by BI/Analytics/IT professionals within SQL Server Data Tools in Visual Studio, and it is typically under source control such as TFS or GitHub. No changes to the development experience occur as part of the introduction of Azure AS. This is a good thing-but it’s important to be aware that development of a semantic model does take some time. It’s fast and easy to create an Azure AS instance in Azure, and it’s fast and easy to deploy the model, but development of the model itself does take time and know-how. The good news is there’s tons of resources available to learn development techniques, such as dimensional modeling which is optimal for reporting.

Analysis Services is considered an enterprise-level data modeling tool. Ideally it exposes a large amount of your corporate data to business users. The pricing of Azure AS reflects that it is an enterprise-level tool (you can see pricing in the Azure portal when you provision the service).

Benefits of a Semantic Layer

According to b-eye-network: “A semantic layer is a business representation of corporate data that helps end users access data using common business terms.”

A semantic layer sits between the original database and a reporting tool in order to assist business users with ease of reporting. It becomes the main entry point for data access for most business users when they are creating reports, dashboards, or running ad hoc queries.

 Why a Semantic Layer Like Azure Analysis Services is Relevant

Traditionally, we’ve built a semantic layer on top of a traditional data warehouse. That’s certainly not a requirement, though certainly still a very valid and common design pattern. In Part 2 of this post we’ll talk about some variations on that theme to accommodate a modern data warehouse platform.

Though exposing reporting to users via a semantic layer is not an absolute “must” in today’s world of options, it facilitates reporting for things such as:

  • Data pre-integrated for users (ex: Salesforce data, invoice data, A/R data, and inventory data are all integrated for the end user to consume)
  • No joins or relationships for users to worry (because they’ve all been handled in the data model)
  • Columns have all been renamed into business user-friendly names (ex: Invoice Amount instead of INVAMT)
  • Business business logic and calculations have been centralized in the data model (which reduces risk of recalculations being done incorrectly)
  • Time-oriented calculations are included which are really powerful (ex: Sales Increase Year Over Year; Sales Year-To-Date; % of Change Since Prior Quarter)
  • Aggregation behavior has been set so reporting tools respond correctly (ex: sales $ sum up, but something like a customer number or invoice does not try to aggregate)
  • Formatting has been specified so reporting tools handle it by default (ex: sales displays with the $ sign whereas units display with commas and no pennies)
  • Data security is incorporated (ex: standard row-level security, and/or exposing certain sensitive measurements to only authorized users)

In summary, a semantic layer is all about the convenience of business users so they can focus on getting the answer they need. It facilitates self-service reporting by providing a cleansed, understandable, trustworthy environment. A semantic layer can often support self-service analysis for 80% of a user base. After all, most functional users want to get the information quickly and then get on with their day. A semantic layer can also help the other 20% of users that have additional needs as well–though in all fairness, those hard core data analysts and data scientist types often prefer to go after the original source data, though there’s certainly still value in them acquiring some of their data from the semantic layer as well. After all, why reinvent the wheel if the investment has been made?

So, even though we have tons of options these days for information delivery, and a single semantic layer for an entire organization isn’t a silver bullet, I firmly believe that a semantic layer is certainly still a great tool in the toolbox. 

Next up is Part 2: Where Azure Analysis Services Fits Into BI & Analytics Architecture {coming soon}.

Finding More Info

Azure Documentation – What is Azure Analysis Services?

Feedback to the Product Team – Feedback on Azure AS

Channel 9 Videos: Azure AS Videos

You Might Also Like…

Data Lake Use Cases and Planning Considerations

Building Blocks of Cortana Intelligence Suite in Azure

Let’s block ads! (Why?)

Blog – SQL Chick

graceebooks: wwinterweb: Star Wars cast member yearbook photos…

tumblr ofktpmhM581sknyuxo1 1280 graceebooks: wwinterweb: Star Wars cast member yearbook photos...

avatar 80daabc0929c 48 graceebooks: wwinterweb: Star Wars cast member yearbook photos...

About Blogtastic!

29/Michigan
Lyme Disease
Alkaline Trio, Bad Religion
Star Wars, Community, Jurassic Park, Guardians of the Galaxy
Science, Tech, Puns, Humor
Nerd, Gamer, Spoonie
World’s Nicest Badass
Hopeful Adventurer
Cunning Linguist
Awesome Individual
Stormpilot Trash

Let’s block ads! (Why?)

Blogtastic!

PowerPhone is Now Compatible with Skype for Business Phone Lines

Blog Image Builder 800x600 2 Recovered Recovered 300x225 PowerPhone is Now Compatible with Skype for Business Phone Lines

Interested in PowerPhone, but don’t have or want to set up and configure a TAPI 2X driver? PowerPhone is now compatible with Skype for Business (formerly Lync) in an easy to use drop down.

With this latest enhancement, there’s no need to fuss with code! For organizations using both Skype for Business and a TAPI 2x system, users can switch between Skype for Business or hard phone lines with ease. Once the PowerPhone Agent is launched, the available Skype for Business or TAPI 2x phone lines will be displayed in the “Select Phone Line” drop down field.

102816 1601 PowerPhonei1 PowerPhone is Now Compatible with Skype for Business Phone Lines

Find the new PowerPhone Agent for download on the PowerPhone webpage. Not a PowerPhone user? PowerPhone includes a free 30 day trial! Download the full PowerPhone solution and get started today!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Chatbots benefit the customer service market, but not a cure-all

Lately, the chatter on how to improve the customer service market and boost efficiency in contact centers has centered…

on chatbots. While fielding customer inquiries is a time-consuming task for agents, chatbots can help alleviate some of the strain through machine learning and automation.

Chatbots can alleviate some of the time-consuming inquiries by enabling customers to get rapid information and answers to their questions by interacting with an automated bot that’s designed to mimic the responses of humans through programming and repeated machine learning to train the bot. Given that contact centers have long been considered a drain in the customer service market, finding ways to automate tasks is critical.

Chatbots are automated conversational tools embedded in messaging apps, like Facebook Messenger, that mimic human conversation, often relying on text analytics, artificial intelligence and machine learning. By automating tasks that were formerly the province of contact-service agents, chatbots enable agents to focus their energy on higher-level tasks and make contact centers more efficient overall.

Bots aren’t a revolutionary step in self-service, but more of an evolutionary step. The concept is similar to interactive voice response (IVR), with a friendlier user interface and a more complex data-retrieval process.

Technology improvements help UI

Technology supporting IVR systems has limitations, resulting in customers needing to utilize touch-tone technology to self-direct their inquiry. The customer journey can be frustrating and circular for customers, including offering multiple layers of menus, presenting multiple options on each menu, difficult-to-understand prompts and so on,. which can make the customer service market challenging.

Technology has made major strides with the growth of artificial intelligence. As a result, customers no longer need to self-direct; instead, they can speak and write in a natural way, and the technology can understand the specific inquiry or question and respond in kind.

Chatbots can work across multiple channels of communication. They can understand natural language, spoken or written, so the customer experience is no different than interacting with a human being.

In the world of IVR, if a customer wants to find out the status of their next flight, it would go like this:

  • Listen to a menu, and press the appropriate phone key for flight information.
  • Listen to a request to enter the flight number and enter via phone keys.

It sounds simple, but listening to menus and selecting proper prompts can be a challenging process — especially if driving.

In the world of bots, checking on a flight’s status would go like this:

  • What is the status of my next flight?

It cannot get any simpler than that.

Chatbots are innovative technology and can save time, but they also have limitations, including the following:

  • They may not be effective with complex customer inquiries. Similar to IVR, chatbots are best at handling simple inquiries. A chatbot can answer a question, such as, “Is my connecting flight on time?” When it comes to a more complex issue, such as reissuing a ticket, a human is most likely best suited to assist a customer.
  • They may not be effective in cultivating a customer relationship over time. Again, similar to IVR, chatbots are great at answering simple questions, but they cannot have a conversation with a customer — e.g., “I understand the weather is nice in Hawaii this time of the year.” Additionally, they cannot connect on a personal basis, which limits the customer relationship.
  • They can drive a poor customer experience when channel pivots are necessary. Processes must allow for the smooth handoff between a bot and a human. In some cases, a chatbot cannot provide a meaningful response to a customer inquiry, and the customer should be transferred over to a live agent. It’s critical to ensure a smooth pivot, not forcing the customer to repeat themselves and providing information to the agent so they understand the process the customer has already gone through.
  • They cannot interpret human speech with 100% accuracy. In some cases, chatbots do not interpret a customer inquiry correctly, whether it’s because of a customer’s accent, background noise or some other factor. Chatbot systems must continually learn from missed opportunities and improve effectiveness and accuracy.

Chatbots are a promising technology. They can respond to customer inquiries within several communication channels, and customers can use natural language, spoken or written, to interact. But, as with all technologies, there are limitations to bot automation’s effectiveness within the customer service market, and it is critical for organizations to build supportive processes to ensure customers have a fluid, positive experience.

Let’s block ads! (Why?)


SearchCRM: News on CRM trends and technology

Why the eigenvalues are wrong?

 Why the eigenvalues are wrong?

Good day,

Please help me.

I compute eigenvalues using function Eigenvalues;

Given the matrix

mato={{1/3, 0, 0, 0}, {0, 1/3, 1/3, 0}, {0, 1/3, 1/3, 0}, {0, 0, 0, 0}}

I used:

Eigenvalues[mato]={2/3, 1/3, 0, 0}

but it wrong.

I should get eigenvalues = 0 and 2/3

How can I change this function below since I used only Eigenvalues in coding to calculate the eigenvalues.

setone=Eigenvalues[mato]
settwo=-Total[If[# > 0, (# Log2[#]), 0] & /@ setone] // N;

Please help me.

Let’s block ads! (Why?)

Recent Questions – Mathematica Stack Exchange

Are Banks Embracing Public Cloud?

Analytics Matter Featured Are Banks Embracing Public Cloud?

Banking is one of the most regulated industries in the world, often saddled with antiquated regulations that are not keeping up with the digital world. Many believe these regulations have kept banks away from the benefits of cloud computing, in particular services hosted on a public cloud. But that is changing.

In recent years, FICO has been democratising analytics and enabling organisations of all sizes to harness the power of analytics-based decisioning.  This process is driven by cloud computing.

The opportunity for enterprises to leverage cloud based analytics is especially compelling in the area of real-time and near-real-time communications.  Consumers demand multi-channel communications that are effective, mobile and in line with their individual needs and preferences.

Are banks ready to undergo the digital transformation of their business model in order to respond effectively to those needs whilst maintaining regulatory compliance?  In short, yes – responding to customer needs has always been essential to maintaining relevance, and the ability to stay apace with the innovation of the digital world has become a matter of survival.

FICO Customer Communication Services, which work alongside existing systems and processes, have been helping improve customer communications for enterprises in all industry sectors. The growing ubiquity of mobile and app-driven digital communications demands a more flexible and cost-effective deployment of FICO’s communication solutions.

FICO has partnered with Amazon Web Services (AWS) to provide our clients with a highly scalable, rapidly deployable and cost-effective alternative to hosting solutions in your own environment.  Thanks to a secure hybrid cloud architecture that withstands the scrutiny of the toughest security accreditation standards, such as PCI and ISO27001, we are able to offer cost-effective communications services in most geographies.  Deployment, testing, and local telecom service connections are now faster, more flexible and less costly.  These benefits are top of mind for most financial institutions – and cloud computing offers unparalleled opportunities to realise them.

Banks are more than ready to adopt cloud solutions, and regulatory constraints are not at odds with this technology as it is widely believed, but there are no-go areas in some core banking functions.  FICO has a lot of experience in this area and has been able to provide its clients with the power of analytics across all IT platforms and in diverse regulatory climates.

FICO Customer Communication Services are no exception and have been at the forefront of FICO’s cloud-based offering.  In fact, several FICO customers are moving their customer communications applications to AWS – ranging from marketing notifications and promotional offer messages to payment reminders and online affordability calculators.  The use cases include development and test environments, POC set-ups, communication services and live production use, reflecting the flexible and cost-effective deployments provided via AWS.

Cloud computing is transforming the world around us and FICO is at the forefront of the revolution, enabling organizations off all sizes to capitalize on the opportunities it offers. Over the coming months, FICO will bring more and more of our software portfolio to the cloud, powered by AWS.

Let’s block ads! (Why?)

FICO

Announcing SQL Server Management Studio -16.5 Release

 Today, we are pleased to announce the latest generally-available (GA) quality release of SQL Server Management Studio (SSMS) 16.5. This update features fixes for high DPI, reconnection and many issues! Check out the list of fixes below.

Get it here:                                                          

Download SSMS 16.5 release

  • The version number for the latest release is 13.0.16000.28

New in this release

  1. Fixed an issue where a crash could occur when a database with table name containing “;:” was clicked on.
  2. Fixed an issue where changes made to the Model page in AS Tabular Database Properties window would script out the original definition. Microsoft Connect Item: 3080744
  3. Fixed the issue that temporary files are added to the “Recent Files” list.
    Microsoft Connect Item: 2558789
  4. Fixed the issue that “Manage Compression” menu item is disabled for the user table nodes in object explorer tree. Microsoft Connect Item: 3104616
  5. Fixed the issue that user is not able to set the font size for object explorer, registered server explorer, template explorer as well as object explorer details. Font for the explorers will be using the Environment font. Microsoft Connect Item: 691432
  6. Fixed the issue that SSMS always reconnect to the default database when connection is lost. Microsoft Connect Item: 3102337
  7. Fixed many of high dpi issues in policy management and query editor window including the execution plan icons.
  8. Fixed the issue that option to config font and color for Extended Event is missing.
  9. Fixed the issue of SSMS crashes that occur when closing the application or when it is trying to show the error dialog.

Please visit the SSMS download page for additional details, and to see the full changelog.

Known Issues
The full list of known issues is available in Release notes available here.

Contact us
As always, if you have any questions or feedback, please visit our forum or Microsoft Connect page. You can also tweet our Engineering Manager at @sqltoolsguy on Twitter. We are fully committed to improve the SSMS experience and look forward to hearing from you!

Let’s block ads! (Why?)

SQL Server Release Services

[Webinar 11/03] Best-in-Class On-Premises Business Intelligence for Power BI from Pyramid Analytics

Join this webinar to learn more about the new functionality from Pyramid Analytics that allows organizations to publish Power BI Desktop content directly into BI Office’s on-premises solution.

94062c9e 0644 4f26 bd20 d227d177f7a8 [Webinar 11/03] Best in Class On Premises Business Intelligence for Power BI from Pyramid Analytics

The result is an end-to-end, integrated, on-premises analytics platform that allows companies to address security limitations, offer better manageability, and provide a rich overall analytics experience required for enterprise deployments.

Join us Thursday Nov. 3rd 10 am PDT to explore how Pyramid works with Microsoft to deliver best-in-class on premise BI.

REGISTER NOW!

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI