• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Dynamic and Scalable : Pentaho 6.1 has arrived!

May 16, 2016   Pentaho

Hello Kettle and Pentaho fans!

Yes indeed we’ve got another present for you in the form of a new Pentaho release: version 6.1
This predictable steady flow of releases has in my opinion pushed the popularity of PDI/Kettle over the years so it’s great that we manage to keep this up.

There’s actually a ton of really nice stuff to be found in version 6.1 so for a more complete recap I’m going to refer to my friend and PDI product manager Jens on his blog.

However, there are a few favorite PDI topics I would like to highlight…

Dynamic ETL

Doing dynamic ETL has been on my mind for a long time. In fact, we started working on this idea in the summer of 2010 to have something to show for at the Pentaho Community Meetup of that year in the beautiful Cascais (Portugal). Back then I remember getting a lot of blank stares and incomprehensive grunts from the audience when I presented the idea. However, the last couple of years dynamic ETL (or ETL metadata injection) has been a tremendous driver for solving the really complex cases out there in many areas like Big Data, Data Ingestion and archiving, IoT and many more.  For a short video explaining a few driving principles behind the concept see here:

More comprehensive material on the topic can be found here.

Well in any case I’m really happy to see us keeping up the continued investment to make metadata injection better and more widely supported.  So in version 6.1 we’re adding support for a bunch of new steps including:

  • Stream Lookup (!!!)
  • S3 Input and Output
  • Metadata Injection (try to keep your sanity while you’re wrestling with this recursive puzzle)
  • Excel Output
  • XML Output
  • Value Mapper
  • Google Analytics

It’s really nice to see these new improvements drive solutions across the Pentaho stack, helping out with Streamlined Data Refinery, auto-modeling and much more.  Jens has a tutorial on his blog with step by step instructions so make sure to check it out!

Data Services

PDI data services is another of these core technologies which frankly take time to mature and be accepted by the larger Pentaho community.  However, I strongly feel like these technologies make such a big difference with what anyone else in the DI/ETL market is doing. In this case, simple being able to run standard SQL on a Kettle transformation is a game changer.  As you can tell I’m very happy to see the following advances being piled on the improvements of the last couple of releases:

  • Extra Parameter Pushdown Optimization for Data Services – You can improve the performance of your Pentaho data service through the new Parameter Pushdown optimization technique. This technique is helpful if your transformation contains any step that should be optimized, including input steps like REST where a parameter in the URL could limit the results returned by a web service.
  • Driver Download for Data Services in Pentaho Data Integration – When connecting to a Pentaho Data Service from a non-Pentaho tool, you previously needed to manually download a Pentaho Data Service driver and install it. Now in 6.1, you can use the Driver Details dialog in Pentaho Data Integration to download the driver.
  • Pentaho Data Service as a Build Model Source Edit section – You can use a Pentaho Data Service as the source in your Build Model job entry, which streamlines the ability to generate data models when you are working with virtual tables.
virtual data sets 6.1 Dynamic and Scalable : Pentaho 6.1 has arrived!

Virtual Data Sets overview in 6.1

Other noteworthy PDI improvements

As always, the change-list for even the point releases like 6.1 is rather large but I just wanted to pick 2 improvements that I really like:

  • JSON Input: we made it a lot faster and the step can now handle large files (hundreds of MBs) with 100% backward compatibility
  • The transformation and job execution dialogs have been cleaned up!
run dialog 6.1 1 Dynamic and Scalable : Pentaho 6.1 has arrived!

The new run dialog in 6.1

I hope you’re all as excited as I am to see these improvements release after release after release…

As usual, please keep giving us feedback on the forums or through our JIRA case tracking system.  This helps us to keep our software stable in an ever changing ICT landscape.

Cheers,
Matt

Let’s block ads! (Why?)

Matt Casters on Data Integration

arrived, Dynamic, Pentaho, Scalable
  • Recent Posts

    • ANOTHER SIMPLE EXAMPLE OF FASCIST NAZI LEFTISTS AT WORK
    • Nvidia and Harvard develop AI tool that speeds up genome analysis
    • Export with large E instead of small e
    • You’ll be back
    • Building AI for the Global South
  • Categories

  • Archives

    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited