Kettle hard dep on 4.4.0? or use 7.1?


Having looked at the options at: and based on @Paul_Avillach’s suggestion, I am going to start with Kettle to simply get some data in.

In the Set-Up section: it recommends Pentaho Data-Integration 4.4.0 is this a hard dep on 4.4.0 or simply the version that was -current at time of writing? That version was released “2012-11-29”

As of this posting -current is v7.1.


current 7.1 should work.
Do you confirm


I figured it wouldn’t to try 7.1-current and then report back.
Same with JDK 1.8.0 instead of JDK 7
I’d be happy to update the wiki at the end of this.


I would use the recommended one. Just so you know pentaho doesn’t do a majority of the ETL work. A procedure in the database does a bulk of the work in the loading process.

With all this said I am not even sure if the 18.1 quick start ( which used a 1.7.0.x database scheme ) has the capabilities of using kettle as we have only used it against 1.0.GA schema.

We internally just used a DB migration script between the two schemas in PL-SQL. This obviously is not an actual raw data to transmart/I2B2 data store etl. Which is what you are looking for and what I am currently working on.


The current tranSMART Kettle ETL does not yet support Oracle Services. It only works with Oracle SID. Additionally, 18.1 has updated the schema to be i2b2 compliant, so it may or may not work with the new schema. I am not sure. i2b2 data installs will work.

Indeed we’ll have to make our ETL services publicly available at some point


@thomas_desain thanks for the heads up; particularly re: DB schema changes

@andre_rosa hmmm, missing that upload tab right now

@thomas_desain is it possible to make your SQL <--> SQL transmogrifier available? I doubt there is anything/much proprietary or priovelaged in the DB loading script. I have more data in an SQLite DB then I have in the exported tables, so being able to do SQLite --> Oracle would be a more ideal situation than loading the TSV/CSV files.


@andre_rosa @thomas_desain any chance of sharing a working data upload method or two?
Not being able to add data means the current release is just a demo.


@benc Your best bet for now is to install I2B2 Workbench.

Then try some of the data import plugins that were created for I2B2. You can probably find some here ( ).

We are not yet supporting 18.1 raw data to I2B2/Transmart data loads. We are using migration scripts for our current version of I2B2/Transmart ( 1.0.GA ) to the new I2B2/Transmart ( 18.1 ). 18.1 Uses the core I2B2 ( 1.7.0.x ) data model. So any current Data loader I2B2 uses should work with the current version of I2B2/Transmart.