phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Daniel Klinger" ...@web-computing.de>
Subject Can't execute DML with Pentaho Data Integrator and Phoenix JDBC
Date Mon, 14 Mar 2016 13:55:21 GMT
Hello,

 

we are using the Phoenix Query Server (PQS) in an HDP 2.3.4 Cluster. For ETL
we use the Pentaho Data Integrator (PDI). We have some Hive Tables we want
to "copy" to Phoenix (HBASE). So my plan is to create the table via PDI,
Load the date from Hive and insert (Upsert) it to phoenix. I'm using the
phoenix thin client and I'm able to connect my PDI to PQS via JDBC. The
Table is created correctly by PDI. 

 

The Problem is the INSERT/UPSERT. I'm loading the data from hive with a
select statement. For every row selected from hive (about 100.000 in total)
I'm creating and UPSERT-Statement with PDI and send it to the PQS. The ETL
is executed without an error but the created table is empty. If I execute
the SQL-Statements, generated by PDI, manually via sqlline-thin, all rows
all 100.000 rows are inserted. So the SQL-Statement itself is correct. 

 

My next idea was that the statements are not send to the PQS. So I
intentionally put some errors in the SQL-Statement. If I execute the ETL now
I'm getting and error from the PQS (wrong datatype, parse error so on). For
me it looks like the correct SQL-Statement is send to the PQS, parsed but
not executed and I have no idea why.

 

Thanks for helping.


Mime
View raw message