phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Riccardo Cardin <riccardo.car...@gmail.com>
Subject Using phoenix-spark plugin to insert an ARRAY Type
Date Thu, 30 Jul 2015 09:35:11 GMT
Hi all,

I have a problem with the phoenix-spark plugin. I have a Spark RDD that I
have to store inside an HBase table. We use the Apache-phoenix layer to
dialog with the database. There is a column of the table that is defined as
an UNSIGNED_SMALLINT ARRAY:

CREATE TABLE EXAMPLE (..., Col10 UNSIGNED_SMALLINT ARRAY, ...);

As stated in the Phoenix documentation, ARRAY data type is backend up by
the java.sql.Array.

I'm using the *phoenix-spark* plugin to save data of the RDD inside the
table. The problem is that I don't know how to create an instance of
java.sql.Array, not having any kind of Connection object. An extract of the
code follows (code is in Scala language):

// Map RDD into an RDD of sequences or tuples
rdd.map {
  value =>
    (/* ... */
     value.getArray(),   // Array of Int to convert into an java.sql.Array
     /* ... */
    )}.saveToPhoenix("EXAMPLE", Seq(/* ... */, "Col10", /* ... */), conf, zkUrl)

Which is the correct way of go on? Is there a way to do want I need?

Best regards,

Riccardo

Mime
View raw message