Hi Riccardo,

For saving arrays, you can use the plain old scala Array type. You can see the tests for an example:

Note that saving arrays is only supported in Phoenix 4.5.0, although the patch is quite small if you need to apply it yourself:

Good luck!


On Thu, Jul 30, 2015 at 5:35 AM, Riccardo Cardin <riccardo.cardin@gmail.com> wrote:

Hi all,

I have a problem with the phoenix-spark plugin. I have a Spark RDD that I have to store inside an HBase table. We use the Apache-phoenix layer to dialog with the database. There is a column of the table that is defined as an UNSIGNED_SMALLINT ARRAY:


As stated in the Phoenix documentation, ARRAY data type is backend up by the java.sql.Array.

I'm using the phoenix-spark plugin to save data of the RDD inside the table. The problem is that I don't know how to create an instance of java.sql.Array, not having any kind of Connection object. An extract of the code follows (code is in Scala language):

// Map RDD into an RDD of sequences or tuples
rdd.map {
  value =>
    (/* ... */
     value.getArray(),   // Array of Int to convert into an java.sql.Array
     /* ... */
}.saveToPhoenix("EXAMPLE", Seq(/* ... */, "Col10", /* ... */), conf, zkUrl)

Which is the correct way of go on? Is there a way to do want I need?

Best regards,