I have a problem with the phoenix-spark plugin. I have a Spark RDD that I have to store inside an HBase table. We use the Apache-phoenix layer to dialog with the database. There is a column of the table that is defined as an UNSIGNED_SMALLINT ARRAY:
CREATE TABLE EXAMPLE (...,Col10 UNSIGNED_SMALLINT ARRAY,...);
As stated in the Phoenix documentation, ARRAY data type is backend up by the java.sql.Array.
I'm using the phoenix-spark plugin to save data of the RDD inside the table. The problem is that I don't know how to create an instance of java.sql.Array, not having any kind of Connection object. An extract of the code follows (code is in Scala language):
// Map RDD into an RDD of sequences or tuples
value =>(/* ... */
value.getArray(),// Array of Int to convert into an java.sql.Array/* ... */)}.saveToPhoenix("EXAMPLE",Seq(/* ... */,"Col10",/* ... */), conf, zkUrl)
Which is the correct way of go on? Is there a way to do want I need?