phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sateesh Karuturi <sateesh.karutu...@gmail.com>
Subject Re: write Dataframe to phoenix
Date Mon, 20 Mar 2017 08:13:46 GMT
Thanks for response NaHeon..,

i added phoenix-spark jar in pom.xml and i am able to read data from
phoenix.
The problem is getting exception while writing Dataframe to phoenix.

On Mon, Mar 20, 2017 at 1:23 PM, NaHeon Kim <honey.and.sw@gmail.com> wrote:

> Did you check your project has dependency on phoenix-spark jar? : )
> See Spark setup at http://phoenix.apache.org/phoenix_spark.html
>
> Regards,
> NaHeon
>
> 2017-03-20 15:31 GMT+09:00 Sateesh Karuturi <sateesh.karuturi9@gmail.com>:
>
>>
>> I am trying to write Dataframe to Phoenix.
>>
>> Here is my code:
>>
>>
>>    1. df.write.format("org.apache.phoenix.spark").mode(SaveMode.Overwrite
>>    ).options(collection.immutable.Map(
>>    2. "zkUrl" -> "localhost:2181/hbase-unsecure",
>>    3. "table" -> "TEST")).save();
>>
>> and i am getting following exception:
>>
>>
>>    1. org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3.0 (TID 411, ip-xxxxx-xx-xxx.ap-southeast-1.compute.internal):
java.lang.RuntimeException: java.sql.SQLException: No suitable driver found for jdbc:phoenix:localhost:2181:/hbase-unsecure;
>>    2.         at org.apache.phoenix.mapreduce.PhoenixOutputFormat.getRecordWriter(PhoenixOutputFormat.java:58)
>>    3.         at org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopDataset$1$anonfun$12.apply(PairRDDFunctions.scala:1030)
>>    4.         at org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopDataset$1$anonfun$12.apply(PairRDDFunctions.scala:1014)
>>    5.         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>>    6.         at org.apache.spark.scheduler.Task.run(Task.scala:88)
>>    7.         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
>>    8.         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>    9.         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>
>>
>>
>

Mime
View raw message