phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Divya Gehlot <divya.htco...@gmail.com>
Subject [Error] Spark - Save to Phoenix
Date Mon, 11 Apr 2016 09:17:07 GMT
Hi,
I am getting below error when I try to save data to Phoenix
Below are  Cluster configuration and steps which I followed :
*Cluster Configuration :*
Hortonworks distribution 2.3.4 version
Spark 1.5.2
Pheonix 4.4


*Table created in Phoenix *
CREATE TABLE TEST
(
   RPT_DATE varchar(100) PRIMARY KEY
)
;

*Spark Scala Script *
val dfLCR = readTable(sqlContext, "", "TEST")
val schemaL = dfLCR.schema
val lcrReportPath = "/TestDivya/Spark/Results/TestData/"
val dfReadReport=
sqlContext.read.format("com.databricks.spark.csv").option("header",
"true").schema(schemaL).load(lcrReportPath)
dfReadlcrReport.show()
val dfWidCol = dfReadReport.withColumn("RPT_DATE",lit("2015-01-01"))
val dfSelect = dfWidCol.select("RPT_DATE")
dfSelect.write.format("org.apache.phoenix.spark").mode(SaveMode.Overwrite).options(collection.immutable.Map(
                "zkUrl" -> "localhost",
                "table" -> "TEST")).save()

*command line to run the Spark scala job *
spark-shell  --properties-file  /TestDivya/Spark/Phoenix.properties --jars
/usr/hdp/2.3.4.0-3485/phoenix/lib/phoenix-spark-4.4.0.2.3.4.0-3485.jar,/usr/hdp/2.3.4.0-3485/phoenix/phoenix-client.jar,/usr/hdp/2.3.4.0-3485/phoenix/phoenix-server.jar
 --driver-class-path
/usr/hdp/2.3.4.0-3485/phoenix/lib/phoenix-spark-4.4.0.2.3.4.0-3485.jar,/usr/hdp/2.3.4.0-3485/hbase/lib/phoenix-server.jar,/usr/hdp/2.3.4.0-3485/hbase/lib/phoenix-client-4.4.0.jar
--conf
"spark.executor.extraClassPath=/usr/hdp/current/phoenix-client/phoenix-client.jar"
--packages com.databricks:spark-csv_2.10:1.4.0  --master yarn-client -i
/TestDivya/Spark/WriteToPheonix.scala


*Error :*

16/04/11 09:03:43 INFO TaskSetManager: Lost task 2.3 in stage 3.0 (TID 409)
on executor ip-xxx-xx-xx-xxx.ap-southeast-1.compute.internal:
java.lang.RuntimeException (java.sql.SQLException: No suitable driver found
for jdbc:phoenix:localhost:2181:/hbase-unsecure;) [duplicate 7]
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage
3.0 (TID 408, ip-172-31-22-135.ap-southeast-1.compute.internal):
java.lang.RuntimeException: java.sql.SQLException: No suitable driver found
for jdbc:phoenix:localhost:2181:/hbase-unsecure;

Could somebody help me resolving the error.
What am I missing ?

Thanks,
Divya

Mime
View raw message