phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 김영우 (YoungWoo Kim) <yw...@apache.org>
Subject Re: Phoenix 4.9.0 with Spark 2.0
Date Tue, 30 May 2017 13:32:21 GMT
Hi, Chaitanya

You should upgrade Phoenix version to 4.10.0 for Spark 2.x. See
https://issues.apache.org/jira/browse/PHOENIX-3333

HTH,

Youngwoo

On Tue, May 30, 2017 at 9:46 PM, cmbendre <chaitanya.bendre@zeotap.com>
wrote:

> Hi,
>
> I am trying Phoenix connector from Spark 2.0. I am using Phoenix 4.9.0 on
> EMR. My command to start spark shell -
>
> /./bin/spark-shell --master local --jars
> /usr/lib/phoenix/phoenix-spark-4.9.0-HBase-1.2.jar --jars
> /usr/lib/phoenix/phoenix-client.jar --conf
> "spark.executor.extraClassPath=/usr/lib/phoenix/phoenix-client.jar" --conf
> "spark.driver.extraClassPath=/usr/lib/phoenix/phoenix-client.jar"
> /
>
> This starts the spark shell. When i run following code, it fails with the
> exception (given below)
>
> /val df = sqlContext.load(
>   "org.apache.phoenix.spark",
>   Map("table" -> "TBL", "zkUrl" -> "localhost:2181:/hbase")
> )/
>
> *Exception: *
> /java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame
>   at java.lang.Class.getDeclaredMethods0(Native Method)
>   at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
>   at java.lang.Class.getDeclaredMethod(Class.java:2128)
>   at java.io.ObjectStreamClass.getPrivateMethod(
> ObjectStreamClass.java:1475)
>   at java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java:72)
>   at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:498)
>   at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:472)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:472)
>   at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:369)
>   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1134)
>   at
> java.io.ObjectOutputStream.defaultWriteFields(
> ObjectOutputStream.java:1548)
>   at
> java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
>   at
> java.io.ObjectOutputStream.writeOrdinaryObject(
> ObjectOutputStream.java:1432)
>   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
>   at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
>   at
> org.apache.spark.serializer.JavaSerializationStream.
> writeObject(JavaSerializer.scala:43)
>   at
> org.apache.spark.serializer.JavaSerializerInstance.
> serialize(JavaSerializer.scala:100)
>   at
> org.apache.spark.util.ClosureCleaner$.ensureSerializable(
> ClosureCleaner.scala:295)
>   at
> org.apache.spark.util.ClosureCleaner$.org$apache$
> spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:2101)
>   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:370)
>   at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:369)
>   at
> org.apache.spark.rdd.RDDOperationScope$.withScope(
> RDDOperationScope.scala:151)
>   at
> org.apache.spark.rdd.RDDOperationScope$.withScope(
> RDDOperationScope.scala:112)
>   at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:369)
>   at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:119)
>   at
> org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:60)
>   at
> org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(
> LogicalRelation.scala:40)
>   at
> org.apache.spark.sql.SparkSession.baseRelationToDataFrame(
> SparkSession.scala:389)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:965)
>   ... 53 elided
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.sql.DataFrame
>   at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>   ... 88 more
> /
>
> Note: Spark 1.6.2 works fine for the same code. Also, JDBC connectivity
> works fine.
>
> What am i missing ?
>
> Thanks
> Chaitanya
>
>
>
>
> --
> View this message in context: http://apache-phoenix-user-
> list.1124778.n5.nabble.com/Phoenix-4-9-0-with-Spark-2-0-tp3602.html
> Sent from the Apache Phoenix User List mailing list archive at Nabble.com.
>

Mime
View raw message