phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nimrod Oren <nimrod.o...@veracity-group.com>
Subject FW: Failing on writing Dataframe to Phoenix
Date Wed, 15 Feb 2017 15:29:49 GMT
Hi,



I'm trying to write a simple dataframe to Phoenix:

     df.save("org.apache.phoenix.spark", SaveMode.Overwrite,

      Map("table" -> "TEST_SAVE", "zkUrl" -> "zk.internal:2181"))



I have the following in my pom.xml:

        <dependency>

            <groupId>org.apache.phoenix</groupId>

            <artifactId>phoenix-spark</artifactId>

            <version>${phoenix-version}</version>

            <scope>provided</scope>

        </dependency>



and phoenix-spark is in spark-defaults.conf on all servers. However I'm
getting the following error:



Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/phoenix/util/SchemaUtil

        at
org.apache.phoenix.spark.DataFrameFunctions$$anonfun$1.apply(DataFrameFunctions.scala:33)

        at
org.apache.phoenix.spark.DataFrameFunctions$$anonfun$1.apply(DataFrameFunctions.scala:33)

        at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)

        at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)

        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)

        at
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)

        at
scala.collection.TraversableLike$class.map(TraversableLike.scala:244)

        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)

        at
org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:33)

        at
org.apache.phoenix.spark.DefaultSource.createRelation(DefaultSource.scala:47)

        at
org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222)

        at
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148)

        at org.apache.spark.sql.DataFrame.save(DataFrame.scala:2045)

        at com.pelephone.TrueCallLoader$.main(TrueCallLoader.scala:184)

        at com.pelephone.TrueCallLoader.main(TrueCallLoader.scala)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:606)

        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)

        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)

        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)

        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)

        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.ClassNotFoundException:
org.apache.phoenix.util.SchemaUtil

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)



Am I missing something?



Nimrod

Mime
View raw message