On Feb 8, 2016, at 11:04 PM, pierre lacave <email@example.com> wrote:
Havent met that one.
According to SPARK-1867, the real issue is hidden.
I d process by elimination, maybe try in local[*] mode firstOn Tue, 9 Feb 2016, 04:58 Benjamin Kim <firstname.lastname@example.org> wrote:Pierre,I got it to work using phoenix-4.7.0-HBase-1.0-client-spark.jar. But, now, I get this error:org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, prod-dc1-datanode151.pdc1i.gradientx.com): java.lang.IllegalStateException: unread block dataIt happens when I do:df.show()Getting closer…Thanks,BenOn Feb 8, 2016, at 2:57 PM, pierre lacave <email@example.com> wrote:
This is the wrong client jar try with the one named phoenix-4.7.0-HBase-1.1-client-spark.jarOn Mon, 8 Feb 2016, 22:29 Benjamin Kim <firstname.lastname@example.org> wrote:Hi Josh,I tried again by putting the settings within the spark-default.conf.spark.driver.extraClassPath=/opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jarspark.executor.extraClassPath=/opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jarI still get the same error using the code below.import org.apache.phoenix.spark._val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "TEST.MY_TEST", "zkUrl" -> “zk1,zk2,zk3:2181"))Can you tell me what else you’re doing?Thanks,BenOn Feb 8, 2016, at 1:44 PM, Josh Mahonin <email@example.com> wrote:Hi Ben,I'm not sure about the format of those command line options you're passing. I've had success with spark-shell just by setting the 'spark.executor.extraClassPath' and 'spark.driver.extraClassPath' options on the spark config, as per the docs .I'm not sure if there's anything special needed for CDH or not though. I also have a docker image I've been toying with which has a working Spark/Phoenix setup using the Phoenix 4.7.0 RC and Spark 1.6.0. It might be a useful reference for you as well .
Good luck,On Mon, Feb 8, 2016 at 4:29 PM, Benjamin Kim <firstname.lastname@example.org> wrote:Hi Pierre,I tried to run in spark-shell using spark 1.6.0 by running this:spark-shell --master yarn-client --driver-class-path /opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar --driver-java-options "-Dspark.executor.extraClassPath=/opt/tools/phoenix/phoenix-4.7.0-HBase-1.0-client.jar”The version of HBase is the one in CDH5.4.8, which is 1.0.0-cdh5.4.8.When I get to the line:val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> “TEST.MY_TEST", "zkUrl" -> “zk1,zk2,zk3:2181”))I get this error:java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$Any ideas?Thanks,BenOn Feb 5, 2016, at 1:36 PM, pierre lacave <email@example.com> wrote:I don't know when the full release will be, RC1 just got pulled out, and expecting RC2 soonyou can find them herehttps://dist.apache.org/repos/dist/dev/phoenix/there is a new phoenix-4.7.0-HBase-1.1-client-spark.jar that is all you need to have in spark classpathOn Fri, Feb 5, 2016 at 9:28 PM, Benjamin Kim <firstname.lastname@example.org> wrote:Hi Pierre,When will I be able to download this version?Thanks,Ben
On Friday, February 5, 2016, pierre lacave <email@example.com> wrote:This was addressed in Phoenix 4.7 (currently in RC)On Fri, Feb 5, 2016 at 6:17 PM, Benjamin Kim <firstname.lastname@example.org> wrote:I cannot get this plugin to work in CDH 5.4.8 using Phoenix 4.5.2 and Spark 1.6. When I try to launch spark-shell, I get:
java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
I continue on and run the example code. When I get tot the line below:
val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "TEST.MY_TEST", "zkUrl" -> "zookeeper1,zookeeper2,zookeeper3:2181")
I get this error:
Can someone help?