phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Mahonin <jmaho...@gmail.com>
Subject Re: Getting Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: org.apache.phoenix.spark. Please find packages at http://spark-packages.org Exception
Date Thu, 16 Mar 2017 15:06:35 GMT
Hi Sateesh,

It seems you are missing the import which gives Spark visibility into the
"org.apache.phoenix.spark". From the documentation page:

*import org.apache.phoenix.spark._*

I'm not entirely sure how this works in Java, however. You might have some
luck with:

*import static org.apache.phoenix.spark.*;*

If you do get this working, please update the list. It would be nice to add
this to the existing documentation.

Josh



On Wed, Mar 15, 2017 at 2:06 PM, Sateesh Karuturi <
sateesh.karuturi9@gmail.com> wrote:

> Hello folks..,
>
> I am trying to execute sample spark-phoenix application.
> but i am getting
>  Exception in thread "main" java.lang.ClassNotFoundException: Failed to
> find data source: org.apache.phoenix.spark. Please find packages at
> http://spark-packages.org exception.
>
> here is my code:
>
> package com.inndata.spark.sparkphoenix;
>
> import org.apache.spark.SparkConf;
> import org.apache.spark.SparkContext;
> import org.apache.spark.api.java.JavaSparkContext;
> import org.apache.spark.sql.DataFrame;
> import org.apache.spark.sql.SQLContext;
>
> import java.io.Serializable;
>
> /**
>  *
>  */
> public class SparkConnection implements Serializable {
>
>     public static void main(String args[]) {
>         SparkConf sparkConf = new SparkConf();
>         sparkConf.setAppName("spark-phoenix-df");
>         sparkConf.setMaster("local[*]");
>         JavaSparkContext sc = new JavaSparkContext(sparkConf);
>         SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
>
>         DataFrame df = sqlContext.read()
>                 .format("org.apache.phoenix.spark")
>                 .option("table", "ORDERS")
>                 .option("zkUrl", "localhost:2181")
>                 .load();
>         df.count();
>
>     }
> }
>
> and here is my pom.xml:
>
> <dependency>
>       <groupId>org.apache.phoenix</groupId>
>       <artifactId>phoenix-core</artifactId>
>       <version>4.8.0-HBase-1.2</version>
>     </dependency>
>
>     <dependency>
>       <groupId>org.scala-lang</groupId>
>       <artifactId>scala-library</artifactId>
>       <version>2.10.6</version>
>       <scope>provided</scope>
>     </dependency>
>     <dependency>
>     <groupId>org.apache.phoenix</groupId>
>     <artifactId>phoenix-spark</artifactId>
>     <version>4.8.0-HBase-1.2</version>
>
> </dependency>
>
>
>     <dependency>
>       <groupId>org.apache.spark</groupId>
>       <artifactId>spark-core_2.10</artifactId>
>       <version>1.6.2</version>
>     </dependency>
>
>     <dependency>
>     <groupId>org.apache.spark</groupId>
>     <artifactId>spark-sql_2.10</artifactId>
>     <version>1.6.2</version>
> </dependency>
>
>
>     <dependency>
>       <groupId>org.apache.hadoop</groupId>
>       <artifactId>hadoop-client</artifactId>
>       <version>2.7.3</version>
>
>     </dependency>
>
>     <dependency>
>       <groupId>org.apache.hadoop</groupId>
>       <artifactId>hadoop-common</artifactId>
>       <version>2.7.3</version>
>
>     </dependency>
>
>     <dependency>
>       <groupId>org.apache.hadoop</groupId>
>       <artifactId>hadoop-common</artifactId>
>       <version>2.7.3</version>
>
>     </dependency>
>
>     <dependency>
>       <groupId>org.apache.hadoop</groupId>
>       <artifactId>hadoop-hdfs</artifactId>
>       <version>2.7.3</version>
>
>     </dependency>
>
>     <dependency>
>       <groupId>org.apache.hbase</groupId>
>       <artifactId>hbase-client</artifactId>
>       <version>1.2.4</version>
>
>
>     </dependency>
>
>
>
>     <dependency>
>       <groupId>org.apache.hbase</groupId>
>       <artifactId>hbase-hadoop-compat</artifactId>
>       <version>1.2.4</version>
>
>     </dependency>
>
>     <dependency>
>       <groupId>org.apache.hbase</groupId>
>       <artifactId>hbase-hadoop2-compat</artifactId>
>       <version>1.2.4</version>
>
>     </dependency>
>     <dependency>
>       <groupId>org.apache.hbase</groupId>
>       <artifactId>hbase-server</artifactId>
>       <version>1.2.4</version>
>     </dependency>
>
>     <dependency>
>       <groupId>org.apache.hbase</groupId>
>       <artifactId>hbase-it</artifactId>
>       <version>1.2.4</version>
>       <type>test-jar</type>
>     </dependency>
>
>
>     <dependency>
>       <groupId>junit</groupId>
>       <artifactId>junit</artifactId>
>       <version>3.8.1</version>
>       <scope>test</scope>
>     </dependency>
>   </dependencies>
>
>
> here is the stackoverflow link:
>
>
> http://stackoverflow.com/questions/42816998/getting-failed-to-find-data-source-org-apache-phoenix-spark-please-find-packag
>
> please help me out.
>
>

Mime
View raw message