phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From talluri abhishek <>
Subject Re: phoenix-spark plugin with Spark 2.3
Date Wed, 09 Jan 2019 23:13:20 GMT
Thanks Youngwoo.
If that's the case how do we handle an upgrade from one spark version to
another? Do we have to rebuild the Phoenix and thereby re-insert all the
Or is there a better way to handle this.


On Wed, Jan 9, 2019 at 6:09 PM Youngwoo Kim (김영우) <> wrote:

> Hi Abhishek,
> Yes. You need to build the phoenix packages with proper Spark dependency
> to run on Spark 2.x E.g., mvn clean package -Dspark.version=2.0.0
> And also, there is another profile for Spark 1.6 See
> So, if you would like to run the spark plugin on your spark distro, you
> shuld build the phoenix package with appropriate spark dependency.
> HTH,
> Youngwoo
> 2019년 1월 10일 (목) 오전 6:32, talluri abhishek <>님이
> 작성:
>> Hi All,
>> When trying to use phoenix-spark plugin with spark 2.3, we are getting
>> a NoClassDefFoundError and it is similar to what is found in the below JIRA
>> Using a phoenix 4.14-cdh5.14 parcel alongside 2.3.0.cloudera3 spark
>> release to reproduce this issue. This plugin works fine when using the
>> spark 1.6 that comes with the distribution but not the spark 2.3 version
>> installed through a csd/parcel. Is there any way to make it work with the
>> given phoenix 4.14 version and spark 2.3 version? (Tried passing in the
>> phoenix-client jar and the phoenix-spark jar with the driver and executor
>> extraClassPath.)
>> I could see that phoenix 4.14-cdh5.14 is built using the
>> cdh-spark-version which is 1.6, do we need to build against the required
>> spark version in order to make it work or am I missing something? The
>> docs/JIRA seem to suggest that 4.10 version should be compatible with spark
>> 1.3.1+ versions.
>> Any help would be appreciated.
>> Thanks,
>> Abhishek

View raw message