phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Youngwoo Kim (김영우) <warwit...@gmail.com>
Subject Re: phoenix-spark plugin with Spark 2.3
Date Thu, 10 Jan 2019 01:42:45 GMT
Abhishek,

If you want upgrade Spark major version, you should ONLY re-build
phoenix-client.jar for your spark application. That means you don't need to
rebuild/upgrade entire Phoenix packages for the cluster and don't need to
re-insert your data. Just make sure that the phoenix-client.jar file on
Spark's extraClasspath is built with proper Spark dependency version.

Thanks,
Youngwoo


On Thu, Jan 10, 2019 at 8:23 AM talluri abhishek <abhishektalluri@gmail.com>
wrote:

> Thanks Youngwoo.
> If that's the case how do we handle an upgrade from one spark version to
> another? Do we have to rebuild the Phoenix and thereby re-insert all the
> data?
> Or is there a better way to handle this.
>
> Thanks
> Abhishek.
>
>
> On Wed, Jan 9, 2019 at 6:09 PM Youngwoo Kim (김영우) <ywkim@apache.org>
> wrote:
>
>> Hi Abhishek,
>>
>> Yes. You need to build the phoenix packages with proper Spark dependency
>> to run on Spark 2.x E.g., mvn clean package -Dspark.version=2.0.0
>>
>> And also, there is another profile for Spark 1.6 See
>> https://github.com/apache/phoenix/blob/4.x-HBase-1.4/pom.xml#L1071
>>
>> So, if you would like to run the spark plugin on your spark distro, you
>> shuld build the phoenix package with appropriate spark dependency.
>>
>> HTH,
>> Youngwoo
>>
>> 2019년 1월 10일 (목) 오전 6:32, talluri abhishek <abhishektalluri@gmail.com>님이
>> 작성:
>>
>>> Hi All,
>>>
>>> When trying to use phoenix-spark plugin with spark 2.3, we are getting
>>> a NoClassDefFoundError and it is similar to what is found in the below JIRA
>>> https://issues.apache.org/jira/browse/PHOENIX-3333
>>>
>>> Using a phoenix 4.14-cdh5.14 parcel alongside 2.3.0.cloudera3 spark
>>> release to reproduce this issue. This plugin works fine when using the
>>> spark 1.6 that comes with the distribution but not the spark 2.3 version
>>> installed through a csd/parcel. Is there any way to make it work with the
>>> given phoenix 4.14 version and spark 2.3 version? (Tried passing in the
>>> phoenix-client jar and the phoenix-spark jar with the driver and executor
>>> extraClassPath.)
>>>
>>> I could see that phoenix 4.14-cdh5.14 is built using the
>>> cdh-spark-version which is 1.6, do we need to build against the required
>>> spark version in order to make it work or am I missing something? The
>>> docs/JIRA seem to suggest that 4.10 version should be compatible with spark
>>> 1.3.1+ versions.
>>>
>>> Any help would be appreciated.
>>>
>>> Thanks,
>>> Abhishek
>>>
>>

Mime
View raw message