phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James Taylor <jamestay...@apache.org>
Subject Re: phoenix-spark error with index on target table
Date Sat, 13 Aug 2016 02:35:17 GMT
Thanks for letting us know, Nathan.

On Friday, August 12, 2016, Nathan Davis <nathan.davis@salesforce.com>
wrote:

> I was able to find a solution to this issue, so for posterity's sake here
> is the solution (perhaps more of a workaround):
>
> When executing the spark driver (in my simple case spark-shell, but should
> be the same with spark-submit), you need to provide the hbase-protocol.jar
> both via `--jars` and in the 'spark.executor.extraClassPath' setting via
> `--config`. Example:
>
> spark-shell --jars /usr/lib/hbase/hbase-protocol.
>> jar,/usr/lib/phoenix/phoenix-spark-4.7.0-HBase-1.2.jar,/
>> usr/lib/phoenix/phoenix-client.jar --conf "spark.executor.
>> extraClassPath=/usr/lib/hbase/hbase-protocol.jar"
>
>
> Then the HBaseZeroCopyByteString class resolves correctly.
>
>
> On Fri, Aug 12, 2016 at 4:34 PM, Nathan Davis <nathan.davis@salesforce.com
> <javascript:_e(%7B%7D,'cvml','nathan.davis@salesforce.com');>> wrote:
>
>> Hi All,
>> I am using phoenix-spark to write a DataFrame to a Phoenix table. All
>> works fine when just writing to a table alone. However, when I do the same
>> thing but with a global index on that table I get the following error. The
>> index is on two columns (varchar, date) with no includes.
>>
>> A little googling turns up some results about add hbase-protocol.jar to
>> the hadoop classpath, but I'm not sure which component needs that. I tried
>> adding that jar to the Spark driver and executor classpath, but still get
>> the same error. So, I wonder if I need to add this to HBase's classpath...
>> Does this look familiar to anyone?
>>
>>
>> java.sql.SQLException: java.util.concurrent.ExecutionException:
>>> java.lang.Exception: java.lang.IllegalAccessError:
>>> com/google/protobuf/HBaseZeroCopyByteString
>>> at org.apache.phoenix.cache.ServerCacheClient.addServerCache(Se
>>> rverCacheClient.java:266)
>>> at org.apache.phoenix.index.IndexMetaDataCacheClient.addIndexMe
>>> tadataCache(IndexMetaDataCacheClient.java:78)
>>> at org.apache.phoenix.execute.MutationState.setMetaDataOnMutati
>>> ons(MutationState.java:1068)
>>> at org.apache.phoenix.execute.MutationState.send(MutationState.java:918)
>>> at org.apache.phoenix.execute.MutationState.send(MutationState.
>>> java:1317)
>>> at org.apache.phoenix.execute.MutationState.commit(MutationStat
>>> e.java:1149)
>>> at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConn
>>> ection.java:520)
>>> at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConn
>>> ection.java:517)
>>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>> at org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConn
>>> ection.java:517)
>>> at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(Phoen
>>> ixRecordWriter.java:82)
>>> ... 13 more
>>
>>
>>
>> Thanks,
>>  -nathan
>>
>
>

Mime
View raw message