Hi All,
I am using phoenix-spark to write a DataFrame to a Phoenix table. All works fine when just writing to a table alone. However, when I do the same thing but with a global index on that table I get the following error. The index is on two columns (varchar, date) with no includes.

A little googling turns up some results about add hbase-protocol.jar to the hadoop classpath, but I'm not sure which component needs that. I tried adding that jar to the Spark driver and executor classpath, but still get the same error. So, I wonder if I need to add this to HBase's classpath... Does this look familiar to anyone?


java.sql.SQLException: java.util.concurrent.ExecutionException: java.lang.Exception: java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString
at org.apache.phoenix.cache.ServerCacheClient.addServerCache(ServerCacheClient.java:266)
at org.apache.phoenix.index.IndexMetaDataCacheClient.addIndexMetadataCache(IndexMetaDataCacheClient.java:78)
at org.apache.phoenix.execute.MutationState.setMetaDataOnMutations(MutationState.java:1068)
at org.apache.phoenix.execute.MutationState.send(MutationState.java:918)
at org.apache.phoenix.execute.MutationState.send(MutationState.java:1317)
at org.apache.phoenix.execute.MutationState.commit(MutationState.java:1149)
at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:520)
at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:517)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:517)
at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:82)
... 13 more


Thanks,
 -nathan