Your assumptions are not unreasonable :) Phoenix 5.0.x should certainly
work with HBase 2.0.x. Glad to see that it's been corrected already
(embarassing that I don't even remember reviewing this).
Let me start a thread on dev@phoenix about a 5.0.1 or a 5.1.0. We need
to have a Phoenix 5.x that works with all of HBase 2.0.x (and hopefully
2.1.x too).
On 9/25/18 9:25 PM, Francis Chuang wrote:
> After some investigation, I found that Phoenix 5.0.0 is only compatible
> with HBase 2.0.0.
>
> In 2.0.1 and onward, compare(final Cell a, final Cell b) in
> CellComparatorImpl was changed to final:
> https://github.com/apache/hbase/blame/master/hbase-common/src/main/java/org/apache/hadoop/hbase/CellComparatorImpl.java#L67
>
> This change affected HBase 2.0.1 and 2.0.2.
>
> As Phoenix 5.0.0 relies on this behavior:
> https://github.com/apache/phoenix/blob/8a819c6c3b4befce190c6ac759f744df511de61d/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/IndexMemStore.java#L84
>
> Fortunately, this is fixed in Phoenix master:
> https://github.com/apache/phoenix/blob/master/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/covered/data/IndexMemStore.java#L82
>
> The issue should be resolved in the next release of Phoenix.
>
> The problem is that I wrongly assumed HBase's version numbers to be
> following semver and that a patch release would not introduce breaking
> changes.
>
> On 26/09/2018 1:04 AM, Jaanai Zhang wrote:
>>
>>
>> Is my method of installing HBase and Phoenix correct?
>>
>> Did you check versions of HBase that exists in your classpath?
>>
>> Is this a compatibility issue with Guava?
>>
>> It isn't an exception which incompatible with Guava
>>
>> ----------------------------------------
>> Jaanai Zhang
>> Best regards!
>>
>>
>>
>> Francis Chuang <francischuang@apache.org
>> <mailto:francischuang@apache.org>> 于2018年9月25日周二 下午8:25写道:
>>
>> Thanks for taking a look, Jaanai!
>>
>> Is my method of installing HBase and Phoenix correct? See
>> https://github.com/Boostport/hbase-phoenix-all-in-one/blob/master/Dockerfile#L12
>>
>> Is this a compatibility issue with Guava?
>>
>> Francis
>>
>> On 25/09/2018 10:21 PM, Jaanai Zhang wrote:
>>>
>>> org.apache.phoenix.hbase.index.covered.data.IndexMemStore$1
>>> overrides
>>> final method
>>> compare.(Lorg/apache/hadoop/hbase/Cell;Lorg/apache/hadoop/hbase/Cell;)I
>>> at java.lang.ClassLoader.defineClass1(Native
Method)
>>> at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>> at
>>>
>>> It looks like that HBase's Jars are incompatible.
>>>
>>> ----------------------------------------
>>> Jaanai Zhang
>>> Best regards!
>>>
>>>
>>>
>>> Francis Chuang <francischuang@apache.org
>>> <mailto:francischuang@apache.org>> 于2018年9月25日周二 下午8:06写道:
>>>
>>> Hi All,
>>>
>>> I recently updated one of my Go apps to use Phoenix 5.0 with
>>> HBase
>>> 2.0.2. I am using my Phoenix + HBase all in one docker image
>>> available
>>> here: https://github.com/Boostport/hbase-phoenix-all-in-one
>>>
>>> This is the log/output from the exception:
>>>
>>> RuntimeException: org.apache.phoenix.execute.CommitException:
>>> org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException:
>>>
>>> Failed 1 action:
>>> org.apache.phoenix.hbase.index.builder.IndexBuildingFailureException:
>>>
>>> Failed to build index for unexpected reason!
>>> at
>>> org.apache.phoenix.hbase.index.util.IndexManagementUtil.rethrowIndexingException(IndexManagementUtil.java:206)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutate(Indexer.java:351)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$28.call(RegionCoprocessorHost.java:1010)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$28.call(RegionCoprocessorHost.java:1007)
>>> at
>>> org.apache.hadoop.hbase.coprocessor.CoprocessorHost$ObserverOperationWithoutResult.callObserver(CoprocessorHost.java:540)
>>> at
>>> org.apache.hadoop.hbase.coprocessor.CoprocessorHost.execOperation(CoprocessorHost.java:614)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preBatchMutate(RegionCoprocessorHost.java:1007)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion$MutationBatchOperation.prepareMiniBatchOperations(HRegion.java:3487)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.doMiniBatchMutate(HRegion.java:3896)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3854)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3785)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doBatchOp(RSRpcServices.java:1027)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicBatchOp(RSRpcServices.java:959)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:922)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2666)
>>> at
>>> org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
>>> at
>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
>>> Caused by: java.lang.VerifyError: class
>>> org.apache.phoenix.hbase.index.covered.data.IndexMemStore$1
>>> overrides
>>> final method
>>> compare.(Lorg/apache/hadoop/hbase/Cell;Lorg/apache/hadoop/hbase/Cell;)I
>>> at java.lang.ClassLoader.defineClass1(Native
Method)
>>> at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>> at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>> at
>>> java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>> at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>> at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>> at
>>> java.security.AccessController.doPrivileged(Native Method)
>>> at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>> at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>> at
>>> org.apache.phoenix.hbase.index.covered.data.IndexMemStore.<init>(IndexMemStore.java:82)
>>> at
>>> org.apache.phoenix.hbase.index.covered.LocalTableState.<init>(LocalTableState.java:57)
>>> at
>>> org.apache.phoenix.hbase.index.covered.NonTxIndexBuilder.getIndexUpdate(NonTxIndexBuilder.java:52)
>>> at
>>> org.apache.phoenix.hbase.index.builder.IndexBuildManager.getIndexUpdate(IndexBuildManager.java:90)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutateWithExceptions(Indexer.java:503)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutate(Indexer.java:348)
>>> ... 18 more
>>> : 1 time, servers with issues:
>>> 9ac923bd5c9f,16020,1537875547341
>>> -> CommitException:
>>> org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException:
>>>
>>> Failed 1 action:
>>> org.apache.phoenix.hbase.index.builder.IndexBuildingFailureException:
>>>
>>> Failed to build index for unexpected reason!
>>> at
>>> org.apache.phoenix.hbase.index.util.IndexManagementUtil.rethrowIndexingException(IndexManagementUtil.java:206)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutate(Indexer.java:351)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$28.call(RegionCoprocessorHost.java:1010)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$28.call(RegionCoprocessorHost.java:1007)
>>> at
>>> org.apache.hadoop.hbase.coprocessor.CoprocessorHost$ObserverOperationWithoutResult.callObserver(CoprocessorHost.java:540)
>>> at
>>> org.apache.hadoop.hbase.coprocessor.CoprocessorHost.execOperation(CoprocessorHost.java:614)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preBatchMutate(RegionCoprocessorHost.java:1007)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion$MutationBatchOperation.prepareMiniBatchOperations(HRegion.java:3487)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.doMiniBatchMutate(HRegion.java:3896)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3854)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3785)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doBatchOp(RSRpcServices.java:1027)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicBatchOp(RSRpcServices.java:959)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:922)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2666)
>>> at
>>> org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
>>> at
>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
>>> Caused by: java.lang.VerifyError: class
>>> org.apache.phoenix.hbase.index.covered.data.IndexMemStore$1
>>> overrides
>>> final method
>>> compare.(Lorg/apache/hadoop/hbase/Cell;Lorg/apache/hadoop/hbase/Cell;)I
>>> at java.lang.ClassLoader.defineClass1(Native
Method)
>>> at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>> at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>> at
>>> java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>> at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>> at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>> at
>>> java.security.AccessController.doPrivileged(Native Method)
>>> at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>> at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>> at
>>> org.apache.phoenix.hbase.index.covered.data.IndexMemStore.<init>(IndexMemStore.java:82)
>>> at
>>> org.apache.phoenix.hbase.index.covered.LocalTableState.<init>(LocalTableState.java:57)
>>> at
>>> org.apache.phoenix.hbase.index.covered.NonTxIndexBuilder.getIndexUpdate(NonTxIndexBuilder.java:52)
>>> at
>>> org.apache.phoenix.hbase.index.builder.IndexBuildManager.getIndexUpdate(IndexBuildManager.java:90)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutateWithExceptions(Indexer.java:503)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutate(Indexer.java:348)
>>> ... 18 more
>>> : 1 time, servers with issues:
>>> 9ac923bd5c9f,16020,1537875547341
>>> -> RetriesExhaustedWithDetailsException: Failed 1 action:
>>> org.apache.phoenix.hbase.index.builder.IndexBuildingFailureException:
>>>
>>> Failed to build index for unexpected reason!
>>> at
>>> org.apache.phoenix.hbase.index.util.IndexManagementUtil.rethrowIndexingException(IndexManagementUtil.java:206)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutate(Indexer.java:351)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$28.call(RegionCoprocessorHost.java:1010)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$28.call(RegionCoprocessorHost.java:1007)
>>> at
>>> org.apache.hadoop.hbase.coprocessor.CoprocessorHost$ObserverOperationWithoutResult.callObserver(CoprocessorHost.java:540)
>>> at
>>> org.apache.hadoop.hbase.coprocessor.CoprocessorHost.execOperation(CoprocessorHost.java:614)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preBatchMutate(RegionCoprocessorHost.java:1007)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion$MutationBatchOperation.prepareMiniBatchOperations(HRegion.java:3487)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.doMiniBatchMutate(HRegion.java:3896)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3854)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.batchMutate(HRegion.java:3785)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doBatchOp(RSRpcServices.java:1027)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicBatchOp(RSRpcServices.java:959)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:922)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2666)
>>> at
>>> org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
>>> at
>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
>>> Caused by: java.lang.VerifyError: class
>>> org.apache.phoenix.hbase.index.covered.data.IndexMemStore$1
>>> overrides
>>> final method
>>> compare.(Lorg/apache/hadoop/hbase/Cell;Lorg/apache/hadoop/hbase/Cell;)I
>>> at java.lang.ClassLoader.defineClass1(Native
Method)
>>> at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>> at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>> at
>>> java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>> at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>> at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>> at
>>> java.security.AccessController.doPrivileged(Native Method)
>>> at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>> at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>> at
>>> org.apache.phoenix.hbase.index.covered.data.IndexMemStore.<init>(IndexMemStore.java:82)
>>> at
>>> org.apache.phoenix.hbase.index.covered.LocalTableState.<init>(LocalTableState.java:57)
>>> at
>>> org.apache.phoenix.hbase.index.covered.NonTxIndexBuilder.getIndexUpdate(NonTxIndexBuilder.java:52)
>>> at
>>> org.apache.phoenix.hbase.index.builder.IndexBuildManager.getIndexUpdate(IndexBuildManager.java:90)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutateWithExceptions(Indexer.java:503)
>>> at
>>> org.apache.phoenix.hbase.index.Indexer.preBatchMutate(Indexer.java:348)
>>> ... 18 more
>>> : 1 time, servers with issues:
>>> 9ac923bd5c9f,16020,1537875547341
>>>
>>> I am not entirely sure if this is a bug, so have not opened a
>>> JIRA yet.
>>> Due to Guava issues with Tephra (see TEPHRA-181), I replaced
>>> HBase's
>>> Guava 11.0.2 with Guava 13.0.1. See
>>> https://github.com/Boostport/hbase-phoenix-all-in-one/blob/master/Dockerfile#L43
>>>
>>> Is Guava 13.0.1 compatible with this release (signaling that
>>> this might
>>> be cause by incompatible versions of Guava) or should I be
>>> investigating
>>> something else?
>>>
>>> Francis
>>>
>>
>
|