phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Thomas D'Silva" <tdsi...@salesforce.com>
Subject Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63
Date Wed, 12 Sep 2018 20:36:40 GMT
can you attach the schema of your table? and the explain plan for select *
from mytable?

On Tue, Sep 11, 2018 at 10:24 PM, Tanvi Bhandari <tanvi.bhandari@gmail.com>
wrote:

> " mapped hbase tables to phoenix and created them explicitly from phoenix
> sqlline client. I first created schema corresponding to namespace and then
> tables." By this statement, I meant the same. I re-created my tables
> since I had the DDLs with me.
>
> After that I tried getting the count of records in my table which gave me
> 8 records (expected result). - *select count(*) from "myTable"*;
> But when I performed the *select * from "myTable";* it is not returning
> any result.
>
> On Wed, Sep 12, 2018 at 1:55 AM Thomas D'Silva <tdsilva@salesforce.com>
> wrote:
>
>> Since you dropped all the system tables, all the phoenix metadata was
>> lost. If you have the ddl statements used to create your tables, you can
>> try rerunning them.
>>
>> On Tue, Sep 11, 2018 at 9:32 AM, Tanvi Bhandari <tanvi.bhandari@gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
>>> (had optional concept of schema) to phoenix-4.14 (schema is a must in
>>> here).
>>>
>>> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
>>> phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
>>> to connect to phoenix using sqline client,  I get the following error on
>>> *console*:
>>>
>>>
>>>
>>> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
>>> IOException
>>>
>>> org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException:
>>> SYSTEM:CATALOG: 63
>>>
>>>         at org.apache.phoenix.util.ServerUtil.createIOException(
>>> ServerUtil.java:120)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> getVersion(MetaDataEndpointImpl.java:3572)
>>>
>>>         at org.apache.phoenix.coprocessor.generated.MetaDataProtos$
>>> MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>
>>>         at org.apache.hadoop.hbase.regionserver.HRegion.
>>> execService(HRegion.java:7435)
>>>
>>>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execServiceOnRegion(RSRpcServices.java:1875)
>>>
>>>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execService(RSRpcServices.java:1857)
>>>
>>>         at org.apache.hadoop.hbase.protobuf.generated.
>>> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:
>>> 2114)
>>>
>>>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.
>>> java:101)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
>>> RpcExecutor.java:130)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.
>>> java:107)
>>>
>>>         at java.lang.Thread.run(Thread.java:745)
>>>
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>>>
>>>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.
>>> java:517)
>>>
>>>         at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.
>>> java:421)
>>>
>>>         at org.apache.phoenix.schema.PTableImpl.makePTable(
>>> PTableImpl.java:406)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(
>>> MetaDataEndpointImpl.java:1046)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> buildTable(MetaDataEndpointImpl.java:587)
>>>
>>>        at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(
>>> MetaDataEndpointImpl.java:1305)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> getVersion(MetaDataEndpointImpl.java:3568)
>>>
>>>         ... 10 more
>>>
>>>
>>>
>>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>>
>>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
>>> NativeConstructorAccessorImpl.java:62)
>>>
>>>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
>>> DelegatingConstructorAccessorImpl.java:45)
>>>
>>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:
>>> 423)
>>>
>>>         at org.apache.hadoop.ipc.RemoteException.instantiateException(
>>> RemoteException.java:106)
>>>
>>>         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(
>>> RemoteException.java:95)
>>>
>>>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
>>> getRemoteException(ProtobufUtil.java:326)
>>>
>>>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
>>> execService(ProtobufUtil.java:1629)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.
>>> call(RegionCoprocessorRpcChannel.java:104)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.
>>> call(RegionCoprocessorRpcChannel.java:94)
>>>
>>>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.
>>> callWithRetries(RpcRetryingCaller.java:136)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.
>>> callExecService(RegionCoprocessorRpcChannel.java:107)
>>>
>>>         at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(
>>> CoprocessorRpcChannel.java:56)
>>>
>>>         at org.apache.phoenix.coprocessor.generated.MetaDataProtos$
>>> MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>>>
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(
>>> ConnectionQueryServicesImpl.java:1271)
>>>
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(
>>> ConnectionQueryServicesImpl.java:1263)
>>>
>>>         at org.apache.hadoop.hbase.client.HTable$15.call(HTable.
>>> java:1736)
>>>
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>
>>>         at java.util.concurrent.ThreadPoolExecutor.runWorker(
>>> ThreadPoolExecutor.java:1142)
>>>
>>>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>>> ThreadPoolExecutor.java:617)
>>>
>>>         at java.lang.Thread.run(Thread.java:745)
>>>
>>>
>>>
>>>
>>>
>>> *Region-server logs are as follows: *
>>>
>>> 2018-09-07 03:23:36,170 ERROR [B.defaultRpcServer.handler=1,queue=1,port=29062]
>>> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
>>> getVersion failed
>>>
>>> java.lang.ArrayIndexOutOfBoundsException: 63
>>>
>>>                at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.
>>> java:517)
>>>
>>>                at org.apache.phoenix.schema.
>>> PTableImpl.<init>(PTableImpl.java:421)
>>>
>>>                at org.apache.phoenix.schema.PTableImpl.makePTable(
>>> PTableImpl.java:406)
>>>
>>>                at org.apache.phoenix.coprocessor.
>>> MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>
>>>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> buildTable(MetaDataEndpointImpl.java:587)
>>>
>>>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> loadTable(MetaDataEndpointImpl.java:1305)
>>>
>>>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> getVersion(MetaDataEndpointImpl.java:3568)
>>>
>>>                at org.apache.phoenix.coprocessor.generated.
>>> MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>
>>>                at org.apache.hadoop.hbase.regionserver.HRegion.
>>> execService(HRegion.java:7435)
>>>
>>>                at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execServiceOnRegion(RSRpcServices.java:1875)
>>>
>>>                at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execService(RSRpcServices.java:1857)
>>>
>>>                at org.apache.hadoop.hbase.protobuf.generated.
>>> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>
>>>                at org.apache.hadoop.hbase.ipc.
>>> RpcServer.call(RpcServer.java:2114)
>>>
>>>                at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.
>>> java:101)
>>>
>>>                at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
>>> RpcExecutor.java:130)
>>>
>>>                at org.apache.hadoop.hbase.ipc.
>>> RpcExecutor$1.run(RpcExecutor.java:107)
>>>
>>>                at java.lang.Thread.run(Thread.java:745)
>>>
>>>
>>>
>>> suspecting that this could be something wrong with SYSTEM table, I went
>>> ahead and dropped all SYSTEM tables from hbase shell and again tried
>>> connecting to phoenix sqlline client. This time connecting through
>>> phoenix-sqlline worked for me. But none of my tables were visible in
>>> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
>>> hbase tables to phoenix and created them explicitly from phoenix sqlline
>>> client. I first created schema corresponding to namespace and then tables.
>>> This way my tables were visible in phoenix sqlline. *Select Count(*)*
>>> query on my table was returning 8 (expected) records as well but when *select
>>> ** query is not returning any record. Can someone tell me what can I do
>>> next in this case?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Tanvi
>>>
>>>
>>>
>>
>>

Mime
View raw message