phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Siddharth Ubale <>
Subject ArrayIndexOutOfBounds excpetion
Date Tue, 11 Jul 2017 10:02:07 GMT

We are using a Phoenix table , where we constantly upgrade the structure by altering the table
and adding new columns.
Almost, every 3 days we see that the table becomes unusable via phoenix after some alter commands
have altered the table.
Earlier we were under the impression that one of the columns gets created which is a duplicate
causing a metadata issue with phoenix which it is
Unable to manage at this stage based on online discussions.
Also, there exists an unresolved issue below about the same.

Can anyone tell me if they are facing the same and what they have done in order to check this

Please find the stack trace for my problem below:

Error: org.apache.hadoop.hbase.DoNotRetryIOException: DATAWAREHOUSE3: null
                at org.apache.phoenix.util.ServerUtil.createIOException(
                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(
                at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(
                at org.apache.hadoop.hbase.regionserver.HRegion.execService(
                at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(
                at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(
                at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(
                at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
                at org.apache.hadoop.hbase.ipc.RpcExecutor$
Caused by: java.lang.ArrayIndexOutOfBoundsException

SQLState:  08000
ErrorCode: 101

Siddharth Ubale,

View raw message