Thanks for following up with the root cause. This bug has been fixed in PHOENIX-930 and will appear in the next release (4.8.1).

On Tue, Sep 6, 2016 at 7:42 PM, Yang Zhang <> wrote:

I have already handled this problem with your help.
and I also  learned what make this mistake
when i create table with duplicate column name (i.e. create table test(id integer primary key,name char(32),name char(64))  )
phoenix will throw an exception but i can still find the wrong table. This cause my table doesn't match my metadata.
afther that, i can't drop my table and get the DoNotRetryIOException

I am not sure whether this is an issue or the version latter had handled this(my version is 4.4.0 ).
But  in my opinion ,phoenix shoule not create a helf table when customer's grammer has error.

2016-09-07 9:21 GMT+08:00 James Taylor <>:
Sorry for the issue you've hit, Yang Zhang. You may need to do the following to recover:
- drop the table from the hbase shell
- create a snapshot of the SYSTEM.CATALOG table just in case
- delete the rows for the table from the SYSTEM.CATALOG table (i.e. issue a DELETE FROM SYSTEM.CATALOG WHERE TABLE_SCHEM = <your schema name> AND TABLE_NAME = <your table name>)
- bounce your cluster (since the SYSTEM.CATALOG table is cached)

On Tue, Sep 6, 2016 at 5:56 PM, Yang Zhang <> wrote:
it got the DoNotRetryIOException again,

This time i just try to creat a table, When i Try to drop the table ,I got this Exception,
Here are the Exception stack below

Error: org.apache.hadoop.hbase.DoNotRetryIOException: MAGNETISM_MODEL: 6
at org.apache.phoenix.util.ServerUtil.createIOException(
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(
at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(
at org.apache.hadoop.hbase.regionserver.HRegion.execService(
at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(
at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
at org.apache.hadoop.hbase.ipc.RpcExecutor$
Caused by: java.lang.ArrayIndexOutOfBoundsException: 6
at org.apache.phoenix.schema.PTableImpl.init(
at org.apache.phoenix.schema.PTableImpl.<init>(
at org.apache.phoenix.schema.PTableImpl.makePTable(
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doDropTable(
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(
... 10 more

SQLState:  08000
ErrorCode: 101

I still can't handle this problem, Any one can help me ?

Thank you very much

2016-07-23 10:34 GMT+08:00 Yang Zhang <>:
Hello everyone

      I got an (org.apache.hadoop.hbase.DoNotRetryIOException) when using phoenix.  my version is phoenix-4.4.0-HBase-0.98-bin.
I create a table and upsert data into it. But someday i want to modify it, so i execute (alter table drop column c1)and ( alter table add c2 bigint). after that i upsert new data into my table for each row. I try select * from my table,it got success。

      But afther a time, I try to select * from it, I got DoNotRetryIOException at org.apache.phoenix.util.ServerUtil.createIOException(  and  java .lang.ArrayIndexOutOfBoundsException:10 at org.apache.phoenix.schema.PtableImpl.init(

      When I try to drop the table I still got DoNotRetryException Exception, Btw anyone have try drop phoenix table in hbase shell ?  I should find some way to drop the table.

Thanks very much!