phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Kiran <maghamraviki...@gmail.com>
Subject Re: Can't drop table, help!
Date Sun, 15 Jun 2014 23:41:52 GMT
Hi Russel,
   When recreating the table, does it complain of a TABLE_ALREADY_EXIST
exception?

    If possible, can you please confirm if you see the table 'DEV_HET_MEF'
from the zookeeper client ( zkcli.sh)
      a )   hbase zkcli -server <host>:<port>
      b)    ls /hbase/table/
      If so, you can remove it by running rmr /hbase/table/DEV_HET_MEF

Also ,can you please let me know which version of Phoenix are you using to
help see if I can reproduce the problem.

Regards
Ravi



On Sun, Jun 15, 2014 at 12:58 PM, Russell Jurney <russell.jurney@gmail.com>
wrote:

> We created a table with duplicate fields by mistake.  Now we are unable to
> drop the table:
>
>
> 0: jdbc:phoenix:hiveapp1> drop table DEV_HET_MEF;
> Error: org.apache.hadoop.hbase.DoNotRetryIOException: DEV_HET_MEF: 28
> at
> com.salesforce.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:83)
> at
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:578)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hbase.regionserver.HRegion.exec(HRegion.java:5490)
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.execCoprocessor(HRegionServer.java:3720)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
> at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1428)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 28
> at com.salesforce.phoenix.schema.PTableImpl.init(PTableImpl.java:213)
> at com.salesforce.phoenix.schema.PTableImpl.<init>(PTableImpl.java:181)
> at com.salesforce.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:176)
> at
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:381)
> at
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:215)
> at
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.doDropTable(MetaDataEndpointImpl.java:595)
> at
> com.salesforce.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:558)
> ... 12 more (state=08000,code=101)
>
>
> So I go to hbase shell and drop the table there...
>
> disable 'DEV_HET_MEF'
>
> drop 'DEV_HET_MEF'
>
>
> Finally, I scan 'SYSTEM.TABLE' and remove all rows corresponding to
> 'DEV_HET_MEF'.
>
> Normally this works, and the table should be gone! But this time, the
> table can't be recreated. What should I do? I tried restarting HBase, no
> effect.
>
> Thanks!
> --
> Russell Jurney twitter.com/rjurney russell.jurney@gmail.com datasyndrome.
> com
>

Mime
View raw message