phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dmitry Goldenberg <dgoldenberg...@gmail.com>
Subject Re: org.apache.hadoop.hbase.TableNotFoundException: SYSTEM.CATALOG
Date Wed, 04 Nov 2015 22:25:05 GMT
Thank you, Michael.

Yes, we've been looking at the HBase logs. Basically what springs out is
the below (see stack trace).  The http://hbase.apache.org/0.94/book.html
doc talks about how "By default, hbase.rootdir is set to /tmp/hbase-${
user.name} and similarly so for the default ZooKeeper data location which
means you'll lose all your data whenever your server reboots unless you
change it."

Is there a connection here? We have not been setting hbase.rootdir
explicitly in our hbase-site.xml file. Could it be the case that the box
got rebooted and we lost our HBase data?  I'd be perplexed if the *system*
HBase data got lost...  We've been running without setting hbase.rootdir
explicitly and with machine reboots.  The system has been OK, no data loss
observed in our tables.  Is this, still, an hbase.rootdir problem?

Stack trace:

2015-11-04 16:51:36,764 DEBUG [FifoRpcScheduler.handler1-thread-14]
util.FSTableDescriptors: Exception during readTableDecriptor. Current table
name = SYSTEM.CATALOG

org.apache.hadoop.hbase.TableInfoMissingException: No table descriptor file
under hdfs://
acme-qa1.acmeinc.com:9000/tmp/hbase-hbase/hbase/data/default/SYSTEM.CATALOG

at
org.apache.hadoop.hbase.util.FSTableDescriptors.getTableDescriptorFromFs(FSTableDescriptors.java:510)

at
org.apache.hadoop.hbase.util.FSTableDescriptors.getTableDescriptorFromFs(FSTableDescriptors.java:487)

at
org.apache.hadoop.hbase.util.FSTableDescriptors.get(FSTableDescriptors.java:173)

at
org.apache.hadoop.hbase.master.HMaster.getTableDescriptors(HMaster.java:2853)

at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:43236)

at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2109)

at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)

at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)




On Wed, Nov 4, 2015 at 2:21 PM, Michael Mior <mmior@uwaterloo.ca> wrote:

> Check the logs on the HBase server. Often this is a result of some
> error there where the table couldn't be created correctly.
>
> --
> Michael Mior
> michael.mior@gmail.com
>
> 2015-11-04 14:13 GMT-05:00 Dmitry Goldenberg <dgoldenberg123@gmail.com>:
> > Could someone provide any insight on why we might be seeing the below
> > exception with Phoenix?  We're seeing this while trying to connect
> > programmatically in Java or when trying to run sqlline.
> >
> > We're running HBase 0.98.15 for Hadoop 2.
> >
> > Anything to try here? workarounds, fixes, troubleshooting tips?
> >
> > Thanks.
> >
> > Error: SYSTEM.CATALOG (state=08000,code=101)
> >
> > org.apache.phoenix.exception.PhoenixIOException: SYSTEM.CATALOG
> >
> > at
> >
> org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1022)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1230)
> >
> > at
> >
> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:111)
> >
> > at
> >
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1696)
> >
> > at
> >
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:594)
> >
> > at
> >
> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:296)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:288)
> >
> > at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:287)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1086)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl$11.call(ConnectionQueryServicesImpl.java:1841)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl$11.call(ConnectionQueryServicesImpl.java:1810)
> >
> > at
> >
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1810)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:162)
> >
> > at
> >
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:126)
> >
> > at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:133)
> >
> > at sqlline.DatabaseConnection.connect(DatabaseConnection.java:157)
> >
> > at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:203)
> >
> > at sqlline.Commands.connect(Commands.java:1064)
> >
> > at sqlline.Commands.connect(Commands.java:996)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:497)
> >
> > at
> >
> sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)
> >
> > at sqlline.SqlLine.dispatch(SqlLine.java:804)
> >
> > at sqlline.SqlLine.initArgs(SqlLine.java:588)
> >
> > at sqlline.SqlLine.begin(SqlLine.java:656)
> >
> > at sqlline.SqlLine.start(SqlLine.java:398)
> >
> > at sqlline.SqlLine.main(SqlLine.java:292)
> >
> > Caused by: org.apache.hadoop.hbase.TableNotFoundException: SYSTEM.CATALOG
> >
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1279)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1150)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1107)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionLocation(HConnectionManager.java:948)
> >
> > at
> org.apache.hadoop.hbase.client.HTable.getRegionLocation(HTable.java:522)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HTable.getKeysAndRegionsInRange(HTable.java:723)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HTable.getKeysAndRegionsInRange(HTable.java:695)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HTable.getStartKeysInRange(HTable.java:1641)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1596)
> >
> > at
> >
> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1577)
> >
> > at
> >
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1007)
> >
> > ... 31 more
>

Mime
View raw message