phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chetan Khatri <chetan.opensou...@gmail.com>
Subject Re: Error at starting Phoenix shell with HBase
Date Wed, 18 Jan 2017 07:16:09 GMT
Hello Guys,

I am still stuck to start Phoenix with Pseudo Hadoop + Hive + Spark
cluster, can anybody help me please.

Thanks.

On Tue, Jan 17, 2017 at 2:03 AM, Chetan Khatri <chetan.opensource@gmail.com>
wrote:

> Hello Josh,
>
> Thank you for reply, As you suggested.
>
> 1) phoenix-4.8.2-HBase-1.2-server.jar at HBase/lib
> *Checked by :* bin/hbase classpath | grep 'phoenix'
>
>
> *2) Errors*
>
>
>
>
> *​3) Nothing can be found at HBase RegionServer logs*
>
>
>
> How can above problem can be resolved ?
>
> Thanks.
> ​​
>
>
> On Mon, Jan 16, 2017 at 10:22 PM, Josh Elser <elserj@apache.org> wrote:
>
>> Did you check the RegionServers logs I asked in the last message?
>>
>> Chetan Khatri wrote:
>>
>>> Any updates for the above error guys ?
>>>
>>>
>>> On Fri, Jan 13, 2017 at 9:35 PM, Josh Elser <elserj@apache.org
>>> <mailto:elserj@apache.org>> wrote:
>>>
>>>     (-cc dev@phoenix)
>>>
>>>     phoenix-4.8.2-HBase-1.2-server.jar in the top-level binary tarball
>>>     of Apache Phoenix 4.8.0 is the jar which is meant to be deployed to
>>>     all HBase's classpath.
>>>
>>>     I would check the RegionServer logs -- I'm guessing that it never
>>>     started correctly or failed. The error message is saying that
>>>     certain regions in the system were never assigned to a RegionServer
>>>     which only happens in exceptional cases.
>>>
>>>     Chetan Khatri wrote:
>>>
>>>         Hello Community,
>>>
>>>         I have installed and configured Apache Phoenix on Single Node
>>>         Ubuntu 16.04
>>>         machine:
>>>         - Hadoop 2.7
>>>         - HBase 1.2.4
>>>         - Phoenix -4.8.2-HBase-1.2
>>>
>>>         Copied phoenix-core-4.8.2-HBase-1.2.jar to hbase/lib and
>>> confirmed
>>>         with bin/hbase classpath | grep 'phoenix' and I am using embedded
>>>         zookeeper, so my hbase-site.xml looks like below:
>>>
>>>         <configuration>
>>>         <property>
>>>         <name>hbase.rootdir</name>
>>>         <value>file:///home/hduser/hbase</value>
>>>         </property>
>>>         </configuration>
>>>
>>>         I am able to read / write to HBase from shell and Apache Spark.
>>>
>>>         *Errors while accessing with **sqlline**:*
>>>
>>>
>>>         1) bin/sqlline.py localhost:2181
>>>
>>>         Error:
>>>
>>>         1. Command made process hang.
>>>         2.
>>>         Error: ERROR 1102 (XCL02): Cannot get all table regions.
>>>         (state=XCL02,code=1102)
>>>         java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all table
>>>         regions.
>>>         at
>>>         org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newE
>>> xception(SQLExceptionCode.java:455)
>>>         at
>>>         org.apache.phoenix.exception.SQLExceptionInfo.buildException
>>> (SQLExceptionInfo.java:145)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllT
>>> ableRegions(ConnectionQueryServicesImpl.java:546)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl.checkCl
>>> ientServerCompatibility(ConnectionQueryServicesImpl.java:1162)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureT
>>> ableCreated(ConnectionQueryServicesImpl.java:1068)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl.createT
>>> able(ConnectionQueryServicesImpl.java:1388)
>>>         at
>>>         org.apache.phoenix.schema.MetaDataClient.createTableInternal
>>> (MetaDataClient.java:2298)
>>>         at
>>>         org.apache.phoenix.schema.MetaDataClient.createTable(MetaDat
>>> aClient.java:940)
>>>         at
>>>         org.apache.phoenix.compile.CreateTableCompiler$2.execute(Cre
>>> ateTableCompiler.java:193)
>>>         at
>>>         org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>> ment.java:344)
>>>         at
>>>         org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>> ment.java:332)
>>>         at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>>         at
>>>         org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(Pho
>>> enixStatement.java:331)
>>>         at
>>>         org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(Phoen
>>> ixStatement.java:1423)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call
>>> (ConnectionQueryServicesImpl.java:2352)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call
>>> (ConnectionQueryServicesImpl.java:2291)
>>>         at
>>>         org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixC
>>> ontextExecutor.java:76)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl.init(Co
>>> nnectionQueryServicesImpl.java:2291)
>>>         at
>>>         org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServ
>>> ices(PhoenixDriver.java:232)
>>>         at
>>>         org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnecti
>>> on(PhoenixEmbeddedDriver.java:147)
>>>         at
>>>         org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.
>>> java:202)
>>>         at sqlline.DatabaseConnection.connect(DatabaseConnection.java:1
>>> 57)
>>>         at
>>>         sqlline.DatabaseConnection.getConnection(DatabaseConnection.
>>> java:203)
>>>         at sqlline.Commands.connect(Commands.java:1064)
>>>         at sqlline.Commands.connect(Commands.java:996)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>>         sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>> ssorImpl.java:62)
>>>         at
>>>         sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>> thodAccessorImpl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>>         at
>>>         sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHa
>>> ndler.java:36)
>>>         at sqlline.SqlLine.dispatch(SqlLine.java:803)
>>>         at sqlline.SqlLine.initArgs(SqlLine.java:588)
>>>         at sqlline.SqlLine.begin(SqlLine.java:656)
>>>         at sqlline.SqlLine.start(SqlLine.java:398)
>>>         at sqlline.SqlLine.main(SqlLine.java:292)
>>>         Caused by:
>>>         org.apache.hadoop.hbase.client.NoServerForRegionException: No
>>>         server address listed in hbase:meta for region
>>>         SYSTEM.CATALOG,,1484293041241.0b74311f417f83abe84ae1be4e823de8.
>>>         containing
>>>         row
>>>         at
>>>         org.apache.hadoop.hbase.client.ConnectionManager$HConnection
>>> Implementation.locateRegionInMeta(ConnectionManager.java:1318)
>>>         at
>>>         org.apache.hadoop.hbase.client.ConnectionManager$HConnection
>>> Implementation.locateRegion(ConnectionManager.java:1181)
>>>         at
>>>         org.apache.hadoop.hbase.client.ConnectionManager$HConnection
>>> Implementation.relocateRegion(ConnectionManager.java:1152)
>>>         at
>>>         org.apache.hadoop.hbase.client.ConnectionManager$HConnection
>>> Implementation.relocateRegion(ConnectionManager.java:1136)
>>>         at
>>>         org.apache.hadoop.hbase.client.ConnectionManager$HConnection
>>> Implementation.getRegionLocation(ConnectionManager.java:957)
>>>         at
>>>         org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllT
>>> ableRegions(ConnectionQueryServicesImpl.java:531)
>>>         ... 32 more
>>>         sqlline version 1.1.9
>>>
>>>         Kindly let me know how to fix this error.
>>>
>>>         Thanks,
>>>
>>>
>>>
>

Mime
View raw message