phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amit Shah <amits...@gmail.com>
Subject Re: HBase Phoenix Integration
Date Tue, 01 Mar 2016 06:19:41 GMT
Hi Sergey,

I get lot of compilation errors when I compile the source code
for 4.6-HBase-1.0 branch or v4.7.0-HBase-1.0-rc3 tag. Note that the source
compilation succeeds when the changes to include cloudera dependent
versions are not included. The only difference between the code changes
suggested on the stackoverflow post and mine is the cloudera cdh version. I
am using cdh 5.5.2. I didn't quite follow the reason behind the code
changes needed in phoenix when deployed on CDH.

Thanks,
Amit.

On Tue, Mar 1, 2016 at 1:15 AM, Sergey Soldatov <sergeysoldatov@gmail.com>
wrote:

> Hi Amit,
>
> Switching to 4.3 means you need HBase 0.98. What kind of problem you
> experienced after building 4.6 from sources with changes suggested on
> StackOverflow?
>
> Thanks,
> Sergey
>
> On Sun, Feb 28, 2016 at 10:49 PM, Amit Shah <amits.84@gmail.com> wrote:
> > An update -
> >
> > I was able to execute "./sqlline.py <zookeeper-server-name>" command but
> I
> > get the same exception as I mentioned earlier.
> >
> > Later I tried following the steps mentioned on this link with phoenix
> 4.3.0
> > but I still get an error this time with a different stack trace
> (attached to
> > this mail)
> >
> > Any help would be appreciated
> >
> > On Sat, Feb 27, 2016 at 8:03 AM, Amit Shah <amits.84@gmail.com> wrote:
> >>
> >> Hi Murugesan,
> >>
> >> What preconditions would I need on the server to execute the python
> >> script? I have Python 2.7.5 installed on the zookeeper server. If I just
> >> copy the sqlline script to the /etc/hbase/conf directory and execute it
> I
> >> get the below import errors. Note this time I had 4.5.2-HBase-1.0
> version
> >> server and core phoenix jars in HBase/lib directory on the master and
> region
> >> servers.
> >>
> >> Traceback (most recent call last):
> >>   File "./sqlline.py", line 25, in <module>
> >>     import phoenix_utils
> >> ImportError: No module named phoenix_utils
> >>
> >> Pardon me for my knowledge about python.
> >>
> >> Thanks,
> >> Amit
> >>
> >> On Fri, Feb 26, 2016 at 11:26 PM, Murugesan, Rani <ranmurug@visa.com>
> >> wrote:
> >>>
> >>> Did you test and confirm your phoenix shell from the zookeeper server?
> >>>
> >>> cd /etc/hbase/conf
> >>>
> >>> > phoenix-sqlline.py <zookeeperserver>:2181
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> From: Amit Shah [mailto:amits.84@gmail.com]
> >>> Sent: Friday, February 26, 2016 4:45 AM
> >>> To: user@phoenix.apache.org
> >>> Subject: HBase Phoenix Integration
> >>>
> >>>
> >>>
> >>> Hello,
> >>>
> >>>
> >>>
> >>> I have been trying to install phoenix on my cloudera hbase cluster.
> >>> Cloudera version is CDH5.5.2 while HBase version is 1.0.
> >>>
> >>>
> >>>
> >>> I copied the server & core jar (version 4.6-HBase-1.0) on the master
> and
> >>> region servers and restarted the hbase cluster. I copied the
> corresponding
> >>> client jar on my SQuirrel client but I get an exception on connect.
> Pasted
> >>> below. The connection url is
> “jdbc:phoenix:<zookeeper-server-name>:2181".
> >>>
> >>> I even tried compiling the source by adding cloudera dependencies as
> >>> suggested on this post but didn't succeed.
> >>>
> >>>
> >>>
> >>> Any suggestions to make this work?
> >>>
> >>>
> >>>
> >>> Thanks,
> >>>
> >>> Amit.
> >>>
> >>>
> >>>
> >>> ________________________________________________________________
> >>>
> >>>
> >>>
> >>> Caused by:
> >>>
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
> >>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
> >>>
> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
> >>>
> >>>             at
> >>>
> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:87)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1319)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:11715)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7388)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1776)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1758)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
> >>>
> >>>             at
> >>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
> >>>
> >>>             at
> >>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
> >>>
> >>>             at
> >>>
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> >>>
> >>>             at
> >>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> >>>
> >>>             at java.lang.Thread.run(Thread.java:745)
> >>>
> >>> Caused by: java.lang.NoSuchMethodError:
> >>>
> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:1016)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1092)
> >>>
> >>>             at
> >>>
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1266)
> >>>
> >>>             ... 10 more
> >>>
> >>>
> >>>
> >>> P.S - The full stacktrace is attached in the mail.
> >>
> >>
> >
>

Mime
View raw message