phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Fulin Sun" <su...@certusnet.com.cn>
Subject Re: Re: HBase Phoenix Integration
Date Tue, 01 Mar 2016 10:08:49 GMT
Hi, Amit
I had successfully built the git repo according to your file change. But when I try to use
sqlline to connect to Phoenix,
I ran into the following error: 

while dev-1,dev-2,dev-3 are my zookeeper hosts. I can still see the regionservers being killed
abnormally. 

Had you met this issue when you use in your scenerio ? If so, please suggest how to resolve
this. 

Thanks.

 Connecting to jdbc:phoenix:dev-1,dev-2,dev-3
16/03/01 18:04:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Error: Failed after attempts=36, exceptions:
Tue Mar 01 18:05:30 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=69350:
row 'SYSTEM.SEQUENCE,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740,
hostname=dev-2,60020,1456826584858, seqNum=0 (state=08000,code=101)
org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=36, exceptions:
Tue Mar 01 18:05:30 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=69350:
row 'SYSTEM.SEQUENCE,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740,
hostname=dev-2,60020,1456826584858, seqNum=0




From: Amit Shah
Date: 2016-03-01 18:00
To: user
Subject: Re: Re: HBase Phoenix Integration
Sure Sun. 
PFA.

Regards,
Amit.

On Tue, Mar 1, 2016 at 2:58 PM, Fulin Sun <sunfl@certusnet.com.cn> wrote:
Hi Amit,
Glad you found a temporory  fix for that. Can you share the relative java file you modified
? 
Thanks a lot. 

Best,
Sun.






From: Amit Shah
Date: 2016-03-01 17:22
To: user
Subject: Re: RE: HBase Phoenix Integration
Hi All,

I got some success in deploying phoenix 4.6-HBase-1.0 on CDH 5.5.2. I resolved the compilation
errors by commenting out the usage of BinaryCompatibleIndexKeyValueDecoder and BinaryCompatibleCompressedIndexKeyValueDecoder
classes since they only get used in secondary indexing. This is a temporary fix but yes it
works !

Hope that helps.
Waiting to see the phoenix-cloudera fix for the latest phoenix version 4.7 especially since
4.7 has some new features.

Thanks,
Amit.

On Tue, Mar 1, 2016 at 2:13 PM, Fulin Sun <sunfl@certusnet.com.cn> wrote:
No idea. I had only searched for this relatively latest post for supporting phoenix with CDH
5.5.x
However, I cannot connect to phoenix according to the post guide. Compiling the git repo had
also 
give me no luck. 







From: Dor Ben Dov
Date: 2016-03-01 16:39
To: user@phoenix.apache.org
Subject: RE: Re: HBase Phoenix Integration
Sun 
This is an old post there, do you know or are there any news about Cloudera Adopting the new
Apache Phoenix 4.7 ? 
 
Dor
 
From: Fulin Sun [mailto:sunfl@certusnet.com.cn] 
Sent: יום ג 01 מרץ 2016 09:14
To: user; James Taylor
Subject: Re: Re: HBase Phoenix Integration
 
Hi Amit
Yeah. Meet the same error message when doing mvn package. I compared between the apache phoenix
4.6.0-hbase-1.0
code with this repo code for CDH 5.5.1 , I did not find the abstract class BinaryCompatiblePhoenixBaseDecoder
in the former. 
Hope some guy can explain this and issue an workaround. 
 
If no way to resolve this, I would still be using the Cloudera-Labs phoenix version from this
: 
https://blog.cloudera.com/blog/2015/11/new-apache-phoenix-4-5-2-package-from-cloudera-labs/



Thanks,
Sun.
 




 
From: Amit Shah
Date: 2016-03-01 15:37
To: user; jamestaylor
Subject: Re: HBase Phoenix Integration
Hi James,
I get a compilation error along with multiple warnings when packaging 4.6-HBase-1.0-cdh5.5
branch. Attached is the error.
 
Also I realized that the pom.xml indicates the branch is for cloudera CDH version 5.5.1. Do
you know if it would work for the latest CDH version 5.5.2?
 
Thanks,
Amit.
 
On Tue, Mar 1, 2016 at 12:05 PM, Dor Ben Dov <dor.ben-dov@amdocs.com> wrote:
James, 
Do you have any problems working with Phoenix latest with CDH 5.5.X ? 
 
Dor
 
From: James Taylor [mailto:jamestaylor@apache.org] 
Sent: יום ג 01 מרץ 2016 08:24
To: user
Cc: Murugesan, Rani
Subject: Re: HBase Phoenix Integration
 
Hi Amit,
 
For Phoenix 4.6 on CDH, try using this git repo instead, courtesy of Andrew Purtell: https://github.com/chiastic-security/phoenix-for-cloudera/tree/4.6-HBase-1.0-cdh5.5
 
Thanks,
James
 
 
 
On Mon, Feb 29, 2016 at 10:19 PM, Amit Shah <amits.84@gmail.com> wrote:
Hi Sergey,
 
I get lot of compilation errors when I compile the source code for 4.6-HBase-1.0 branch or
v4.7.0-HBase-1.0-rc3 tag. Note that the source compilation succeeds when the changes to include
cloudera dependent versions are not included. The only difference between the code changes
suggested on the stackoverflow post and mine is the cloudera cdh version. I am using cdh 5.5.2.
I didn't quite follow the reason behind the code changes needed in phoenix when deployed on
CDH.
 
Thanks,
Amit.
 
On Tue, Mar 1, 2016 at 1:15 AM, Sergey Soldatov <sergeysoldatov@gmail.com> wrote:
Hi Amit,

Switching to 4.3 means you need HBase 0.98. What kind of problem you
experienced after building 4.6 from sources with changes suggested on
StackOverflow?

Thanks,
Sergey

On Sun, Feb 28, 2016 at 10:49 PM, Amit Shah <amits.84@gmail.com> wrote:
> An update -
>
> I was able to execute "./sqlline.py <zookeeper-server-name>" command but I
> get the same exception as I mentioned earlier.
>
> Later I tried following the steps mentioned on this link with phoenix 4.3.0
> but I still get an error this time with a different stack trace (attached to
> this mail)
>
> Any help would be appreciated
>
> On Sat, Feb 27, 2016 at 8:03 AM, Amit Shah <amits.84@gmail.com> wrote:
>>
>> Hi Murugesan,
>>
>> What preconditions would I need on the server to execute the python
>> script? I have Python 2.7.5 installed on the zookeeper server. If I just
>> copy the sqlline script to the /etc/hbase/conf directory and execute it I
>> get the below import errors. Note this time I had 4.5.2-HBase-1.0 version
>> server and core phoenix jars in HBase/lib directory on the master and region
>> servers.
>>
>> Traceback (most recent call last):
>>   File "./sqlline.py", line 25, in <module>
>>     import phoenix_utils
>> ImportError: No module named phoenix_utils
>>
>> Pardon me for my knowledge about python.
>>
>> Thanks,
>> Amit
>>
>> On Fri, Feb 26, 2016 at 11:26 PM, Murugesan, Rani <ranmurug@visa.com>
>> wrote:
>>>
>>> Did you test and confirm your phoenix shell from the zookeeper server?
>>>
>>> cd /etc/hbase/conf
>>>
>>> > phoenix-sqlline.py <zookeeperserver>:2181
>>>
>>>
>>>
>>>
>>>
>>> From: Amit Shah [mailto:amits.84@gmail.com]
>>> Sent: Friday, February 26, 2016 4:45 AM
>>> To: user@phoenix.apache.org
>>> Subject: HBase Phoenix Integration
>>>
>>>
>>>
>>> Hello,
>>>
>>>
>>>
>>> I have been trying to install phoenix on my cloudera hbase cluster.
>>> Cloudera version is CDH5.5.2 while HBase version is 1.0.
>>>
>>>
>>>
>>> I copied the server & core jar (version 4.6-HBase-1.0) on the master and
>>> region servers and restarted the hbase cluster. I copied the corresponding
>>> client jar on my SQuirrel client but I get an exception on connect. Pasted
>>> below. The connection url is “jdbc:phoenix:<zookeeper-server-name>:2181".
>>>
>>> I even tried compiling the source by adding cloudera dependencies as
>>> suggested on this post but didn't succeed.
>>>
>>>
>>>
>>> Any suggestions to make this work?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Amit.
>>>
>>>
>>>
>>> ________________________________________________________________
>>>
>>>
>>>
>>> Caused by:
>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
>>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
>>> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
>>>
>>>             at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:87)
>>>
>>>             at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1319)
>>>
>>>             at
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:11715)
>>>
>>>             at
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7388)
>>>
>>>             at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1776)
>>>
>>>             at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1758)
>>>
>>>             at
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>
>>>             at
>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
>>>
>>>             at
>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
>>>
>>>             at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>
>>>             at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>
>>>             at java.lang.Thread.run(Thread.java:745)
>>>
>>> Caused by: java.lang.NoSuchMethodError:
>>> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
>>>
>>>             at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:1016)
>>>
>>>             at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1092)
>>>
>>>             at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1266)
>>>
>>>             ... 10 more
>>>
>>>
>>>
>>> P.S - The full stacktrace is attached in the mail.
>>
>>
>
 
 
This message and the information contained herein is proprietary and confidential and subject
to the Amdocs policy statement, you may review at http://www.amdocs.com/email_disclaimer.asp

 


Mime
View raw message