phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Biyuhao <byh0...@gmail.com>
Subject Re: Phoenix-4.4 with CDH5.4.0
Date Mon, 11 May 2015 06:46:35 GMT
Hi ,

I think you can just install phoenix from cloudera manager, it's very easy.
You can read instruction here,
http://www.cloudera.com/content/cloudera/en/developers/home/cloudera-labs/apache-phoenix/install-apache-phoenix-cloudera-labs.pdf

http://blog.cloudera.com/blog/2015/05/apache-phoenix-joins-cloudera-labs/

2015-05-11 13:56 GMT+08:00 Anirudha Khanna <akhanna@marinsoftware.com>:

> I first tried the "server" jar but kept getting an exception like "No
> coprocessor $SomeCoprocessor available." So then I went and checked which
> "server" jar had the coprocessor classes and based on this picked the
> "runnable" jar.
>
> $jar tf phoenix-server-4.4.0-HBase-1.0-runnable.jar | grep coprocessor |
> wc -l
> Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
>      490
> $jar tf phoenix-server-client-4.4.0-HBase-1.0.jar | grep coprocessor | wc
> -l
> Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
>        0
> $jar tf phoenix-server-4.4.0-HBase-1.0.jar | grep coprocessor | wc -l
> Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
>        0
>
> Is this just a case of wrong packaging or is this expected?
>
>
> -- Anirudha
>
> On Fri, May 8, 2015 at 8:50 PM, Nick Dimiduk <ndimiduk@gmail.com> wrote:
>
>> Phoenix-runnable is actually the uberjar for the query server. Instead,
>> you'll need the "server" jar. Sorry, these names are a bit confusing.
>>
>>
>> On Friday, May 8, 2015, Anirudha Khanna <akhanna@marinsoftware.com>
>> wrote:
>>
>>> Hi All,
>>>
>>> I am trying to deploy Phoenix-4.4.0 with HBase-1.0.0 from CDH5.4.0. From
>>> the phoenix build, I copied the
>>> jar, phoenix-server-4.4.0-HBase-1.0-runnable.jar over to the HBase lib
>>> directory and was able to successfully start the HBase cluster.
>>>
>>> But when I tried to connect to the cluster using sqlline.py
>>> $zookeeperQuorum I got the following exception,
>>>
>>> Error: org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
>>> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1148)
>>> at
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:10515)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7054)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1740)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1722)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31309)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2035)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.NoSuchMethodError:
>>> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
>>> at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:925)
>>> at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1001)
>>> at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1097)
>>> ... 10 more (state=08000,code=101)
>>>
>>> Looking into the HBase master logs they looked clean, (no expcetions),
>>> but the RegionServer had a similar exception,
>>>
>>> 2015-05-08 15:43:20,039 INFO
>>>  [PostOpenDeployTasks:f629ad7161c1b03ba7b0aa4459a85a60]
>>> hbase.MetaTableAccessor: Updated row
>>> SYSTEM.CATALOG,,1431079592560.f629ad7161c1b03ba7b0aa4459a85a60. with
>>> server=localhost,16201,1431079934872
>>> 2015-05-08 15:43:23,138 ERROR
>>> [B.defaultRpcServer.handler=4,queue=1,port=16201]
>>> coprocessor.MetaDataEndpointImpl: createTable failed
>>> java.lang.NoSuchMethodError:
>>> org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
>>> at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:925)
>>> at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1001)
>>> at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1097)
>>> at
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:10515)
>>> at
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7054)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1740)
>>> at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1722)
>>> at
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31309)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2035)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>>
>>> Any help in helping resolve this is greatly appreciated.
>>>
>>> Cheers,
>>> Anirudha
>>>
>>
>

Mime
View raw message