phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James Taylor <jamestay...@apache.org>
Subject Re: Kerberos Secure cluster and phoenix
Date Mon, 01 Sep 2014 22:09:39 GMT
Please try with the 4.1 jars in our binary distribution here:

http://phoenix.apache.org/download.html

Make sure to use the jars for the client and server in the hadoop2 directory.

Then follow the directions that Alex posted here:

http://bigdatanoob.blogspot.com/2013/09/connect-phoenix-to-secure-hbase-cluster.html

http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH5/latest/CDH5-Security-Guide/CDH5-Security-Guide.html

It sounds to me like there's a mismatch between your client and server jars.

Thanks,
James

On Mon, Sep 1, 2014 at 2:43 PM,  <Deepak_Gattala@dell.com> wrote:
> I am getting this following error really appreciate any comments. please
>
> Error: com.google.protobuf.ServiceException: java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000
failed on local exception: java.io.EOFException (state=08000,code=101)
> org.apache.phoenix.exception.PhoenixIOException: com.google.protobuf.ServiceException:
java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000 failed
on local exception: java.io.EOFException
>         at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:101)
>        at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:846)
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1057)
>         at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>         at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>         at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>         at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>         at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1452)
>         at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>         at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>         at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>         at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>         at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>         at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>         at sqlline.SqlLine.dispatch(SqlLine.java:817)
>         at sqlline.SqlLine.initArgs(SqlLine.java:633)
>         at sqlline.SqlLine.begin(SqlLine.java:680)
>         at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>         at sqlline.SqlLine.main(SqlLine.java:424)
> Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException:
java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000 failed
on local exception: java.io.EOFException
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$StubMaker.makeStub(HConnectionManager.java:1650)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(HConnectionManager.java:1676)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getKeepAliveMasterService(HConnectionManager.java:1884)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHTableDescriptor(HConnectionManager.java:2671)
>         at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:397)
>         at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:402)
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:772)
>         ... 23 more
> Caused by: com.google.protobuf.ServiceException: java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000
failed on local exception: java.io.EOFException
>         at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1674)
>         at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1715)
>         at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:42561)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(HConnectionManager.java:1687)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(HConnectionManager.java:1596)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$StubMaker.makeStub(HConnectionManager.java:1622)
>         ... 29 more
> Caused by: java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000
failed on local exception: java.io.EOFException
>         at org.apache.hadoop.hbase.ipc.RpcClient.wrapException(RpcClient.java:1485)
>         at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1457)
>         at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1657)
>         ... 34 more
> Caused by: java.io.EOFException
>         at java.io.DataInputStream.readInt(DataInputStream.java:392)
>         at org.apache.hadoop.hbase.ipc.RpcClient$Connection.readResponse(RpcClient.java:1072)
>         at org.apache.hadoop.hbase.ipc.RpcClient$Connection.run(RpcClient.java:728)
> sqlline version 1.1.2
>
> -----Original Message-----
> From: James Taylor [mailto:jamestaylor@apache.org]
> Sent: Monday, September 1, 2014 4:35 PM
> To: user
> Subject: Re: Kerberos Secure cluster and phoenix
>
> In addition to the above, in our 3.1/4.1 release, you can pass through the principal
and keytab file on the connection URL to connect to different secure clusters, like this:
>
> DriverManager.getConnection("jdbc:phoenix:h1,h2,h3:2181:user/principal:/user.keytab");
>
> The full URL is now of the form
> jdbc:phoenix:<quorom>:<port>:<rootNode>:<principal>:<keytabFile>
>
> where <port> and <rootNode> may be absent. We determine that <port>
is present if it's a number and <rootNode> if it begins with a '/'.
>
> One other useful feature from this work, not related to connecting to a secure cluster,
you may specify only the <principal> which would cause a different HConnection to be
used (per unique principal per cluster). In this way, you can pass through different HBase
properties that apply to the HConnection (such as timeout parameters).
>
> For example:
>
> DriverManager.getConnection("jdbc:phoenix:h1:longRunning", props);
>
> where props would contain the HBase config parameters and values for timeouts in a "longRunning"
connection which could be completely different than connection gotten through this URL:
>
> DriverManager.getConnection("jdbc:phoenix:h1:shortRunning", props);
>
> Thanks,
> James
>
> On Mon, Sep 1, 2014 at 2:13 PM, Alex Kamil <alex.kamil@gmail.com> wrote:
>> see
>> http://bigdatanoob.blogspot.com/2013/09/connect-phoenix-to-secure-hbas
>> e-cluster.html
>>
>> and
>> http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH5/la
>> test/CDH5-Security-Guide/CDH5-Security-Guide.html
>>
>>
>> On Mon, Sep 1, 2014 at 5:02 PM, <Deepak_Gattala@dell.com> wrote:
>>>
>>> Hi all,
>>>
>>>
>>>
>>> Any one has success doing a Phoenix connection  to a secure Hbase
>>> Hadoop cluster, if yes can you please kindly let me know the steps
>>> taken, I am on the recent version of phoenix and using Cloudera CDH 5.1 with
hbase 0.98.
>>>
>>>
>>>
>>> Appreciate your help.
>>>
>>>
>>>
>>> Thanks
>>>
>>> Deepak Gattala
>>
>>

Mime
View raw message