phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <Deepak_Gatt...@Dell.com>
Subject RE: Kerberos Secure cluster and phoenix
Date Tue, 02 Sep 2014 04:30:43 GMT
Also some one like me exists.

http://mail-archives.apache.org/mod_mbox/phoenix-user/201406.mbox/%3CCAPjB-CCjcO+u1KkeSS+6Pr9Jp_ogn5KNopwjmGuV1Ns-tG=XyQ@mail.gmail.com%3E

he is doing the same thing what I am doing but no success.

Thanks
Deepak Gattala


From: Gattala, Deepak
Sent: Monday, September 1, 2014 11:10 PM
To: user@phoenix.apache.org
Subject: RE: Kerberos Secure cluster and phoenix

hbase(main):003:0> user_permission
User                                                   Table,Family,Qualifier:Permission
 hive                                                  hbase:acl,,: [Permission: actions=READ,WRITE,CREATE,ADMIN]
 deepak_gattala                                        hbase:acl,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN]
 Deepak_Gattala                                        hbase:acl,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN]
3 row(s) in 0.4530 seconds

hbase(main):006:0> user_permission 'SYSTEM.CATALOG'
User                                                   Table,Family,Qualifier:Permission
 deepak_gattala                                        SYSTEM.CATALOG,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN]
 Deepak_Gattala                                        SYSTEM.CATALOG,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN]
2 row(s) in 0.1640 seconds

hbase(main):007:0> user_permission 'SYSTEM.SEQUENCE'
User                                                   Table,Family,Qualifier:Permission
 deepak_gattala                                        SYSTEM.SEQUENCE,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN]
 Deepak_Gattala                                        SYSTEM.SEQUENCE,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN]
2 row(s) in 0.1470 seconds


From: Anil Gupta [mailto:anilgupta84@gmail.com]
Sent: Monday, September 1, 2014 11:03 PM
To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>

Subject: Re: Kerberos Secure cluster and phoenix

Hi Deepak,

Can you paste output of scan of _acl_ table?
Hbase daemons are started by hbase user of unix however you can add other user in hbase in
 _acl_ table and use them to perform CRUD.
In cdh 'hbase' user is nologin user. That's why you are unable to su. You can modify hbase
user.

Sent from my iPhone

On Sep 1, 2014, at 7:35 PM, <Deepak_Gattala@Dell.com<mailto:Deepak_Gattala@Dell.com>>
wrote:
I added the missing part and restarted the cluster and reran the sqlline I go the same eact
authentication error.

[root@ausgtmhadoop10 ~]su hbase
This account is currently not available.

I cannot su as hbase, I donot have that user in local anymore after I did Kerberos.

Thanks
Deepak Gattala

From: Alex Kamil [mailto:alex.kamil@gmail.com]
Sent: Monday, September 1, 2014 9:08 PM
To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: Kerberos Secure cluster and phoenix

also "hbase/" is missing in the principal name

Client {
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=false
  useTicketCache=true;
  keyTab="/tmp/hbase.keytab"
  principal="ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM<mailto:ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM>";

};

On Mon, Sep 1, 2014 at 9:56 PM, <Deepak_Gattala@dell.com<mailto:Deepak_Gattala@dell.com>>
wrote:
I am using 4.1 just download after James gave me the link.

I am so sorry if I missed something on the classpath can you please let me know where and
what I missed, please so I can correct it.

'java -cp ".:/usr/lib/hadoop/client/*:/etc/zookeeper/conf:/etc/hbase/conf/:/etc/hadoop/conf/:/etc/hbase/conf/:/usr/lib/hbase/*:/usr/lib/hadoop/lib/*:/usr/lib/zookeeper/lib
' + os.pathsep + phoenix_utils.phoenix_client_jar +

Please let me know

Thanks
Deepak Gattala

From: Anil Gupta [mailto:anilgupta84@gmail.com<mailto:anilgupta84@gmail.com>]
Sent: Monday, September 1, 2014 8:53 PM

To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: Kerberos Secure cluster and phoenix

Could you please tell your Phoenix version?
Also, it seems like you didn't modify class path as I suggested you. Could you please try
that?

Sent from my iPhone

On Sep 1, 2014, at 6:43 PM, <Deepak_Gattala@Dell.com<mailto:Deepak_Gattala@Dell.com>>
wrote:
I changed the things as per you gave me

[deepak_gattala@ausgtmhadoop10 ~]ls -ltra /tmp/hbase.keytab
-r-------- 1 deepak_gattala root 105 Sep  1 19:46 /tmp/hbase.keytab

[deepak_gattala@ausgtmhadoop10 ~]klist -kt /tmp/hbase.keytab
Keytab name: FILE:/tmp/hbase.keytab
KVNO Timestamp         Principal
---- ----------------- --------------------------------------------------------
   1 08/04/14 08:46:24 hbase/ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM<mailto:hbase/ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM>


This is my sqlline java command look like

java_cmd = 'java -cp ".:/usr/lib/hadoop/client/*:/etc/zookeeper/conf:/etc/hbase/conf/:/etc/hadoop/conf/:/etc/hbase/conf/:/usr/lib/hbase/*:/usr/lib/hadoop/lib/*:/usr/lib/zookeeper/lib
' + os.pathsep + phoenix_utils.phoenix_client_jar + \
    '" -Dlog4j.configuration=file:' + \
    os.path.join(phoenix_utils.current_dir, "log4j.properties") + \
    ' -Djava.security.auth.login.config=file:/home/deepak_gattala/phoenix-4.1.0-bin/hadoop2/bin/jaas.conf
' + \
    ' -Djavax.net.debug=ssl ' + \
    ' -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true ' + \
    ' -Djava.library.path=/usr/lib/hadoop/lib/native/:/usr/lib/hadoop/lib/:/usr/lib/hbase/lib/:/usr/lib/zookeeper/lib
' + \
    " sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver \
-u jdbc:phoenix:" + sys.argv[1] + \
    " -n none -p none --color=" + colorSetting + " --fastConnect=false --verbose=true \
--isolation=TRANSACTION_READ_COMMITTED " + sqlfile

[deepak_gattala@ausgtmhadoop10 ~]pwd
/home/deepak_gattala/phoenix-4.1.0-bin/hadoop2/bin
[deepak_gattala@ausgtmhadoop10 ~]cat jaas.conf

Client {
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=false
  useTicketCache=true;
  keyTab="/tmp/hbase.keytab"
  principal="ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM<mailto:ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM>";

};


I got SAME EXACT ERROR about authentication.

This is what I see in the hbase master logs.
2014-09-01 20:38:27,463 WARN org.apache.hadoop.ipc.RpcServer: RpcServer.listener,port=60000:
count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authentication is required
        at org.apache.hadoop.hbase.ipc.RpcServer$Connection.readAndProcess(RpcServer.java:1448)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener.doRead(RpcServer.java:790)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.doRunLoop(RpcServer.java:581)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.run(RpcServer.java:556)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)


My hbase runs as user hbase, but I login as deepak_gattala, also what CDH document you are
referring to, anything from Cloudera about how phoenix works with Kerberos?

Thanks
Deepak Gattala



From: Alex Kamil [mailto:alex.kamil@gmail.com]
Sent: Monday, September 1, 2014 8:23 PM
To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: Kerberos Secure cluster and phoenix

- is your principal actually hbase/ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM<mailto:ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM>?
you can list content of /tmp/hbase.keytab file with kutil to see: How to Display the Keylist
(Principals) in a Keytab File<http://docs.oracle.com/cd/E19683-01/806-4078/6jd6cjs1q/index.html>

-limit keytab permissions with:  chmod 400 /tmp/hbase.keytab

- it worked for me when phoenix client used zk-jaas.conf from hbase/conf directory exactly
per cdh doc, and login with the same user that runs hbase
zk-jaas.conf :
Client {
      com.sun.security.auth.module.Krb5LoginModule required
      useKeyTab=true
      useTicketCache=false
      keyTab="/etc/hbase/conf/hbase.keytab"
      principal="hbase/fully.qualified.domain.name@<YOUR-REALM><mailto:hbase/fully.qualified.domain.name@%3cYOUR-REALM%3e>";
   };




On Mon, Sep 1, 2014 at 8:56 PM, <Deepak_Gattala@dell.com<mailto:Deepak_Gattala@dell.com>>
wrote:
Hi Alex/Anil,

Please kindly see what i am missing here. I know I am so close but overlooking some thing,
please kindly advise

./sqlline.py ausgtmhadoop10.us-poclab.dellpoc.com<http://ausgtmhadoop10.us-poclab.dellpoc.com>:2181:/hbase:hbase/ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM<mailto:ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM>:/tmp/hbase.keytab

[deepak_gattala@ausgtmhadoop10 ~]klist
Ticket cache: FILE:/tmp/krb5cc_134810
Default principal: hbase/ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM<mailto:ausgtmhadoop10.us-poclab.dellpoc.com@US-POCLAB.DELLPOC.COM>

Valid starting     Expires            Service principal
09/01/14 19:47:50  09/02/14 05:47:50  krbtgt/US-POCLAB.DELLPOC.COM@US-POCLAB.DELLPOC.COM<mailto:krbtgt/US-POCLAB.DELLPOC.COM@US-POCLAB.DELLPOC.COM>

And it got failed with same exact error
14/09/01 19:50:56 ERROR client.HConnectionManager$HConnectionImplementation: Can't get connection
to ZooKeeper: KeeperErrorCode = AuthFailed for /hbase.

I looked at the hbase logs and it complains about same thing.

2014-09-01 19:54:33,341 WARN org.apache.hadoop.ipc.RpcServer: RpcServer.listener,port=60000:
count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authentication is required
        at org.apache.hadoop.hbase.ipc.RpcServer$Connection.readAndProcess(RpcServer.java:1448)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener.doRead(RpcServer.java:790)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.doRunLoop(RpcServer.java:581)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.run(RpcServer.java:556)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2014-09-01 19:54:53,470 WARN org.apache.hadoop.ipc.RpcServer: RpcServer.listener,port=60000:
count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authentication is required
        at org.apache.hadoop.hbase.ipc.RpcServer$Connection.readAndProcess(RpcServer.java:1448)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener.doRead(RpcServer.java:790)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.doRunLoop(RpcServer.java:581)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.run(RpcServer.java:556)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2014-09-01 19:55:13,523 WARN org.apache.hadoop.ipc.RpcServer: RpcServer.listener,port=60000:
count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authentication is required
        at org.apache.hadoop.hbase.ipc.RpcServer$Connection.readAndProcess(RpcServer.java:1448)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener.doRead(RpcServer.java:790)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.doRunLoop(RpcServer.java:581)
        at org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.run(RpcServer.java:556)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

Thanks
Deepak Gattala

From: Alex Kamil [mailto:alex.kamil@gmail.com<mailto:alex.kamil@gmail.com>]
Sent: Monday, September 1, 2014 7:18 PM

To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: Kerberos Secure cluster and phoenix

looks like phoenix is trying to create SYSTEM.CATALOG table (ensureTableCreated) but is not
able to pass through kerberos (HConnectionManager error), i've seen this when supplying incorrect
kerberos credentials or not using keytab at all. You would see exact reason if you enable
kerberos debug mode.

On Mon, Sep 1, 2014 at 8:11 PM, <Deepak_Gattala@dell.com<mailto:Deepak_Gattala@dell.com>>
wrote:
Hi Anil,

I did try that and actually that was the error I forwarded to you what I was getting

[deepak_gattala@ausgtmhadoop10 ~]cat /etc/hbase/conf.cloudera.hbase1/jaas.conf

Client {
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=false
  useTicketCache=true;
};

Thanks for letting me know that it’s not possible to do what I was using with phoenix, I
think currenly I am only left out with the option that I have to send the principle and keytab
file.


I just want to give one last try with what I was doing if you can help me to understand what
is the error below is:-

Error: com.google.protobuf.ServiceException: java.io.IOException: Call to ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000<http://ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000>
failed on local exception: java.io.EOFException (state=08000,code=101)
org.apache.phoenix.exception.PhoenixIOException: com.google.protobuf.ServiceException: java.io.IOException:
Call to ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000<http://ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000>
failed on local exception: java.io.EOFException
        at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:101)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:817)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1107)
        at org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
        at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1315)
        at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:445)
        at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
        at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:256)
        at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:248)
        at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:246)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:960)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1519)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1489)
        at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1489)
       at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:162)
        at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:129)
        at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:133)
        at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
        at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
        at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
        at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
        at sqlline.SqlLine.dispatch(SqlLine.java:817)
        at sqlline.SqlLine.initArgs(SqlLine.java:633)
        at sqlline.SqlLine.begin(SqlLine.java:680)
        at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
        at sqlline.SqlLine.main(SqlLine.java:424)
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException:
java.io.IOException: Call to ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000<http://ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000>
failed on local exception: java.io.EOFException
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$StubMaker.makeStub(HConnectionManager.java:1650)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(HConnectionManager.java:1676)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getKeepAliveMasterService(HConnectionManager.java:1884)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHTableDescriptor(HConnectionManager.java:2671)
        at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:397)
        at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:402)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:746)
        ... 31 more
Caused by: com.google.protobuf.ServiceException: java.io.IOException: Call to ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000<http://ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000>
failed on local exception: java.io.EOFException
        at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1674)
        at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1715)
        at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:42561)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(HConnectionManager.java:1687)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(HConnectionManager.java:1596)
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$StubMaker.makeStub(HConnectionManager.java:1622)
        ... 37 more
Caused by: java.io.IOException: Call to ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000<http://ausgtmhadoop08.us-poclab.dellpoc.com/192.168.1.102:60000>
failed on local exception: java.io.EOFException
        at org.apache.hadoop.hbase.ipc.RpcClient.wrapException(RpcClient.java:1485)
        at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1457)
        at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1657)
        ... 42 more
Caused by: java.io.EOFException
        at java.io.DataInputStream.readInt(DataInputStream.java:392)
        at org.apache.hadoop.hbase.ipc.RpcClient$Connection.readResponse(RpcClient.java:1072)
        at org.apache.hadoop.hbase.ipc.RpcClient$Connection.run(RpcClient.java:728)

Thanks
Deepak Gattala


From: anil gupta [mailto:anilgupta84@gmail.com<mailto:anilgupta84@gmail.com>]
Sent: Monday, September 1, 2014 7:04 PM

To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: Kerberos Secure cluster and phoenix

Hi Deepak,
Current feature only supports connecting using a keytab and principal.
IMO, You have got following options now:
1. Get the keytab generated and use OOTB feature.
2. Try to inherit secure session. In this case Phoenix OOTB secure connection feature will
not play any role at all.
3. Enhance Phoenix to support your use case.
At present, #2 seems like a quick thing to try out.
Can you try using jaas.conf file and use similar classpath as i specified earlier. In this
case just use "<zk>:<zk_port>:<root_dir>" to invoke sqlline.
Make sure "useTicketCache=true;" in your jaas.conf file. Also, make sure that you only have
one jaas.conf file in your classpath.

~Anil

On Mon, Sep 1, 2014 at 4:48 PM, <Deepak_Gattala@dell.com<mailto:Deepak_Gattala@dell.com>>
wrote:
Hello Anil, sorry for the confusion. I think the details below will help you some visibility,
if you need any further information let me know.

I just login to the edge node as deepak_gattala

And it will talk to my AD and figure it out who I am, so when I do klist it already knows
me, I do not have to do kinit –kt  hbase…….

[deepak_gattala@ausgtmhadoop10 ~]klist
Ticket cache: FILE:/tmp/krb5cc_134810
Default principal: Deepak_Gattala@AMER.DELL.COM<mailto:Deepak_Gattala@AMER.DELL.COM>

Valid starting     Expires            Service principal
09/01/14 15:37:40  09/02/14 01:37:40  krbtgt/AMER.DELL.COM@AMER.DELL.COM<mailto:AMER.DELL.COM@AMER.DELL.COM>
09/01/14 15:37:40  09/02/14 01:37:40  krbtgt/DELL.COM@AMER.DELL.COM<mailto:DELL.COM@AMER.DELL.COM>
09/01/14 15:37:41  09/02/14 01:37:40  krbtgt/DELLPOC.COM@DELL.COM<mailto:DELLPOC.COM@DELL.COM>
09/01/14 15:37:41  09/02/14 01:37:40  krbtgt/US-POCLAB.DELLPOC.COM@DELLPOC.COM<mailto:US-POCLAB.DELLPOC.COM@DELLPOC.COM>
09/01/14 15:37:41  09/02/14 01:37:40  AUSGTMHADOOP10$@US-POCLAB.DELLPOC.COM<mailto:AUSGTMHADOOP10$@US-POCLAB.DELLPOC.COM>

After that I can just do hbase shell and do run whatever I ran. I do not have to use any hbase.keytab
file to do so.

The goal is that we run stuff as users liked who are in NT like deepak_gattala or anil_kumar,
james_tylor ….etc.

I hope this helps, we do not want to use the keytab files of the services.

Thanks
Deepak Gattala

From: anil gupta [mailto:anilgupta84@gmail.com<mailto:anilgupta84@gmail.com>]
Sent: Monday, September 1, 2014 6:43 PM

To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: Kerberos Secure cluster and phoenix

Hi Deepak,
You need to use the following command to invoke sqlline when you want to use OOTB feature:
sqlline.sh <zk>:<zk_port>:<root_dir>:<principal>:<keytab>
The Phoenix client does the authentication using keytab and principal.
Do you do kinit before running "hbase shell"? I am assuming you have keytab file on this box.
Can you provide entire log when you try to invoke phoenix.
Thanks,
Anil Gupta


On Mon, Sep 1, 2014 at 4:33 PM, <Deepak_Gattala@dell.com<mailto:Deepak_Gattala@dell.com>>
wrote:
Hi Anil,

I am logging in the edge node as deepak_gattala and I am able to do this.

[deepak_gattala@ausgtmhadoop10 ~]hbase shell
14/09/01 18:31:11 INFO Configuration.deprecation: hadoop.native.lib is deprecated. Instead,
use io.native.lib.available
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 0.98.1-cdh5.1.0, rUnknown, Sat Jul 12 08:20:49 PDT 2014

hbase(main):001:0> scan 'weblog'
ROW                                                    COLUMN+CELL
 row1                                                  column=stats:daily, timestamp=1405608596314,
value=test-daily-value
 row1                                                  column=stats:monthly, timestamp=1405608606261,
value=test-monthly-value
 row1                                                  column=stats:weekly, timestamp=1405608606216,
value=test-weekly-value
 row2                                                  column=stats:weekly, timestamp=1405609101013,
value=test-weekly-value
 row3                                                  column=stats:daily, timestamp=1406661440128,
value=test-daily-value
3 row(s) in 5.9620 seconds

hbase(main):002:0> quit
[deepak_gattala@ausgtmhadoop10 ~]hadoop fs -ls /
Found 5 items
drwxrwxr-x   - hbase hbase               0 2014-09-01 17:38 /hbase
drwxrwxr-x   - solr  solr                0 2014-07-18 17:38 /solr
drwxrwxr-x   - hdfs  supergroup          0 2013-12-26 23:53 /system
drwxrwxrwt   - hdfs  supergroup          0 2014-09-01 18:20 /tmp
drwxrwxr-x   - hdfs  supergroup          0 2014-08-29 10:13 /user

is that I am still required to send the keytab file while I am calling the sqlline as you
mentioned below:-

Use following command to invoke sqlline:
sqlline.sh <zk>:<zk_port>:<root_dir>:<principal>:<keytab>

or can I just call it as like sqlline <zk>:2181:/hbase

can you please clarify that.

Thanks
Deepak Gattala

From: anil gupta [mailto:anilgupta84@gmail.com<mailto:anilgupta84@gmail.com>]
Sent: Monday, September 1, 2014 6:27 PM

To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: Kerberos Secure cluster and phoenix

Hi Deepak,
AFAIK, jaas.conf is not required when using OOTB feature of connecting to a secure cluster.
It seems like User connecting to secure HBase cluster does not have proper permission setup
for znode of ZK. Can you check the permission of znode "/hbase" and make sure that User has
proper permission.
Thanks,
Anil Gupta

On Mon, Sep 1, 2014 at 4:21 PM, <Deepak_Gattala@dell.com<mailto:Deepak_Gattala@dell.com>>
wrote:
Hi Anil,

Thanks for sharing the details,

I am now getting the error like.

14/09/01 18:17:33 ERROR client.HConnectionManager$HConnectionImplementation: Can't get connection
to ZooKeeper: KeeperErrorCode = AuthFailed for /hbase

Are you familiar with this, it looks like its having issues communicating with Zookeeper.
Any thoughts around it?

I think you need a Jaas.conf file, is that not required any more I don’t see that in your
script.

Thanks
Deepak Gattala

From: anil gupta [mailto:anilgupta84@gmail.com<mailto:anilgupta84@gmail.com>]
Sent: Monday, September 1, 2014 6:12 PM
To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>

Subject: Re: Kerberos Secure cluster and phoenix

Hi Deepak,
What version of phoenix you are using? Phoenix 3.1 and 4.1 support connecting to secure Hadoop/HBase
cluster out of the box(Phoenix-19). Are you running HBase on a fully distributed cluster?
I would recommend you to use phoenix-*-client-without-hbase.jar file.

Use following command to invoke sqlline:
sqlline.sh <zk>:<zk_port>:<root_dir>:<principal>:<keytab>
Last week i used 3.1 release to connect to a secure HBase cluster running cdh4.6. Here is
the bash script with modified classpath:
---------------------------------------------------------------------------------------
#!/bin/bash
current_dir=$(cd $(dirname $0);pwd)
phoenix_jar_path="$current_dir/.."
phoenix_client_jar=$(find $phoenix_jar_path/phoenix-*-client-without-hbase.jar)


if [ -z "$1" ]
  then echo -e "Zookeeper not specified. \nUsage: sqlline.sh <zookeeper> <optional_sql_file>
\nExample: \n 1. sqlline.sh localhost \n 2. sqlline.sh localhost ../examples/stock_symbol.sql";
  exit;
fi

if [ "$2" ]
  then sqlfile="--run=$2";
fi

echo Phoenix_Client_Jar=$phoenix_client_jar

java -cp "/etc/hbase/conf:.:../sqlline-1.1.2.jar:../jline-2.11.jar:/opt/cloudera/parcels/CDH/lib/hbase/hbase-0.94.15-cdh4.6.0-security.jar:/opt/cloudera/parcels/CDH/lib/hbase/lib/*:/opt/cloudera/parcels/CDH/lib/hadoop/*:/opt/cloudera/parcels/CDH/lib/hadoop/lib/*:../phoenix-core-3.1.0.jar:$phoenix_client_jar"
-Dlog4j.configuration=file:$current_dir/log4j.properties sqlline.SqlLine -d org.apache.phoenix.jdb
c.PhoenixDriver -u jdbc:phoenix:$1 -n none -p none --color=true --fastConnect=false --verbose=true
--isolation=TRANSACTION_READ_COMMITTED $sqlfile
---------------------------------------------------------------------------------------
Just modify above script as per 5.1 release of CDH and your environment setup.
Let us know if it doesn't works.

Thanks,
Anil Gupta

On Mon, Sep 1, 2014 at 3:12 PM, Alex Kamil <alex.kamil@gmail.com<mailto:alex.kamil@gmail.com>>
wrote:
Deepak,

also I'd check first if hbase is working and accessible in secure mode with the same kerberos
principal you use for phoenix client
-  start hbase shell and see if you can run some commands in secure mode
- verify hbase, hadoop, zookeeper running in secure mode, are there any exceptions in server
logs
- can you execute command in hdfs shell and with zookeeper client
- run kinit as shown in cdh security guide for hbase, what do you see when you run klist
- enable kerberos debug mode in sqlline.py, with something like

kerberos="-Djava.security.auth.login.config=/myapp/phoenix/bin/zk-jaas.conf -Dsun.security.krb5.debug=true
-Djava.security.krb5.realm=MYDOMAIN -Djava.security.krb5.kdc=MYKDC -Djava.security.krb5.conf=/etc/krb5.conf"

java_cmd = 'java ' + kerberos + ' -classpath ".' + os.pathsep +extrajars+ os.pathsep+ phoenix_utils.phoenix_client_jar
+ \

Alex

On Mon, Sep 1, 2014 at 6:09 PM, James Taylor <jamestaylor@apache.org<mailto:jamestaylor@apache.org>>
wrote:
Please try with the 4.1 jars in our binary distribution here:

http://phoenix.apache.org/download.html

Make sure to use the jars for the client and server in the hadoop2 directory.

Then follow the directions that Alex posted here:

http://bigdatanoob.blogspot.com/2013/09/connect-phoenix-to-secure-hbase-cluster.html

http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH5/latest/CDH5-Security-Guide/CDH5-Security-Guide.html

It sounds to me like there's a mismatch between your client and server jars.

Thanks,
James

On Mon, Sep 1, 2014 at 2:43 PM,  <Deepak_Gattala@dell.com<mailto:Deepak_Gattala@dell.com>>
wrote:
> I am getting this following error really appreciate any comments. please
>
> Error: com.google.protobuf.ServiceException: java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000<http://ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000>
failed on local exception: java.io.EOFException (state=08000,code=101)
> org.apache.phoenix.exception.PhoenixIOException: com.google.protobuf.ServiceException:
java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000<http://ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000>
failed on local exception: java.io.EOFException
>         at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:101)
>        at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:846)
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1057)
>         at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>         at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>         at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>         at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>         at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1452)
>         at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>         at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>         at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>         at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>         at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>         at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>         at sqlline.SqlLine.dispatch(SqlLine.java:817)
>         at sqlline.SqlLine.initArgs(SqlLine.java:633)
>         at sqlline.SqlLine.begin(SqlLine.java:680)
>         at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>         at sqlline.SqlLine.main(SqlLine.java:424)
> Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException:
java.io.IOException: Call to ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000<http://ausgtmhadoop10.us-poclab.dellpoc.com/192.168.1.100:60000>
failed on local exception: java.io.EOFException

>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$StubMaker.makeStub(HConnectionManager.java:1650)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(HConnectionManager.java:1676)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getKeepAliveMasterService(HConnectionManager.java:1884)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHTableDescriptor(HConnectionManager.java:2671)
>         at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:397)
>         at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:402)
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:772)
>
...

[Message clipped]

Mime
View raw message