phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Giuseppe Reina <g.re...@gmail.com>
Subject Re: Phoenix and Kerberos (No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt))
Date Tue, 17 Jun 2014 14:54:07 GMT
Thank you. I'll try that!


On Tue, Jun 17, 2014 at 3:50 PM, anil gupta <anilgupta84@gmail.com> wrote:

> Hi Giuseppe,
>
> The latest nightly builds of phoenix have the patch for
> https://issues.apache.org/jira/browse/PHOENIX-19. If you pick up the
> latest nighly then it would be easier to connect to a secure cluster. You
> can find the nighly for 3.0 here:
>
> https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/
>
> If you are using HBase098 then try out Phoenix4.0 nightly. Let me know if
> you need further help.
>
> Thanks,
> Anil Gupta
>
>
> On Tue, Jun 17, 2014 at 6:35 AM, Giuseppe Reina <g.reina@gmail.com> wrote:
>
>> Hi all,
>>   I'm trying to make Phoenix work with HBase and Kerberos but so far I
>> got no luck. I'm currently using HDP 2.1 on Centos 6.5 and following this
>> guide as reference:
>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>> I'm able to use mainly all the Hadoop services (MapReduce, Zookeeper,
>> HBase,...) using my user but not Phoenix (note I granted RWCA permissions
>> to my user on hbase).
>>
>> I don't see any problems with my TGT
>>
>> [myuser@zk1.mydomain ~]$ klist -fae
>>> Ticket cache: FILE:/tmp/krb5cc_501
>>> Default principal: myuser@MYREALM
>>> Valid starting     Expires            Service principal
>>> 06/17/14 10:58:50  06/18/14 10:58:50  krbtgt/MYREALM@MYREALM
>>>  renew until 06/24/14 10:58:33, Flags: FRIT
>>> Etype (skey, tkt): des3-cbc-sha1, des3-cbc-sha1
>>> Addresses: (none)
>>
>>
>> But when I launch Phoenix sqlline with the krb5 debug on using the
>> following command:
>>
>>> [myuser@zk1.mydomain ~]$ java -cp
>>> ".:/usr/lib/phoenix/*:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure -n myuser
>>>  --fastConnect=false --verbose=true --isolation=TRANSACTION_READ_COMMITTED
>>
>>
>> I get the following error:
>>
>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>> issuing: !connect
>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure myuser ''
>>> org.apache.phoenix.jdbc.PhoenixDriver
>>> Connecting to
>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>>> [jar:file:/usr/lib/phoenix/phoenix-4.0.0.2.1.2.0-402-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/usr/lib/phoenix/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> Java config name: null
>>> Native config name: /etc/krb5.conf
>>> Loaded from native config
>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>> >>>DEBUG <CCacheInputStream> server principal is krbtgt/MYREALM@MYREALM
>>> >>>DEBUG <CCacheInputStream> key type: 16
>>> >>>DEBUG <CCacheInputStream> auth time: Tue Jun 17 10:58:33 UTC
2014
>>> >>>DEBUG <CCacheInputStream> start time: Tue Jun 17 10:58:50 UTC
2014
>>> >>>DEBUG <CCacheInputStream> end time: Wed Jun 18 10:58:50 UTC
2014
>>> >>>DEBUG <CCacheInputStream> renew_till time: Tue Jun 24 10:58:33
UTC
>>> 2014
>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>> Service ticket not found in the subject
>>> >>> Credentials acquireServiceCreds: same realm
>>> default etypes for default_tgs_enctypes: 16 1 3.
>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> >>> KdcAccessibility: reset
>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000, number
>>> of retries =3, #bytes=737
>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>> timeout=30000,Attempt =1, #bytes=737
>>> >>> KrbKdcReq send: #bytes read=714
>>> >>> KdcAccessibility: remove kerberos.mydomain
>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> Krb5Context setting mySeqNumber to: 595457406
>>> Krb5Context setting peerSeqNumber to: 0
>>> Created InitSecContextToken:
>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1  ..n..j0..f......
>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01  ................
>>> [...]
>>> 0260: 7A 66 8D 83 5C 76 84 2E   09 6B E4 7E 3C 6C 7A 3A  zf..\v...k..<lz:
>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01
>>> 04 00 ff ff ff ff 7b 14 41 17 41 6a d6 72 f2 55 a5 2f d1 95 c3 99 30 8f 00
>>> 95 9e 1a 23 b6 4b b5 5d 89 6e f5 b4 e6 5a 50 1d d3 01 01 00 00 04 04 04 04 ]
>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54 2d
>>> 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04
>>> 00 ff ff ff ff 17 ec 99 7b 96 4e 05 41 26 5e 0b b5 b9 c6 5e c8 52 9b 14 69
>>> d1 43 7a fa bc 4b 75 fe 49 61 2b 99 52 13 c7 9d 01 01 00 00 68 64 70 2d 75
>>> 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>> Found ticket for myuser@MYREALM to go to zookeeper/zk1.mydomain@MYREALM
>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>> Service ticket not found in the subject
>>> >>> Credentials acquireServiceCreds: same realm
>>> default etypes for default_tgs_enctypes: 16 1 3.
>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000, number
>>> of retries =3, #bytes=737
>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>> timeout=30000,Attempt =1, #bytes=737
>>> >>> KrbKdcReq send: #bytes read=714
>>> >>> KdcAccessibility: remove kerberos.mydomain
>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> Krb5Context setting mySeqNumber to: 284225265
>>> Krb5Context setting peerSeqNumber to: 0
>>> Created InitSecContextToken:
>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1  ..n..j0..f......
>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01  ................
>>> [...]
>>> 0260: 76 30 32 5D 70 32 BA 6F   1F E0 C7 8F 9C B4 24 73  v02]p2.o......$s
>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01
>>> 04 00 ff ff ff ff 24 63 c2 63 f1 09 e4 a1 d9 8e 56 77 35 a4 c3 76 11 77 d7
>>> 30 a9 6b 15 4d ee d7 2d 5c 80 e8 28 1d 2a 75 ac 1c 01 01 00 00 04 04 04 04 ]
>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54 2d
>>> 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04
>>> 00 ff ff ff ff 55 15 d9 ab fb 0f ac 4e a1 1f 2e 0b 89 ca 61 a0 5a d3 4e f6
>>> af 30 4f 6d 8f ad 2d 0c 9d b7 c4 be a7 b2 ac b5 01 01 00 00 68 64 70 2d 75
>>> 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>> 14/06/17 12:25:26 WARN ipc.RpcClient: Exception encountered while
>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>> level: Failed to find any Kerberos tgt)]
>>> 14/06/17 12:25:26 FATAL ipc.RpcClient: SASL authentication failed. The
>>> most likely cause is missing or invalid credentials. Consider 'kinit'.
>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>> find any Kerberos tgt)]
>>>  at
>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>> at
>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>  at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>  at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>  at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>  at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>  at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>> at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>  at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>  at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>> at
>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>  at
>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>> at
>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>  at
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>  at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>  at
>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>  at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>  at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>  at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>> Caused by: GSSException: No valid credentials provided (Mechanism level:
>>> Failed to find any Kerberos tgt)
>>> at
>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>  at
>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>> at
>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>  at
>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>> at
>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>  at
>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>> at
>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>  ... 47 more
>>> 14/06/17 12:25:28 WARN ipc.RpcClient: Exception encountered while
>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>> level: Failed to find any Kerberos tgt)]
>>> 14/06/17 12:25:28 FATAL ipc.RpcClient: SASL authentication failed. The
>>> most likely cause is missing or invalid credentials. Consider 'kinit'.
>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>> find any Kerberos tgt)]
>>>  at
>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>> at
>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>  at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>  at
>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>  at
>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>  at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>  at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>  at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>> at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>  at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>  at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>> at
>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>  at
>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>> at
>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>  at
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>  at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>  at
>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>  at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>  at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>  at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>> Caused by: GSSException: No valid credentials provided (Mechanism level:
>>> Failed to find any Kerberos tgt)
>>> at
>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>  at
>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>> at
>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>  at
>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>> at
>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>  at
>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>> at
>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>  ... 47 more
>>
>>
>>
>> Can any of you help me with this problem?
>>
>> Kind Regards
>>
>
>
>
> --
> Thanks & Regards,
> Anil Gupta
>

Mime
View raw message