phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Elser <josh.el...@gmail.com>
Subject Re: Problem connecting JDBC client to a secure cluster
Date Thu, 13 Apr 2017 15:46:44 GMT
Just some extra context here:

 From your original message, you noted how the ZK connection succeeded 
but the HBase connection didn't. The JAAS configuration file you 
provided is *only* used by ZooKeeper. As you have eventually realized, 
hbase-site.xml is the configuration file which controls how the client 
connects to HBase.

Nice write-up for the archive :)

rafa wrote:
> Hi all,
>
>
> I have been able to track down the origin of the problem and it is
> related to the hbase-site.xml not being loaded correctly by the
> application server.
>
> Seeing the instructions given by Anil in this JIRA:
> https://issues.apache.org/jira/browse/PHOENIX-19 it has been easy to
> reproduce it
>
> java <mailto:root@clo-mgr-p-01u.cajamar.int:/tmp/testhbase2%3ejava>  -cp
> /tmp/testhbase2:/opt/cloudera/parcels/CLABS_PHOENIX-4.7.0-1.clabs_phoenix1.3.0.p0.000/lib/phoenix/lib/hadoop-hdfs-2.6.0-cdh5.7.0.jar:/opt/cloudera/parcels/CLABS_PHOENIX-4.7.0-1.clabs_phoenix1.3.0.p0.000/lib/phoenix/phoenix-4.7.0-clabs-phoenix1.3.0-client.jar
> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver  -u
> jdbc:phoenix:node-01u.xxxx.int:2181:phoenix@HADOOP.INT:/etc/security/keytabs/phoenix.keytab
> -n none -p none --color=true --fastConnect=false --verbose=true
> --incremental=false --isolation=TRANSACTION_READ_COMMITTED
>
>
> In /tmp/testhbase2 there are 3 files:
>
> -rw-r--r--   1 root root  4027 Apr 11 18:23 hdfs-site.xml
> -rw-r--r--   1 root root  3973 Apr 11 18:29 core-site.xml
> -rw-rw-rw-   1 root root  3924 Apr 11 18:49 hbase-site.xml
>
>
> a) If hdfs-site.xml is missing or invalid:
>
> It fails with Caused by: java.lang.IllegalArgumentException:
> java.net.UnknownHostException: nameservice1
>
> (with HA HDFS, hdfs-site.xml  is needed to resolve the name service)
>
> b) if core-site.xml is missing or invalid:
>
>   17/04/11 19:05:01 WARN security.UserGroupInformation:
> PriviledgedActionException as:root (auth:SIMPLE)
> cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)]
> 17/04/11 19:05:01 WARN ipc.RpcClientImpl: Exception encountered while
> connecting to the server : javax.security.sasl.SaslException: GSS
> initiate failed [Caused by GSSException: No valid credentials provided
> (Mechanism level: Failed to find any Kerberos tgt)]
> 17/04/11 19:05:01 FATAL ipc.RpcClientImpl: SASL authentication failed.
> The most likely cause is missing or invalid credentials. Consider 'kinit'.
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)]
>          at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
>          at
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181)
>
> ...
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)
>          at
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>          at
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>          at
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>          at
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>          at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>          at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>          at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
>
>
>
> c) If hbase-site.xml is missing or invalid:
>
> The zookeeeper connection works right, but not the Hbase master one:
>
> java  -cp
> /tmp/testhbase2:/opt/cloudera/parcels/CLABS_PHOENIX-4.7.0-1.clabs_phoenix1.3.0.p0.000/lib/phoenix/lib/hadoop-hdfs-2.6.0-cdh5.7.0.jar:/opt/cloudera/parcels/CLABS_PHOENIX-4.7.0-1.clabs_phoenix1.3.0.p0.000/lib/phoenix/phoenix-4.7.0-clabs-phoenix1.3.0-client.jar
> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver  -u
> jdbc:phoenix:node-01u.xxxx.int:2181:phoenix@HADOOP.INT:/etc/security/keytabs/phoenix.keytab
> -n none -p none --color=true --fastConnect=false --verbose=true
> --incremental=false --isolation=TRANSACTION_READ_COMMITTED
> Setting property: [incremental, false]
> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
> issuing: !connect
> jdbc:phoenix:node-01u.xxxx.int:2181:phoenix@HADOOP.INT:/etc/security/keytabs/phoenix.keytab
> none none org.apache.phoenix.jdbc.PhoenixDriver
> Connecting to
> jdbc:phoenix:node-01u.xxxx.int:2181:phoenix@HADOOP.INT:/etc/security/keytabs/phoenix.keytab
> 17/04/11 19:06:38 INFO query.ConnectionQueryServicesImpl: Trying to
> connect to a secure cluster with keytab:/etc/security/keytabs/phoenix.keytab
> 17/04/11 19:06:38 INFO security.UserGroupInformation: Login successful
> for user phoenix@HADOOP.INT <mailto:phoenix@HADOOP.INT> using keytab
> file /etc/security/keytabs/phoenix.keytab
> 17/04/11 19:06:38 INFO query.ConnectionQueryServicesImpl: Successfull
> login to secure cluster!!
> 17/04/11 19:06:38 INFO zookeeper.RecoverableZooKeeper: Process
> identifier=hconnection-0x6f9edfc9 connecting to ZooKeeper
> ensemble=node-01u.xxxx.int:2181 <http://node-01u.xxxx.int:2181>
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:zookeeper.version=3.4.5-cdh5.7.0--1, built on 04/17/2016
> 08:12 GMT
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client environment:host.name
> <http://host.name>=node-01u.xxxx.int <http://node-01u.xxxx.int>
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:java.version=1.7.0_131
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:java.vendor=Oracle Corporation
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.131.x86_64/jre
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:java.class.path=/tmp/testhbase2:/opt/cloudera/parcels/CLABS_PHOENIX-4.7.0-1.clabs_phoenix1.3.0.p0.000/lib/phoenix/lib/hadoop-hdfs-2.6.0-cdh5.7.0.jar:/opt/cloudera/parcels/CLABS_PHOENIX-4.7.0-1.clabs_phoenix1.3.0.p0.000/lib/phoenix/phoenix-4.7.0-clabs-phoenix1.3.0-client.jar
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:java.io.tmpdir=/tmp
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:java.compiler=<NA>
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client environment:os.name
> <http://os.name>=Linux
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:os.version=2.6.32-573.26.1.el6.x86_64
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client environment:user.name
> <http://user.name>=root
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:user.home=/root
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Client
> environment:user.dir=/tmp/testhbase2
> 17/04/11 19:06:38 INFO zookeeper.ZooKeeper: Initiating client
> connection, connectString=node-01u.xxxx.int:2181
> <http://node-01u.xxxx.int:2181> sessionTimeout=90000
> watcher=hconnection-0x6f9edfc90x0, quorum=node-01u.xxxx.int:2181
> <http://node-01u.xxxx.int:2181>, baseZNode=/hbase
> 17/04/11 19:06:39 INFO zookeeper.ClientCnxn: Opening socket connection
> to server node-01u.xxxx.int/192.168.101.161:2181
> <http://node-01u.xxxx.int/192.168.101.161:2181>. Will not attempt to
> authenticate using SASL (unknown error)
> 17/04/11 19:06:39 INFO zookeeper.ClientCnxn: Socket connection
> established, initiating session, client: /192.168.101.161:33960
> <http://192.168.101.161:33960>, server:
> node-01u.xxxx.int/192.168.101.161:2181
> <http://node-01u.xxxx.int/192.168.101.161:2181>
> 17/04/11 19:06:39 INFO zookeeper.ClientCnxn: Session establishment
> complete on server node-01u.xxxx.int/192.168.101.161:2181
> <http://node-01u.xxxx.int/192.168.101.161:2181>, sessionid =
> 0x15afb9d0dee8c0f, negotiated timeout = 60000
> 17/04/11 19:06:39 WARN util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes
> where applicable
> 17/04/11 19:06:39 INFO metrics.Metrics: Initializing metrics system: phoenix
> 17/04/11 19:06:39 WARN impl.MetricsConfig: Cannot locate configuration:
> tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
> 17/04/11 19:06:39 INFO impl.MetricsSystemImpl: Scheduled snapshot period
> at 10 second(s).
> 17/04/11 19:06:39 INFO impl.MetricsSystemImpl: phoenix metrics system
> started
> 17/04/11 19:06:39 INFO Configuration.deprecation: hadoop.native.lib is
> deprecated. Instead, use io.native.lib.available
> 17/04/11 19:07:28 INFO client.RpcRetryingCaller: Call exception,
> tries=10, retries=35, started=48483 ms ago, cancelled=false, msg=
> 17/04/11 19:07:48 INFO client.RpcRetryingCaller: Call exception,
> tries=11, retries=35, started=68653 ms ago, cancelled=false, msg=
>
>
> Hbase master:
>
> 2017-04-11 19:06:40,212 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
> RpcServer.listener,port=60000: Caught exception while
> reading:Authentication is required
> 2017-04-11 19:06:40,418 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
> RpcServer.listener,port=60000: Caught exception while
> reading:Authentication is required
> 2017-04-11 19:06:40,726 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
> RpcServer.listener,port=60000: Caught exception while
> reading:Authentication is required
> 2017-04-11 19:06:41,233 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
> RpcServer.listener,port=60000: Caught exception while
> reading:Authentication is required
>
>
> Thanks !!
> Best Regards,
> rafa
>
>
>
>
> On Tue, Apr 11, 2017 at 4:05 PM, rafa <rafa13@gmail.com
> <mailto:rafa13@gmail.com>> wrote:
>
>     Hi everybody !,
>
>     We have a CDH 5.8 kerberized cluster in which we have installed
>     Apache Phoenix 4.7 (via CLABS parcel). Everything works as expected.
>     The only problem we are facing is when trying to connect a
>     WeblogicServer to Apache Phoenix via the fat client.
>
>     needed files are added in classpath: hbase-site.xml,core-site.xml
>     and hdfs-site.xml
>
>     Jaas.conf used:
>
>     Client {
>     com.sun.security.auth.module.Krb5LoginModule required
>     principal="phoenix@HADOOP.INT <mailto:phoenix@HADOOP.INT>"
>     useKeyTab=true
>     keyTab=phoenix.keytab
>     storeKey=true
>     debug=true;
>     };
>
>     JDBC URL used:
>     jdbc:phoenix:node-01u.xxxx.int:2181:hbase/phoenix@HADOOP.INT:/wldoms/domcb1arqu/phoenix.keytab
>
>     The secured connection is made correctly in Zookeeper, but it never
>     succeeds when connecting to the HBase Master
>
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:java.io.tmpdir=/tmp
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:java.compiler=<NA>
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:os.name <http://os.name>=Linux
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:os.arch=amd64
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:os.version=2.6.32-642.11.1.el6.x86_64
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:user.name <http://user.name>=weblogic
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:user.home=/home/weblogic
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Client
>     environment:user.dir=/wldoms/dominios/domcb1arqu
>     17/04/10 12:38:23 INFO zookeeper.ZooKeeper: Initiating client
>     connection, connectString=node-01u.xxxx.int:2181
>     <http://node-01u.xxxx.int:2181> sessionTimeout=90000
>     watcher=hconnection-0x589619380x0, quorum=node-01u.xxxx.int:2181
>     <http://node-01u.xxxx.int:2181>, baseZNode=/hbase
>     17/04/10 12:38:23 INFO zookeeper.Login: successfully logged in.
>     17/04/10 12:38:23 INFO zookeeper.Login: TGT refresh thread started.
>     17/04/10 12:38:23 INFO zookeeper.Login: TGT valid starting
>     at:        Mon Apr 10 12:38:23 CEST 2017
>     17/04/10 12:38:23 INFO zookeeper.Login: TGT
>     expires:                  Tue Apr 11 12:38:23 CEST 2017
>     17/04/10 12:38:23 INFO zookeeper.Login: TGT refresh sleeping until:
>     Tue Apr 11 08:20:46 CEST 2017
>     17/04/10 12:38:23 INFO client.ZooKeeperSaslClient: Client will use
>     GSSAPI as SASL mechanism.
>     17/04/10 12:38:23 INFO zookeeper.ClientCnxn: Opening socket
>     connection to server node-01u.xxxx.int/192.168.101.161:2181
>     <http://node-01u.xxxx.int/192.168.101.161:2181>. Will attempt to
>     SASL-authenticate using Login Context section 'Client'
>     17/04/10 12:38:23 INFO zookeeper.ClientCnxn: Socket connection
>     established, initiating session, client: /192.168.60.6:49232
>     <http://192.168.60.6:49232>, server:
>     node-01u.xxxx.int/192.168.101.161:2181
>     <http://node-01u.xxxx.int/192.168.101.161:2181>
>     17/04/10 12:38:23 INFO zookeeper.ClientCnxn: Session establishment
>     complete on server node-01u.xxxx.int/192.168.101.161:2181
>     <http://node-01u.xxxx.int/192.168.101.161:2181>, sessionid =
>     0x15afb9d0dee82a6, negotiated timeout = 60000
>     17/04/10 12:38:24 INFO metrics.Metrics: Initializing metrics system:
>     phoenix
>     17/04/10 12:38:24 WARN impl.MetricsConfig: Cannot locate
>     configuration: tried
>     hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
>     17/04/10 12:38:24 INFO impl.MetricsSystemImpl: Scheduled snapshot
>     period at 10 second(s).
>     17/04/10 12:38:24 INFO impl.MetricsSystemImpl: phoenix metrics
>     system started
>     17/04/10 12:38:24 INFO Configuration.deprecation: hadoop.native.lib
>     is deprecated. Instead, use io.native.lib.available
>     17/04/10 12:39:13 INFO client.RpcRetryingCaller: Call exception,
>     tries=10, retries=35, started=48396 ms ago, cancelled=false, msg=
>     17/04/10 12:39:33 INFO client.RpcRetryingCaller: Call exception,
>     tries=11, retries=35, started=68572 ms ago, cancelled=false, msg=
>     17/04/10 12:39:53 INFO client.RpcRetryingCaller: Call exception,
>     tries=12, retries=35, started=88752 ms ago, cancelled=false, msg=
>     17/04/10 12:40:13 INFO client.RpcRetryingCaller: Call exception,
>     tries=13, retries=35, started=108791 ms ago, cancelled=false, msg=
>
>
>     keeps retrying until it finally fails.
>     ....
>
>     We obtain:
>
>     Mon Apr 10 12:38:24 CEST 2017,
>     RpcRetryingCaller{globalStartTime=1491820704684, pause=100,
>     retries=35}, org.apache.hadoop.hbase.MasterNotRunningException:
>     com.google.protobuf.ServiceException:
>     org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call
>     to node-05u.xxxx.int/192.168.101.167:60000
>     <http://node-05u.xxxx.int/192.168.101.167:60000> failed on local
>     exception:
>     org.apache.hadoop.hbase.exceptions.ConnectionClosingException:
>     Connection to node-05u.xxxx.int/192.168.101.167:60000
>     <http://node-05u.xxxx.int/192.168.101.167:60000> is closing. Call
>     id=0, waitTime=32
>     Mon Apr 10 12:38:25 CEST 2017,
>     RpcRetryingCaller{globalStartTime=1491820704684, pause=100,
>     retries=35}, org.apache.hadoop.hbase.MasterNotRunningException:
>     com.google.protobuf.ServiceException:
>     org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call
>     to node-05u.xxxx.int/192.168.101.167:60000
>     <http://node-05u.xxxx.int/192.168.101.167:60000> failed on local
>     exception:
>     org.apache.hadoop.hbase.exceptions.ConnectionClosingException:
>     Connection to node-05u.xxxx.int/192.168.101.167:60000
>     <http://node-05u.xxxx.int/192.168.101.167:60000> is closing. Call
>     id=1, waitTime=6
>
>
>     The connectivity to Hbase Master (port 60000) is ok from the WLS
>     machine.
>
>     Looking at the HBase Master logs we see that the HBase Master is
>     responding everytime with "Authentication is required" error:
>
>
>     2017-04-10 12:47:15,849 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     RpcServer.listener,port=60000: connection from 192.168.60.6:34380
>     <http://192.168.60.6:34380>; # active connections: 5
>     2017-04-10 12:47:15,849 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     RpcServer.listener,port=60000: Caught exception while
>     reading:Authentication is required
>     2017-04-10 12:47:15,849 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     RpcServer.listener,port=60000: DISCONNECTING client
>     192.168.60.6:34380 <http://192.168.60.6:34380> because read
>     count=-1. Number of active connections:
>
>     Executing a "hbase shell" manually inside the cluster after
>     obtaining a ticket with the same keytab we see:
>
>
>     2017-04-11 12:33:12,319 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     RpcServer.listener,port=60000: connection from 192.168.101.161:60370
>     <http://192.168.101.161:60370>; # active connections: 5
>     2017-04-11 12:33:12,330 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     Kerberos principal name is hbase/node-05u.xxxx.int@HADOOP.INT
>     <mailto:node-05u.xxxx.int@HADOOP.INT>
>     2017-04-11 12:33:12,330 DEBUG
>     org.apache.hadoop.security.UserGroupInformation: PrivilegedAction
>     as:hbase/node-05u.xxxx.int@HADOOP.INT
>     <mailto:node-05u.xxxx.int@HADOOP.INT> (auth:KERBEROS)
>     from:org.apache.hadoop.hbase.ipc.RpcServer$Connection.saslReadAndProcess(RpcServer.java:1354)
>     2017-04-11 12:33:12,331 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     Created SASL server with mechanism = GSSAPI
>     2017-04-11 12:33:12,331 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     Have read input token of size 640 for processing by
>     saslServer.evaluateResponse()
>     2017-04-11 12:33:12,333 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     Will send token of size 108 from saslServer.
>     2017-04-11 12:33:12,335 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     Have read input token of size 0 for processing by
>     saslServer.evaluateResponse()
>     2017-04-11 12:33:12,335 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     Will send token of size 32 from saslServer.
>     2017-04-11 12:33:12,336 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     Have read input token of size 32 for processing by
>     saslServer.evaluateResponse()
>     2017-04-11 12:33:12,336 DEBUG
>     org.apache.hadoop.hbase.security.HBaseSaslRpcServer: SASL server
>     GSSAPI callback: setting canonicalized client ID: phoenix@HADOOP.INT
>     <mailto:phoenix@HADOOP.INT>
>     2017-04-11 12:33:12,336 DEBUG org.apache.hadoop.hbase.ipc.RpcServer:
>     SASL server context established. Authenticated client:
>     phoenix@HADOOP.INT <mailto:phoenix@HADOOP.INT> (auth:KERBEROS).
>     Negotiated QoP is auth
>     2017-04-11 12:33:12,336 INFO
>     SecurityLogger.org.apache.hadoop.hbase.Server: Auth successful for
>     phoenix@HADOOP.INT <mailto:phoenix@HADOOP.INT> (auth:KERBEROS)
>     2017-04-11 12:33:12,338 INFO
>     SecurityLogger.org.apache.hadoop.hbase.Server: Connection from
>     192.168.101.161 port: 60370 with version info: version:
>     "1.2.0-cdh5.8.0" url:
>     "file:///data/jenkins/workspace/generic-package-rhel64-6-0/topdir/BUILD/hbase-1.2.0-cdh5.8.0"
>     revision: "Unknown" user: "jenkins" date: "Tue Jul 12 16:09:11 PDT
>     2016" src_checksum: "b910b34d6127cf42495e0a8bf37a0e9e"
>     2017-04-11 12:33:12,338 INFO
>     SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager:
>     Authorization successful for phoenix@HADOOP.INT
>     <mailto:phoenix@HADOOP.INT> (auth:KERBEROS) for protocol=interface
>     org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingInterface
>
>     It seems that the JDBC driver is not trying to authenticate to
>     Hbase. Perhaps some of you have faced a similar situation or could
>     point me to a new direction.
>
>     Thank you very much for your help !
>     Best Regards,
>     rafa.
>
>
>

Mime
View raw message