phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aaron Molitor <amoli...@splicemachine.com>
Subject Re: Unable to create secondary index with IndexTool - "java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
Date Thu, 18 Aug 2016 14:12:16 GMT
Tom, 

While you're waiting for Cloudera you may be able to gain some deeper insight by looking around
here: https://github.com/cloudera/hbase/tree/cdh5-1.2.0_5.7.2 <https://github.com/cloudera/hbase/tree/cdh5-1.2.0_5.7.2>

In my experience all of the platform companies (CDH, HDP, MapR) tend to incorporate patches
to the published versions of the components in their deployments.  The general idea makes
sense to me, but can be challenging when one of those patches includes a breaking API change.
 

Good Luck!
-Aaron


> On Aug 18, 2016, at 06:02, Squires, Tom (ELS-LON) <tom.squires@elsevier.com> wrote:
> 
> Hi,
> 
> We got to the bottom of this. The core of the issue is in the Cloudera version of HBase
(hbase-client specifically): 1.2.0-cdh5.7.2 (see: https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh5_maven_repo_57x.html
<https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh5_maven_repo_57x.html>).
> 
> It looks like the 1.2.0-cdh5.7.2 version of hbase-client contains a different version
of the problematic HTableDescriptor class: the signatures of the setValue(String, String)method
contain different return types when comparing version 1.2.0 with version 1.2.0-cdh5.7.2.
> 
> Phoenix's IndexTool expects the 1.2.0 version (which is bundled with the Phoenix client
jar), but the Cloudera version was higher up on the classpath therefore causing the issue
in my email below.
> 
> Our workaround was to rename the Phoenix client jar so that it appears first on the classpath
and is therefore the jar from which the class that gets loaded, which is the version the IndexTool
expects.
> 
> We plan on talking with Cloudera to gain a better understanding of the differences between
1.2.0 and 1.2.0-cdh5.7.2 of hbase-client: we'll update this thread if anything comes of that
discussion.
> 
> Regards,
> Tom
> 
> From: Squires, Tom (ELS-LON)
> Sent: 18 August 2016 10:11
> To: user@phoenix.apache.org
> Cc: Narros, Eduardo (ELS-LON)
> Subject: Unable to create secondary index with IndexTool - "java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;"
>  
> Hi,
> 
> I am trying to use the org.apache.phoenix.mapreduce.index.IndexTool to create a secondary
index on a table in our Phoenix cluster. We are using HBase 1.2 on Cloudera CDH 5.7.2.
> 
> I downloaded the Phoenix 4.8.0 for HBase 1.2 binaries so that we had a version of IndexTool
that is compatible with our HBase version, but I am getting the following error when running
the tool:
> 
> java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
> 
> This looks to me like the IndexTool is expecting a different version of HBase.
> 
> Can anyone please advise? I have pasted shell output below.
> 
> Many thanks,
> Tom
> 
> [ec2-user@ip-10-0-0-229 ~]$ hbase version
> HBase 1.2.0-cdh5.7.2
> Source code repository file:///data/jenkins/workspace/generic-package-centos64-7-0/topdir/BUILD/hbase-1.2.0-cdh5.7.2
revision=Unknown
> Compiled by jenkins on Fri Jul 22 12:21:17 PDT 2016
> From source with checksum bc88ae0a54f047ea2506e04326e55353
> [ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema
MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path  /home/user/hadoop
> Error: Could not find or load main class org.apache.phoenix.mapreduce.index.IndexTool
> [ec2-user@ip-10-0-0-229 ~]$ find apache-phoenix-4.8.0-HBase-1.2-bin/ -name "*.jar" |
xargs grep IndexTool.class
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar matches
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar matches
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar matches
> Binary file apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar matches
> [ec2-user@ip-10-0-0-229 ~]$ sudo cp apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-server.jar
apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-core-4.8.0-HBase-1.2.jar apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-client.jar
apache-phoenix-4.8.0-HBase-1.2-bin/phoenix-4.8.0-HBase-1.2-hive.jar /opt/cloudera/parcels/CDH/lib/hbase/lib/
> [ec2-user@ip-10-0-0-229 ~]$ hbase org.apache.phoenix.mapreduce.index.IndexTool --schema
MENDELEY --data-table DOCUMENTS --index-table INDEX_PROFILE_ID --output-path  /home/user/hadoop
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/lib/phoenix-4.8.0-HBase-1.2-hive.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 16/08/18 05:01:56 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x63cd9962
connecting to ZooKeeperensemble=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965,
built on 02/20/2014 09:09 GMT
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:host.name=ip-10-0-0-229.eu-west-1.compute.internal
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_67-cloudera/jre
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.class.path=<removed>
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hadoop/lib/native:/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hbase/bin/../lib/native/Linux-amd64-64
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-327.el7.x86_64
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.name=ec2-user
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ec2-user
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ec2-user
> 16/08/18 05:01:56 INFO zookeeper.ZooKeeper: Initiating client connection,connectString=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181
sessionTimeout=60000 watcher=hconnection-0x63cd99620x0,quorum=52.49.131.199:2181,52.48.211.214:2181,52.210.25.200:2181,
baseZNode=/hbase
> 16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Opening socket connection to server 52.210.25.200/52.210.25.200:2181.
Will not attempt to authenticate using SASL (unknown error)
> 16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Socket connection established to 52.210.25.200/52.210.25.200:2181,
initiating session
> 16/08/18 05:01:56 INFO zookeeper.ClientCnxn: Session establishment complete on server
52.210.25.200/52.210.25.200:2181, sessionid = 0x35678e40dae4cee, negotiated timeout = 60000
> 16/08/18 05:01:57 INFO metrics.Metrics: Initializing metrics system: phoenix
> 16/08/18 05:01:57 WARN impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
> 16/08/18 05:01:57 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
> 16/08/18 05:01:57 INFO impl.MetricsSystemImpl: phoenix metrics system started
> 16/08/18 05:01:58 INFO Configuration.deprecation: hadoop.native.lib is deprecated. Instead,
use io.native.lib.available
> Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.setValue(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/hadoop/hbase/HTableDescriptor;
>     at org.apache.phoenix.query.ConnectionQueryServicesImpl.generateTableDescriptor(ConnectionQueryServicesImpl.java:756)
>     at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1020)
>     at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1396)
>     at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2302)
>     at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:922)
>     at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)
>     at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
>     at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
>     at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>     at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)
>     at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1421)
>     at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2353)
>     at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2300)
>     at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
>     at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2300)
>     at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:231)
>     at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:144)
>     at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>     at java.sql.DriverManager.getConnection(DriverManager.java:571)
>     at java.sql.DriverManager.getConnection(DriverManager.java:187)
>     at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:98)
>     at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
>     at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
>     at org.apache.phoenix.mapreduce.index.IndexTool.run(IndexTool.java:188)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>     at org.apache.phoenix.mapreduce.index.IndexTool.main(IndexTool.java:394)
> [ec2-user@ip-10-0-0-229 ~]$
> 
> 
> 
> Elsevier Limited. Registered Office: The Boulevard, Langford Lane, Kidlington, Oxford,
OX5 1GB, United Kingdom, Registration No. 1982084, Registered in England and Wales.


Mime
View raw message