phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Riesland, Zack" <Zack.Riesl...@sensus.com>
Subject RE: CsvBulkUpload not working after upgrade to 4.6
Date Fri, 11 Dec 2015 13:01:18 GMT
Thanks James,

I did. Unfortunately, Hortonworks only supports software that they provide in their stack,
and they only provide 4.2.2 for our version of the HDP stack. Even if we upgrade to the latest
HDP, we would only get support for Phoenix 4.4

FWIW, I am also unable to connect Aqua Data Studio to Phoenix using the new client jar (phoenix-4.6.0-HBase-0.98-client.jar

When I try, I get the stack trace below.

However, I am still able to connect using the old 4.2.2 jar.


java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
                at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:396)
                at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:298)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:179)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1919)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1898)
                at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1898)
                at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
                at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
                at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
                at \\...\\ .\\हिñçêČάй語简�?한\\.Jᠻꐎꎂᢋ 9.KX(Unknown
Source)
                at \\...\\ .\\हिñçêČάй語简�?한\\.Jᠻꐎꎂᢋ 9.au(Unknown
Source)
                at \\...\\ .\\हिñçêČάй語简�?한\\.Jᠻꐎꎂᢋ 9.getConnection(Unknown
Source)
                at \\...\\ .\\हिñçêČάй語简�?한\\.Jᠻꐎꎂᢋ 9.getConnection(Unknown
Source)
                at com.aquafold.datastudio.mainframe.dialog.connection.diagnostics.yꑅꀱꏓᜪ
import.dv(Unknown Source)
                at com.aquafold.datastudio.mainframe.dialog.connection.diagnostics.yꑅꀱꏓᜪ
import.b(Unknown Source)
                at \\...\\ .\\हिñçêČάй語简�?한\\.bᡲꐢꐟꄦ 5 5.d(Unknown
Source)
                at \\...\\ .\\हिñçêČάй語简�?한\\.bᡲꐢꐟꄦ 5 5.b(Unknown
Source)
                at com.aquafold.datastudio.mainframe.dialog.connection.diagnostics.PingDialog$3.runTest(Unknown
Source)
                at com.aquafold.datastudio.mainframe.dialog.connection.diagnostics.PingDialog$2.run(Unknown
Source)
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
                at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:426)
                at org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:319)
                at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:292)
                at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:296)
                ... 18 more



From: James Taylor [mailto:jamestaylor@apache.org]
Sent: Wednesday, December 09, 2015 2:31 PM
To: user
Subject: Re: CsvBulkUpload not working after upgrade to 4.6

Zack,
Have you asked Hortonworks through your support channel? This sounds like an issue related
to the HDP version you have - you need to confirm with them that upgrading to Phoenix 4.6.0
will work (and if there are any extra steps you need to take).

Thanks,
James



On Wed, Dec 9, 2015 at 10:41 AM, Riesland, Zack <Zack.Riesland@sensus.com<mailto:Zack.Riesland@sensus.com>>
wrote:
Thanks Samarth,

I’m running hbase 0.98.4.2.2.8.0-3150 and phoenix 4.6.0-HBase-0.98

The hbase stuff is there via the HDP 2.2.8 install. It worked before upgrading to 4.6.

From: Samarth Jain [mailto:samarth@apache.org<mailto:samarth@apache.org>]
Sent: Wednesday, December 09, 2015 1:29 PM
To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: Re: CsvBulkUpload not working after upgrade to 4.6

Zack,

What version of HBase are you running? And which version of Phoenix (specifically 4.6-0.98
version or 4.6-1.x version)? FWIW, I don't see the MetaRegionTracker.java file in HBase branches
1.x and master. Maybe you don't have the right hbase-client jar in place?

- Samarth

On Wed, Dec 9, 2015 at 4:30 AM, Riesland, Zack <Zack.Riesland@sensus.com<mailto:Zack.Riesland@sensus.com>>
wrote:
This morning I tried running the same operation from a data node as well as a name node, where
phoenix 4.2 is completely gone, and I get the exact same error.



From: Riesland, Zack
Sent: Tuesday, December 08, 2015 8:42 PM
To: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
Subject: CsvBulkUpload not working after upgrade to 4.6

I upgraded our cluster from 4.2.2 to 4.6.

After a few hiccups, everything seems to be working: I can connect and interact with the DB
using Aqua Studio. My web stuff that queries Phoenix works, using the new client jar. My java
code to connect and interact with the DB works, using the new client jar, etc.

With one exception: Csv Bulk Upload does not work with the new client jar – only with the
old one (4.2.2.blah).

On my edge node, where I run this from, I upgraded phoenix using the same script. /usr/hdp/current/phoenix-client
now points to a folder full of 4.6 stuff. Permissions all seem to be correct. However, the
command below fails with the error below.

If I replace the reference to the (4.6) phoenix-client.jar and point explicitly to the old
4.2 client jar, it still works.

Any ideas or suggestions?

Thanks!



HADOOP_CLASSPATH=/usr/hdp/current/hbase-master/conf/:/usr/hdp/current/hbase-master/lib/hbase-protocol.jar
hadoop jar /usr/hdp/current/phoenix-client/phoenix-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool
-Dfs.permissions.umask-mode=000 --z <server info> --table <table> --input <input>

15/12/08 15:37:01 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=<connect
string> sessionTimeout=120000 watcher=hconnection-0x7728c6760x0, quorum=<stuff>
15/12/08 15:37:01 INFO zookeeper.ClientCnxn: Opening socket connection to .... Will not attempt
to authenticate using SASL (unknown error)
15/12/08 15:37:01 INFO zookeeper.ClientCnxn: Socket connection established to ..., initiating
session
15/12/08 15:37:01 INFO zookeeper.ClientCnxn: Session establishment complete on server...,
sessionid = 0x25110076f7fbf25, negotiated timeout = 40000
15/12/08 15:37:01 INFO client.HConnectionManager$HConnectionImplementation: Closing master
protocol: MasterService
15/12/08 15:37:01 INFO client.HConnectionManager$HConnectionImplementation: Closing zookeeper
sessionid=0x25110076f7fbf25
15/12/08 15:37:01 INFO zookeeper.ZooKeeper: Session: 0x25110076f7fbf25 closed
15/12/08 15:37:01 INFO zookeeper.ClientCnxn: EventThread shut down
15/12/08 15:37:01 INFO client.HConnectionManager$HConnectionImplementation: Closing zookeeper
sessionid=0x35102a35baf2c0c
15/12/08 15:37:01 INFO zookeeper.ZooKeeper: Session: 0x35102a35baf2c0c closed
15/12/08 15:37:01 INFO zookeeper.ClientCnxn: EventThread shut down
Exception in thread "main" java.sql.SQLException: ERROR 2006 (INT08): Incompatible jars detected
between client and server. Ensure that phoenix.jar is put on the classpath of HBase in every
region server: org.apache.hadoop.hbase.protobuf.generated.ZooKeeperProtos$MetaRegionServer.hasState()Z
                at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:396)
                at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1000)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:879)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1225)
                at org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:113)
                at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2013)
                at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:785)
                at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
                at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:319)
                at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:311)
                at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
                at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:309)
                at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1368)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1929)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1898)
                at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1898)
                at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
                at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
                at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
                at java.sql.DriverManager.getConnection(DriverManager.java:571)
                at java.sql.DriverManager.getConnection(DriverManager.java:187)
                at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:301)
                at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:292)
                at org.apache.phoenix.mapreduce.CsvBulkLoadTool.loadData(CsvBulkLoadTool.java:211)
                at org.apache.phoenix.mapreduce.CsvBulkLoadTool.run(CsvBulkLoadTool.java:184)
                at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
                at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
                at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:99)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:606)
                at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
                at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.protobuf.generated.ZooKeeperProtos$MetaRegionServer.hasState()Z
                at org.apache.hadoop.hbase.zookeeper.MetaRegionTracker.getMetaRegionState(MetaRegionTracker.java:219)
                at org.apache.hadoop.hbase.zookeeper.MetaRegionTracker.blockUntilAvailable(MetaRegionTracker.java:204)
                at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getMetaRegionLocation(ZooKeeperRegistry.java:58)
                at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1157)
                at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1249)
                at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1160)
                at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1117)
                at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionLocation(HConnectionManager.java:958)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:439)
                at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:953)
                ... 33 more


Mime
View raw message