phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benjamin Kim <bbuil...@gmail.com>
Subject Re: Spark Plugin Information
Date Sat, 09 Apr 2016 19:09:33 GMT
Josh,

For my tests, I’m passing the Zookeeper Quorum URL.

"zkUrl" -> "prod-nj3-hbase-master001.pnj3i.gradientx.com,prod-nj3-namenode001.pnj3i.gradientx.com,prod-nj3-namenode002.pnj3i.gradientx.com:2181”

Is this correct?

Thanks,
Ben


> On Apr 9, 2016, at 8:06 AM, Josh Mahonin <jmahonin@gmail.com> wrote:
> 
> Hi Ben,
> 
> It looks like a connection URL issue. Are you passing the correct 'zkUrl' parameter,
or do you have the HBase Zookeeper quorum defined in an hbase-site.xml available in the classpath?
> 
> If you're able to connect to Phoenix using JDBC, you should be able to take the JDBC
url, pop off the 'jdbc:phoenix:' prefix and use it as the 'zkUrl' option.
> 
> Josh
> 
> On Fri, Apr 8, 2016 at 6:47 PM, Benjamin Kim <bbuild11@gmail.com <mailto:bbuild11@gmail.com>>
wrote:
> Hi Josh,
> 
> I am using CDH 5.5.2 with HBase 1.0.0, Phoenix 4.5.2, and Spark 1.6.0. I looked up the
error and found others who led me to ask the question. I’ll try to use Phoenix 4.7.0 client
jar and see what happens.
> 
> The error I am getting is:
> 
> java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
> 	at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:388)
> 	at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:296)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:179)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1917)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1896)
> 	at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1896)
> 	at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
> 	at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
> 	at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:571)
> 	at java.sql.DriverManager.getConnection(DriverManager.java:187)
> 	at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:93)
> 	at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:57)
> 	at org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:45)
> 	at org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:280)
> 	at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:101)
> 	at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:57)
> 	at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:37)
> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
> 	at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1153)
> 	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
> 	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
> 	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
> 	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
> 	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
> 	at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
> 	at $iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
> 	at $iwC$$iwC$$iwC.<init>(<console>:50)
> 	at $iwC$$iwC.<init>(<console>:52)
> 	at $iwC.<init>(<console>:54)
> 	at <init>(<console>:56)
> 	at .<init>(<console>:60)
> 	at .<clinit>(<console>)
> 	at .<init>(<console>:7)
> 	at .<clinit>(<console>)
> 	at $print(<console>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> 	at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
> 	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> 	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> 	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> 	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
> 	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> 	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
> 	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> 	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
> 	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> 	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:875)
> 	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> 	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> 	at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
> 	at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
> 	at org.apache.spark.repl.SparkILoop.org <http://org.apache.spark.repl.sparkiloop.org/>$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> 	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> 	at org.apache.spark.repl.SparkILoop.org <http://org.apache.spark.repl.sparkiloop.org/>$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
> 	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> 	at org.apache.spark.repl.Main$.main(Main.scala:31)
> 	at org.apache.spark.repl.Main.main(Main.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
> 	at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
> 	at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:414)
> 	at org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:323)
> 	at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
> 	at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:294)
> 	... 73 more
> Caused by: java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> 	at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
> 	... 78 more
> Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
> 	at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
> 	at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
> 	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2317)
> 	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:688)
> 	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
> 	... 83 more
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ClientRpcControllerFactory
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:190)
> 	at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
> 	... 87 more
> 
> Thanks,
> Ben
> 
> 
>> On Apr 8, 2016, at 2:23 PM, Josh Mahonin <jmahonin@gmail.com <mailto:jmahonin@gmail.com>>
wrote:
>> 
>> Hi Ben,
>> 
>> If you have a reproducible test case, please file a JIRA for it. The documentation
(https://phoenix.apache.org/phoenix_spark.html <https://phoenix.apache.org/phoenix_spark.html>)
is accurate and verified for up to Phoenix 4.7.0 and Spark 1.6.0.
>> 
>> Although not supported by the Phoenix project at large, you may find this Docker
image useful as a configuration reference:
>> https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark <https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark>
>> 
>> Good luck!
>> 
>> Josh
>> 
>> On Fri, Apr 8, 2016 at 3:11 PM, Benjamin Kim <bbuild11@gmail.com <mailto:bbuild11@gmail.com>>
wrote:
>> I want to know if there is a update/patch coming to Spark or the Spark plugin? I
see that the Spark plugin does not work because HBase classes are missing from the Spark Assembly
jar. So, when Spark does reflection, it does not look for HBase client classes in the Phoenix
Plugin jar but only in the Spark Assembly jar. Is this true?
>> 
>> If someone can enlighten me on this topic, please let me know.
>> 
>> Thanks,
>> Ben
>> 
>> 
>> 
>> 
>> 
>> 
>> 
> 
> 


Mime
View raw message