phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nathan Davis <nathan.da...@salesforce.com>
Subject Re: TableNotFoundException, tableName=SYSTEM.CATALOG with phoenix-spark
Date Wed, 10 Aug 2016 13:07:47 GMT
Thanks Josh, I tried that out (adding just the phoenix-client-spark jar to
CP) and got the same error result.

I wonder if the issue is that I'm running on EMR 5 with HBase 1.2. The jars
I'm using are copied over from the HBase master because there is no
4.7.0-HBase-1.2 set in MVN. Is the phoenix-spark functionality confirmed to
work in 4.7 against HBase 1.2?


On Tue, Aug 9, 2016 at 7:37 PM, Josh Mahonin <jmahonin@gmail.com> wrote:

> Hi Nathan,
>
> That's a new error to me. I've heard some people have had some luck
> passing the phoenix-spark and phoenix-client JAR using the --jars option,
> but the recommended procedure is to ensure you're using the
> *phoenix-client-spark* JAR on the Spark driver and executor classpath
> from config. [1]
>
> As a reference, here's a Docker image with a working configuration as well
> [2]
>
> Good luck,
>
> Josh
>
> [1] https://phoenix.apache.org/phoenix_spark.html
> [2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark
>
> On Tue, Aug 9, 2016 at 2:20 PM, Nathan Davis <nathan.davis@salesforce.com>
> wrote:
>
>> I am trying to create a simple POC of the Spark / Phoenix integration.
>> The operation is:
>>
>> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
>>> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))
>>
>>
>> The error I get from that is:
>>
>> org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03):
>>> Table undefined. tableName=SYSTEM.CATALOG
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllT
>>> ableRegions(ConnectionQueryServicesImpl.java:542)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkCl
>>> ientServerCompatibility(ConnectionQueryServicesImpl.java:1113)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureT
>>> ableCreated(ConnectionQueryServicesImpl.java:1033)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.createT
>>> able(ConnectionQueryServicesImpl.java:1369)
>>
>> at org.apache.phoenix.query.DelegateConnectionQueryServices.
>>> createTable(DelegateConnectionQueryServices.java:120)
>>
>> at org.apache.phoenix.schema.MetaDataClient.createTableInternal
>>> (MetaDataClient.java:2116)
>>
>> at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDat
>>> aClient.java:828)
>>
>> at org.apache.phoenix.compile.CreateTableCompiler$2.execute(Cre
>>> ateTableCompiler.java:183)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>> ment.java:338)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>> ment.java:326)
>>
>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(Pho
>>> enixStatement.java:324)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(Phoen
>>> ixStatement.java:1326)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.
>>> call(ConnectionQueryServicesImpl.java:2279)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.
>>> call(ConnectionQueryServicesImpl.java:2248)
>>
>> at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixC
>>> ontextExecutor.java:78)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(Co
>>> nnectionQueryServicesImpl.java:2248)
>>
>> at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServ
>>> ices(PhoenixDriver.java:233)
>>
>> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnecti
>>> on(PhoenixEmbeddedDriver.java:135)
>>
>> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>>
>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>
>> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>
>> at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnecti
>>> on(ConnectionUtil.java:98)
>>
>>
>> This is in a spark-shell session started with command:
>>
>> spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars
>>> /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/phoe
>>> nix-4.7.0-HBase-1.2-client.jar
>>
>>
>>
>> Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG
>> clearly exists and has the table metadata I'd expect.
>>
>> What am I doing wrong here?
>>
>> Thanks,
>> -nathan
>>
>>
>>
>

Mime
View raw message