phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Mahonin <jmaho...@gmail.com>
Subject Re: TableNotFoundException, tableName=SYSTEM.CATALOG with phoenix-spark
Date Tue, 09 Aug 2016 23:37:14 GMT
Hi Nathan,

That's a new error to me. I've heard some people have had some luck passing
the phoenix-spark and phoenix-client JAR using the --jars option, but the
recommended procedure is to ensure you're using the *phoenix-client-spark*
JAR on the Spark driver and executor classpath from config. [1]

As a reference, here's a Docker image with a working configuration as well
[2]

Good luck,

Josh

[1] https://phoenix.apache.org/phoenix_spark.html
[2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark

On Tue, Aug 9, 2016 at 2:20 PM, Nathan Davis <nathan.davis@salesforce.com>
wrote:

> I am trying to create a simple POC of the Spark / Phoenix integration. The
> operation is:
>
> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
>> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))
>
>
> The error I get from that is:
>
> org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03):
>> Table undefined. tableName=SYSTEM.CATALOG
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>> getAllTableRegions(ConnectionQueryServicesImpl.java:542)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>> checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1113)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>> ensureTableCreated(ConnectionQueryServicesImpl.java:1033)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(
>> ConnectionQueryServicesImpl.java:1369)
>
> at org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(
>> DelegateConnectionQueryServices.java:120)
>
> at org.apache.phoenix.schema.MetaDataClient.createTableInternal(
>> MetaDataClient.java:2116)
>
> at org.apache.phoenix.schema.MetaDataClient.createTable(
>> MetaDataClient.java:828)
>
> at org.apache.phoenix.compile.CreateTableCompiler$2.execute(
>> CreateTableCompiler.java:183)
>
> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(
>> PhoenixStatement.java:338)
>
> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(
>> PhoenixStatement.java:326)
>
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>
> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(
>> PhoenixStatement.java:324)
>
> at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(
>> PhoenixStatement.java:1326)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
>> ConnectionQueryServicesImpl.java:2279)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
>> ConnectionQueryServicesImpl.java:2248)
>
> at org.apache.phoenix.util.PhoenixContextExecutor.call(
>> PhoenixContextExecutor.java:78)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(
>> ConnectionQueryServicesImpl.java:2248)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(
>> PhoenixDriver.java:233)
>
> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(
>> PhoenixEmbeddedDriver.java:135)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>
> at org.apache.phoenix.mapreduce.util.ConnectionUtil.
>> getConnection(ConnectionUtil.java:98)
>
>
> This is in a spark-shell session started with command:
>
> spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars
>> /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/pho
>> enix-4.7.0-HBase-1.2-client.jar
>
>
>
> Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG
> clearly exists and has the table metadata I'd expect.
>
> What am I doing wrong here?
>
> Thanks,
> -nathan
>
>
>

Mime
View raw message