I am trying to create a simple POC of the Spark / Phoenix integration. The operation is: 

val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))

The error I get from that is:

org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName=SYSTEM.CATALOG
at org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:542)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1113)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1033)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1369)
at org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:120)
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2116)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:828)
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:338)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:326)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:324)
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1326)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2279)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2248)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2248)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:233)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:135)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:98)

This is in a spark-shell session started with command:

spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/phoenix-4.7.0-HBase-1.2-client.jar


Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG clearly exists and has the table metadata I'd expect.

What am I doing wrong here?

Thanks,
-nathan