phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kanagha <>
Subject Java equivalent for phoenixTableAsDataFrame scala- phoenix-spark plugin
Date Wed, 02 Aug 2017 18:53:59 GMT

I'm trying to find what is the approach to read from multi-tenant phoenix
table via phoenix-spark plugin in java.

For scala, I see the following example.

test("Can read from tenant-specific table as DataFrame") {
    val sqlContext = new SQLContext(sc)
    val df = sqlContext.phoenixTableAsDataFrame(
      Seq(OrgIdCol, TenantOnlyCol),
      zkUrl = Some(quorumAddress),
      tenantId = Some(TenantId),
      conf = hbaseConfiguration)

    // There should only be 1 row upserted in tenantSetup.sql
    val count = df.count()
    count shouldEqual 1L

 For java, I see the following snippet. But, is there an option to pass in
a url param with tenantId similar to how it is done via JDBC?"org.apache.phoenix.spark").option("table", "tableName"
).option("zkUrl", "").load();

Appreciate any inputs or suggestions!


View raw message