phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sateesh Karuturi <sateesh.karutu...@gmail.com>
Subject Phoenix-spark read example in spark-2.0.0
Date Sat, 18 Mar 2017 20:58:18 GMT
Hello friends..,

I am very new to Apache Phoenix and i just started running sample phoenix
spark example in spark 1.6 version. it was successful and now i want to run
this example in spark version 2.0.0. Is phoenix provides support for
spark-2.0.0?

previously i used this command:

DataFrame fromPhx = context.read().format("org.apache.phoenix.spark")

.options(ImmutableMap.of("driver", "org.apache.phoenix.jdbc.PhoenixDriver",
"zkUrl",

"jdbc:phoenix:localhost:2181", "table", "SAMPLE"))

.load();


In spark 2.0.0:


org.apache.spark.sql.Dataset<Row> df  = spark.read().format(
"org.apache.phoenix.spark")

.options(ImmutableMap.of("driver", "org.apache.phoenix.jdbc.PhoenixDriver",
"zkUrl",

"jdbc:phoenix:localhost:2181", "table", "SAMPLE"))

.load();


This is correct or i need to change any code?


please help me out.

Mime
View raw message