phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Divya Gehlot <>
Subject Fwd: [HELP:]Save Spark Dataframe in Phoenix Table
Date Sat, 09 Apr 2016 13:05:49 GMT
Reposting for other user benefits
---------- Forwarded message ----------
From: Divya Gehlot <>
Date: 8 April 2016 at 19:54
Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table
To: Josh Mahonin <>

Hi Josh,
I am doing in the same manner as mentioned in Phoenix Spark manner.
Using the latest version of HDP 2.3.4 .
In case of version mismatch/lack of spark Phoenix support it's should have
thrown the error at read also.
Which is working fine as expected .
Will surely pass on the code snippets once I log on to my System.
In the mean while I would like to know the zkURL parameter.If I build it
with HbaseConfiguration and passing zk quorom ,znode and port .
It throws error for example localhost :2181/hbase-unsecure
This localhost gets replaced by all the quorom
Like quorum1,quorum2:2181/hbase-unsecure

I am just providing the IP address of my HBase master.

I feel like I am  not on right track so asked for the help .
How to connect to Phoenix through Spark on hadoop cluster .
Thanks for the help.
On Apr 8, 2016 7:06 PM, "Josh Mahonin" <> wrote:

> Hi Divya,
> That's strange. Are you able to post a snippet of your code to look at?
> And are you sure that you're saving the dataframes as per the docs (
> Depending on your HDP version, it may or may not actually have
> phoenix-spark support. Double-check that your Spark configuration is setup
> with the right worker/driver classpath settings. and that the phoenix JARs
> contain the necessary phoenix-spark classes
> (e.g. org.apache.phoenix.spark.PhoenixRelation). If not, I suggest
> following up with Hortonworks.
> Josh
> On Fri, Apr 8, 2016 at 1:22 AM, Divya Gehlot <>
> wrote:
>> Hi,
>> I hava a Hortonworks Hadoop cluster having below Configurations :
>> Spark 1.5.2
>> HBASE 1.1.x
>> Phoenix 4.4
>> I am able to connect to Phoenix through JDBC connection and able to read
>> the Phoenix tables .
>> But while writing the data back to Phoenix table
>> I am getting below error :
>> org.apache.spark.sql.AnalysisException:
>> org.apache.phoenix.spark.DefaultSource does not allow user-specified
>> schemas.;
>> Can any body help in resolving the above errors or any other solution of
>> saving Spark Dataframes to Phoenix.
>> Would really appareciate the help.
>> Thanks,
>> Divya

View raw message