phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Elser <els...@apache.org>
Subject Re: Phoenix Mapreduce
Date Fri, 29 Dec 2017 15:18:06 GMT
Hey Anil,

Check out the MultiHfileOutputFormat class.

You can see how AbstractBulkLoadTool invokes it inside the `submitJob` 
method.

On 12/28/17 5:33 AM, Anil wrote:
> HI Team,
> 
> I was looking at the PhoenixOutputFormat and PhoenixRecordWriter.java , 
> could not see connection autocommit is set to false. Did i miss 
> something here ?
> 
> Is there any way to read from phoenix table and create HFiles for bulk 
> import instead of committing every record (batch).
> 
> I have written a mapreduce job to create a datasets for my target table 
> and data load to target table is taking long time and want to avoid load 
> time by avoiding statement execution or frequent commits.
> 
> Any help would be appreciated. thanks.
> 
> Thanks,
> Anil
> 
> 

Mime
View raw message