phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ciureanu, Constantin (GfK)" <>
Subject MapReduce bulk load into Phoenix table
Date Tue, 13 Jan 2015 09:12:13 GMT
Hello all,

(Due to the slow speed of Phoenix JDBC – single machine ~ 1000-1500 rows /sec) I am also
documenting myself about loading data into Phoenix via MapReduce.

So far I understood that the Key + List<[Key,Value]> to be inserted into HBase table
is obtained via a “dummy” Phoenix connection – then those rows are stored into HFiles
(then after the MR job finishes it is Bulk loading those HFiles normally into HBase).

My question: Is there any better / faster approach? I assume this cannot reach the maximum
speed to load data into Phoenix / HBase table.

Also I would like to find a better / newer sample code than this one:

Thank you,
View raw message