phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From local host <>
Subject Re: Analyse the phoenix inserted data using pig
Date Mon, 24 Mar 2014 16:21:27 GMT
Thanks Ravi for the update.

If you can share some more info on the expected date for PhoenixHbaseLoader
that will be great.

I am eager to use Phoenix in my current project but I want to know about
the extra work done by phoenix while inserting records in HBase table so
that I can freely use other batch analysis tools such as pig, impala, hive.
*In crux, I want to know if Phoenix and other tools are inter-operable.*

On Fri, Mar 21, 2014 at 7:37 PM, Ravi Kiran <>wrote:

> Hi
>    We are currently working on having a PhoenixHbaseLoader to load data
> from HBase using Pig.
> Regards
> Ravi
> On Sat, Mar 22, 2014 at 5:19 AM, local host <
> > wrote:
>> Hey All,
>> *How can I analyze the Hbase data, which was inserted by jdbc phoenix,
>> using Pig?*
>> I wish to do batch processing on Hbase data using pig and correct the
>> maintained counters.
>> In crux, I want to know what extra work phoenix is doing in a HBase table
>> at the time of insertion that requires some extra steps when I am analyzing
>> it from other mapreduce tools such as hive, pig, dril etc.
>> --UniLocal

View raw message