phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Moustafa Aboul Atta <m.aboula...@gmail.com>
Subject Re: Reading Numerical Values Written Through Hive
Date Thu, 27 Nov 2014 09:01:45 GMT
UNSIGNED_INT and UNISIGNED_LONG do indeed work. Thanks

On Mon, Nov 24, 2014 at 4:51 PM, Abe Weinograd <abe@flonet.com> wrote:

> Phoenix serializes INTEGER and BIGINT differently than Bytes.toBytes()
>
> I believe this will only work if you use UNSIGNED_INT and UNSIGNED_LONG,
> but would require you to not have negative numbers.
>
> Abe
>
> On Mon, Nov 24, 2014 at 8:47 AM, Moustafa Aboul Atta <
> m.aboulatta@gmail.com> wrote:
>
>> Hello, I have Hive running on top of HBase through org.apache.hadoop.
>> hive.hbase.HBaseStorageHandler.
>>
>> String columns on Hive are mapped to string columns on HBase mapped to
>> varchar columns on Phoenix
>> Numericals (INTs, BIGINTs) mapped to binary columns on Hbase mapped to
>> (INTEGER, BIGINT) on phoenix.
>>
>> When I try to query through phoenix an already existing table on Hbase
>> that its data was inserted via Hive. Strings are read properly, however,
>> any numerical value is not. I am not sure what is the problem. It's not
>> overflow because types are configured properly. I suspect it may be a
>> problem with endianness but can't find any concrete lead. Values on Hbase
>> are stored as big endian.
>>
>> Any insights will be highly appreciated. thanks.
>>
>> --
>> Best Regards,
>> Moustafa Aboul Atta
>>
>
>


-- 
Best Regards,
Moustafa Aboul Atta

Mime
View raw message