phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abe Weinograd <...@flonet.com>
Subject Re: Reading Numerical Values Written Through Hive
Date Mon, 24 Nov 2014 14:51:30 GMT
Phoenix serializes INTEGER and BIGINT differently than Bytes.toBytes()

I believe this will only work if you use UNSIGNED_INT and UNSIGNED_LONG,
but would require you to not have negative numbers.

Abe

On Mon, Nov 24, 2014 at 8:47 AM, Moustafa Aboul Atta <m.aboulatta@gmail.com>
wrote:

> Hello, I have Hive running on top of HBase through org.apache.hadoop.
> hive.hbase.HBaseStorageHandler.
>
> String columns on Hive are mapped to string columns on HBase mapped to
> varchar columns on Phoenix
> Numericals (INTs, BIGINTs) mapped to binary columns on Hbase mapped to
> (INTEGER, BIGINT) on phoenix.
>
> When I try to query through phoenix an already existing table on Hbase
> that its data was inserted via Hive. Strings are read properly, however,
> any numerical value is not. I am not sure what is the problem. It's not
> overflow because types are configured properly. I suspect it may be a
> problem with endianness but can't find any concrete lead. Values on Hbase
> are stored as big endian.
>
> Any insights will be highly appreciated. thanks.
>
> --
> Best Regards,
> Moustafa Aboul Atta
>

Mime
View raw message