phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Kiran <maghamraviki...@gmail.com>
Subject Re: Phoenix Pig Integration: Error in readFields java.lang.NegativeArraySizeException: -1
Date Tue, 01 Jul 2014 22:16:41 GMT
Hi Anil,

   True. It shouldn't be problem .
    I will test it and let you know soon.

Regards
Ravi


On Tue, Jul 1, 2014 at 3:00 PM, anil gupta <anilgupta84@gmail.com> wrote:

> One more question:
> The views that i am creating, only have few columns of the the original
> HBase tables. I dont think that should be a problem. Right?
> For example: One row in table in HBase can have 100 columns but in my
> phoenix view only mapped 10 columns.
>
>
> On Tue, Jul 1, 2014 at 2:58 PM, anil gupta <anilgupta84@gmail.com> wrote:
>
>> Hi Ravi,
>>
>> Here is sample DDL for table:
>> CREATE view test_table( Rowkey_Prefix_String VARCHAR not null, rev_time
>> BIGINT not null, ora_id BIGINT not null, src_db_id SMALLINT,
>> t.big_decimal_val DECIMAL, UNSIGNED_DATE_COLUMN UNSIGNED_DATE, t.OID
>> VARCHAR CONSTRAINT pk PRIMARY KEY(Rowkey_Prefix_String ,rev_time,ora_id));
>>
>>
>>
>> On Tue, Jul 1, 2014 at 2:48 PM, Ravi Kiran <maghamravikiran@gmail.com>
>> wrote:
>>
>>> Hi Anil,
>>>    Can you please share the table ddl to help me write tests to see
>>> where the issue is.
>>>
>>> Regards
>>> Ravi
>>>
>>>
>>> On Tue, Jul 1, 2014 at 1:58 PM, anil gupta <anilgupta84@gmail.com>
>>> wrote:
>>>
>>>> I tried one more script:
>>>> A = load 'hbase://query/SELECT * from test_table' using
>>>> org.apache.phoenix.pig.
>>>> PhoenixHBaseLoader('ZK');
>>>> grpd = GROUP A BY Rowkey_Prefix_String;
>>>> cnt = FOREACH grpd GENERATE group AS Rowkey_Prefix_String,COUNT(A);
>>>> DUMP cnt;
>>>>
>>>> Rowkey_Prefix_String is prefix of my composite rowkey.
>>>>
>>>> Again, i got same error:
>>>> ERROR org.apache.hadoop.hbase.io.HbaseObjectWritable - Error in
>>>> readFields
>>>> java.io.IOException: Error in readFields
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:698)
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:596)
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.client.coprocessor.ExecResult.readFields(ExecResult.java:83)
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:333)
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:383)
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
>>>> Caused by: java.lang.NegativeArraySizeException: -1
>>>>     at org.apache.hadoop.hbase.util.Bytes.readByteArray(Bytes.java:175)
>>>>     at
>>>> org.apache.phoenix.schema.PColumnImpl.readFields(PColumnImpl.java:157)
>>>>     at
>>>> org.apache.phoenix.schema.PTableImpl.readFields(PTableImpl.java:721)
>>>>     at
>>>>
>>>> org.apache.phoenix.coprocessor.MetaDataProtocol$MetaDataMutationResult.readFields(MetaDataProtocol.java:161)
>>>>     at
>>>>
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
>>>>     ... 6 more
>>>>
>>>> I am able to query this table from sqlline.
>>>>
>>>> Thanks,
>>>> Anil Gupta
>>>>
>>>>
>>>>
>>>>
>>>> On Tue, Jul 1, 2014 at 1:51 PM, anil gupta <anilgupta84@gmail.com>
>>>> wrote:
>>>>
>>>> > Hi All,
>>>> >
>>>> > I am running the HBase0.94.15 and latest phoenix 3.1 nightly build.
I
>>>> have
>>>> > to use pig on phoenix views. When i run the job i get following error:
>>>> > ERROR org.apache.hadoop.hbase.io.HbaseObjectWritable - Error in
>>>> readFields
>>>> > java.lang.NegativeArraySizeException: -1
>>>> >     at
>>>> org.apache.hadoop.hbase.util.Bytes.readByteArray(Bytes.java:175)
>>>> >     at
>>>> > org.apache.phoenix.schema.PColumnImpl.readFields(PColumnImpl.java:157)
>>>> >     at
>>>> org.apache.phoenix.schema.PTableImpl.readFields(PTableImpl.java:721)
>>>> >     at
>>>> >
>>>> org.apache.phoenix.coprocessor.MetaDataProtocol$MetaDataMutationResult.readFields(MetaDataProtocol.java:161)
>>>> >     at
>>>> >
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
>>>> >     at
>>>> >
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:596)
>>>> >     at
>>>> >
>>>> org.apache.hadoop.hbase.client.coprocessor.ExecResult.readFields(ExecResult.java:83)
>>>> >     at
>>>> >
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
>>>> >     at
>>>> >
>>>> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:333)
>>>> >     at
>>>> >
>>>> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:383)
>>>> >     at
>>>> >
>>>> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
>>>> >
>>>> > Here is my pig script:
>>>> > A = load 'hbase://query/SELECT * from test_table' using
>>>> > org.apache.phoenix.pig.PhoenixHBaseLoader('ZK');
>>>> > grpd = GROUP A BY UNSIGNED_DATE_COLUMN;
>>>> > cnt = FOREACH grpd GENERATE group AS UNSIGNED_DATE_COLUMN,COUNT(A);
>>>> > DUMP cnt;
>>>> >
>>>> > Can anyone tell me what the issue over here? I am suspecting maybe
>>>> > UNSIGNED_DATE type column is not supported in Pig integration?
>>>> >
>>>> >
>>>> > --
>>>> > Thanks & Regards,
>>>> > Anil Gupta
>>>> >
>>>>
>>>>
>>>>
>>>> --
>>>> Thanks & Regards,
>>>> Anil Gupta
>>>>
>>>
>>>
>>
>>
>> --
>> Thanks & Regards,
>> Anil Gupta
>>
>
>
>
> --
> Thanks & Regards,
> Anil Gupta
>

Mime
View raw message