phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Kiran <maghamraviki...@gmail.com>
Subject Re: Phoenix Pig Integration: Error in readFields java.lang.NegativeArraySizeException: -1
Date Tue, 01 Jul 2014 21:48:30 GMT
Hi Anil,
   Can you please share the table ddl to help me write tests to see where
the issue is.

Regards
Ravi


On Tue, Jul 1, 2014 at 1:58 PM, anil gupta <anilgupta84@gmail.com> wrote:

> I tried one more script:
> A = load 'hbase://query/SELECT * from test_table' using
> org.apache.phoenix.pig.
> PhoenixHBaseLoader('ZK');
> grpd = GROUP A BY Rowkey_Prefix_String;
> cnt = FOREACH grpd GENERATE group AS Rowkey_Prefix_String,COUNT(A);
> DUMP cnt;
>
> Rowkey_Prefix_String is prefix of my composite rowkey.
>
> Again, i got same error:
> ERROR org.apache.hadoop.hbase.io.HbaseObjectWritable - Error in readFields
> java.io.IOException: Error in readFields
>     at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:698)
>     at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:596)
>     at
>
> org.apache.hadoop.hbase.client.coprocessor.ExecResult.readFields(ExecResult.java:83)
>     at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
>     at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:333)
>     at
>
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:383)
>     at
>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
> Caused by: java.lang.NegativeArraySizeException: -1
>     at org.apache.hadoop.hbase.util.Bytes.readByteArray(Bytes.java:175)
>     at
> org.apache.phoenix.schema.PColumnImpl.readFields(PColumnImpl.java:157)
>     at org.apache.phoenix.schema.PTableImpl.readFields(PTableImpl.java:721)
>     at
>
> org.apache.phoenix.coprocessor.MetaDataProtocol$MetaDataMutationResult.readFields(MetaDataProtocol.java:161)
>     at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
>     ... 6 more
>
> I am able to query this table from sqlline.
>
> Thanks,
> Anil Gupta
>
>
>
>
> On Tue, Jul 1, 2014 at 1:51 PM, anil gupta <anilgupta84@gmail.com> wrote:
>
> > Hi All,
> >
> > I am running the HBase0.94.15 and latest phoenix 3.1 nightly build. I
> have
> > to use pig on phoenix views. When i run the job i get following error:
> > ERROR org.apache.hadoop.hbase.io.HbaseObjectWritable - Error in
> readFields
> > java.lang.NegativeArraySizeException: -1
> >     at org.apache.hadoop.hbase.util.Bytes.readByteArray(Bytes.java:175)
> >     at
> > org.apache.phoenix.schema.PColumnImpl.readFields(PColumnImpl.java:157)
> >     at
> org.apache.phoenix.schema.PTableImpl.readFields(PTableImpl.java:721)
> >     at
> >
> org.apache.phoenix.coprocessor.MetaDataProtocol$MetaDataMutationResult.readFields(MetaDataProtocol.java:161)
> >     at
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
> >     at
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:596)
> >     at
> >
> org.apache.hadoop.hbase.client.coprocessor.ExecResult.readFields(ExecResult.java:83)
> >     at
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:692)
> >     at
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:333)
> >     at
> >
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.receiveResponse(SecureClient.java:383)
> >     at
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
> >
> > Here is my pig script:
> > A = load 'hbase://query/SELECT * from test_table' using
> > org.apache.phoenix.pig.PhoenixHBaseLoader('ZK');
> > grpd = GROUP A BY UNSIGNED_DATE_COLUMN;
> > cnt = FOREACH grpd GENERATE group AS UNSIGNED_DATE_COLUMN,COUNT(A);
> > DUMP cnt;
> >
> > Can anyone tell me what the issue over here? I am suspecting maybe
> > UNSIGNED_DATE type column is not supported in Pig integration?
> >
> >
> > --
> > Thanks & Regards,
> > Anil Gupta
> >
>
>
>
> --
> Thanks & Regards,
> Anil Gupta
>

Mime
View raw message