phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abe Weinograd <...@flonet.com>
Subject Re: table seems to get corrupt
Date Tue, 05 Aug 2014 22:32:53 GMT
Any idea why this is happening? All region servers have the Phoenix jars.

On Sunday, August 3, 2014, Abe Weinograd <abe@flonet.com> wrote:

> We are loading it via Map reduce directly into HFiles.  This has worked
> well for us for a while.  Recently, while loading a table with 99 columns
> and a 3 column composite key, it became unqueryable at some point during or
> after we load it.
>
> When trying to run an aggregation (Group by and COUNT(1)), Squirrel dumps
> out java.lang.IllegalStateException: Expected single, aggregated KeyValue
> from coprocessor, but instead received
> keyvalues={\x00\x00\x00M\x00\x00\x01G\x92I\xEB\xEB3fcf9e6d-01f9-4bb9-9d08-f8cc95c00f28/0:ACTION_END/1406909122452/Put/vlen=8/ts=0/value=
>
> When trying to do a SELECT * LIMIT 1, we
> get org.apache.phoenix.exception.PhoenixIOException:
> org.apache.phoenix.exception.PhoenixIOException: IPC server unable to read
> call parameters: Can't find class
> org.apache.phoenix.filter.ColumnProjectionFilter
>
> This has happened a couple of times and the only way we can seem to get
> querying via phoenix to work is to blow up the table.  I can scan and do
> stuff fine via hbase and the Hbase shell.
>
> Having trouble getting anywhere debugging myself.  Any thoughts/help would
> be greatly appreciated.
>
> Abe
>


-- 
Sent from MetroMail

Mime
View raw message