phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Liren Sun <sunliren2...@gmail.com>
Subject Re: Problem after phoenix upgrade from 2.2.2 to 3.1
Date Wed, 15 Oct 2014 07:02:40 GMT
Turn out that the jar on one host is corrupted during copy. Copy again,
restart, problem is gone.

On Tue, Oct 14, 2014 at 11:32 PM, Liren Sun <sunliren2005@gmail.com> wrote:

> Taking dev off the thread.
>
> I saw this in my RS log, but I am sure the jar is in the classpath.
> I checked it by hbase classpath
>
> 2014-10-15 06:21:44,011 WARN org.apache.hadoop.ipc.HBaseServer: Unable to
> read call parameters for client 10.9.216.33
> java.io.IOException: Error in readFields
>         at
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:699)
>         at
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:126)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1311)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1226)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:748)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:539)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:514)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> Caused by: java.lang.RuntimeException: Can't find class
> org.apache.phoenix.filter.SkipScanFilter
>         at org.apache.hadoop.hbase.client.Scan.createForName(Scan.java:584)
>         at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:602)
>         at
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:696)
>         ... 9 more
> 2014-10-15 06:22:48,567 ERROR
> org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
> java.lang.RuntimeException: Can't find class
> org.apache.phoenix.filter.SkipScanFilter
>         at org.apache.hadoop.hbase.client.Scan.createForName(Scan.java:584)
>         at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:602)
>         at
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:696)
>         at
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:126)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1311)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1226)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:748)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:539)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:514)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
>
>
> On Tue, Oct 14, 2014 at 9:54 PM, Liren Sun <sunliren2005@gmail.com> wrote:
>
> Hi,
>> We upgraded our phoenix today from 2.2.2 to 3.1. First the auto-upgrade
>> was not working in 3.1, so I did a upgrade from 2.2.2 to 3.0 first. Initial
>> connection to phoenix seems setting up the meta data, I can see the
>> upgraded table's attributes changed to new package, and I can see the table
>> schema through !table in sqline. Without make a query to the upgraded table
>> in 3.0, I upgraded to phoenix 3.1.
>> Then when I try to query the table, the query seems to take a lot time
>> and eventually die with this exception:
>> +----------------------+
>> java.lang.RuntimeException:
>> org.apache.phoenix.exception.PhoenixIOException:
>> org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=14,
>> exceptions:
>> Wed Oct 15 02:44:54 UTC 2014,
>> org.apache.hadoop.hbase.client.ScannerCallable@da77dcf,
>> java.io.IOException: IPC server unable to read call parameters: Error in
>> readFields
>> Wed Oct 15 02:44:55 UTC 2014,
>> org.apache.hadoop.hbase.client.ScannerCallable@da77dcf,
>> java.io.IOException: IPC server unable to read call parameters: Error in
>> readFields
>>
>>         at sqlline.SqlLine$IncrementalRows.hasNext(SqlLine.java:2440)
>>         at sqlline.SqlLine$TableOutputFormat.print(SqlLine.java:2074)
>>         at sqlline.SqlLine.print(SqlLine.java:1735)
>>         at sqlline.SqlLine$Commands.execute(SqlLine.java:3683)
>>         at sqlline.SqlLine$Commands.sql(SqlLine.java:3584)
>>         at sqlline.SqlLine.dispatch(SqlLine.java:821)
>>         at sqlline.SqlLine.begin(SqlLine.java:699)
>>         at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>         at sqlline.SqlLine.main(SqlLine.java:424)
>>
>> I noticed the date type is changed to unsign_date. We have a date column
>> in our table rowkey, and it is the first column (there is a salt bucket key
>> in front of it). We use to_date function in where clause in our queries.
>>
>>
>> Thanks
>> Leo
>>
>>
>>
> On Tue, Oct 14, 2014 at 9:54 PM, Liren Sun <sunliren2005@gmail.com> wrote:
>
>> Hi,
>> We upgraded our phoenix today from 2.2.2 to 3.1. First the auto-upgrade
>> was not working in 3.1, so I did a upgrade from 2.2.2 to 3.0 first. Initial
>> connection to phoenix seems setting up the meta data, I can see the
>> upgraded table's attributes changed to new package, and I can see the table
>> schema through !table in sqline. Without make a query to the upgraded table
>> in 3.0, I upgraded to phoenix 3.1.
>> Then when I try to query the table, the query seems to take a lot time
>> and eventually die with this exception:
>> +----------------------+
>> java.lang.RuntimeException:
>> org.apache.phoenix.exception.PhoenixIOException:
>> org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=14,
>> exceptions:
>> Wed Oct 15 02:44:54 UTC 2014,
>> org.apache.hadoop.hbase.client.ScannerCallable@da77dcf,
>> java.io.IOException: IPC server unable to read call parameters: Error in
>> readFields
>> Wed Oct 15 02:44:55 UTC 2014,
>> org.apache.hadoop.hbase.client.ScannerCallable@da77dcf,
>> java.io.IOException: IPC server unable to read call parameters: Error in
>> readFields
>>
>>         at sqlline.SqlLine$IncrementalRows.hasNext(SqlLine.java:2440)
>>         at sqlline.SqlLine$TableOutputFormat.print(SqlLine.java:2074)
>>         at sqlline.SqlLine.print(SqlLine.java:1735)
>>         at sqlline.SqlLine$Commands.execute(SqlLine.java:3683)
>>         at sqlline.SqlLine$Commands.sql(SqlLine.java:3584)
>>         at sqlline.SqlLine.dispatch(SqlLine.java:821)
>>         at sqlline.SqlLine.begin(SqlLine.java:699)
>>         at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>         at sqlline.SqlLine.main(SqlLine.java:424)
>>
>> I noticed the date type is changed to unsign_date. We have a date column
>> in our table rowkey, and it is the first column (there is a salt bucket key
>> in front of it). We use to_date function in where clause in our queries.
>>
>>
>> Thanks
>> Leo
>>
>>
>
>

Mime
View raw message