phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 梁鹏程 <lian...@hotmail.com>
Subject Re: phoenix timeout exception
Date Tue, 31 Mar 2015 09:37:07 GMT
base-site.xml parameter  look like  this, is ok?

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
  <property>
    <name>hbase.regionserver.wal.codec</name>
    <value>org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec</value>
  </property>
  <property>
    <name>phoenix.query.timeoutMs</name>
    <value>6000000</value>
  </property>
</configuration>






Regards,
Ben Liang

> 在 2015年3月31日,17:11,丁桂涛(桂花) <dingguitao@baixing.com> 写道:
> 
> Add the following parameter to the hbase-site.xml in the phoenix bin directory.
> 
> <property>
>   <name>phoenix.query.timeoutMs</name>
>   <value>6000000</value>
> </property>
> 
> 
> On Tue, Mar 31, 2015 at 5:06 PM, 梁鹏程 <liangpc@hotmail.com <mailto:liangpc@hotmail.com>>
wrote:
> Do you give me clearly steps to do ?  
> I had been modified  server-side hbase-site.xml hbase.rpc.timeout = 3600000 , but still
print the same exception message.
> 
> 
> 
> Regards,
> Ben Liang
> 
>> 在 2015年3月31日,15:23,Puneet Kumar Ojha <puneet.kumar@pubmatic.com <mailto:puneet.kumar@pubmatic.com>>
写道:
>> 
>> As per below error Increase timeout from 60000 to 600,000 in the config properties.
>>  
>> From: outlook_c086a38934715939@outlook.com <mailto:outlook_c086a38934715939@outlook.com>
[mailto:outlook_c086a38934715939@outlook.com <mailto:outlook_c086a38934715939@outlook.com>]
On Behalf Of Ben Liang
>> Sent: Tuesday, March 31, 2015 12:29 PM
>> To: user@phoenix.apache.org <mailto:user@phoenix.apache.org>
>> Subject: phoenix timeout exception
>>  
>>  
>> HI ALL,
>>       I’m count the number of rows on hbase tables (dw.DM_T >= 290000000 rows),
sometimes succeed and sometimes fail,  it's exception as follow:
>>       
>>       could you help me to solve it?
>>       
>>       thanks.
>>  
>> 0: jdbc:phoenix:mvxl0490> select count(sales_id) from dw.DM_T;
>> +------------------------------------------+
>> |             COUNT(SALES_ID)       |
>> +------------------------------------------+
>> 15/03/31 14:38:39 WARN client.ScannerCallable: Ignore, probably already closed
>> java.io.IOException: Call to mvxl0663/10.16.1.237:60020 <http://10.16.1.237:60020/>
failed on local exception: org.apache.hadoop.hbase.ipc.RpcClient$CallTimeoutException: Call
id=2487, waitTime=68225, rpcTimetout=60000
>>         at org.apache.hadoop.hbase.ipc.RpcClient.wrapException(RpcClient.java:1489)
>>         at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1461)
>>         at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1661)
>>         at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1719)
>>         at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:30387)
>>         at org.apache.hadoop.hbase.client.ScannerCallable.close(ScannerCallable.java:291)
>>         at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:160)
>>         at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:59)
>>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:117)
>>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:93)
>>         at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:246)
>>         at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:439)
>>         at org.apache.phoenix.iterate.ScanningResultIterator.next(ScanningResultIterator.java:47)
>>         at org.apache.phoenix.iterate.TableResultIterator.next(TableResultIterator.java:104)
>>         at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:106)
>>         at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:73)
>>         at org.apache.phoenix.iterate.SpoolingResultIterator$SpoolingResultIteratorFactory.newIterator(SpoolingResultIterator.java:67)
>>         at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:92)
>>         at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:83)
>>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>         at java.lang.Thread.run(Thread.java:745)
>> Caused by: org.apache.hadoop.hbase.ipc.RpcClient$CallTimeoutException: Call id=2487,
waitTime=68225, rpcTimetout=60000
>>         at org.apache.hadoop.hbase.ipc.RpcClient$Connection.cleanupCalls(RpcClient.java:1194)
>>         at org.apache.hadoop.hbase.ipc.RpcClient$Connection.readResponse(RpcClient.java:1138)
>>         at org.apache.hadoop.hbase.ipc.RpcClient$Connection.run(RpcClient.java:727)
>> 15/03/31 14:41:55 WARN client.ScannerCallable: Ignore, probably already closed
>> org.apache.hadoop.hbase.UnknownScannerException: org.apache.hadoop.hbase.UnknownScannerException:
Name: 198, already closed?
>>         at org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3145)
>>         at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
>>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>         at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
>>         at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
>>         at java.lang.Thread.run(Thread.java:745)
>>  
>>         at sun.reflect.GeneratedConstructorAccessor11.newInstance(Unknown Source)
>>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:284)
>>         at org.apache.hadoop.hbase.client.ScannerCallable.close(ScannerCallable.java:293)
>>         at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:160)
>>         at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:59)
>>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:117)
>>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:93)
>>         at org.apache.hadoop.hbase.client.ClientScanner.close(ClientScanner.java:457)
>>         at org.apache.phoenix.iterate.ScanningResultIterator.close(ScanningResultIterator.java:41)
>>         at org.apache.phoenix.iterate.TableResultIterator.close(TableResultIterator.java:92)
>>         at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:125)
>>         at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:73)
>>         at org.apache.phoenix.iterate.SpoolingResultIterator$SpoolingResultIteratorFactory.newIterator(SpoolingResultIterator.java:67)
>>         at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:92)
>>         at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:83)
>>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>         at java.lang.Thread.run(Thread.java:745)
>> Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.UnknownScannerException):
org.apache.hadoop.hbase.UnknownScannerException: Name: 198, already closed?
>>         at org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3145)
>>         at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
>>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>>         at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
>>         at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
>>         at java.lang.Thread.run(Thread.java:745)
>>  
>>         at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1457)
>>         at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1661)
>>         at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1719)
>>         at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:30387)
>>         at org.apache.hadoop.hbase.client.ScannerCallable.close(ScannerCallable.java:291)
>>         ... 16 more
>> java.lang.RuntimeException: org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException:
270225ms passed since the last invocation, timeout is currently set to 60000
>>         at sqlline.IncrementalRows.hasNext(IncrementalRows.java:73)
>>         at sqlline.TableOutputFormat.print(TableOutputFormat.java:33)
>>         at sqlline.SqlLine.print(SqlLine.java:1653)
>>         at sqlline.Commands.execute(Commands.java:833)
>>         at sqlline.Commands.sql(Commands.java:732)
>>         at sqlline.SqlLine.dispatch(SqlLine.java:808)
>>         at sqlline.SqlLine.begin(SqlLine.java:681)
>>         at sqlline.SqlLine.start(SqlLine.java:398)
>>         at sqlline.SqlLine.main(SqlLine.java:292)
>>  
>>  
>> Regards,
>> Ben Liang
> 
> 


Mime
View raw message