phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nathan Davis <nathan.da...@salesforce.com>
Subject phoenix.query.maxServerCacheBytes not used
Date Tue, 19 Jul 2016 20:59:51 GMT
Hi,
I am running a standalone HBase locally with Phoenex installed by dropping
the jars into HBase lib directory. I have added the following to my
hbase-site.xml and restarted HBase:

  <property>
>     <name>phoenix.query.maxServerCacheBytes</name>
>     <value>419430400</value>
>   </property>
>   <property>
>     <name>phoenix.query.maxGlobalMemoryPercentage</name>
>     <value>25</value>
>   </property>


However, I am still getting the following error when doing a regular inner
join to an 5mill-sized RHS table (Notice that the error says "...maximum
allowed size (104857600 bytes)" even though I have changed that setting to
400MB):

java.sql.SQLException: Encountered exception in sub plan [0] execution.
> at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:193)
> at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:138)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:276)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:261)
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:260)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:248)
> at
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.execute(PhoenixPreparedStatement.java:172)
> at
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.execute(PhoenixPreparedStatement.java:177)
> at
> org.apache.phoenix.jdbc.PhoenixConnection.executeStatements(PhoenixConnection.java:354)
> at
> org.apache.phoenix.util.PhoenixRuntime.executeStatements(PhoenixRuntime.java:298)
> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:243)
> Caused by: org.apache.phoenix.join.MaxServerCacheSizeExceededException:
> Size of hash cache (104857638 bytes) exceeds the maximum allowed size
> (104857600 bytes)
> at
> org.apache.phoenix.join.HashCacheClient.serialize(HashCacheClient.java:110)
> at
> org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient.java:83)
> at
> org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(HashJoinPlan.java:381)
> at org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:162)
> at org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:158)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)


It seems like my `maxServerCacheBytes` setting is not getting picked up,
but not sure why. I'm pretty newb to Phoenix so I'm sure it's something
simple...

Thanks up front for the help!

-Nathan Davis

Mime
View raw message