phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nanda <tnkish...@gmail.com>
Subject org.apache.phoenix.join.MaxServerCacheSizeExceededException
Date Wed, 10 Feb 2016 12:37:02 GMT
Hi ,

I am using HDP 2.3.0 with Phoneix 4.4 and i quiet often get the below
exception,

Caused by: java.sql.SQLException: Encountered exception in sub plan [0]
execution.
        at
org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:156)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:251)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:241)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:240)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:1223)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
com.brocade.nva.dataaccess.AbstractDAO.getResultSet(AbstractDAO.java:388)
~[nvp-data-access-1.0-SNAPSHOT.jar:na]
        at
com.brocade.nva.dataaccess.HistoryDAO.getSummaryTOP10ReportDetails(HistoryDAO.java:306)
~[nvp-data-access-1.0-SNAPSHOT.jar:na]
        ... 75 common frames omitted
Caused by: org.apache.phoenix.join.MaxServerCacheSizeExceededException:
Size of hash cache (104857651 bytes) exceeds the maximum allowed size
(104857600 bytes)
        at
org.apache.phoenix.join.HashCacheClient.serialize(HashCacheClient.java:109)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient.java:82)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(HashJoinPlan.java:338)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:135)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
~[na:1.8.0_40]
        at
org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:172)
~[phoenix-core-4.4.0-HBase-1.1.jar:4.4.0-HBase-1.1]
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[na:1.8.0_40]


Below are params i am using,

server side properties:
phoenix.coprocessor.maxServerCacheTimeToLiveMs=180000
phoenix.groupby.maxCacheSize=1572864000
phoenix.query.maxGlobalMemoryPercentage=60
phoenix.query.maxGlobalMemorySize=4096000000
phoenix.stats.guidepost.width=524288000


client side properties are:
hbase.client.scanner.timeout.period=180000
phoenix.query.spoolThresholdBytes=1048576000
phoenix.query.timeoutMs=180000
phoenix.query.threadPoolSize=240
phoenix.query.maxGlobalMemoryPercentage=60
phoenix.query.maxServerCacheBytes=1048576810


and my hbase heap is set to 4GB

Is there som property i need to set explicitly for this.

Thanks,
Nanda

Mime
View raw message