phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Elser <josh.el...@gmail.com>
Subject Re: Hash join out of memory error
Date Mon, 31 Oct 2016 22:34:05 GMT
Is the directory containing hbase-site.xml where you have made the 
modification included on your overriden CLASSPATH? How are you running 
this query -- is it on the classpath for that program?

ashish tapdiya wrote:
> Query:
>
> SELECT /*+ NO_STAR_JOIN*/ IP, RANK, TOTAL FROM (SELECT SOURCEIPADDR as
> IP, AVG(PAGERANK) AS RANK, SUM(ADREVENUE) AS TOTAL FROM uservisits_sf1
> AS UV  INNER JOIN rankings_sf1 AS R ON R.PAGEURL = UV.DESTINATIONURL
> WHERE UV.VISITDATE >= TO_DATE('2000-01-15') AND UV.VISITDATE <
> TO_DATE('2000-01-22') GROUP BY UV.SOURCEIPADDR) ORDER BY TOTAL DESC LIMIT 1
>
> Execution plan:
>
> +--------------------------------------------------------------------------------------------------------------------+
> |
> PLAN                                                        |
> +--------------------------------------------------------------------------------------------------------------------+
> | CLIENT 2-CHUNK PARALLEL 2-WAY FULL SCAN OVER
> USERVISITS_SF1                                                        |
> |     SERVER FILTER BY (VISITDATE >= DATE '2000-01-15 00:00:00.000' AND
> VISITDATE < DATE '2000-01-22 00:00:00.000')  |
> |     SERVER AGGREGATE INTO DISTINCT ROWS BY
> [UV.SOURCEIPADDR]                                                       |
> | CLIENT MERGE
> SORT
> |
> | CLIENT TOP 1 ROW SORTED BY [SUM(UV.ADREVENUE)
> DESC]                                                                |
> |     PARALLEL INNER-JOIN TABLE
> 0
> |
> |         CLIENT 2-CHUNK PARALLEL 2-WAY ROUND ROBIN FULL SCAN OVER
> RANKINGS_SF1                                      |
> |     DYNAMIC SERVER FILTER BY UV.DESTINATIONURL IN
> (R.PAGEURL)                                                      |
> +--------------------------------------------------------------------------------------------------------------------+
>
> Error in the client:
>
> java.sql.SQLException: Encountered exception in sub plan [0] execution.
>          at
> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:199)
>          at
> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:143)
>          at
> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:138)
>          at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:281)
>          at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:266)
>          at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>          at
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:265)
>          at
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.executeQuery(PhoenixPreparedStatement.java:186)
>          at execQuery.execQueryHJ3(execQuery.java:152)
>          at execQuery.main(execQuery.java:25)
> Caused by: java.lang.OutOfMemoryError
>          at
> java.io.ByteArrayOutputStream.hugeCapacity(ByteArrayOutputStream.java:123)
>          at
> java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:117)
>          at
> java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
>          at
> java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
>          at java.io.DataOutputStream.write(DataOutputStream.java:107)
>          at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:153)
>          at
> org.apache.phoenix.join.HashCacheClient.serialize(HashCacheClient.java:108)
>          at
> org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient.java:83)
>          at
> org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(HashJoinPlan.java:385)
>          at
> org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:167)
>          at
> org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:163)
>          at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>          at
> org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
>          at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>          at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>          at java.lang.Thread.run(Thread.java:745)
>
> The size of rankings_sf1 table (to be serialized in client) is 2.9 GB.
> Java client driver is started with -Xmx12g flag. Phoenix version is
> 4.8.1. Cluster size is 3 nodes (2 slaves). Relevant hbase-site.xml
> configuration in both client and servers is as follows:
>
> <property>
> <name>phoenix.query.maxServerCacheBytes</name>
> <value>14088576000</value>
> </property>
> <property>
> <name>phoenix.query.maxGlobalMemoryPercentage</name>
> <value>90</value>
> </property>
>
>
> bin directory contains hbase-site.xml and its path in .bashrc is
> specified as follows
> export
> CLASSPATH=$CLASSPATH:/home/ubuntu/phoenix/bin:/home/ubuntu/phoenix/phoenix-4.8.1-HBase-1.2-client.jar
>
>
> Is it a bug or am I missing some configuration that is causing this error?
>
> Thanks,
> Ashish

Mime
View raw message