Hi Maryann,
The sizes of each table in my query are as follows:
customer - 24 MB
orders - 164 MB
lineitem - 725 MB
supplier - 1.4 MB
nation - 2.2 KB
region - 400 B
The heap size of my region servers is 4 GB.
Thank you,
David
On Mon, Feb 24, 2014 at 5:58 PM, Maryann Xue <maryann.xue@gmail.com> wrote:
> Hi David,
>
> What are sizes of each table in your query? And what is the heap size of
> your region server?
>
>
> Thanks,
> Maryann
>
>
> On Mon, Feb 24, 2014 at 5:50 PM, David Wang <davidwang400@gmail.com>wrote:
>
>> Hi,
>>
>> I had a question about running queries containing inner joins on phoenix,
>> but wasn't sure where to post.
>> I downloaded the phoenix-master (3.0.0), and inserted 10 MB of data
>> through psql.sh. I found I was able to run a query,
>> which contains an inner join in sqlline.
>>
>> The query I tried:
>>
>> select n_name, sum(l_extendedprice * (1 - l_discount)) as revenue from
>> customer inner join orders on c_custkey = o_custkey inner join lineitem on
>> l_orderkey = o_orderkey inner join supplier on l_suppkey = s_suppkey and
>> c_nationkey = s_nationkey inner join nation on s_nationkey = n_nationkey
>> inner join region on n_regionkey = r_regionkey where r_name = 'AMERICA' and
>> o_orderdate >= '1993-01-01' and o_orderdate < '1994-01-01' group by n_name
>> order by revenue desc;
>>
>> But when I increased the data size from 10 MB to 1 GB, and try to run the
>> same query, I get the following error:
>> Error: Encountered exception in hash plan execution. (state=,code=0)
>>
>> I have included the path to snappy-java-1.0.3.2.jar in sqlline.sh like
>> the following:
>> java -cp
>> ".:$phoenix_client_jar:/usr/local/hbase-0.94.8/lib/snappy-java-1.0.3.2.jar"
>> -Dlog4j.configuration=file:$current_dir/log4j.properties sqlline.SqlLine
>> -d com.salesforce.phoenix.jdbc.PhoenixDriver -u jdbc:phoenix:$1 -n none
>> -p
>> none --color=true --fastConnect=false --verbose=true
>> --isolation=TRANSACTION_READ_COMMITTED $sqlfile
>>
>> Do you have any idea where the problem might be?
>>
>> Thank you so much,
>>
>> David
>>
>
>
>
> --
> Thanks,
> Maryann
>
|