phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gabriel Reid <gabriel.r...@gmail.com>
Subject Re: select w/ limit hanging on large tables
Date Wed, 13 May 2015 05:27:00 GMT
Hi Kiru,

How many regions are there on this table?

Could you also share some information on the schema of the table (e.g. how
many columns are defined)?

Does a "limit 10" query also hang in this table?

Could you also elaborate a bit on the issues you were running into when
loading data into the table? We're there performance issues, or we're
things not working at all?

- Gabriel
On Tue, May 12, 2015 at 23:56 Kiru Pakkirisamy <kirupakkirisamy@yahoo.com>
wrote:

> We are trying to benchmark/test Phoenix with large tables.
> A 'select * from table1 limit 100000' hangs on a 1.4 billion row table (in
> sqlline.py or SQuirreL)
> The same select of 1million rows works on smaller table (300 million).
> Mainly we wanted to create a smaller version of the 1.4 billion table and
> ran into this issue.
> Any ideas why this is happening ?
> We had quite a few problems crossing the 1 billion mark even when loading
> (using CsvBulkLoadTool) the table.
> We are also wondering whether our HBase is configured correctly.
> Any tips on HBase Configuration for loading/running Phoenix is highly
> appreciated as well.
> (We are on HBase 0.98.12 and Phoenix 4.3.1)
>
> Regards,
> - kiru
>
>
> Regards,
> - kiru
>
>
>

Mime
View raw message