Have you looked at paging [1] using Phoenix's row-value constructors together with the LIMIT clause? That might be what you are looking for.

[1] http://phoenix.apache.org/paged.html


On Mon, May 18, 2015 at 6:46 AM, Isart Montane <isart.montane@gmail.com> wrote:

the company I work for is performing some tests on Phoenix with NodeJS. For simple queries I didn't had any problem, but as soon as I start to use our app I'm getting "process out of memory" errors on the client when I runs queries that return a big number of rows (i.e. 400k) . I think the problem is that the client tries to buffer all the results in RAM and that kills it. The same query runs fine when I run it with sqline.

So, is there a way to tell the client to stream the results (or batch them) instead of buffering them all? is raising the client memory the only solution?

I'm using phoenix-4.3.1 and https://github.com/gaodazhu/phoenix-client as the NodeJS driver


Isart Montane