phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Isart Montane <>
Subject Re: Phoenix and NodeJS
Date Mon, 18 May 2015 14:30:33 GMT
Hi Eli,

thanks a lot for your answer. That might be a workaround but I was hoping
to get a more generic answer I can apply to the driver/phoenix since that
will require me lots of changes to the code.

Any clue on why it works with sqline but not trough the node driver?

On Mon, May 18, 2015 at 4:20 PM, Eli Levine <> wrote:

> Have you looked at paging [1] using Phoenix's row-value constructors
> together with the LIMIT clause? That might be what you are looking for.
> [1]
> Eli
> On Mon, May 18, 2015 at 6:46 AM, Isart Montane <>
> wrote:
>> Hi,
>> the company I work for is performing some tests on Phoenix with NodeJS.
>> For simple queries I didn't had any problem, but as soon as I start to use
>> our app I'm getting "process out of memory" errors on the client when I
>> runs queries that return a big number of rows (i.e. 400k) . I think the
>> problem is that the client tries to buffer all the results in RAM and that
>> kills it. The same query runs fine when I run it with sqline.
>> So, is there a way to tell the client to stream the results (or batch
>> them) instead of buffering them all? is raising the client memory the only
>> solution?
>> I'm using phoenix-4.3.1 and
>> as the NodeJS driver
>> Thanks,
>> Isart Montane

View raw message