phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eli Levine <>
Subject Re: Phoenix and NodeJS
Date Mon, 18 May 2015 15:43:34 GMT
I don't have info on what your app does with results from Phoenix. If the
app is constructing some sort of object representations from Phoenix
results and holding on to them, I would look at what the memory footprint
of that is. I know this isn't very helpful but at this point I would try to
dig deeper into your app and the NodeJS driver rather than Phoenix, since
you mentioned the same queries run fine in sqlline.

On Mon, May 18, 2015 at 7:30 AM, Isart Montane <>

> Hi Eli,
> thanks a lot for your answer. That might be a workaround but I was hoping
> to get a more generic answer I can apply to the driver/phoenix since that
> will require me lots of changes to the code.
> Any clue on why it works with sqline but not trough the node driver?
> On Mon, May 18, 2015 at 4:20 PM, Eli Levine <> wrote:
>> Have you looked at paging [1] using Phoenix's row-value constructors
>> together with the LIMIT clause? That might be what you are looking for.
>> [1]
>> Eli
>> On Mon, May 18, 2015 at 6:46 AM, Isart Montane <>
>> wrote:
>>> Hi,
>>> the company I work for is performing some tests on Phoenix with NodeJS.
>>> For simple queries I didn't had any problem, but as soon as I start to use
>>> our app I'm getting "process out of memory" errors on the client when I
>>> runs queries that return a big number of rows (i.e. 400k) . I think the
>>> problem is that the client tries to buffer all the results in RAM and that
>>> kills it. The same query runs fine when I run it with sqline.
>>> So, is there a way to tell the client to stream the results (or batch
>>> them) instead of buffering them all? is raising the client memory the only
>>> solution?
>>> I'm using phoenix-4.3.1 and
>>> as the NodeJS driver
>>> Thanks,
>>> Isart Montane

View raw message