madlib-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nandish Jayaram <njaya...@pivotal.io>
Subject Re: Out of memory for neural network
Date Fri, 27 Jul 2018 15:05:29 GMT
Hi Luyao,

Can you kindly include details about the database settings (btw, are you using Postgres or
Greenplum?) and the training dataset size used?
If your dataset is publicly available, please do share the details of where we can access
it too. 

NJ

Sent from my iPhone

> On Jul 27, 2018, at 7:41 AM, LUYAO CHEN <luyao_chen@hotmail.com> wrote:
> 
> Dear user community,
> 
> I would report a problem regarding in neural network. I am using a 16G RAM machine. 
> 
> After some iterations (~100 ). I got the below error 
> 
> ERROR:  spiexceptions.OutOfMemory: out of memory
> DETAIL:  Failed on request of size 32800.
> CONTEXT:  Traceback (most recent call last):
>   PL/Python function "mlp_classification", line 36, in <module>
>     grouping_col
>   PL/Python function "mlp_classification", line 45, in wrapper
>   PL/Python function "mlp_classification", line 325, in mlp
>   PL/Python function "mlp_classification", line 580, in update
> PL/Python function "mlp_classification"
> 
> Below is the command ,
> 
> SELECT madlib.mlp_classification(
>     'train_data_sub',      -- Source table 
>     'mlp_model',      -- Destination table
>     'features',     -- Input features
>     'positive',     -- Label
>     ARRAY[5],         -- Number of units per layer
>     'learning_rate_init=0.003,
>     n_iterations=500,
>     tolerance=0',     -- Optimizer params
>     'tanh',           -- Activation function
>     NULL,             -- Default weight (1)
>     FALSE,            -- No warm start
>     true,             -- verbose
>     'case_icd'         -- Grouping
> );
> 
> Is that a problem or just caused by the data size? 
> 
> Regards,
> Luyao Chen
> 

Mime
View raw message