phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vikas Agarwal <vi...@infoobjects.com>
Subject Re: java.lang.OutOfMemoryError: unable to create new native thread
Date Sat, 18 Oct 2014 03:07:20 GMT
Some other process on the same machine is creating many threads and it is
reaching the machine's limit which is 1024 by default (check with ulimit
-a). This process may be some database which has thread leak (we had
similar issue with Cassandra) or your own utility processes running in
background.

On Sat, Oct 18, 2014 at 4:07 AM, ashish tapdiya <ashishtapdiya@gmail.com>
wrote:

> Hi,
>
> I am getting a weird error:
>
>
> java.sql.SQLException: java.lang.OutOfMemoryError: unable to create new
> native thread
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:887)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.java:1019)
>         at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:303)
>         at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:272)
>         at
> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:289)
>         at
> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:210)
>         at
> org.apache.phoenix.compile.FromCompiler.getResolverForMutation(FromCompiler.java:184)
>         at
> org.apache.phoenix.compile.UpsertCompiler.compile(UpsertCompiler.java:241)
>         at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:442)
>         at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:433)
>         at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:250)
>         at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:242)
>         at
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>         at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:241)
>         at
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.executeUpdate(PhoenixPreparedStatement.java:168)
>         at PrepareLockTables.populateLockTables(PrepareLockTables.java:215)
>         at PrepareLockTables.main(PrepareLockTables.java:36)
> Caused by: java.lang.OutOfMemoryError: unable to create new native thread
>         at java.lang.Thread.start0(Native Method)
>         at java.lang.Thread.start(Thread.java:714)
>         at
> java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:949)
>         at
> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1360)
>         at
> java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:132)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processExecs(HConnectionManager.java:1599)
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:867)
>         ... 16 more
>
>
>
> I have written a simple database population script which is populating
> around 100k rows in a table. I am not creating any threads in my script;
> however, I get this error.
>
> Cluster was working fine for 5 months. However, all of a sudden i am
> getting this client error.
>
> ~Ashish
>



-- 
Regards,
Vikas Agarwal
91 – 9928301411

InfoObjects, Inc.
Execution Matters
http://www.infoobjects.com
2041 Mission College Boulevard, #280
Santa Clara, CA 95054
+1 (408) 988-2000 Work
+1 (408) 716-2726 Fax

Mime
View raw message