Hi Ralph,

Try increasing the ulimit for number of open files(ulimit -n) and processes(ulimit -u) for below users:-
hbase
hdfs


Regards,
Ankit Singhal


On Tue, Jul 7, 2015 at 4:13 AM, Perko, Ralph J <Ralph.Perko@pnnl.gov> wrote:
Hi,  

I am using a pig script to regularly load data into hbase/phoenix.  Recently, "OutOfMemory – unable to create new native thread” errors have cropped up in the pig generated MR job – specifically in the Mapper task.  I have not seen this before and it only occurs on this one data load while other similar scripts complete successfully.  We also recently upgraded to Phoenix 4.4.  My understanding is this is less about memory and more about resource availability from the OS.

The attached PDF contains some of the relevant log entries from the MR job

Any thoughts on what could be causing this?

Thanks,
Ralph


Phoenix 4.4.0
HBase 0.98.4

Pig script:

register $phoenix_jar;
register $project_jar;
register $piggybank_jar;

SET job.name 'Load $data into $table_name'

Z = load '$data' USING org.apache.pig.piggybank.storage.CSVExcelStorage() as (
    f:chararray,
    r:int,
...
);

X = FILTER Z BY f is not null AND r is not null and fst is not null;

D = foreach X generate
    fst,
    gov.pnnl.pig.Format(f),
    r,
...
;

STORE D into 'hbase://$table_name/FST,F,R,...' using org.apache.phoenix.pig.PhoenixHBaseStorage('$zookeeper','-batchSize 500');