phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bulvik, Noam" <Noam.Bul...@teoco.com>
Subject RE: CSV Bulkload in oozie
Date Fri, 30 Jan 2015 08:57:11 GMT
We are using it via ozzie.
We had such issue and we solve it by setting default file permission in hdfs so all users
will have access to it.
I guess there is more advanced solution but for us it was enough

-----Original Message-----
From: Ganesh R [rganesh84@yahoo.co.in]
Received: יום שישי, 30 ינו 2015, 9:33
To: user@phoenix.apache.org [user@phoenix.apache.org]
Subject: CSV Bulkload in oozie

Has anyone tried using csv bulk load tool in oozie?
I am trying to use CSVBulkLoader in oozie, it does work in command line with hadoop jar command
however when i have it in oozie as java action, it fails to load table at the very end.

- The actual mapreduce job shows 100% success exactly reflecting the number of rows to be
upserted (matching the input csv file).
- have hbase-protocol.jar in the oozie share library. I tried to have this jar as distributed
cache but no luck.
- alternatively, it works when i covert it to shell action and execute as hadoop jar through
oozie.

I see FileNotFoundException written in logs over and over again, logs as shown below..
much appreciate your inputs on this.

2015-01-30 01:32:40,223 INFO [phoenix-2-thread-0] org.apache.phoenix.mapreduce.CsvBulkLoadTool:
Loading HFiles from /tmp/6cbb4bad-6f26-4546-a831-dd0db13c6d88/OOZIE
2015-01-30 01:32:40,255 WARN [phoenix-2-thread-0] org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles:
Skipping non-directory file:/tmp/6cbb4bad-6f26-4546-a831-dd0db13c6d88/OOZIE/_SUCCESS
2015-01-30 01:32:40,314 INFO [LoadIncrementalHFiles-8] org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles:
Trying to load hfile=file:/tmp/6cbb4bad-6f26-4546-a831-dd0db13c6d88/OOZIE first=\x0D1000000011001\x001001\x00\x80\x00\x01K=Ix\x00
last=\x0D1000000011001\x001001\x00\x80\x00\x01K=Ix\x00
2015-01-30 01:32:40,315 INFO [LoadIncrementalHFiles-10] org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles:
Trying to load hfile=file:/tmp/6cbb4bad-6f26-4546-a831-dd0db13c6d88/OOZIE first=\x101000000011002\x001001\x00\x80\x00\x01K=Ix\x00
last=\x101000000011002\x001003\x00\x80\x00\x01K2\xFC\xC0\x00
2015-01-30 01:41:50,602 ERROR [LoadIncrementalHFiles-13] org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles:
Encountered unrecoverable error from region server
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
Fri Jan 30 01:32:40 EST 2015, org.apache.hadoop.hbase.client.RpcRetryingCaller@549072f7, java.io.FileNotFoundException:
java.io.FileNotFoundException: File file:/tmp/6cbb4bad-6f26-4546-a831-dd0db13c6d88/
2015-01-30 01:41:50,602 ERROR [LoadIncrementalHFiles-13] org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles:
Encountered unrecoverable error from region server
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
Fri Jan 30 01:32:40 EST 2015, org.apache.hadoop.hbase.client.RpcRetryingCaller@549072f7, java.io.FileNotFoundException:
java.io.FileNotFoundException: File file:/tmp/6cbb4bad-6f26-4546-a831-dd0db13c6d88/

Information in this e-mail and its attachments is confidential and privileged under the TEOCO
confidentiality terms that can be reviewed here<http://www.teoco.com/email-disclaimer>.

Mime
View raw message