We are running CSV bulk loader on phoenix 4.5 with CDH 5.4 and it works fine but with one problem. The loading task is hang on mapreduce.LoadIncrementalHFiles: Trying to load hfile .. until we give the directory holding the hfile (under /tmp of the HDFS) write permissions.
We set umask to be 000 but it does not work.
Any idea how it should be fixed
PRIVILEGED AND CONFIDENTIAL
PLEASE NOTE: The information contained in this message is privileged and confidential, and is intended only for the use of the individual to whom it is addressed and others who have been specifically authorized to receive it. If you are not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, or if any problems occur with transmission, please contact sender. Thank you.