phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Riesland, Zack" <>
Subject RE: Permissions Question
Date Tue, 07 Jul 2015 14:53:07 GMT
Thanks Krishna,

The hfiles are stored in, for example, /tmp/daa6119d-f49e-485e-a6fe-1405d9c3f2a4/<structure
based on table name>

‘tmp’ is owned by ‘hdfs’ in group ‘hdfs’.

‘daa6119d-f49e-485e-a6fe-1405d9c3f2a4’ is owned by my script user (‘user1’ for example)
in group ‘hdfs’.

I cannot run the script as ‘hbase’, and the name of the folder (‘daa6119d-f49e-485e-a6fe-1405d9c3f2a4’
in this case) will change each time I run the jar, so explicitly doing a chown on that folder
won’t help.

Do you know what change I need to make to ‘user1’ so that hfiles created by him will write
to hbase?

From: Krishna []
Sent: Monday, July 06, 2015 3:11 PM
Subject: Re: Permissions Question

The owner of the directory containing HFiles should be 'hbase' user and ownership can set
using 'chown' command.

On Mon, Jul 6, 2015 at 7:12 AM, Riesland, Zack <<>>
I’ve been running CsvBulkLoader as ‘hbase’ and that has worked well.

But I now need to integrate with some scripts that will be run as another user.

When I run under a different account, the CsvBulkLoader runs and creates the HFiles, but then
encounters permission issues attempting to write the data to HBase.

Can someone point me in the right direction for solving this?

How can I give ‘hbase’ write permissions to a different user?


View raw message