phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From cmbendre <chaitanya.ben...@zeotap.com>
Subject Async Index Creation fails due to permission issue
Date Tue, 23 May 2017 12:43:58 GMT
I created an ASYNC index and ran the IndexTool Map-Reduce job to populate it.
Here is the command i used

hbase org.apache.phoenix.mapreduce.index.IndexTool --data-table MYTABLE
--index-table MYTABLE_GLOBAL_INDEX --output-path MYTABLE_GLOBAL_INDEX_HFILE

I can see that index HFiles are created successfully on HDFS but then the
job fails due to permission errors. The files are created as "hadoop" user,
and they do not have any permission for "hbase" user. Here is the error i
get -

/Caused by:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
Permission denied: user=hbase, access=EXECUTE,
inode="/user/hadoop/MYTABLE_GLOBAL_INDEX_HFILE/MYTABLE_GLOBAL_INDEX/0/a4c9888f8e284158bfb79b30b2cdee82":hadoop:hadoop:drwxrwx---
	at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:320)
	at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
	at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
	at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
	at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1728)
	at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1712)
	at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1686)
	at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1830)
	at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
	at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
	at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:588)
	at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
	at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2045)/

The hack i am using right now is to set the permissions manually for these
files when the IndexTool job is running. Is there a better way ?



--
View this message in context: http://apache-phoenix-user-list.1124778.n5.nabble.com/Async-Index-Creation-fails-due-to-permission-issue-tp3573.html
Sent from the Apache Phoenix User List mailing list archive at Nabble.com.

Mime
View raw message