Hadoop impersonation not handling permissions

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Hadoop impersonation not handling permissions

Harinder Singh
Hi I am using hadoop proxy user/impersonation to access a directory on which the superuser has access, but it's giving me permission errors when the proxy user tries to access it:

Say user "a" is a superuser and "b" is trying to access a directory on behalf of it. But "b" does not have permission on the directory, user "a" does have permissions. So shouldn't "b" be able to access that directory as well? Below is the exception I am getting:

org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke 11: Exception <- abc-cdh-n1/192.168.*.*:8020: getListing {org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=b, access=READ_EXECUTE, inode="/foo/one":hdfs:supergroup:drwx------
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:168)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3530)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3513)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:3484)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6624)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5135)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:5096)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:888)
    at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:336)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:630)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2211)


My superuser is hdfs and I am using UserGroupInformation.loginUserFromKeytabAndReturnUGI(user, keyTabPath) with the hdfs principal in place of user and I don't have ACL's enabled. I have added the proxy user's settings as well. * for hdfs.

So can someone guide me what am I missing here?
Reply | Threaded
Open this post in threaded view
|

Re: Hadoop impersonation not handling permissions

Wei-Chiu Chuang-2
Pretty sure this is the expected behavior.
From the stacktrace, you're impersonation is configured correctly (i.e. it successfully perform operation on behalf of user b) the problem is your file doesn't allow b to access it.

On Mon, Jul 30, 2018 at 1:25 PM Harinder Singh <[hidden email]> wrote:
Hi I am using hadoop proxy user/impersonation to access a directory on which the superuser has access, but it's giving me permission errors when the proxy user tries to access it:

Say user "a" is a superuser and "b" is trying to access a directory on behalf of it. But "b" does not have permission on the directory, user "a" does have permissions. So shouldn't "b" be able to access that directory as well? Below is the exception I am getting:

org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke 11: Exception <- abc-cdh-n1/192.168.*.*:8020: getListing {org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=b, access=READ_EXECUTE, inode="/foo/one":hdfs:supergroup:drwx------
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:168)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3530)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3513)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:3484)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6624)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5135)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:5096)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:888)
    at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:336)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:630)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2211)


My superuser is hdfs and I am using UserGroupInformation.loginUserFromKeytabAndReturnUGI(user, keyTabPath) with the hdfs principal in place of user and I don't have ACL's enabled. I have added the proxy user's settings as well. * for hdfs.

So can someone guide me what am I missing here?

--
A very happy Hadoop contributor
Reply | Threaded
Open this post in threaded view
|

Re: Hadoop impersonation not handling permissions

Harinder Singh
Maybe my understanding is not correct. So if hdfs has access on it and "b" is trying to access this dir that "hdfs" has access on. Shouldn't it allow to access that? As hdfs is impersonating "b". 

Thanks
Harinder

On Mon, Jul 30, 2018 at 2:02 PM, Wei-Chiu Chuang <[hidden email]> wrote:
Pretty sure this is the expected behavior.
From the stacktrace, you're impersonation is configured correctly (i.e. it successfully perform operation on behalf of user b) the problem is your file doesn't allow b to access it.

On Mon, Jul 30, 2018 at 1:25 PM Harinder Singh <[hidden email]> wrote:
Hi I am using hadoop proxy user/impersonation to access a directory on which the superuser has access, but it's giving me permission errors when the proxy user tries to access it:

Say user "a" is a superuser and "b" is trying to access a directory on behalf of it. But "b" does not have permission on the directory, user "a" does have permissions. So shouldn't "b" be able to access that directory as well? Below is the exception I am getting:

org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke 11: Exception <- abc-cdh-n1/192.168.*.*:8020: getListing {org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=b, access=READ_EXECUTE, inode="/foo/one":hdfs:supergroup:drwx------
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:168)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3530)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3513)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:3484)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6624)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5135)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:5096)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:888)
    at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:336)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:630)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2211)


My superuser is hdfs and I am using UserGroupInformation.loginUserFromKeytabAndReturnUGI(user, keyTabPath) with the hdfs principal in place of user and I don't have ACL's enabled. I have added the proxy user's settings as well. * for hdfs.

So can someone guide me what am I missing here?

--
A very happy Hadoop contributor