I have change my hive engine to Tez and want to run query using tez but query only executed with hadoop and hive users and when I change my user(user51) in beeline or Hue query got failed. But same query running fine with user51 when hive engine is MR.
Below are scenario with error debug logs.
Working for all users
SET hive.execution.engine=mr;
SELECT count(*) FROM db.mytable;
Working for only hadoop and hive users
SET hive.execution.engine=tez;
SELECT count(*) FROM db.mytable;
Query failed for other users(like user51)
SET hive.execution.engine=tez;
SELECT count(*) FROM db.mytable;
Error log
INFO [HiveServer2-Background-Pool: Thread-643([])]: client.TezClientUtils (TezClientUtils.java:setupTezJarsLocalResources(178)) - Using tez.lib.uris value from configuration: hdfs:///apps/tez/tez.tar.gz
INFO [HiveServer2-Background-Pool: Thread-643([])]: client.TezClientUtils (TezClientUtils.java:setupTezJarsLocalResources(180)) - Using tez.lib.uris.classpath value from configuration: null
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #952 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #952
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: ipc.ProtobufRpcEngine (ProtobufRpcEngine.java:invoke(248)) - Call: getFileInfo took 1ms
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #953 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #953
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: ipc.ProtobufRpcEngine (ProtobufRpcEngine.java:invoke(248)) - Call: getFileInfo took 1ms
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #954 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #954
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: ipc.ProtobufRpcEngine (ProtobufRpcEngine.java:invoke(248)) - Call: getFileInfo took 0ms
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #955 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #955
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: ipc.ProtobufRpcEngine (ProtobufRpcEngine.java:invoke(248)) - Call: getFileInfo took 1ms
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #956 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #956
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: ipc.ProtobufRpcEngine (ProtobufRpcEngine.java:invoke(248)) - Call: getFileInfo took 1ms
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #957 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #957
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: ipc.ProtobufRpcEngine (ProtobufRpcEngine.java:invoke(248)) - Call: getFileInfo took 0ms
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #958 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #958
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: ipc.ProtobufRpcEngine (ProtobufRpcEngine.java:invoke(248)) - Call: getFileInfo took 1ms
DEBUG [IPC Parameter Sending Thread #13([])]: ipc.Client (Client.java:run(1117)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 sending #959 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
DEBUG [IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51([])]: ipc.Client (Client.java:receiveRpcResponse(1171)) - IPC Client (1005331061) connection to ip-10-4-22-xx.ec2.internal/10.4.xx.xx:8020 from user51 got value #959
DEBUG [HiveServer2-Background-Pool: Thread-643([])]: retry.RetryInvocationHandler (RetryInvocationHandler.java:handleException(366)) - Exception while invoking call #959 ClientNamenodeProtocolTranslatorPB.getFileInfo over null. Not retrying because try once and fail.
org.apache.hadoop.ipc.RemoteException: java.lang.NullPointerException
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at com.sun.proxy.$Proxy31.getFileInfo(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:796) ~[hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at sun.reflect.GeneratedMethodAccessor10.invoke(Unknown Source) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_222]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_222]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) [hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) [hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) [hadoop-common-2.8.3-amzn-1.jar:?]
at com.sun.proxy.$Proxy32.getFileInfo(Unknown Source) [?:?]
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1717) [hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1437) [hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1434) [hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) [hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1449) [hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.tez.client.TezClientUtils.checkAncestorPermissionsForAllUsers(TezClientUtils.java:1036) [tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClientUtils.addLocalResources(TezClientUtils.java:275) [tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClientUtils.setupTezJarsLocalResources(TezClientUtils.java:183) [tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClient.getTezJarResources(TezClient.java:1057) [tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClient.start(TezClient.java:447) [tez-api-0.8.4.jar:0.8.4]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.startSessionAndContainers(TezSessionState.java:376) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:323) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolManager$TezSessionPoolSession.openInternal(TezSessionPoolManager.java:703) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:196) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezTask.updateSession(TezTask.java:303) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:168) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_222]
at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_222]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) [hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_222]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_222]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_222]
ERROR [HiveServer2-Background-Pool: Thread-643([])]: exec.Task (TezTask.java:execute(230)) - Failed to execute tez graph.
org.apache.hadoop.ipc.RemoteException: java.lang.NullPointerException
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at com.sun.proxy.$Proxy31.getFileInfo(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:796) ~[hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at sun.reflect.GeneratedMethodAccessor10.invoke(Unknown Source) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_222]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_222]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at com.sun.proxy.$Proxy32.getFileInfo(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1717) ~[hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1437) ~[hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1434) ~[hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1449) ~[hadoop-hdfs-client-2.8.3-amzn-1.jar:?]
at org.apache.tez.client.TezClientUtils.checkAncestorPermissionsForAllUsers(TezClientUtils.java:1036) ~[tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClientUtils.addLocalResources(TezClientUtils.java:275) ~[tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClientUtils.setupTezJarsLocalResources(TezClientUtils.java:183) ~[tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClient.getTezJarResources(TezClient.java:1057) ~[tez-api-0.8.4.jar:0.8.4]
at org.apache.tez.client.TezClient.start(TezClient.java:447) ~[tez-api-0.8.4.jar:0.8.4]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.startSessionAndContainers(TezSessionState.java:376) ~[hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:323) ~[hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolManager$TezSessionPoolSession.openInternal(TezSessionPoolManager.java:703) ~[hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:196) ~[hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezTask.updateSession(TezTask.java:303) ~[hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:168) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232) [hive-exec-2.3.2-amzn-0.jar:2.3.2-amzn-0]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_222]
at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_222]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) [hadoop-common-2.8.3-amzn-1.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362) [hive-service-2.3.2-amzn-2.jar:2.3.2-amzn-2]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_222]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_222]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_222]
ERROR [HiveServer2-Background-Pool: Thread-643([])]: ql.Driver (SessionState.java:printError(1126)) - FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
I don't know what's happening. Can anyone help?
Finally I found the solution. We have added some HDFS authorization property in hdfs-site.xml and while executing the query on tez engine, tez was creating the some temporary files and directories in hdfs. So I removed below additional properties from hdfs-site.xml and restart the hadoop services.
Additional property
<property>
<name>dfs.namenode.inode.attributes.provider.class</name>
<value>org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer</value>
</property>
Hope this will help to someone.