在CDH中重启YARN组件时,JobHistoryServer服务启动失败

06-17 1092阅读

问题描述

在CDH中重启YARN组件的JobHistoryServerJobHistoryServer服务时,遇到如下错误。

在CDH中重启YARN组件时,JobHistoryServer服务启动失败
(图片来源网络,侵删)

错误一:

Stopping JobHistory 上午10点44:40.553分 INFO AbstractService Service org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://nameservice1:8020/user/history/done] org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://nameservice1:8020/user/history/done] at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.tryCreatingHistoryDirs(HistoryFileManager.java:696) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.createHistoryDirs(HistoryFileManager.java:630) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceInit(HistoryFileManager.java:591) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:97) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:108) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:150) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.launchJobHistoryServer(JobHistoryServer.java:226) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.main(JobHistoryServer.java:236) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied. user=mapred is not the owner of inode=/user/history/done at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:303) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:270) at org.apache.sentry.hdfs.SentryINodeAttributesProvider
ClientNamenodeProtocol
Server
Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server
RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server
11.next(FileContext.java:1008) at org.apache.hadoop.fs.FileContext
SentryPermissionEnforcer.checkPermission(SentryINodeAttributesProvider.java:86) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:194) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1855) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1839) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1784) at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:64) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1861) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:856) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:509) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos
2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine
ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC
RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server
Handler.run(Server.java:2675) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499) at org.apache.hadoop.ipc.Client.call(Client.java:1445) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine
Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.
Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler
Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy17.setPermission(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.setPermis

错误二:

ERROR JobHistoryServer Error starting JobHistoryServer org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://nameservice1:8020/user/history/done] at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.tryCreatingHistoryDirs(HistoryFileManager.java:696) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.createHistoryDirs(HistoryFileManager.java:630) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceInit(HistoryFileManager.java:591) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:97) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:108) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:150) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.launchJobHistoryServer(JobHistoryServer.java:226) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.main(JobHistoryServer.java:236) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied. user=mapred is not the owner of inode=/user/history/done at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:303) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:270) at org.apache.sentry.hdfs.SentryINodeAttributesProvider
ClientNamenodeProtocol
Server
Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server
RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Serve

原因分析:

错误一:

在尝试创建/user/history/done目录时,用户mapred没有足够的权限。要解决这个问题,我们需要确保mapred用户对/user/history/done目录有正确的权限。

错误二:

这个错误是由于JobHistoryServer在启动时,无法创建hdfs://nameservice1:8020/user/history/done目录,因为当前用户mapred不是目录的所有者,没有足够的权限。


解决方案:

1. 检查当前用户权限:

使用如下命令查看/user/history/done目录的权限设置。

hdfs dfs -ls /user/history/done

确认mapred用户是否是目录的所有者或者是否有写权限。

2. 修改目录权限:

如果mapred用户不是目录的所有者,可以使用如下命令将目录的所有权更改为mapred。

hdfs dfs -chown mapred /user/history/done

如果mapred用户需要写权限,可以使用如下命令给予mapred用户写权限。

hdfs dfs -chmod 755 /user/history/done

之后重新启动,JobHistoryServer 启动成功。

启动 JobHistoryServer 成功后的异常

在 JobHistoryServer 启动成功后,查看角色日志,发现又报了如下异常。

ERROR	HistoryFileManager	
Error while trying to scan the directory hdfs://nameservice1:8020/user/history/done_intermediate/hive
org.apache.hadoop.security.AccessControlException: Permission denied: user=mapred, access=READ_EXECUTE, inode="/user/history/done_intermediate/hive":hive:hadoop:drwxrwx---

这个错误表明,用户 mapred 没有足够的权限来读取和执行目录 /user/history/done_intermediate/hive 中的内容。目录的权限设置为 drwxrwx—,这意味着:

目录所有者(hive)有读、写和执行权限。

目录所属组(hadoop)有读、写和执行权限。

其他用户没有任何权限。

解决方法:

更改目录权限:通过HDFS命令行更改 hdfs 目录权限,以允许 mapred 用户读取和执行目录内容。可以将目录权限更改为 drwxrwxr-x,这样所有用户都有读取和执行权限,但只有所有者和组有写权限。

hdfs dfs -chmod 775 /user/history/done_intermediate/hive
VPS购买请点击我

文章版权声明:除非注明,否则均为主机测评原创文章,转载或复制请以超链接形式并注明出处。

目录[+]