在CDH中重启YARN组件时,JobHistoryServer服务启动失败

2024-06-17 1099阅读

问题描述

在CDH中重启YARN组件的JobHistoryServerJobHistoryServer服务时,遇到如下错误。

在CDH中重启YARN组件时,JobHistoryServer服务启动失败
(图片来源网络,侵删)

错误一:

Stopping JobHistory 上午10点44:40.553分 INFO AbstractService Service org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://nameservice1:8020/user/history/done] org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://nameservice1:8020/user/history/done] at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.tryCreatingHistoryDirs(HistoryFileManager.java:696) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.createHistoryDirs(HistoryFileManager.java:630) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceInit(HistoryFileManager.java:591) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:97) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:108) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:150) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.launchJobHistoryServer(JobHistoryServer.java:226) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.main(JobHistoryServer.java:236) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied. user=mapred is not the owner of inode=/user/history/done at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:303) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:270) at org.apache.sentry.hdfs.SentryINodeAttributesProvider
ClientNamenodeProtocol
Server
Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server
RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server
11.next(FileContext.java:1008) at org.apache.hadoop.fs.FileContext
SentryPermissionEnforcer.checkPermission(SentryINodeAttributesProvider.java:86) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:194) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1855) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1839) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1784) at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:64) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1861) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:856) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:509) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos
2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine
ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC
RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server
Handler.run(Server.java:2675) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499) at org.apache.hadoop.ipc.Client.call(Client.java:1445) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine
Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.
Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler
Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy17.setPermission(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.setPermis

错误二:

ERROR JobHistoryServer Error starting JobHistoryServer org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://nameservice1:8020/user/history/done] at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.tryCreatingHistoryDirs(HistoryFileManager.java:696) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.createHistoryDirs(HistoryFileManager.java:630) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceInit(HistoryFileManager.java:591) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:97) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:108) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:150) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.launchJobHistoryServer(JobHistoryServer.java:226) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.main(JobHistoryServer.java:236) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied. user=mapred is not the owner of inode=/user/history/done at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:303) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:270) at org.apache.sentry.hdfs.SentryINodeAttributesProvider
ClientNamenodeProtocol
Server
Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server
RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Serve

原因分析:

错误一:

在尝试创建/user/history/done目录时,用户mapred没有足够的权限。要解决这个问题,我们需要确保mapred用户对/user/history/done目录有正确的权限。

错误二:

这个错误是由于JobHistoryServer在启动时,无法创建hdfs://nameservice1:8020/user/history/done目录,因为当前用户mapred不是目录的所有者,没有足够的权限。


解决方案:

1. 检查当前用户权限:

使用如下命令查看/user/history/done目录的权限设置。

hdfs dfs -ls /user/history/done

确认mapred用户是否是目录的所有者或者是否有写权限。

2. 修改目录权限:

如果mapred用户不是目录的所有者,可以使用如下命令将目录的所有权更改为mapred。

hdfs dfs -chown mapred /user/history/done

如果mapred用户需要写权限,可以使用如下命令给予mapred用户写权限。

hdfs dfs -chmod 755 /user/history/done

之后重新启动,JobHistoryServer 启动成功。

启动 JobHistoryServer 成功后的异常

在 JobHistoryServer 启动成功后,查看角色日志,发现又报了如下异常。

ERROR	HistoryFileManager	
Error while trying to scan the directory hdfs://nameservice1:8020/user/history/done_intermediate/hive
org.apache.hadoop.security.AccessControlException: Permission denied: user=mapred, access=READ_EXECUTE, inode="/user/history/done_intermediate/hive":hive:hadoop:drwxrwx---

这个错误表明,用户 mapred 没有足够的权限来读取和执行目录 /user/history/done_intermediate/hive 中的内容。目录的权限设置为 drwxrwx—,这意味着:

目录所有者(hive)有读、写和执行权限。

目录所属组(hadoop)有读、写和执行权限。

其他用户没有任何权限。

解决方法:

更改目录权限:通过HDFS命令行更改 hdfs 目录权限,以允许 mapred 用户读取和执行目录内容。可以将目录权限更改为 drwxrwxr-x,这样所有用户都有读取和执行权限,但只有所有者和组有写权限。

hdfs dfs -chmod 775 /user/history/done_intermediate/hive
VPS购买请点击我

免责声明:我们致力于保护作者版权,注重分享,被刊用文章因无法核实真实出处,未能及时与作者取得联系,或有版权异议的,请联系管理员,我们会立即处理! 部分文章是来自自研大数据AI进行生成,内容摘自(百度百科,百度知道,头条百科,中国民法典,刑法,牛津词典,新华词典,汉语词典,国家院校,科普平台)等数据,内容仅供学习参考,不准确地方联系删除处理! 图片声明:本站部分配图来自人工智能系统AI生成,觅知网授权图片,PxHere摄影无版权图库和百度,360,搜狗等多加搜索引擎自动关键词搜索配图,如有侵权的图片,请第一时间联系我们,邮箱:ciyunidc@ciyunshuju.com。本站只作为美观性配图使用,无任何非法侵犯第三方意图,一切解释权归图片著作权方,本站不承担任何责任。如有恶意碰瓷者,必当奉陪到底严惩不贷!

目录[+]