hadoop使用kerberos认证后,hadoop fs -ls命令行无法使用,求大神帮忙

hadoop版本apache hadoop 2.7.3,jdk-1.7

输入hadoop fs -ls,错误信息如下:

hadoop@hadoop01 native]$ hadoop fs -ls
17/08/01 01:33:36 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "hadoop01/192.168.148.129"; destination host is: "hadoop01":9000;

klist查看凭据缓存,是存在的:
[hadoop@hadoop01 native]$ klist
Ticket cache: KEYRING:persistent:1001:1001
Default principal: hadoop/hadoop01@HADOOP.COM

Valid starting Expires Service principal
08/01/2017 01:12:54 08/02/2017 01:12:54 krbtgt/HADOOP.COM@HADOOP.COM

通过http://192.168.148.129:50070/dfshealth.html#tab-overview访问界面也是OK的:
Configured Capacity: 55.38 GB
DFS Used: 16 KB (0%)
Non DFS Used: 11.4 GB
DFS Remaining: 43.99 GB (79.42%)
Block Pool Used: 16 KB (0%)
DataNodes usages% (Min/Median/Max/stdDev): 0.00% / 0.00% / 0.00% / 0.00%
Live Nodes 2 (Decommissioned: 0)
Dead Nodes 0 (Decommissioned: 0)
Decommissioning Nodes 0
Total Datanode Volume Failures 0 (0 B)
Number of Under-Replicated Blocks 0
Number of Blocks Pending Deletion 0
Block Deletion Start Time 2017/8/1 上午10:12:21

1个回答

这么久了还没人回答,自己回答:
配置krb5.conf时候屏蔽关于 #default_ccache_name的配置,hadoop启动时是基于这个位置去寻找认证缓存的

[libdefaults]
dns_lookup_realm = false
ticket_lifetime = 24h
renew_lifetime = 7d
forwardable = true
rdns = false
default_realm = HADOOP.COM
#default_ccache_name = KEYRING:persistent:%{uid}

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!