Access Hadoop with kerberos failed -
i install hadoop , kerberos, but when exec hadoop fs -ls /
error has bean occured.
[dannil@ozcluster06 logs]$ hadoop fs -ls / 16/09/13 11:34:39 warn ipc.client: exception encountered while connecting server : javax.security.sasl.saslexception: gss initiate failed [caused gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)] ls: failed on local exception: java.io.ioexception: javax.security.sasl.saslexception: gss initiate failed [caused gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)]; host details : local host is: "localhost/127.0.0.1"; destination host is: "192.168.168.46":9000;
i can see datanode , namenode has start jps
20963 datanode 21413 secondarynamenode 20474 namenode 22906 jps
i add principal hdfs/oz.flex@oz.flex
, http/oz.flex@oz.flex
,then use xst -norandkey -k hdfs.keytab hdfs/oz.flex@oz.flex http/oz.flex@oz.flex
generate hdfs.keytab
kadmin.local: listprincs http/oz.flex@oz.flex k/m@oz.flex dannil/admin@oz.flex hdfs/oz.flex@oz.flex kadmin/admin@oz.flex kadmin/changepw@oz.flex kadmin/ozcluster06@oz.flex kiprop/ozcluster06@oz.flex krbtgt/oz.flex@oz.flex
then exec kinit -kt /home/dannil/hadoop-2.7.1/hdfs.keytab hdfs/oz.flex
i can see ticket status :
[dannil@ozcluster06 ~]$ klist ticket cache: keyring:persistent:1000:krb_ccache_4h73pla default principal: hdfs/oz.flex@oz.flex valid starting expires service principal 2016-09-13t10:47:06 2016-09-14t10:47:06 krbtgt/oz.flex@oz.flex
this hadoop config value:
core-site.xml:
fs.defaultfs=hdfs://192.168.168.46:9000 hadoop.security.authentication=kerberos hadoop.security.authorization=true
hdfs-site.xml:
dfs.replication=1 dfs.permissions=false dfs.block.access.token.enable=true dfs.namenode.keytab.file=/home/dannil/hadoop-2.7.1/hdfs.keytab dfs.namenode.kerberos.principal=hdfs/oz.flex@oz.flex dfs.namenode.kerberos.internal.spnego.principal=http/oz.flex@oz.flex dfs.secondary.namenode.keytab.file=/home/dannil/hadoop-2.7.1/hdfs.keytab dfs.secondary.namenode.kerberos.principal=hdfs/oz.flex@oz.flex dfs.secondary.namenode.kerberos.internal.spnego.principal=http/oz.flex@oz.flex dfs.datanode.data.dir.perm=700 dfs.datanode.address=0.0.0.0:61004 dfs.datanode.http.address=0.0.0.0:61006 dfs.datanode.keytab.file=/home/dannil/hadoop-2.7.1/hdfs.keytab dfs.datanode.kerberos.principal=hdfs/oz.flex@oz.flex dfs.https.port=50470 dfs.https.address=0.0.0.0:50470 dfs.webhdfs.enabled=true dfs.web.authentication.kerberos.principal=http/oz.flex@oz.flex dfs.web.authentication.kerberos.keytab=/home/dannil/hadoop-2.7.1/hdfs.keytab dfs.http.policy=https_only dfs.data.transfer.protection=integrity
how error occured ? , should solve problem?
Comments
Post a Comment