hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daryn Sharp <da...@yahoo-inc.com>
Subject Re: 'Can't get service ticket for: host/0.0.0.0' when running hdfs with kerberos
Date Fri, 14 Sep 2012 15:25:54 GMT
Is your default kerberos realm set to "EXAMPLETEST.COM<http://EXAMPLETEST.COM>"?  If
not, have you tried grepping your confs for "EXAMPLETEST.COM<http://EXAMPLETEST.COM>"?

Daryn

On Sep 12, 2012, at 5:37 PM, jack chrispoo wrote:

Hi,

I'm using Hadoop 1.0.1, I tried to follow https://ccp.cloudera.com/display/CDHDOC/Configuring+Hadoop+Security+in+CDH3+%28KSSL%29
to configure hadoop with kerberos authentication. I configured KDC and added hdfs, mapred,
host principles for each node to kerberos and deployed the keytabs to each node.

I modified core-site.xml, hdfs-site.xml, hadoop-env.sh as below, and then tried to start dfs
using sudo hadoop_dir/bin/start-dfs.sh

The NameNode and DataNodes started without error. And from namenode:50070 I can see that all
DataNodes are live. I can create directories, ls in hdfs using hadoop command. But one thing
I'm confused is: earlier when I started hdfs without Kerberos, 'jps' will show in namenode
a pid with 'NameNode':

  3239 NameNode

and in datanode a pid with 'DataNode':

  24307 DataNode

, but now 'jps' shows a pid with 'NameNode' on namenode,

  3239 NameNode

but a pid without any name on DataNode,

  # jps
  2931 Jps
  2684

I guess this process 2684 is the DataNode because if I run 'sudo hadoop_dir/bin/stop-dfs.sh'
this process goes away. Has anyone seen this before? why it doesn't show 'DataNode'?

Also, a while after I started hdfs, NameNode's log showed some error:

2012-09-12 14:31:06,335 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException
as:host/node0@EXAMPLETEST.COM<mailto:node0@EXAMPLETEST.COM> cause:java.io.IOException:
Can't get service ticket for: host/0.0.0.0<http://0.0.0.0/>
2012-09-12 14:31:06,335 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException
as:host/node0@ EXAMPLETEST.COM<http://EXAMPLETEST.COM/> cause:java.io.IOException: Can't
get service ticket for: host/0.0.0.0<http://0.0.0.0/>
2012-09-12 14:31:06,358 WARN org.mortbay.log: /getimage: java.io.IOException: GetImage failed.
java.io.IOException: Can't get service ticket for: host/0.0.0.0<http://0.0.0.0/>
        at org.apache.hadoop.security.SecurityUtil.fetchServiceTicket(SecurityUtil.java:138)
        at org.apache.hadoop.hdfs.server.namenode.TransferFsImage.getFileClient(TransferFsImage.java:158)
        at org.apache.hadoop.hdfs.server.namenode.GetImageServlet$1$1.run(GetImageServlet.java:88)
        at org.apache.hadoop.hdfs.server.namenode.GetImageServlet$1$1.run(GetImageServlet.java:85)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.hdfs.server.namenode.GetImageServlet$1.run(GetImageServlet.java:85)
        at org.apache.hadoop.hdfs.server.namenode.GetImageServlet$1.run(GetImageServlet.java:70)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.hdfs.server.namenode.GetImageServlet.doGet(GetImageServlet.java:70)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
        at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1221)
        at org.apache.hadoop.security.Krb5AndCertsSslSocketConnector$Krb5SslFilter.doFilter(Krb5AndCertsSslSocketConnector.java:221)
.......

It seems like the namenode is trying to get a kerberos ticket for the datanode (in hdfs-site.xml
dfs.datanode.address is set to 0.0.0.0:1004<http://0.0.0.0:1004/> and dfs.datanode.http.address
set to 0.0.0.0:1006<http://0.0.0.0:1006/>) but failed. I googled about 0.0.0.0, it is
said to be something related to reverse DNS, from my node I can use 'host ip-address' to get
the host name, so reverse DNS should be working. So what could've caused these errors?

Please give me some clue to this,
Thanks!
jack


Configuration:

added to core-site.xml:
<property>
  <name>hadoop.security.authentication</name>
  <value>kerberos</value> <!-- A value of "simple" would disable security.
-->
</property>

<property>
  <name>hadoop.security.authorization</name>
  <value>true</value>
</property>

to hdfs-site.xml:
  <!-- General HDFS security config -->
  <property>
    <name>dfs.block.access.token.enable</name>
    <value>true</value>
  </property>

  <!-- NameNode security config -->
  <property>
    <name>dfs.https.address</name>
    <value>c10i-bl0.us.oracle.com:50470<http://c10i-bl0.us.oracle.com:50470/></value>
  </property>
  <property>
    <name>dfs.https.port</name>
    <value>50470</value>
  </property>
  <property>
    <name>dfs.namenode.keytab.file</name>
    <value>/usr/hadoop/hadoop-1.0.1/conf/hdfs.keytab</value> <!-- path to the
HDFS keytab -->
  </property>
  <property>
    <name>dfs.namenode.kerberos.principal</name>
    <value>hdfs/_HOST@CLOUDPERF.COM<mailto:HOST@CLOUDPERF.COM></value>
  </property>
  <property>
    <name>dfs.namenode.kerberos.https.principal</name>
    <value>host/_HOST@CLOUDPERF.COM<mailto:HOST@CLOUDPERF.COM></value>
  </property>

  <!-- Secondary NameNode security config -->
  <property>
    <name>dfs.secondary.https.address</name>
    <value>c10i-bl0.us.oracle.com:50495<http://c10i-bl0.us.oracle.com:50495/></value>
  </property>
  <property>
    <name>dfs.secondary.https.port</name>
    <value>50495</value>
  </property>
  <property>
    <name>dfs.secondary.namenode.keytab.file</name>
    <value>/usr/hadoop/hadoop-1.0.1/conf/hdfs.keytab</value> <!-- path to the
HDFS keytab -->
  </property>
  <property>
    <name>dfs.secondary.namenode.kerberos.principal</name>
    <value>hdfs/_HOST@CLOUDPERF.COM<mailto:HOST@CLOUDPERF.COM></value>
  </property>
  <property>
    <name>dfs.secondary.namenode.kerberos.https.principal</name>
    <value>host/_HOST@CLOUDPERF.COM<mailto:HOST@CLOUDPERF.COM></value>
  </property>

  <!-- DataNode security config -->
  <property>
    <name>dfs.datanode.data.dir.perm</name>
    <value>700</value>
  </property>
  <property>
    <name>dfs.datanode.address</name>
    <value>0.0.0.0:1004<http://0.0.0.0:1004/></value>
  </property>
  <property>
    <name>dfs.datanode.http.address</name>
    <value>0.0.0.0:1006<http://0.0.0.0:1006/></value>
  </property>
  <property>
    <name>dfs.datanode.keytab.file</name>
    <value>/usr/hadoop/hadoop-1.0.1/conf/hdfs.keytab</value> <!-- path to the
HDFS keytab -->
  </property>
  <property>
    <name>dfs.datanode.kerberos.principal</name>
    <value>hdfs/_HOST@CLOUDPERF.COM<mailto:HOST@CLOUDPERF.COM></value>
  </property>
  <property>
    <name>dfs.datanode.kerberos.https.principal</name>
    <value>host/_HOST@CLOUDPERF.COM<mailto:HOST@CLOUDPERF.COM></value>
  </property>

and to hadoop-env.sh:
export HADOOP_SECURE_DN_USER=hdfs




Mime
View raw message