hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "prophy Yan (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (HDFS-7657) failed start datanode in secure mode with class not found err, but jar is in classpath
Date Thu, 22 Jan 2015 08:54:34 GMT

     [ https://issues.apache.org/jira/browse/HDFS-7657?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

prophy Yan resolved HDFS-7657.
------------------------------
    Resolution: Fixed

this is the jsvc problem, i build the jsvc from commons-daemon, it resolved.

> failed start datanode in secure mode with class not found err, but jar is in classpath
> --------------------------------------------------------------------------------------
>
>                 Key: HDFS-7657
>                 URL: https://issues.apache.org/jira/browse/HDFS-7657
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: datanode
>    Affects Versions: 2.6.0
>         Environment: centos 6.5
> hadoop-2.6.0
>            Reporter: prophy Yan
>
> i use the root to start datanode in secure mode, but is doesn't start
> in jsvc.err ,the err is :
> "Service exit with a return value of 3
> java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter"
> but the jar file is in the classpath
> i set the $hadoop_home/bin/hdfs to debug mode, it show the command is:
> /usr/bin/jsvc -Dproc_datanode -outfile /var/log/hadoop//jsvc.out -errfile /var/log/hadoop//jsvc.err
-pidfile /var/run/hadoop/hadoop_secure_dn.pid -nodetach -user hdfs -cp '/usr/lib/hadoop/commons-daemon-1.0.15.jar:/etc/hadoop:/usr/lib/hadoop/share/hadoop/common/lib/*:/usr/lib/hadoop/share/hadoop/common/*:/usr/lib/hadoop/share/hadoop/hdfs:/usr/lib/hadoop/share/hadoop/hdfs/lib/*:/usr/lib/hadoop/share/hadoop/hdfs/*:/usr/lib/hadoop/share/hadoop/yarn/lib/*:/usr/lib/hadoop/share/hadoop/yarn/*:/usr/lib/hadoop/share/hadoop/mapreduce/lib/*:/usr/lib/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar'
-Xmx1000m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop -Dhadoop.log.file=hadoop.log
-Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop/
-Dhadoop.id.str=hdfs -jvm server -Dhadoop.security.logger=ERROR,RFAS -Dhadoop.security.logger=INFO,NullAppender
org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter
> i add "for f in /usr/lib/hadoop/share/hadoop/common/*.jar; do
>   export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$f
> done
> .......,and so on
> "  to the hadoop-env.sh, then the datanode can start
> so is this mean the jsvc can not recognize the -cp 'hadoop_home/lib/*'?
> it just can recognize the -cp 'hadoop_home/lib/hadoop-hdfs.jar', like this?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message