hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From deepak rosario tharigopla <rozartharigo...@gmail.com>
Subject Re: stop-dfs.sh does not work
Date Wed, 10 Jul 2013 06:03:16 GMT
You can browse to this location which is the jdk root
and if you can find jps (jdk1.6 comes with jps but not openjdk and its
preferable to use sun jdk6 for hadoop) there simple type jps and execute
which will give you all the java process in the JVM

Good handy jps command. You can add the following in the .bashrc file in
the /home/<user>/
unalias jps $> /dev/null
alias jps="/usr/lib/jvm/jdk1.6.0_43/bin/jps"

and log out and login. So that you can use jps anywhere you dont have to
open the jdk file location for executing jps.

On Wed, Jul 10, 2013 at 12:30 AM, YouPeng Yang <yypvsxf19870706@gmail.com>wrote:

> Hi users.
>     I start my HDFS by using :start-dfs.sh. And add the node start
> successfully.
> However the stop-dfs.sh dose not work when I want to stop the HDFS.
> It shows : no namdenode to stop
>            no datanode to stop.
> I have to stop it by the command: kill -9 pid.
> So I wonder that how the stop-dfs.sh does not  work no longer?
> Best regards

Thanks & Regards
Deepak Rosario Pancras

View raw message