hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Richard Zhang" <richardtec...@gmail.com>
Subject Re: Help: how to check the active datanodes?
Date Fri, 04 Jul 2008 02:11:41 GMT
Thanks Mafish. That page can be used to track the status of the nodes.

On Thu, Jul 3, 2008 at 6:43 PM, Mafish Liu <mafish@gmail.com> wrote:

> Hi, zhang:
>   Once you start hadoop with shell start-all.sh, a hadoop status pape can
> be accessed through http://namenode-ip:port/dfshealth. Port is specified
> by
>
>    <name>dfs.http.address</name>
> in your hadoop-default.xml.
>    If the datanodes status is not as expected, you need to check log files.
> They show the details of failure.
>
> On Fri, Jul 4, 2008 at 4:17 AM, Richard Zhang <richardtechzh@gmail.com>
> wrote:
>
> > Hi guys:
> > I am running hadoop on a 8 nodes cluster.  I uses start-all.sh to boot
> > hadoop and it shows that all 8 data nodes are started. However, when I
> use
> > bin/hadoop dfsadmin -report to check the status of the data nodes and it
> > shows only one data node (the one with the same host as name node) is
> > active. How could we know if all the data nodes are active precisely?
> Does
> > anyone has deal with this before?
> > Thanks.
> > Richard
> >
>
>
>
> --
> Mafish@gmail.com
> Institute of Computing Technology, Chinese Academy of Sciences, Beijing.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message