hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vitaliy Semochkin <vitaliy...@gmail.com>
Subject Re: default masters/slaves content question
Date Fri, 25 Jun 2010 09:19:57 GMT
Thank you very much for reply.

Is there any reason masters file contains localhost in default installation
of hadoop?

On Thu, Jun 24, 2010 at 10:56 PM, Jitendra Nath Pandey <
jitendra@yahoo-inc.com> wrote:

>  Start-dfs.sh script will try to start the secondary namenode on the node
> listed in masters file.
>
> Datanodes don’t connect to the secondary namenode, and slaves file should
> contain the hostnames where you want to start the datanode. It can be
> localhost.
>
>
> On 6/24/10 2:56 AM, "Vitaliy Semochkin" <vitaliy.se@gmail.com> wrote:
>
> Hi,
>
> In default installation hadoop masters and slaves files contain localhost.
> Am I correct that masters file contains  list of SECONDARY namenodes?
>
> If so, will localhost node try to start secondary namenode even if it
> already have one?
> More over will datanodes try to contact itself in order to try to reach
>  secondary namenode on localhost?
>
> should I keep localhost in slaves in case I want to run datanode on same
> sever I start cluster and run namemode?
>
> This is my first hadoop experience,
> Thanks in advance.
>
> Vitaliy S
>
>

Mime
View raw message