hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris MacKenzie <stu...@chrismackenziephotography.co.uk>
Subject Re: Hadoop YARM Cluster Setup Questions
Date Sat, 23 Aug 2014 16:44:32 GMT
Hi,

The requirement is simply to have the slaves and masters files on the resource manager it's
used by the shell script that starts the demons :-)

Sent from my iPhone

> On 23 Aug 2014, at 16:02, "S.L" <simpleliving016@gmail.com> wrote:
> 
> Ok, Ill copy the slaves file to the other slave nodes as well.
> 
> What about the masters file though?
> 
> Sent from my HTC
> 
> ----- Reply message -----
> From: "rab ra" <rabmdu@gmail.com>
> To: "user@hadoop.apache.org" <user@hadoop.apache.org>
> Subject: Hadoop YARM Cluster Setup Questions
> Date: Sat, Aug 23, 2014 5:03 AM
> 
> Hi,
> 
> 1. Typically,we used to copy the slaves file all the participating nodes
> though I do not have concrete theory to back up this. Atleast, this is what
> I was doing in hadoop 1.2 and I am doing the same in hadoop 2x
> 
> 2. I think, you should investigate the yarn GUI and see how many maps it
> has spanned. There is a high possibility that both the maps are running in
> the same node in parallel. Since there are two splits, there would be two
> map processes, and one node is capable of handling more than one map.
> 
> 3. There could be no replica of input file stored and it is small, and
> hence stored in a single block in one node itself.
> 
> These could be few hints which might help you
> 
> regards
> rab
> 
> 
> 
> On Sat, Aug 23, 2014 at 12:26 PM, S.L <simpleliving016@gmail.com> wrote:
> 
> > Hi Folks,
> >
> > I was not able to find  a clear answer to this , I know that on the master
> > node we need to have a slaves file listing all the slaves , but do we need
> > to have the slave nodes have a master file listing the single name node( I
> > am not using a secondary name node). I only have the slaves file on the
> > master node.
> >
> > I was not able to find a clear answer to this ,the reason I ask this is
> > because when I submit a hadoop job , even though the input is being split
> > into 2 parts , only one data node is assigned applications , the other two
> > ( I have three) are no tbeing assigned any applications.
> >
> > Thanks in advance!
> >

Mime
View raw message