hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raghu Angadi <rang...@yahoo-inc.com>
Subject Re: adding datanodes on the fly?
Date Tue, 17 Jul 2007 17:49:50 GMT

You can take a look at start-dfs.sh to see what it does.

Pretty much : $ ssh datanode 'cd dir; bin/hadoop-daemon.sh start datanode'

You are strongly encouraged to experiment with the scripts to see what 
they do. When something does not seem to work well, check corresponding 
log file in logs/ directory as well.

> The documentation says to start DFS from the namenode which will startup all
> the datanodes.

This is for the simple, common case.

Raghu.

> Thanks,
> Ankur
> 
> -----Original Message-----
> From: Raghu Angadi [mailto:rangadi@yahoo-inc.com] 
> Sent: Tuesday, July 17, 2007 1:33 PM
> To: hadoop-user@lucene.apache.org
> Subject: Re: adding datanodes on the fly?
> 
> Ankur Sethi wrote:
>> How are datanodes added?  Do they get added and started only at start of
> DFS
>> filesystem?  Can they be added while hadoop fs is running by editing slaves
>> file or does hadoop have to be restarted?
> 
> to add more data nodes, you can just bring up new datanodes with the 
> right config anytime. Namenode can add them anytime.
> 
> 'slaves' file is used only by the scripts like bin/start-dfs.sh, 
> bin/stop-dfs.sh etc. So adding new datanodes to slaves helps you manage 
> restarts etc easier.
> 
> Raghu.


Mime
View raw message