hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Run multiple HDFS instances
Date Thu, 18 Apr 2013 09:22:59 GMT
Yes you can but if you want the scripts to work, you should have them
use a different PID directory (I think its called HADOOP_PID_DIR)
every time you invoke them.

I instead prefer to start the daemons up via their direct command such
as "hdfs namenode" and so and move them to the background, with a
redirect for logging.

On Thu, Apr 18, 2013 at 2:34 PM, Lixiang Ao <aolixiang@gmail.com> wrote:
> Hi all,
> Can I run mutiple HDFS instances, that is, n seperate namenodes and n
> datanodes, on a single machine?
> I've modified core-site.xml and hdfs-site.xml to avoid port and file
> conflicting between HDFSes, but when I started the second HDFS, I got the
> errors:
> Starting namenodes on [localhost]
> localhost: namenode running as process 20544. Stop it first.
> localhost: datanode running as process 20786. Stop it first.
> Starting secondary namenodes []
> secondarynamenode running as process 21074. Stop it first.
> Is there a way to solve this?
> Thank you in advance,
> Lixiang Ao

Harsh J

View raw message