hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brahma Reddy <brahmared...@huawei.com>
Subject RE: How to run multiple data nodes and multiple task trackers on single server.
Date Mon, 04 Jul 2011 04:41:09 GMT
HI Xiaobo Gu,

We can run the multiple data nodes in single machine by using the following

i)In the hadoop/conf/haddop-env.sh

Give the different directories for each cluster to the following property

For Example:

#the directory where pid files are stored, by default /tmp

export HADOOP_PID_DIR=/var/processname 

ii)We have to give the listening ports differently

For Example I am using the free ports aviable in the machine in the
following example

The address where the datanode server listen to

The datanode http server address and port</description>

The datanode puc server address and port 

Similarly we can give different ports for namenode,secondary
namenode,jobtracker and tasktracker

This e-mail and attachments contain confidential information from HUAWEI,
which is intended only for the person or entity whose address is listed
above. Any use of the information contained herein in any way (including,
but not limited to, total or partial disclosure, reproduction, or
dissemination) by persons other than the intended recipient's) is
prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!

-----Original Message-----
From: Xiaobo Gu [mailto:guxiaobo1982@gmail.com] 
Sent: Friday, July 01, 2011 9:05 PM
To: hdfs-user@hadoop.apache.org
Subject: How to run multiple data nodes and multiple task trackers on single

   I am following the guides in
with Hadoop on Solaris 10u9 X64, but I have revised the
hadoop-daemon.sh call to

hadoop-daemon.sh  $1 datanode $DN_CONF_OPTS

But I failed with error messages as following:

-bash-3.00$ run-additionalDN.sh start 1
starting datanode, logging to
Usage: java DataNode


Xiaobo Gu

View raw message