hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rajeev Gupta <graj...@in.ibm.com>
Subject Re: Hadoop Eclipse Plugin
Date Thu, 18 Jun 2009 10:32:08 GMT
You need to give
1) Map/reduce Master Host : Host where mapred.sh is running.
2)Map/reduce Master port: 19001 (see hadoop-site.xml file)
3)DFS master: Host where your start-dfs.sh is running
 4) DFS master port: 19000

These parameters will be sufficient to access HDFS. You may need to setup
some advanced parameters to give permissions to Window's user on hosts
where hadoop is running.

Thanks and regards.
-Rajeev Gupta

             <praveen.yarlagad                                          To 
             da@gmail.com>             core-user@hadoop.apache.org         
             06/18/2009 08:39                                              
             AM                                                    Subject 
                                       Hadoop Eclipse Plugin               
             Please respond to                                             


I have a problem configuring Hadoop Map/Reduce plugin with Eclipse.

Setup Details:

I have a namenode, a jobtracker and two data nodes, all running on ubuntu.
My set up works fine with example programs. I want to connect to this setup
from eclipse.

namenode - - 54310(port)
jobtracker - - 54311(port)

I run eclipse on a different windows m/c. I want to configure map/reduce
with eclipse, so that I can access HDFS from windows.

Map/Reduce master
Host - With jobtracker IP, it did not work
Port - With jobtracker port, it did not work

DFS master
Host - With namenode IP, It did not work
Port - With namenode port, it did not work

I tried other combination too by giving namenode details for Map/Reduce
and jobtracker details for DFS master. It did not work either.

If anyone has configured plugin with eclipse, please let me know. Even the
to how to configure it will be highly appreciated.


View raw message