hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Arijit Mukherjee" <armukher...@connectivasystems.com>
Subject RE: Hadoop eclipse plugin
Date Tue, 26 Aug 2008 06:22:50 GMT
Hi Christophe
 
Thanx for the prompt reply. I've tried these two ports before, but I get
this exception:
 
Cannot connect to the Map/Reduce location: hadoop@localhost
java.io.IOException: Unknown protocol to name node:
org.apache.hadoop.mapred.JobSubmissionProtocol
 at org.apache.hadoop.dfs.NameNode.getProtocolVersion(NameNode.java:84)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
a:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
Impl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:585)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:452)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:888)
 
Do I need to change anything on the advanced tab? Or leave them as
default?
 
Regards
Arijit

PS: For some reason I've been getting the 552 spam score (5.6) exceeded
threshold error everytime I replied to this message.

Dr. Arijit Mukherjee
Principal Member of Technical Staff, Level-II
Connectiva Systems (I) Pvt. Ltd.
J-2, Block GP, Sector V, Salt Lake
Kolkata 700 091, India
Phone: +91 (0)33 23577531/32 x 107
http://www.connectivasystems.com


-----Original Message-----
From: christophe.taton@gmail.com [mailto:christophe.taton@gmail.com] On
Behalf Of Christophe Taton
Sent: Tuesday, August 26, 2008 11:27 AM
To: core-user@hadoop.apache.org; armukherjee@connectivasystems.com
Subject: Re: Hadoop eclipse plugin


Hi Arijit,

On Tue, Aug 26, 2008 at 7:06 AM, Arijit Mukherjee <
armukherjee@connectivasystems.com> wrote:

> Hi All
>
> Have anyone tried the Eclipse plugin for Hadoop? I've been able to add

> the plugin to Eclipse, however, I can see couple of problems there - 
> probably I did something wrong.
>
> (1) When I try to add a new hadoop location, the plugin is never able 
> to find it. I'm not sure which ports to put inside the config box - 
> I've tried several of them including the namenode, jobtracker etc - 
> but it always fails to add the hadoop server. Following is the output 
> of netstat which I used to check the ports -
>
> strawberry:/home/hadoop> netstat -plten | grep java
> tcp        0      0 :::8354                     :::*
> LISTEN      507        21343      8878/java
> tcp        0      0 :::19300                    :::*
> LISTEN      507        21418      8954/java
> tcp        0      0 ::ffff:127.0.0.1:54310      :::*
> LISTEN      507        20857      8564/java
> tcp        0      0 ::ffff:127.0.0.1:54311      :::*
> LISTEN      507        21421      8954/java
> tcp        0      0 :::50090                    :::*
> LISTEN      507        21369      8878/java
> tcp        0      0 :::50060                    :::*
> LISTEN      507        21712      9118/java
> tcp        0      0 :::50030                    :::*
> LISTEN      507        21471      8954/java
> tcp        0      0 ::ffff:127.0.0.1:53522      :::*
> LISTEN      507        21717      9118/java
> tcp        0      0 :::5205                     :::*
> LISTEN      507        20854      8564/java
> tcp        0      0 :::50070                    :::*
> LISTEN      507        21005      8564/java
>
> And the processes are running as -
>
> strawberry:/home/hadoop> jps
> 8878 SecondaryNameNode
> 8564 NameNode
> 8954 JobTracker
> 9216 Jps
> 8079
> 9118 TaskTracker


As far as I can see from your configuration:
The namenode is listening on port localhost:54311 (and its web interface
is listening on port *:50070). The jobtracker is listening on port
localhost:54310 (and its web interface is listening on port *:50030).

This means that you need to setup the eclipse plugin as follows:
 - map/reduce master: hostname = localhost (or 127.0.0.1) and port =
54310
 - distributed file system master: hostname = localhost and port = 54311

This should work. You need to use hostname = localhost as your daemons
listen on 127.0.0.1 only (or else you need to update your hadoop.xml to
listen on other network interfaces).


>
> (2) If I create a MapReduce project and add the Mapper and Reduce 
> classes, the signature of the generated classes are different from 
> what's there in the Hadoop APIs - as a result of which, there's a 
> compilation error.


This is probably the case as the plugin has not been updated for many
weeks now and is then out of sync wrt. Hadoop APIs that have been
changed. I should resume working actively on it soon (I hope in a couple
of weeks at most). Thus you cannot use these at the moment and need to
write all by hand, sorry...



> Please can anyone point me to the right directions? I've also raised 
> these questions to the plugin forum, but thought someone here may be 
> able to help as well.
>

Hope this helps,
Cheers,
Christophe
No virus found in this incoming message.
Checked by AVG. 
Version: 8.0.100 / Virus Database: 270.6.9/1634 - Release Date:
8/25/2008 8:48 PM



Mime
View raw message