hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Christophe Taton <christophe.ta...@gmail.com>
Subject Re: Hadoop Eclipse plugin java.net.UnknownHostException
Date Mon, 12 Nov 2007 06:57:38 GMT
Hi Bob,

You need to fill both the Host/IP of the Hadoop master and its home
directory (the home directory for Hadoop is the one on the other
machine). The current version of the plug-in SSH to the host running
Hadoop and runs shell commands (which is why it needs both the IP and
the path to Hadoop).

Just in case, which version of the plug-in did you download?


Bob Futrelle wrote:
> Topic: Hadoop Eclipse plugin java.net.UnknownHostException
> In the Eclipse plugin for Hadoop, downloaded and installed today,
> creating a New Hadoop Server Location failed to validate.
> In the same way, in the Project Explorer, trying to load the
> DFS @ Hadoop Server failed with a message in a
> Refresh DFS Children dialogue, saying,
> "Refreshing DFS node failed: java.net.UnknownHostException: null"
> The Details button was dimmed in that dialogue.
> I was following the directions in the PDF: "Running Habuntu: the
> Hadoop Cluster Image".
> Habuntu is running as a VM in VMware Fusion.
> The Cheat Sheet tells me to fill in a "Hadoop Home Directory", but
> it's on another machine,
> my Habuntu VM, not my Mac.  On the other hand, the Server dialog
> suggests that I can put in an IP,
> and an installation directory, which I had done (though it then
> couldn't validate).
> I can connect to the VM with no problems via ssh, e.g.,
> ssh guest@
> and have also transferred files.
> My system:
> Mac OS X 10.4.10
> MacBook Pro 2.33 GHz Intel Core 2 Duo
> Java version
> Eclipse 3.2.2
> Googling didn't help me with the above.
>  Thanks,
>  - Bob Futrelle
> (I get the digest, so you can cc me, if you would.)

View raw message