hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Zhixuan Zhu" <z...@calpont.com>
Subject RE: Having problem to run hadoop with a jar file
Date Thu, 12 Apr 2012 15:24:33 GMT
Hi Bernd,

Thanks for the prompt reply!

My mapred-site.xml has the following contents:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->


I think if it's a protocol issue, I should get the similar issue when
including the customized API files in my driver project. But it works
fine that way. I'm at the end of my wit. Please advice...


-----Original Message-----
From: Bernd Fondermann [mailto:bf_jak@brainlounge.de] 
Sent: Thursday, April 12, 2012 2:35 AM
To: common-dev@hadoop.apache.org
Subject: Re: Having problem to run hadoop with a jar file


verify that port 9001 is actually a RPC port that Hadoop uses for 
communication. A common error is to specify the port where Hadoop's 
WebUIs are running. There, HTTP is served, not the Hadoop RPC protocol.

In short: check you are using the right port in all XML config 
properties and in-code URLs.


On 12.04.12 07:38, Zhixuan Zhu wrote:
> HI there,
> I'm having some issue when running hadoop with an external jar file.
> My testing environment:
> Host machine: windows, eclipse 3.3 with hadoop-0.20.2 and plugin, Jre6
> Single node hadoop cluster: linux, hadoop-0.20.2, jdk_1.6.0_26
> I wrote MyFileInputFormat.class and MyDBOutputFormat.class, which
> the two DBformat classes to customize my database fetching and
> When I add the two files in the same project as my driver and mapper
> classes, everything works fine. But when I build the two files into a
> separate jar file and make it external library to my driver project
> (which do not have the two DB files anymore), I got the errors as
> below. I put the jar file on both my driver's build path, and in the
> $hadoop/lib of my hadoop node. I googled and people said this could be
> caused by non-matching hadoop version of jar and driver. But I'm using
> exactly the same version for both sides. Any suggestions are highly
> appreciated.
> Thanks very much,
> Grace
> java.io.IOException: Call to srvswint4.calpont.com/
> failed on local exception: java.io.EOFException
> 	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1089)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1057)
> 	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
> 	at org.apache.hadoop.mapred.$Proxy0.getProtocolVersion(Unknown
> Source)
> 	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:369)
> 	at
> org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:486)
> 	at org.apache.hadoop.mapred.JobClient.init(JobClient.java:471)
> 	at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:456)
> 	at
> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1196)
> 	at OutputTest.run(OutputTest.java:98)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> 	at OutputTest.main(OutputTest.java:108)
> Caused by: java.io.EOFException
> 	at java.io.DataInputStream.readInt(Unknown Source)
> 	at
> 	at org.apache.hadoop.ipc.Client$Connection.run(Client.java:689)

View raw message