hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Wressle <wres...@opera.com>
Subject External jars revisited.
Date Mon, 08 Oct 2007 08:22:31 GMT
Hello Hadoopers!

I have just recently started using Hadoop and I have a question that has 
puzzled me for a couple of days now.

I have already browsed the mailing list and found some relevant posts, 
especially
http://mail-archives.apache.org/mod_mbox/lucene-hadoop-user/200708.mbox/%3c84ad79bb0708131649x3b94cc18x7a0910090f06a1e7@mail.gmail.com%3e,

but the solution eludes me.

My Map/Reduce job relies on external jars and I had to modify my ant 
script to include them in the lib/ directory of my jar file. So far, so 
good. The job runs without any issues when I issue the job on my local 
machine only.

However, adding a second machine to the mini-cluster presents the 
following problem: a NullPointerException being thrown as soon as I call 
any function within a class I have imported from the external jars. 
Please note that this will only happen on the other machine, the maps on 
my main machine, which I submit the job on, will proceed without any 
warnings.

java.lang.NullPointerException at xxx.xxx.xxx (Unknown Source) is the 
actual log output from hadoop.

My jar file contains all the necessary jars in the lib/ directory. Do I 
need to place them somewhere else on the slaves in order for my 
submitted job to be able to use them?

Any pointers would be much appreciated.





Mime
View raw message