hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dennis Kubes <ku...@apache.org>
Subject Re: External jars revisited.
Date Mon, 08 Oct 2007 20:13:22 GMT
This won't solve your current error, but I should have a revised patch 
for HADOOP-1622, which deals which allows multiple resources including 
jars for hadoop jobs finished and posted this afternoon.

Dennis Kubes

Ted Dunning wrote:
> Can you show us the lines in your code where you construct the JobConf?
> If you don't include a class in that constructor call, then Hadoop doesn't
> have enough of a hint to find your jar files.
> On 10/8/07 12:03 PM, "Christophe Taton" <christophe.taton@gmail.com> wrote:
>> Hi Daniel,
>> Can you try to build and run a single jar file which contains all
>> required class files directly (i.e. without including jar files inside
>> the job jar file)?
>> This should prevent classloading problems. If the error still persists,
>> then you might suspect other problems.
>> Chris
>> Daniel Wressle wrote:
>>> Hello Hadoopers!
>>> I have just recently started using Hadoop and I have a question that
>>> has puzzled me for a couple of days now.
>>> I have already browsed the mailing list and found some relevant posts,
>>> especially
>>> http://mail-archives.apache.org/mod_mbox/lucene-hadoop-user/200708.mbox/%3c84
>>> ad79bb0708131649x3b94cc18x7a0910090f06a1e7@mail.gmail.com%3e,
>>> but the solution eludes me.
>>> My Map/Reduce job relies on external jars and I had to modify my ant
>>> script to include them in the lib/ directory of my jar file. So far,
>>> so good. The job runs without any issues when I issue the job on my
>>> local machine only.
>>> However, adding a second machine to the mini-cluster presents the
>>> following problem: a NullPointerException being thrown as soon as I
>>> call any function within a class I have imported from the external
>>> jars. Please note that this will only happen on the other machine, the
>>> maps on my main machine, which I submit the job on, will proceed
>>> without any warnings.
>>> java.lang.NullPointerException at xxx.xxx.xxx (Unknown Source) is the
>>> actual log output from hadoop.
>>> My jar file contains all the necessary jars in the lib/ directory. Do
>>> I need to place them somewhere else on the slaves in order for my
>>> submitted job to be able to use them?
>>> Any pointers would be much appreciated.

View raw message