mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Re: ClassNotFoundException while using RecommenderJob
Date Thu, 15 Mar 2012 10:44:00 GMT
You would still need to use the 'job' file generated by the build to
get an artifact with all the dependencies.
You don't need to add Guava as a dependency; it already is one. It's
the job file that you're missing.

There are two RecommenderJobs. One is what I call pseudo-distributed,
yes. The other is a fully distributed item-based recommender.

Sean

On Thu, Mar 15, 2012 at 10:01 AM, Janina <mail4janina@googlemail.com> wrote:
> Thanks for your fast answer.
>
> I haven't added the jar manually, but by adding the dependency to the
> pom.xml. I tried it with and without the dependendy and with different
> versions of the dependency, but it remained the same error message.
>
> But the RecommenderJob is meant to be to run a pseudo distributed
> recommender on a Hadoop cluster? Am I guessing something wrong? Or do I
> have another possibility to run recommendations on a Hadoop Cluster? I have
> read that only the clustering and classification parts of mahout are really
> able to be distributed on a hadoop cluster.
>
> 2012/3/15 Sean Owen <srowen@gmail.com>
>
>> You shouldn't have to add anything to your jar, if you use the
>> supplied 'job' file which contains all transitive dependencies.
>> If you do add your own jars, I think you need to unpack and repack
>> them, not put them into the overall jar as a jar file, even with a
>> MANIFEST.MF entry. I am not sure that works on Hadoop.
>>
>> On Thu, Mar 15, 2012 at 9:42 AM, Janina <mail4janina@googlemail.com>
>> wrote:
>> > Hi all,
>> >
>> > I am trying to run a RecommenderJob from a Java program. I have added the
>> > files users.txt and input.txt to a Hadoop VM and use the run-method of
>> > RecommenderJob to start the calculation. But the the following error
>> > message occurs while running the MapReducer:
>> >
>> > Error: java.lang.ClassNotFoundException:
>> com.google.common.primitives.Longs
>> > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> >  at java.security.AccessController.doPrivileged(Native Method)
>> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> >  at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> >  at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> > at
>> >
>> org.apache.mahout.cf.taste.hadoop.TasteHadoopUtils.idToIndex(TasteHadoopUtils.java:61)
>> >  at
>> >
>> org.apache.mahout.cf.taste.hadoop.item.ItemIDIndexMapper.map(ItemIDIndexMapper.java:48)
>> > at
>> >
>> org.apache.mahout.cf.taste.hadoop.item.ItemIDIndexMapper.map(ItemIDIndexMapper.java:31)
>> >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
>> >  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
>> > at org.apache.hadoop.mapred.Child.main(Child.java:170)
>> >
>> > I have added the required guava-r09.jar explicitly to my jar which also
>> > lays on the hadoop cluster.
>> > This may be a stupid question, but does anyone know where this error
>> comes
>> > from? This would help me a lot.
>> >
>> > Thanks and greetings,
>> > Janina
>>

Mime
View raw message