hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Dunning <tdunn...@veoh.com>
Subject Re: can't find class ImmutableBytesWritable
Date Mon, 04 Feb 2008 18:07:39 GMT

If you add the jars to the lib directory in the job jar, you may have to
modify the manifest of the job jar to reference those jar files.

If you repack all of the jar contents into a single collection of class
files yoou shouldn't need to do that.


On 2/4/08 10:04 AM, "Marc Harris" <mharris@jumptap.com> wrote:

> Some more analysis (sorry for not doing this all before sending the
> e-mail, but it didn't occur to me).
> 
> I currently include all the hbase classes in the job jar file. Since I
> am using maven, and maven has very much its own idea of how to deal with
> third party jar files, the only way I could find to do this was
> unpacking all the jar files into their class false and repacking them
> all together as one big jar file. This is in contrast to the suggestion
> in http://wiki.apache.org/hadoop/Hbase/MapReduce to put the hbase jar
> files in the lib directory of the job jar file.
> 
> When I added the hbase jar file to HADOOP_CLASSPATH, the problem went
> away.
> 
> Is this the correct behavior, or should adding the classes to the job
> jar have worked?
> 
> - Marc
> 
> 
> On Mon, 2008-02-04 at 12:24 -0500, Marc Harris wrote:
> 
>> Sorry, my third paragraph got totally bungled. I wish I could type (and
>> read, apparently). It should have said
>> 
>> I get the following exception after my map tasks have run and before my
>> reduce tasks are called. (I am running in the local runner attached to a
>> debugger).
>> And I forgot to say, I am running 0.16.0, downloaded from
>> http://people.apache.org/~nigel/hadoop-0.16.0-candidate-0/
>> 
>> - Marc
>> 
>> 
>> On Mon, 2008-02-04 at 12:09 -0500, Marc Harris wrote:
>> 
>>> Sorry for the cross-post to hadoop and hbase. Is the hbase-user group
>>> active yet? I haven't got any e-mails from it.
>>> 
>>> I am having a problem with hbase mapreduce
>>> 
>>> I get the following exception my map tasks have run and before and
>>> reduce tasks are called. (I am runnig in the local runner attached to a
>>> debugger).
>>> 
>>> Here is the exception:
>>> 
>>> java.io.IOException: can't find class:
>>> org.apache.hadoop.hbase.io.ImmutableBytesWritable because
>>> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>>>         at
>>> org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java
>>> :210)
>>>         at
>>> org.apache.hadoop.io.MapWritable.readFields(MapWritable.java:145)
>>>         at org.apache.hadoop.mapred.MapTask
>>> $MapOutputBuffer.spill(MapTask.java:554)
>>>         at org.apache.hadoop.mapred.MapTask
>>> $MapOutputBuffer.sortAndSpillToDisk(MapTask.java:497)
>>>         at org.apache.hadoop.mapred.MapTask
>>> $MapOutputBuffer.flush(MapTask.java:713)
>>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:209)
>>>         at org.apache.hadoop.mapred.LocalJobRunner
>>> $Job.run(LocalJobRunner.java:132)
>>> Exception in thread "main" java.io.IOException: Job failed!
>>>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:894)
>>>         at
>>> com.jumptap.msi.processors.pagefreshness.PageFreshness.main(PageFreshness.ja
>>> va:298)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39>>>
)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
>>> .java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:585)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>> 
>>> Since the code in AbstractMapWritable wraps the ClassNotFoundException
>>> incorrectly, I changed it and got a marginally more helpful stack trace:
>>> 
>>> java.io.IOException: can't find class:
>>> org.apache.hadoop.hbase.io.ImmutableBytesWritable because
>>> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>>>         at
>>> org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java
>>> :210)
>>>         at
>>> org.apache.hadoop.io.MapWritable.readFields(MapWritable.java:145)
>>>         at org.apache.hadoop.mapred.MapTask
>>> $MapOutputBuffer.spill(MapTask.java:554)
>>>         at org.apache.hadoop.mapred.MapTask
>>> $MapOutputBuffer.sortAndSpillToDisk(MapTask.java:497)
>>>         at org.apache.hadoop.mapred.MapTask
>>> $MapOutputBuffer.flush(MapTask.java:713)
>>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:209)
>>>         at org.apache.hadoop.mapred.LocalJobRunner
>>> $Job.run(LocalJobRunner.java:132)
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
>>>         at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
>>>         at java.lang.Class.forName0(Native Method)
>>>         at java.lang.Class.forName(Class.java:164)
>>>         at
>>> org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java
>>> :207)
>>>         ... 6 more
>>> 
>>> What is slightly odd is that if I put a breakpoint on some code in
>>> org.ImmutableBytesWritable I find that it is loaded successfully well
>>> before the mapreduce task runs. Does the mapreduce task run in some
>>> special context that does not have access to ImmutableBytesWritable
>>> somehow? I don't know what "java.security.AccessController.doPrivileged"
>>> does but it sounds relevant.
>>> 
>>> Here is my mapreduce task. I removed what I believe to be irrelevant
>>> bits.
>>> 
>>> Any help you could give would be appreciated as I'm completely stuck
>>> now.
>>> Thanks
>>> - Marc
>>> 
>>> import  . . .
>>> 
>>> public class PageFreshness
>>> {
>>>     . . . 
>>>     public static class MapStep extends TableMap<Text, MapWritable>
>>>     {
>>>         public void map(HStoreKey hStoreKey,
>>>                         MapWritable inputFields,
>>>                         OutputCollector<Text, MapWritable>
>>> outputCollector,
>>>                         Reporter reporter) throws IOException
>>>         {
>>>             Text key = hStoreKey.getRow();
>>>              . . .
>>>             Text otherKey = new Text(key);
>>> 
>>>             outputCollector.collect(otherKey, inputFields);
>>>         }
>>>     }
>>> 
>>>     public static class ReduceStep extends TableReduce<Text,
>>> MapWritable>
>>>     {
>>>         private JobConf job;
>>> 
>>>         public void configure(JobConf job)
>>>         {
>>>             super.configure(job);
>>>             this.job = job;
>>>         }
>>> 
>>>         public void reduce(Text pageKeyText,
>>>                            Iterator<MapWritable> iterator,
>>>                            OutputCollector<Text, MapWritable>
>>> outputCollector,
>>>                            Reporter reporter) throws IOException
>>>         {
>>>            . . .
>>>            // Doesn't mapper what's in here; the exception is thrown
>>> before we get here
>>>         }
>>> 
>>>     }
>>> 
>>>     enum TimeUnit
>>>     {
>>>         m(      60*1000),
>>>         h(   60*60*1000),
>>>         d(24*60*60*1000);
>>> 
>>>         private int millisMultiple;
>>> 
>>>         TimeUnit(int millisMultiple)
>>>         {
>>>             this.millisMultiple = millisMultiple;
>>>         }
>>>     }
>>> 
>>>     public static void main(String[] args) throws Exception
>>>     {
>>>         Configuration conf = new HBaseConfiguration();
>>>         conf.set("mapred.job.tracker", "local");
>>> 
>>>         . . . // set up some parameters such as minKeepCount
>>> 
>>>         JobConf jobConf = new JobConf(conf, PageFreshness.class);
>>>         jobConf.setJobName("Page Freshness processor");
>>>         jobConf.setInt(KEEP_COUNT, minKeepCount);
>>>         jobConf.setLong(KEEP_TIME, minKeepTime *
>>> minKeepTimeUnit.millisMultiple);
>>> 
>>>         String table = "pagefetch";
>>>         String columns = PageFetchTable.INFO_FETCHSTATUS + " " +
>>>                          PageFetchTable.INFO_FETCHDATE + " " +
>>>                          PageFetchTable.INFO_LASTMODIFIED + " " +
>>>                          PageFetchTable.CHANGEDATA_SIGNATURE;
>>>         TableMap.initJob(table, columns, MapStep.class, jobConf);
>>>         TableReduce.initJob("page", ReduceStep.class, jobConf);
>>> 
>>>         JobClient.runJob(jobConf);
>>>     }
>>> 
>>> }
>>> 
>>> 


Mime
View raw message