hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marc Harris <mhar...@jumptap.com>
Subject can't find class ImmutableBytesWritable
Date Mon, 04 Feb 2008 17:09:53 GMT
Sorry for the cross-post to hadoop and hbase. Is the hbase-user group
active yet? I haven't got any e-mails from it.

I am having a problem with hbase mapreduce

I get the following exception my map tasks have run and before and
reduce tasks are called. (I am runnig in the local runner attached to a
debugger).

Here is the exception:

java.io.IOException: can't find class:
org.apache.hadoop.hbase.io.ImmutableBytesWritable because
org.apache.hadoop.hbase.io.ImmutableBytesWritable
        at
org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java:210)
        at
org.apache.hadoop.io.MapWritable.readFields(MapWritable.java:145)
        at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.spill(MapTask.java:554)
        at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.sortAndSpillToDisk(MapTask.java:497)
        at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.flush(MapTask.java:713)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:209)
        at org.apache.hadoop.mapred.LocalJobRunner
$Job.run(LocalJobRunner.java:132)
Exception in thread "main" java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:894)
        at
com.jumptap.msi.processors.pagefreshness.PageFreshness.main(PageFreshness.java:298)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:585)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)

Since the code in AbstractMapWritable wraps the ClassNotFoundException
incorrectly, I changed it and got a marginally more helpful stack trace:

java.io.IOException: can't find class:
org.apache.hadoop.hbase.io.ImmutableBytesWritable because
org.apache.hadoop.hbase.io.ImmutableBytesWritable
        at
org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java:210)
        at
org.apache.hadoop.io.MapWritable.readFields(MapWritable.java:145)
        at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.spill(MapTask.java:554)
        at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.sortAndSpillToDisk(MapTask.java:497)
        at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.flush(MapTask.java:713)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:209)
        at org.apache.hadoop.mapred.LocalJobRunner
$Job.run(LocalJobRunner.java:132)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.io.ImmutableBytesWritable
        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:164)
        at
org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java:207)
        ... 6 more

What is slightly odd is that if I put a breakpoint on some code in
org.ImmutableBytesWritable I find that it is loaded successfully well
before the mapreduce task runs. Does the mapreduce task run in some
special context that does not have access to ImmutableBytesWritable
somehow? I don't know what "java.security.AccessController.doPrivileged"
does but it sounds relevant.

Here is my mapreduce task. I removed what I believe to be irrelevant
bits.

Any help you could give would be appreciated as I'm completely stuck
now.
Thanks
- Marc

import  . . .

public class PageFreshness
{
    . . . 
    public static class MapStep extends TableMap<Text, MapWritable>
    {
        public void map(HStoreKey hStoreKey,
                        MapWritable inputFields,
                        OutputCollector<Text, MapWritable>
outputCollector,
                        Reporter reporter) throws IOException
        {
            Text key = hStoreKey.getRow();
             . . . 
            Text otherKey = new Text(key);

            outputCollector.collect(otherKey, inputFields);
        }
    }

    public static class ReduceStep extends TableReduce<Text,
MapWritable>
    {
        private JobConf job;

        public void configure(JobConf job)
        {
            super.configure(job);
            this.job = job;
        }

        public void reduce(Text pageKeyText,
                           Iterator<MapWritable> iterator,
                           OutputCollector<Text, MapWritable>
outputCollector,
                           Reporter reporter) throws IOException
        {
           . . .
           // Doesn't mapper what's in here; the exception is thrown
before we get here
        }

    }

    enum TimeUnit
    {
        m(      60*1000),
        h(   60*60*1000),
        d(24*60*60*1000);

        private int millisMultiple;

        TimeUnit(int millisMultiple)
        {
            this.millisMultiple = millisMultiple;
        }
    }

    public static void main(String[] args) throws Exception
    {
        Configuration conf = new HBaseConfiguration();
        conf.set("mapred.job.tracker", "local");

        . . . // set up some parameters such as minKeepCount

        JobConf jobConf = new JobConf(conf, PageFreshness.class);
        jobConf.setJobName("Page Freshness processor");
        jobConf.setInt(KEEP_COUNT, minKeepCount);
        jobConf.setLong(KEEP_TIME, minKeepTime *
minKeepTimeUnit.millisMultiple);

        String table = "pagefetch";
        String columns = PageFetchTable.INFO_FETCHSTATUS + " " +
                         PageFetchTable.INFO_FETCHDATE + " " +
                         PageFetchTable.INFO_LASTMODIFIED + " " +
                         PageFetchTable.CHANGEDATA_SIGNATURE;
        TableMap.initJob(table, columns, MapStep.class, jobConf);
        TableReduce.initJob("page", ReduceStep.class, jobConf);

        JobClient.runJob(jobConf);
    }

}



Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message