hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Saptarshi Guha <saptarshi.g...@gmail.com>
Subject Re: InputSplits, Serializers in Hadoop 0.20
Date Mon, 10 Aug 2009 16:10:50 GMT
Fixed. InputSplits in 0.20 should implement Writable

On Mon, Aug 10, 2009 at 11:49 AM, Saptarshi
Guha<saptarshi.guha@gmail.com> wrote:
> Hello,
> In my custom inputformat written using the new Hadoop 0.20 API, I get
> rhe following error
>        at org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
>        at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:899)
>        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
>        at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
>        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
>
>
> The code in writeNewSplits which causes this is the last line
>
> ...
>  try {
>      if (array.length != 0) {
>        DataOutputBuffer buffer = new DataOutputBuffer();
>        RawSplit rawSplit = new RawSplit();
>        SerializationFactory factory = new SerializationFactory(conf);
>
>        Serializer<T> serializer =
>          factory.getSerializer((Class<T>) array[0].getClass());
> ...
>
> My InputSplit format has the read and write methods, but I can't quite
> figure out what is causing this error.
>
> Thank you in advance
> Saptarshi
>

Mime
View raw message