hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Guruprasad DV <dv.gurupra...@gmail.com>
Subject implementing a generic list writable
Date Tue, 03 Jul 2012 07:55:35 GMT
Hi,

I am working on building a map reduce pipeline of jobs(with one MR job's
output feeding to another as input). The values being passed around are
fairly complex, in that there are lists of different types and hash maps
with values as lists. Hadoop api does not seem to have a ListWritable. Am
trying to write a generic one, but it seems i can't instantiate a generic
type in my readFields implementation, unless i pass in the class type
itself:

public class ListWritable<T extends Writable> implements Writable {
    private List<T> list;
    private Class<T> clazz;

    public ListWritable(Class<T> clazz) {
       this.clazz = clazz;
       list = new ArrayList<T>();
    }

    @Override
    public void write(DataOutput out) throws IOException {
        out.writeInt(list.size());
        for (T element : list) {
            element.write(out);
        }
     }

     @Override
     public void readFields(DataInput in) throws IOException{
     int count = in.readInt();
     this.list = new ArrayList<T>();
     for (int i = 0; i < count; i++) {
        try {
            T obj = clazz.newInstance();
            obj.readFields(in);
            list.add(obj);
        } catch (InstantiationException e) {
            e.printStackTrace();
        } catch (IllegalAccessException e) {
            e.printStackTrace();
        }
      }
    }
}


But hadoop requires all writables to have a no argument constructor to read
the values back. Has anybody tried to do the same and solved this problem?
TIA.

-- 
Thanks and regards,
guru

Mime
View raw message