Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 10A249224 for ; Tue, 3 Jul 2012 07:56:15 +0000 (UTC) Received: (qmail 41098 invoked by uid 500); 3 Jul 2012 07:56:13 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 40727 invoked by uid 500); 3 Jul 2012 07:56:06 -0000 Mailing-List: contact mapreduce-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: mapreduce-user@hadoop.apache.org Delivered-To: mailing list mapreduce-user@hadoop.apache.org Received: (qmail 40683 invoked by uid 99); 3 Jul 2012 07:56:04 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 03 Jul 2012 07:56:04 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=FSL_RCVD_USER,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dv.guruprasad@gmail.com designates 209.85.160.176 as permitted sender) Received: from [209.85.160.176] (HELO mail-gh0-f176.google.com) (209.85.160.176) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 03 Jul 2012 07:55:57 +0000 Received: by ghbz10 with SMTP id z10so5715902ghb.35 for ; Tue, 03 Jul 2012 00:55:36 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=sAHRdpDxBYFbmnl4u5OlwOuvv4FUH6WPS6IuqeQzX5U=; b=rkEBgMI/AalRzPD3zKAF13AbupJV3dmD7drSiOoEoigHe9MNyHdYARhK7h9ePdOpgl ktOiAOezu2H/lbKOh9+Wlru8phL8BVThpLR8Le/FDq6G5Gemx+uQmS3ubgrPJr4+cVOk 9KgRbZ0PDDgrHt9ug40rVsiqzuIWeTDlwPbK464gVDd60Ne3eMHr7sQIO1v4KVfu/bhh QqI0DsrPj9B8m5qnahNgaQLJPkY6RHWfcFvLlvyXG8g0TQjO/59XJRcZoV4dc6M7Pa4N n53uCzfRUnhNcANotNvSEcjtKonrJvVpVioEPvFTfBP8yf3HpJlHagrQSLBeHxBaBh7H IfEw== MIME-Version: 1.0 Received: by 10.66.86.106 with SMTP id o10mr27290764paz.22.1341302135836; Tue, 03 Jul 2012 00:55:35 -0700 (PDT) Received: by 10.142.251.18 with HTTP; Tue, 3 Jul 2012 00:55:35 -0700 (PDT) Date: Tue, 3 Jul 2012 13:25:35 +0530 Message-ID: Subject: implementing a generic list writable From: Guruprasad DV To: mapreduce-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=f46d042de65b57d14b04c3e83d9b --f46d042de65b57d14b04c3e83d9b Content-Type: text/plain; charset=ISO-8859-1 Hi, I am working on building a map reduce pipeline of jobs(with one MR job's output feeding to another as input). The values being passed around are fairly complex, in that there are lists of different types and hash maps with values as lists. Hadoop api does not seem to have a ListWritable. Am trying to write a generic one, but it seems i can't instantiate a generic type in my readFields implementation, unless i pass in the class type itself: public class ListWritable implements Writable { private List list; private Class clazz; public ListWritable(Class clazz) { this.clazz = clazz; list = new ArrayList(); } @Override public void write(DataOutput out) throws IOException { out.writeInt(list.size()); for (T element : list) { element.write(out); } } @Override public void readFields(DataInput in) throws IOException{ int count = in.readInt(); this.list = new ArrayList(); for (int i = 0; i < count; i++) { try { T obj = clazz.newInstance(); obj.readFields(in); list.add(obj); } catch (InstantiationException e) { e.printStackTrace(); } catch (IllegalAccessException e) { e.printStackTrace(); } } } } But hadoop requires all writables to have a no argument constructor to read the values back. Has anybody tried to do the same and solved this problem? TIA. -- Thanks and regards, guru --f46d042de65b57d14b04c3e83d9b Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hi,

I am working on building a = map reduce pipeline of jobs(with one MR job's output feeding to another= as input). The values being passed around are fairly complex, in that ther= e are lists of different types and hash maps with values as lists. Hadoop a= pi does not seem to have a ListWritable. Am trying to write a generic one, = but it seems i can't instantiate a generic type in my readFields implem= entation, unless i pass in the class type itself:
public class ListWritable<T extends Wr=
itable> implements Writable {
    private List<T> list;
    private Class<T> clazz;

    public ListWritable(Class<T> clazz) {
       this.clazz =3D clazz;
       list =3D new ArrayList<T>();
    }

    @Override
    public void write(DataOutput out) throws IOException {
        out.writeInt(list.size());
        for (T element : list) {
            element.write(out);
        }
     }

     @Override
     public void readFields(DataInput in) throws IOException{
     int count =3D in.readInt();
     this.list =3D new ArrayList<T>();
     for (int i =3D 0; i < count; i++) {
        try {
            T obj =3D clazz.newInstance();
            obj.readFields(in);
            list.add(obj);
        } catch (InstantiationException e) {
            e.printStackTrace();
        } catch (IllegalAccessException e) {
            e.printStackTrace();
        }
      }
    }
}

But hadoop= requires all writables to have a no argument constructor to read the value= s back. Has anybody tried to do the same and solved this problem? TIA.

--
Thanks and regards, guru
--f46d042de65b57d14b04c3e83d9b--