Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 50F6410465 for ; Tue, 4 Jun 2013 20:09:43 +0000 (UTC) Received: (qmail 46956 invoked by uid 500); 4 Jun 2013 20:09:38 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 46750 invoked by uid 500); 4 Jun 2013 20:09:38 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 46743 invoked by uid 99); 4 Jun 2013 20:09:38 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Jun 2013 20:09:38 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW X-Spam-Check-By: apache.org Received-SPF: error (nike.apache.org: local policy) Received: from [209.85.214.177] (HELO mail-ob0-f177.google.com) (209.85.214.177) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Jun 2013 20:09:30 +0000 Received: by mail-ob0-f177.google.com with SMTP id ta17so1169709obb.8 for ; Tue, 04 Jun 2013 13:08:50 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=mime-version:sender:x-originating-ip:in-reply-to:references:date :x-google-sender-auth:message-id:subject:from:to:content-type :x-gm-message-state; bh=5OJ7tL/ZVuTWWfbwbKGZVg+I+25Jl2puLYGGQNDWpS0=; b=kdCwYlrnlIAf1HwSBAny5li5I+AVtnafX+iKFT0h4uhKQAU1PQfiBwZZMStTyy1cmY YOZJ0WqHtZNJiUTQjznAvjAC6QIvYM665pczbHrM+W3ONP7kWg3v6TLKvX7DBeNjEpXx BuCLYSalSg1lL7jSd4DmE9ZgREwTs+/YsgHAYJxUw+kLf+ATLr9FvBgCuOrBLR0JIWEN SgTEKz/Z+I/Rb7uXX1RLGphFD4JqGnymIQYHg8fAQnYGBFTZtGu5Ojci0g49p4Ij1p+Y X7+rfKGYhcLEGy7Rj3XLL2elh+/Eg2LKybSvbTX5uLW54ABi11+eIzcg5e59Nb2dtDzN d2dw== MIME-Version: 1.0 X-Received: by 10.60.141.2 with SMTP id rk2mr12853899oeb.69.1370376529863; Tue, 04 Jun 2013 13:08:49 -0700 (PDT) Sender: niels@basj.es Received: by 10.76.166.229 with HTTP; Tue, 4 Jun 2013 13:08:49 -0700 (PDT) X-Originating-IP: [87.187.134.78] Received: by 10.76.166.229 with HTTP; Tue, 4 Jun 2013 13:08:49 -0700 (PDT) In-Reply-To: References: Date: Tue, 4 Jun 2013 22:08:49 +0200 X-Google-Sender-Auth: 53TaR_ZdOKunrNOabtfX_HpwVGQ Message-ID: Subject: Re: Reducer to output only json From: Niels Basjes To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b33c83a4578ec04de59a61a X-Gm-Message-State: ALoCoQmvnj/PDhUeiXE19nXDohrS/4ExfeLzul49y3jMonpBx6CV78aaLw6ik3wGZZpo5wYAw63O X-Virus-Checked: Checked by ClamAV on apache.org --047d7b33c83a4578ec04de59a61a Content-Type: text/plain; charset=ISO-8859-1 Have you tried something like this (i do not have a pc here to check this code) context.write(NullWritable, new Text(jsn.toString())); On Jun 4, 2013 8:10 PM, "Chengi Liu" wrote: > Hi, > > I have the following redcuer class > > public static class TokenCounterReducer > extends Reducer { > public void reduce(Text key, Iterable values, Context context) > throws IOException, InterruptedException { > > //String[] fields = s.split("\t", -1) > JSONObject jsn = new JSONObject(); > int sum = 0; > for (Text value : values) { > String[] vals = value.toString().split("\t"); > String[] targetNodes = vals[0].toString().split(",",-1); > jsn.put("source",vals[1] ); > jsn.put("target",targetNodes); > //sum += value.get(); > } > // context.write(key, new Text(sum)); > } > } > > I want to save that json to hdfs? > > It was very trivial in hadoop streaming.. but how do i do it in hadoop > java? > Thanks > --047d7b33c83a4578ec04de59a61a Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable

Have you tried something like this (i do not have a pc here = to check this code)

context.write(NullWritable, new Text(jsn.toString()));

On Jun 4, 2013 8:10 PM, "Chengi Liu" &= lt;chengi.liu.86@gmail.com&g= t; wrote:
Hi,=A0

=A0I have the following redcuer = class

public static class TokenCounterReducer=
=A0 =A0 extends Reducer<Text, Text, Text, Text> {
=A0 =A0 public void reduce(Text key, Iterable<Text> values, Cont= ext context)
=A0 =A0 =A0 =A0 throws IOException, InterruptedExcep= tion {
=A0 =A0
=A0 =A0 //String[] field= s =3D s.split("\t", -1)
=A0 =A0 JSONObject jsn = =3D new JSONObject();
=A0 =A0 =A0 =A0 int sum =3D 0;
= =A0 =A0 =A0 =A0 for (Text value : values) {
=A0 =A0 =A0 =A0 String[] vals =3D value.toString().= split("\t");
=A0 =A0 =A0 =A0 String[] = targetNodes =3D vals[0].toString().split(",",-1);
=A0 = =A0 =A0 =A0 jsn.put("sour= ce",vals[1] );
=A0 =A0 =A0 =A0 jsn.put(&= quot;target",targetNodes);
=A0 =A0 =A0 =A0 =A0 =A0 //sum += =3D value.get();
=A0 =A0 =A0 =A0 }
=A0 =A0 =A0 =A0// co= ntext.write(key, new Text(sum));
=A0 =A0 }
}

I want to save that jso= n to hdfs?

It was very trivial in hadoop streaming= .. but how do i do it in hadoop java?
Thanks
--047d7b33c83a4578ec04de59a61a--