Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6AC2110495 for ; Tue, 4 Jun 2013 20:17:04 +0000 (UTC) Received: (qmail 58860 invoked by uid 500); 4 Jun 2013 20:16:59 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 58669 invoked by uid 500); 4 Jun 2013 20:16:59 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 58662 invoked by uid 99); 4 Jun 2013 20:16:59 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Jun 2013 20:16:59 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dontariq@gmail.com designates 209.85.212.53 as permitted sender) Received: from [209.85.212.53] (HELO mail-vb0-f53.google.com) (209.85.212.53) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Jun 2013 20:16:52 +0000 Received: by mail-vb0-f53.google.com with SMTP id p12so520683vbe.26 for ; Tue, 04 Jun 2013 13:16:31 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=3rWqepabxYQnLBv+aCqwOkn87wh8b2pzupQ1T2Y/2Bg=; b=cJtkwWoGgU/8HMPiM4oIi0ONstqHZRBy3TpX5CbFKH5eV1AyU3lIxSfx9JgO2X7uXc 1H2rWYP2K5bd8ythfjDYB4NIwZBx/q1b4DjKDjzJDeMvmXoz04VPOFcu4vmHQMfeXDVW ykuKtPLThkyiN4+JA5SdNQCkAgdrl69xuq2xxxQo8OVLuGrM7hWLifM2t4M0tC2Sb6Lc OjisHhfEF7UMJ2jGN7A3XPlVuWFVBDkKk5mbwJAkSzrzXAmvn9UK83faEq0z0wEstSim Cz78I99kLe9aJCaX/iiSMsP34SDERMKIICE6C0FpZ0uz7819bLV2O2sC1qL3oHONyR46 mHzQ== X-Received: by 10.52.72.78 with SMTP id b14mr12522110vdv.63.1370376991378; Tue, 04 Jun 2013 13:16:31 -0700 (PDT) MIME-Version: 1.0 Received: by 10.59.12.163 with HTTP; Tue, 4 Jun 2013 13:15:51 -0700 (PDT) In-Reply-To: References: From: Mohammad Tariq Date: Wed, 5 Jun 2013 01:45:51 +0530 Message-ID: Subject: Re: Reducer to output only json To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=20cf307f35e2c7926204de59c174 X-Virus-Checked: Checked by ClamAV on apache.org --20cf307f35e2c7926204de59c174 Content-Type: text/plain; charset=ISO-8859-1 Yes...This should do the trick. Warm Regards, Tariq cloudfront.blogspot.com On Wed, Jun 5, 2013 at 1:38 AM, Niels Basjes wrote: > Have you tried something like this (i do not have a pc here to check this > code) > > context.write(NullWritable, new Text(jsn.toString())); > On Jun 4, 2013 8:10 PM, "Chengi Liu" wrote: > >> Hi, >> >> I have the following redcuer class >> >> public static class TokenCounterReducer >> extends Reducer { >> public void reduce(Text key, Iterable values, Context context) >> throws IOException, InterruptedException { >> >> //String[] fields = s.split("\t", -1) >> JSONObject jsn = new JSONObject(); >> int sum = 0; >> for (Text value : values) { >> String[] vals = value.toString().split("\t"); >> String[] targetNodes = vals[0].toString().split(",",-1); >> jsn.put("source",vals[1] ); >> jsn.put("target",targetNodes); >> //sum += value.get(); >> } >> // context.write(key, new Text(sum)); >> } >> } >> >> I want to save that json to hdfs? >> >> It was very trivial in hadoop streaming.. but how do i do it in hadoop >> java? >> Thanks >> > --20cf307f35e2c7926204de59c174 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Yes...This should do the trick.

Warm Regards,
Tariq
=


On Wed, Jun 5, 2013 at 1:38 AM, Niels Ba= sjes <Niels@basjes.nl> wrote:
Have you tried something like this (i do not have a pc here = to check this code)

context.write(NullWritable, new Text(jsn.toString()));

On Jun 4, 2013 8:10 PM, "Chengi Liu" &= lt;chengi.liu.= 86@gmail.com> wrote:
Hi,=A0

=A0I have the following redcuer = class

public static class TokenCounterReducer=
=A0 =A0 extends Reducer<Text, Text, Text, Text> {
=A0 =A0 public void reduce(Text key, Iterable<Text> values, Cont= ext context)
=A0 =A0 =A0 =A0 throws IOException, InterruptedExcep= tion {
=A0 =A0
=A0 =A0 //String[] field= s =3D s.split("\t", -1)
=A0 =A0 JSONObject jsn = =3D new JSONObject();
=A0 =A0 =A0 =A0 int sum =3D 0;
= =A0 =A0 =A0 =A0 for (Text value : values) {
=A0 =A0 =A0 =A0 String[] vals =3D value.toString().= split("\t");
=A0 =A0 =A0 =A0 String[] = targetNodes =3D vals[0].toString().split(",",-1);
=A0 = =A0 =A0 =A0 jsn.put("sour= ce",vals[1] );
=A0 =A0 =A0 =A0 jsn.put(&= quot;target",targetNodes);
=A0 =A0 =A0 =A0 =A0 =A0 //sum += =3D value.get();
=A0 =A0 =A0 =A0 }
=A0 =A0 =A0 =A0// co= ntext.write(key, new Text(sum));
=A0 =A0 }
}

I want to save that jso= n to hdfs?

It was very trivial in hadoop streaming= .. but how do i do it in hadoop java?
Thanks

--20cf307f35e2c7926204de59c174--