Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4A9F410EE8 for ; Tue, 4 Jun 2013 18:10:13 +0000 (UTC) Received: (qmail 49679 invoked by uid 500); 4 Jun 2013 18:10:07 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 49440 invoked by uid 500); 4 Jun 2013 18:10:07 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 49427 invoked by uid 99); 4 Jun 2013 18:10:06 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Jun 2013 18:10:06 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of chengi.liu.86@gmail.com designates 209.85.160.48 as permitted sender) Received: from [209.85.160.48] (HELO mail-pb0-f48.google.com) (209.85.160.48) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 04 Jun 2013 18:09:59 +0000 Received: by mail-pb0-f48.google.com with SMTP id md4so591808pbc.35 for ; Tue, 04 Jun 2013 11:09:38 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=xRnmIMf4eiHPfcYuvcjMRIOq3rQ3liU7Z2K9FXLLe+8=; b=GBHOkHq5Gk/6moIdXbIIIxeVeusn8HCGfTS5Z9UZwr+flanQhqCCu3RiKw1UGfSZfa Q1RqL3WspuAahAj0vAI3HRuSfYLBe2pVovJJ0Z/IYpjthswBoEO8LasZO+sCi8aXgRZg YcDAWtdvKlxUeNRy7/3VDVp9vUXjlM2MRbw21ilq/CYJ94trK+uqrU5lrP5lyPIWBSbB hhGsu54i3aD/zNmuux5txCKMjIhtabEkwZNNt69eLytcO7k9zzynuNr6wXgr8e4+01Se nylvJ72Vw2lAD7fLor1zk3ZR8uuTVKRf42YgvXeJNRkHgXwBQWD/gRkppVXhOYBNT1Yj 98Gw== MIME-Version: 1.0 X-Received: by 10.66.8.98 with SMTP id q2mr30211355paa.76.1370369378406; Tue, 04 Jun 2013 11:09:38 -0700 (PDT) Received: by 10.70.38.135 with HTTP; Tue, 4 Jun 2013 11:09:38 -0700 (PDT) Date: Tue, 4 Jun 2013 11:09:38 -0700 Message-ID: Subject: Reducer to output only json From: Chengi Liu To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=bcaec52161f702d78d04de57fc62 X-Virus-Checked: Checked by ClamAV on apache.org --bcaec52161f702d78d04de57fc62 Content-Type: text/plain; charset=ISO-8859-1 Hi, I have the following redcuer class public static class TokenCounterReducer extends Reducer { public void reduce(Text key, Iterable values, Context context) throws IOException, InterruptedException { //String[] fields = s.split("\t", -1) JSONObject jsn = new JSONObject(); int sum = 0; for (Text value : values) { String[] vals = value.toString().split("\t"); String[] targetNodes = vals[0].toString().split(",",-1); jsn.put("source",vals[1] ); jsn.put("target",targetNodes); //sum += value.get(); } // context.write(key, new Text(sum)); } } I want to save that json to hdfs? It was very trivial in hadoop streaming.. but how do i do it in hadoop java? Thanks --bcaec52161f702d78d04de57fc62 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi,=A0

=A0I have the following re= dcuer class

public static class T= okenCounterReducer
=A0 =A0 extends Reducer<Text, Text, Text, T= ext> {
=A0 =A0 public void reduce(Text key, Iterable<Text> values, Cont= ext context)
=A0 =A0 =A0 =A0 throws IOException, InterruptedExcep= tion {
=A0 =A0
=A0 =A0 //St= ring[] fields =3D s.split("\t", -1)
=A0 =A0 JSONObject = jsn =3D new JSONObject();
=A0 =A0 =A0 =A0 int sum =3D 0;
=A0 =A0 =A0 =A0 for (Text value : values) {
=A0 =A0 =A0 =A0 String[] vals =3D value.toS= tring().split("\t");
=A0 =A0 =A0 =A0 Str= ing[] targetNodes =3D vals[0].toString().split(",",-1);
=A0 =A0 =A0 =A0 jsn.put= ("source",vals[1] );
=A0 =A0 =A0 =A0 jsn= .put("target",targetNodes);
=A0 =A0 =A0 =A0 =A0 =A0 //s= um +=3D value.get();
=A0 =A0 =A0 =A0 }
=A0 =A0 =A0 =A0/= / context.write(key, new Text(sum));
=A0 =A0 }
}

I want to save th= at json to hdfs?

It was very trivial i= n hadoop streaming.. but how do i do it in hadoop java?
Tha= nks
--bcaec52161f702d78d04de57fc62--