Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2595B9B9E for ; Tue, 29 May 2012 21:16:04 +0000 (UTC) Received: (qmail 17330 invoked by uid 500); 29 May 2012 21:16:00 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 17275 invoked by uid 500); 29 May 2012 21:16:00 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 17266 invoked by uid 99); 29 May 2012 21:16:00 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 29 May 2012 21:16:00 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of markq2011@gmail.com designates 209.85.161.176 as permitted sender) Received: from [209.85.161.176] (HELO mail-gg0-f176.google.com) (209.85.161.176) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 29 May 2012 21:15:54 +0000 Received: by ggnk4 with SMTP id k4so3853845ggn.35 for ; Tue, 29 May 2012 14:15:33 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=kqL7TMvfqgd9GteX/EtWtSwpP3CYLjZz/ZFIzrJmVcc=; b=Avbp/iR0OskjBH7mVX7VN68ZXXjgUp9ygaFSxoI0Hu/jJLuBx/SE3zUbQqN5jpwzCz jq2mXrRv5nQXfqKTGE1OOfmsw946HyBNNHHlYuD2x9+B24mKs5D79qdPulciSQ1CnkbF 7j7EYNRUNT/olnfFBhB2i15NSO6cU7H5MQuD7mNv2tXf9IuCzmJPgdWieMIlbS9ZmmYB FxIWEcTUU8Lyhr5yi0Wyj/vr0FdMSqS20TBJmvreZrDrN7snSWZqc6BYHHqB0Os+P28L R8onw486vSDrYGccZhcE/yope0V+c7y3pRrE87KLMkESZt70H3fRH6gNceAwO+CF71TQ Yfjg== MIME-Version: 1.0 Received: by 10.60.154.232 with SMTP id vr8mr12559654oeb.49.1338326133246; Tue, 29 May 2012 14:15:33 -0700 (PDT) Received: by 10.76.113.206 with HTTP; Tue, 29 May 2012 14:15:33 -0700 (PDT) In-Reply-To: References: Date: Tue, 29 May 2012 14:15:33 -0700 Message-ID: Subject: Re: different input/output formats From: Mark question To: common-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=bcaec55245ecc4098104c13355ca X-Virus-Checked: Checked by ClamAV on apache.org --bcaec55245ecc4098104c13355ca Content-Type: text/plain; charset=ISO-8859-1 Hi Samir, can you email me your main class.. or if you can check mine, it is as follows: public class SortByNorm1 extends Configured implements Tool { @Override public int run(String[] args) throws Exception { if (args.length != 2) { System.err.printf("Usage:bin/hadoop jar norm1.jar \n"); ToolRunner.printGenericCommandUsage(System.err); return -1; } JobConf conf = new JobConf(new Configuration(),SortByNorm1.class); conf.setJobName("SortDocByNorm1"); conf.setMapperClass(Norm1Mapper.class); conf.setMapOutputKeyClass(FloatWritable.class); conf.setMapOutputValueClass(Text.class); conf.setNumReduceTasks(0); conf.setReducerClass(Norm1Reducer.class); conf.setOutputKeyClass(FloatWritable.class); conf.setOutputValueClass(Text.class); conf.setInputFormat(TextInputFormat.class); conf.setOutputFormat(SequenceFileOutputFormat.class); TextInputFormat.addInputPath(conf, new Path(args[0])); SequenceFileOutputFormat.setOutputPath(conf, new Path(args[1])); JobClient.runJob(conf); return 0; } public static void main(String[] args) throws Exception { int exitCode = ToolRunner.run(new SortByNorm1(), args); System.exit(exitCode); } On Tue, May 29, 2012 at 1:55 PM, samir das mohapatra < samir.helpdoc@gmail.com> wrote: > Hi Mark > See the out put for that same Application . > I am not getting any error. > > > On Wed, May 30, 2012 at 1:27 AM, Mark question wrote: > >> Hi guys, this is a very simple program, trying to use TextInputFormat and >> SequenceFileoutputFormat. Should be easy but I get the same error. >> >> Here is my configurations: >> >> conf.setMapperClass(myMapper.class); >> conf.setMapOutputKeyClass(FloatWritable.class); >> conf.setMapOutputValueClass(Text.class); >> conf.setNumReduceTasks(0); >> conf.setOutputKeyClass(FloatWritable.class); >> conf.setOutputValueClass(Text.class); >> >> conf.setInputFormat(TextInputFormat.class); >> conf.setOutputFormat(SequenceFileOutputFormat.class); >> >> TextInputFormat.addInputPath(conf, new Path(args[0])); >> SequenceFileOutputFormat.setOutputPath(conf, new Path(args[1])); >> >> >> myMapper class is: >> >> public class myMapper extends MapReduceBase implements >> Mapper { >> >> public void map(LongWritable offset, Text >> val,OutputCollector output, Reporter reporter) >> throws IOException { >> output.collect(new FloatWritable(1), val); >> } >> } >> >> But I get the following error: >> >> 12/05/29 12:54:31 INFO mapreduce.Job: Task Id : >> attempt_201205260045_0032_m_000000_0, Status : FAILED >> java.io.IOException: wrong key class: org.apache.hadoop.io.LongWritable is >> not class org.apache.hadoop.io.FloatWritable >> at >> org.apache.hadoop.io.SequenceFile$Writer.append(SequenceFile.java:998) >> at >> >> org.apache.hadoop.mapred.SequenceFileOutputFormat$1.write(SequenceFileOutputFormat.java:75) >> at >> >> org.apache.hadoop.mapred.MapTask$DirectMapOutputCollector.collect(MapTask.java:705) >> at >> >> org.apache.hadoop.mapred.MapTask$OldOutputCollector.collect(MapTask.java:508) >> at >> >> filter.stat.cosine.preprocess.SortByNorm1$Norm1Mapper.map(SortByNorm1.java:59) >> at >> >> filter.stat.cosine.preprocess.SortByNorm1$Norm1Mapper.map(SortByNorm1.java:1) >> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) >> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:397) >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330) >> at org.apache.hadoop.mapred.Child$4.run(Child.java:217) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:396) >> at org.apache.hadoop.security.Use >> >> Where is the writing of LongWritable coming from ?? >> >> Thank you, >> Mark >> > > --bcaec55245ecc4098104c13355ca--