Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 11434D2E1 for ; Wed, 27 Feb 2013 07:33:53 +0000 (UTC) Received: (qmail 30068 invoked by uid 500); 27 Feb 2013 07:33:47 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 29973 invoked by uid 500); 27 Feb 2013 07:33:47 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 29949 invoked by uid 99); 27 Feb 2013 07:33:46 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 27 Feb 2013 07:33:46 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: local policy) Received: from [209.85.219.44] (HELO mail-oa0-f44.google.com) (209.85.219.44) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 27 Feb 2013 07:33:38 +0000 Received: by mail-oa0-f44.google.com with SMTP id h1so540792oag.17 for ; Tue, 26 Feb 2013 23:33:16 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=mime-version:x-received:date:message-id:subject:from:to :content-type:x-gm-message-state; bh=ZdzfYDR5D36wpZWa1LLL3MRt83ps+FCI4AI6yLHFsdA=; b=aYjXncHZTasdIYov6r0q8MSH3j7CSSJXa3mbzbngF7LA9KwJCkz7EZWTWpg4uqbSxa yLqh9rNPbMPkNNMCVbJ2BkSlaxiVeNTIOGVNDTjlK0WA6kkekcu71RbLWxFUvvlfxMQv R+hF23dObeSwxqer0iTH7/tadhjASEpIDtsdzR5d76X7yLUwQrE4xx6hbLldI6DyvyCO gy5bw3Yefyo0/mOb2He1rQdS3cmleE2Ax7XmmQNrERiUhQrnsIRdygGK9NNSwkeUEcXj BHMXSo8xRsol4lUJTkwsbLeFshDHQ9Mi3C49vKuSqd07oBGaJbn7e1x87ToUYHC1CNNp z0Tg== MIME-Version: 1.0 X-Received: by 10.60.24.162 with SMTP id v2mr1135097oef.96.1361950396436; Tue, 26 Feb 2013 23:33:16 -0800 (PST) Received: by 10.60.42.142 with HTTP; Tue, 26 Feb 2013 23:33:16 -0800 (PST) Date: Wed, 27 Feb 2013 11:33:16 +0400 Message-ID: Subject: java.lang.NumberFormatException and Thanks to Hemanth and Harsh From: Fatih Haltas To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=e89a8ff252ba94ede604d6afc9ff X-Gm-Message-State: ALoCoQnn5HjerpcSO2+E3Om42d7+RV35dcGQa35P5g7Rr1lAUHx7UfWMJvoQSo9KNPEPT+onULba X-Virus-Checked: Checked by ClamAV on apache.org --e89a8ff252ba94ede604d6afc9ff Content-Type: text/plain; charset=ISO-8859-1 Hi all, First, I would like to thank you all, espacially to Hemanth and Harsh. I solved my problem, this was exactly the about java version and hadoop version incompatibility, now, I can run my compiled and jarred MapReduce program. I have a different question now. I created a code, finding the each ip's packet number in a given time interval for the netflow data. However, I am getting the java.lang.NumberFormatException. ************************************************************************* 1. Here is my code in Java; ************************************************************************* package org.myorg; import java.io.IOException; import org.apache.hadoop.io.*; import java.util.NoSuchElementException; import java.util.Iterator; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.*; import org.apache.hadoop.fs.Path; import org.apache.hadoop.conf.*; import org.apache.hadoop.io.*; import org.apache.hadoop.mapred.Reporter; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.input.TextInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Mapper; import java.util.StringTokenizer; import org.apache.hadoop.util.Tool; import org.apache.hadoop.util.ToolRunner; public class MapReduce extends Configured implements Tool { public int run (String[] args) throws Exception { System.out.println("Debug1"); if(args.length != 2) { System.err.println("Usage: MapReduce "); ToolRunner.printGenericCommandUsage(System.err); } Job job = new Job(); job.setJarByClass(MapReduce.class); System.out.println("Debug2"); job.setJobName("MaximumPacketFlowIP"); System.out.println("Debug3"); FileInputFormat.addInputPath(job, new Path(args[0])); System.out.println("Debug8"); FileOutputFormat.setOutputPath(job, new Path(args[1])); System.out.println("Debug9"); job.setMapperClass(FlowPortMapper.class); System.out.println("Debug6"); job.setReducerClass(FlowPortReducer.class); System.out.println("Debug7"); job.setOutputKeyClass(Text.class); System.out.println("Debug4"); job.setOutputValueClass(IntWritable.class); System.out.println("Debug5"); //System.exit(job.waitForCompletion(true) ? 0:1); return job.waitForCompletion(true) ? 0:1 ; } /* ----------------------main---------------------*/ public static void main(String[] args) throws Exception { int exitCode = ToolRunner.run(new MapReduce(), args); System.exit(exitCode); } /*---------------------------------Mapper-----------------------------------------*/ static class FlowPortMapper extends Mapper { public void map (LongWritable key, Text value, Context context) throws IOException, InterruptedException { String flow = value.toString(); long starttime=0; long endtime=0; long time1=1357289339; long time2=1357289342; StringTokenizer line = new StringTokenizer(flow); String internalip="i"; //Getting the internalip from flow if(line.hasMoreTokens()) internalip=line.nextToken(); //Getting the starttime and endtime from flow for(int i=0;i<9;i++) if(line.hasMoreTokens()) starttime=Long.parseLong(line.nextToken()); if(line.hasMoreTokens()) endtime=Long.parseLong(line.nextToken()); //If the time is in the given interval then emit 1 if(starttime>=time1 && endtime<=time2) context.write(new Text(internalip), new IntWritable(1)); } } /* --------------------Reducer-----------------------------------*/ static class FlowPortReducer extends Reducer { public void reduce(Text key,Iterable values, Context context) throws IOException, InterruptedException { int numberOfOccurence = 0; for(IntWritable value:values) numberOfOccurence += value.get(); context.write(key, new IntWritable(numberOfOccurence)); } } } ****************************************************************************** 2. Here is the error I am getting: ****************************************************************************** [hadoop@ADUAE042-LAP-V flowtimeclasses2602]$ hadoop jar flowtime2602_1841.jar org.myorg.MapReduce /user/hadoop/NetFlow2 test1106.out Warning: $HADOOP_HOME is deprecated. Debug1 Debug2 Debug3 Debug8 Debug9 Debug6 Debug7 Debug4 Debug5 13/02/27 10:56:00 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 13/02/27 10:56:00 INFO input.FileInputFormat: Total input paths to process : 1 13/02/27 10:56:00 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/02/27 10:56:00 WARN snappy.LoadSnappy: Snappy native library not loaded 13/02/27 10:56:00 INFO mapred.JobClient: Running job: job_201302261146_0014 13/02/27 10:56:01 INFO mapred.JobClient: map 0% reduce 0% 13/02/27 10:56:13 INFO mapred.JobClient: Task Id : attempt_201302261146_0014_m_000000_0, Status : FAILED java.lang.NumberFormatException: For input string: "Data" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48) at java.lang.Long.parseLong(Long.java:410) at java.lang.Long.parseLong(Long.java:468) at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:106) at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:86) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249) 13/02/27 10:56:20 INFO mapred.JobClient: Task Id : attempt_201302261146_0014_m_000000_1, Status : FAILED java.lang.NumberFormatException: For input string: "Data" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48) at java.lang.Long.parseLong(Long.java:410) at java.lang.Long.parseLong(Long.java:468) at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:106) at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:86) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249) 13/02/27 10:56:28 INFO mapred.JobClient: Task Id : attempt_201302261146_0014_m_000000_2, Status : FAILED java.lang.NumberFormatException: For input string: "Data" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48) at java.lang.Long.parseLong(Long.java:410) at java.lang.Long.parseLong(Long.java:468) at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:106) at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:86) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249) 13/02/27 10:56:40 INFO mapred.JobClient: Job complete: job_201302261146_0014 13/02/27 10:56:40 INFO mapred.JobClient: Counters: 8 13/02/27 10:56:40 INFO mapred.JobClient: Job Counters 13/02/27 10:56:40 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=33174 13/02/27 10:56:40 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 13/02/27 10:56:40 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 13/02/27 10:56:40 INFO mapred.JobClient: Rack-local map tasks=1 13/02/27 10:56:40 INFO mapred.JobClient: Launched map tasks=4 13/02/27 10:56:40 INFO mapred.JobClient: Data-local map tasks=3 13/02/27 10:56:40 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 13/02/27 10:56:40 INFO mapred.JobClient: Failed map tasks=1 ********************************************************************************** 3. As you see from the code, I added some debug lines after some lines. ********************************************************************************** 4. My data is a NetFlow data; Normally, it was .txt file; I inserted it to dfs; whose info is as follows: ******************************************************** [hadoop@ADUAE042-LAP-V flowtimeclasses2602]$ hadoop dfs -lsr /user/hadoop/NetFlow2 Warning: $HADOOP_HOME is deprecated. -rw-r--r-- 2 hadoop supergroup 14187484 2013-02-26 12:03 /user/hadoop/NetFlow2 ********************************************************* What may be the reason for it? How can I fix it? --e89a8ff252ba94ede604d6afc9ff Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi all,

First, I would like to th= ank you all, espacially to Hemanth and Harsh.=A0

=
I solved my problem, this was exactly the about java version and= hadoop version incompatibility, now, I can run my compiled and jarred MapR= educe program.

I have a different question now. I created = a code, finding the each ip's packet number in a given time interval fo= r the netflow data.

However, I am gett= ing the java.lang.NumberFormatException.
****************************************************************= *********
1. Here is my code in Java;
*****= ********************************************************************
package org.myorg;
import java.io.IOException;
import org.apache.hadoop.io.*;
import java.util.NoSuchEle= mentException;
import java.util.Iterator;
import org.ap= ache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.= hadoop.io.Text;
import org.apache.hadoop.mapreduce.*;
i= mport org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;=
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapr= ed.Reporter;
import org.apache.hadoop.mapreduce.lib.input.FileInp= utFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputF= ormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
<= div>import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.io.IntWritable;
import org.apache.had= oop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.m= apreduce.Mapper;
import org.apache.hadoop.mapreduce.Mapper;
=
import java.util.StringTokenizer;
import org.apache.hadoop.u= til.Tool;
import org.apache.hadoop.util.ToolRunner;


public class MapReduce extends Configured implem= ents Tool
{


=A0 =A0 =A0 = =A0 public int run (String[] args) throws Exception
=A0 =A0 =A0 =A0 {
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 System.out= .println("Debug1");

=A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 if(args.length !=3D 2)
=A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 {
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 System.err.= println("Usage: MapReduce <input path> <output path>"= );
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 ToolRunner.printGeneri= cCommandUsage(System.err);
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 }

=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 Job job =3D new Job();=

=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 job.setJarByClass= (MapReduce.class);
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 System.out.println("Debug2")= ;
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 job.setJobName("MaximumPac= ketFlowIP");
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 System.out.prin= tln("Debug3");

=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0FileInputFormat.addInputPath(job, n= ew Path(args[0]));
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0System.out.= println("Debug8");
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0F= ileOutputFormat.setOutputPath(job, new Path(args[1]));
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0System.out.println("Debug9&quo= t;);


=A0 =A0 =A0 =A0 =A0 =A0 job.se= tMapperClass(FlowPortMapper.class);
=A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 System.out.println("Debug6");
=A0 =A0 =A0 =A0 =A0 =A0 job.setReducerClass(FlowPortReducer.class);
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 System.out.println("Debug7&quo= t;);




=
=A0 =A0 =A0 =A0 =A0 =A0 job.setOutputKeyClass(Text.class);
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 System.out.println("Debug4")= ;
=A0 =A0 =A0 =A0 =A0 =A0 job.setOutputValueClass(IntWritable.cla= ss);
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 System.out.println("Deb= ug5");

=A0 =A0 =A0 =A0 =A0 =A0 //System.exit(= job.waitForCompletion(true) ? 0:1);
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 return job.waitForCompletion(true) ? 0= :1 ;
=A0 =A0 =A0 =A0 }


= =A0 =A0 =A0 =A0 /* ----------------------main---------------------*/
<= div>=A0 =A0 =A0 =A0 public static void main(String[] args) throws Exception=
=A0 =A0 =A0 =A0 {
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int exitCo= de =3D ToolRunner.run(new MapReduce(), args);
=A0 =A0 =A0 =A0 =A0= =A0 =A0 =A0 System.exit(exitCode);
=A0 =A0 =A0 =A0 }
= =A0 =A0 =A0 =A0 /*---------------------------------Mapper------------------= -----------------------*/
=A0 =A0 =A0 =A0 static class FlowPortMapper extends Mapper <LongWri= table,Text,Text,IntWritable>
=A0 =A0 =A0 =A0 {
=A0 = =A0 =A0 =A0 =A0 =A0 =A0 =A0 public void map (LongWritable key, Text value, = Context context) throws IOException,
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 InterruptedException
=A0 =A0= =A0 =A0 =A0 =A0 =A0 =A0 {
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 String flow =3D value.toString();
=A0 =A0 =A0 =A0 =A0= =A0 =A0 =A0 =A0 =A0 =A0 =A0 long starttime=3D0;
=A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 long endtime=3D0;
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 long time1=3D135728933= 9;
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 long time2=3D1= 357289342;
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 String= Tokenizer line =3D new StringTokenizer(flow);
=A0 =A0 =A0 =A0 =A0= =A0 =A0 =A0 =A0 =A0 =A0 =A0 String internalip=3D"i";

=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 //Getti= ng the internalip from flow
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 if(line.hasMoreTokens())
=A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0 internalip=3Dline.nextToken();

=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 //Getting the starttim= e and endtime from flow
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 for(int i=3D0;i<9;i++)
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =A0 =A0 =A0 =A0 if(line.hasMoreTokens())
=A0 =A0 =A0 =A0 =A0 =A0= =A0 =A0 =A0 =A0 =A0 =A0 starttime=3DLong.parseLong(line.nextToken());
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 if(line.hasMoreTokens(= ))
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 endtime=3DLong= .parseLong(line.nextToken());

=A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0 =A0 =A0 //If the time is in the given interval then emi= t 1
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 if(starttime>=3Dtim= e1 && endtime<=3Dtime2)
=A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0 context.write(new Text(internalip), new IntWritable(1))= ;


=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 }

=A0 =A0 =A0 =A0 }

=A0 =A0 =A0 = =A0 /* --------------------Reducer-----------------------------------*/

=A0 =A0 =A0 =A0 static class FlowPortReducer extends = Reducer<Text,IntWritable,Text,IntWritable>
=A0 =A0 =A0 =A0 {
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 public =A0= void reduce(Text key,Iterable<IntWritable> values, Context context) t= hrows IOException,
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 InterruptedExc= eption
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 {
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 int nu= mberOfOccurence =3D 0;
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 = =A0 =A0 =A0 =A0 =A0 =A0 for(IntWritable value:values)
=A0 =A0 =A0= =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 number= OfOccurence +=3D value.get();
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 contex= t.write(key, new IntWritable(numberOfOccurence));
=A0 =A0 =A0 =A0= =A0 =A0 =A0 =A0 }
=A0 =A0 =A0 =A0 }
}

******************************************************************= ************
2. Here is the error I am getting:
*************= *****************************************************************
[hadoop@ADUAE042-LAP-V flowtimeclasses2602]$ hadoop jar flowtim= e2602_1841.jar org.myorg.MapReduce /user/hadoop/NetFlow2 test1106.out
Warning: $HADOOP_HOME is deprecated.

Debug1
Debug2
Debug3
Debug8
Debug9
Debug6
Debug7
Debug4
Debug5
13/02= /27 10:56:00 WARN mapred.JobClient: Use GenericOptionsParser for parsing th= e arguments. Applications should implement Tool for the same.
13/02/27 10:56:00 INFO input.FileInputFormat: Total input paths to pro= cess : 1
13/02/27 10:56:00 INFO util.NativeCodeLoader: Loaded the= native-hadoop library
13/02/27 10:56:00 WARN snappy.LoadSnappy: = Snappy native library not loaded
13/02/27 10:56:00 INFO mapred.JobClient: Running job: job_201302261146= _0014
13/02/27 10:56:01 INFO mapred.JobClient: =A0map 0% reduce 0= %
13/02/27 10:56:13 INFO mapred.JobClient: Task Id : attempt_2013= 02261146_0014_m_000000_0, Status : FAILED
java.lang.NumberFormatException: For input string: "Data"
=A0 =A0 =A0 =A0 at java.lang.NumberFormatException.forInputString(N= umberFormatException.java:48)
=A0 =A0 =A0 =A0 at java.lang.Long.p= arseLong(Long.java:410)
=A0 =A0 =A0 =A0 at java.lang.Long.parseLong(Long.java:468)
= =A0 =A0 =A0 =A0 at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:10= 6)
=A0 =A0 =A0 =A0 at org.myorg.MapReduce$FlowPortMapper.map(MapR= educe.java:86)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)<= /div>
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask.runNewMapper(= MapTask.java:764)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Map= Task.run(MapTask.java:370)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Child$4.run(Child.java:255= )
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(= Native Method)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doA= s(Subject.java:396)
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1121)
=A0 =A0 =A0 =A0 at org.apache.h= adoop.mapred.Child.main(Child.java:249)

13/02/27 1= 0:56:20 INFO mapred.JobClient: Task Id : attempt_201302261146_0014_m_000000= _1, Status : FAILED
java.lang.NumberFormatException: For input string: "Data"
=A0 =A0 =A0 =A0 at java.lang.NumberFormatException.forInputString(N= umberFormatException.java:48)
=A0 =A0 =A0 =A0 at java.lang.Long.p= arseLong(Long.java:410)
=A0 =A0 =A0 =A0 at java.lang.Long.parseLong(Long.java:468)
= =A0 =A0 =A0 =A0 at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:10= 6)
=A0 =A0 =A0 =A0 at org.myorg.MapReduce$FlowPortMapper.map(MapR= educe.java:86)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)<= /div>
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask.runNewMapper(= MapTask.java:764)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Map= Task.run(MapTask.java:370)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Child$4.run(Child.java:255= )
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(= Native Method)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doA= s(Subject.java:396)
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1121)
=A0 =A0 =A0 =A0 at org.apache.h= adoop.mapred.Child.main(Child.java:249)

13/02/27 1= 0:56:28 INFO mapred.JobClient: Task Id : attempt_201302261146_0014_m_000000= _2, Status : FAILED
java.lang.NumberFormatException: For input string: "Data"
=A0 =A0 =A0 =A0 at java.lang.NumberFormatException.forInputString(N= umberFormatException.java:48)
=A0 =A0 =A0 =A0 at java.lang.Long.p= arseLong(Long.java:410)
=A0 =A0 =A0 =A0 at java.lang.Long.parseLong(Long.java:468)
= =A0 =A0 =A0 =A0 at org.myorg.MapReduce$FlowPortMapper.map(MapReduce.java:10= 6)
=A0 =A0 =A0 =A0 at org.myorg.MapReduce$FlowPortMapper.map(MapR= educe.java:86)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)<= /div>
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask.runNewMapper(= MapTask.java:764)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Map= Task.run(MapTask.java:370)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.Child$4.run(Child.java:255= )
=A0 =A0 =A0 =A0 at java.security.AccessController.doPrivileged(= Native Method)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doA= s(Subject.java:396)
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1121)
=A0 =A0 =A0 =A0 at org.apache.h= adoop.mapred.Child.main(Child.java:249)

13/02/27 1= 0:56:40 INFO mapred.JobClient: Job complete: job_201302261146_0014
13/02/27 10:56:40 INFO mapred.JobClient: Counters: 8
13/02/2= 7 10:56:40 INFO mapred.JobClient: =A0 Job Counters
13/02/27 10:56= :40 INFO mapred.JobClient: =A0 =A0 SLOTS_MILLIS_MAPS=3D33174
13/0= 2/27 10:56:40 INFO mapred.JobClient: =A0 =A0 Total time spent by all reduce= s waiting after reserving slots (ms)=3D0
13/02/27 10:56:40 INFO mapred.JobClient: =A0 =A0 Total time spent by a= ll maps waiting after reserving slots (ms)=3D0
13/02/27 10:56:40 = INFO mapred.JobClient: =A0 =A0 Rack-local map tasks=3D1
13/02/27 = 10:56:40 INFO mapred.JobClient: =A0 =A0 Launched map tasks=3D4
13/02/27 10:56:40 INFO mapred.JobClient: =A0 =A0 Data-local map tasks= =3D3
13/02/27 10:56:40 INFO mapred.JobClient: =A0 =A0 SLOTS_MILLI= S_REDUCES=3D0
13/02/27 10:56:40 INFO mapred.JobClient: =A0 =A0 Fa= iled map tasks=3D1
**********************************************************************= ************
3. As you see from the code, I added some debu= g lines after some lines.=A0
******************************= ****************************************************
4. My data is a NetFlow data; Normally, it was .txt file; I inse= rted it to dfs; whose info is as follows:
*****************= ***************************************
[hadoop@ADUAE0= 42-LAP-V flowtimeclasses2602]$ hadoop dfs -lsr /user/hadoop/NetFlow2
Warning: $HADOOP_HOME is deprecated.

-rw-r--r= -- =A0 2 hadoop supergroup =A0 14187484 2013-02-26 12:03 /user/hadoop/NetFl= ow2
*********************************************************

What may be the reason for it?
How can I fix it?
--e89a8ff252ba94ede604d6afc9ff--