Return-Path: Delivered-To: apmail-lucene-hadoop-commits-archive@locus.apache.org Received: (qmail 93120 invoked from network); 16 Oct 2006 23:15:23 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (209.237.227.199) by minotaur.apache.org with SMTP; 16 Oct 2006 23:15:23 -0000 Received: (qmail 19145 invoked by uid 500); 16 Oct 2006 23:15:23 -0000 Delivered-To: apmail-lucene-hadoop-commits-archive@lucene.apache.org Received: (qmail 19125 invoked by uid 500); 16 Oct 2006 23:15:23 -0000 Mailing-List: contact hadoop-commits-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hadoop-dev@lucene.apache.org Delivered-To: mailing list hadoop-commits@lucene.apache.org Received: (qmail 19116 invoked by uid 99); 16 Oct 2006 23:15:23 -0000 Received: from asf.osuosl.org (HELO asf.osuosl.org) (140.211.166.49) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 16 Oct 2006 16:15:23 -0700 X-ASF-Spam-Status: No, hits=-8.6 required=10.0 tests=ALL_TRUSTED,INFO_TLD,NO_REAL_NAME X-Spam-Check-By: apache.org Received-SPF: pass (asf.osuosl.org: local policy) Received: from [140.211.166.113] (HELO eris.apache.org) (140.211.166.113) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 16 Oct 2006 16:15:22 -0700 Received: by eris.apache.org (Postfix, from userid 65534) id 2713A1A981A; Mon, 16 Oct 2006 16:15:02 -0700 (PDT) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r464718 - in /lucene/hadoop/trunk: CHANGES.txt src/java/org/apache/hadoop/mapred/JobInProgress.java src/webapps/job/analysejobhistory.jsp Date: Mon, 16 Oct 2006 23:15:01 -0000 To: hadoop-commits@lucene.apache.org From: cutting@apache.org X-Mailer: svnmailer-1.1.0 Message-Id: <20061016231502.2713A1A981A@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org X-Spam-Rating: minotaur.apache.org 1.6.2 0/1000/N Author: cutting Date: Mon Oct 16 16:15:00 2006 New Revision: 464718 URL: http://svn.apache.org/viewvc?view=rev&rev=464718 Log: HADOOP-588. Fix logging and accounting of failed tasks. Contributed by Sanjay. Modified: lucene/hadoop/trunk/CHANGES.txt lucene/hadoop/trunk/src/java/org/apache/hadoop/mapred/JobInProgress.java lucene/hadoop/trunk/src/webapps/job/analysejobhistory.jsp Modified: lucene/hadoop/trunk/CHANGES.txt URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/CHANGES.txt?view=diff&rev=464718&r1=464717&r2=464718 ============================================================================== --- lucene/hadoop/trunk/CHANGES.txt (original) +++ lucene/hadoop/trunk/CHANGES.txt Mon Oct 16 16:15:00 2006 @@ -20,6 +20,9 @@ 5. HADOOP-514. Make DFS heartbeat interval configurable. (Milind Bhandarkar via cutting) + 6. HADOOP-588. Fix logging and accounting of failed tasks. + (Sanjay Dahiya via cutting) + Release 0.7.1 - 2006-10-11 Modified: lucene/hadoop/trunk/src/java/org/apache/hadoop/mapred/JobInProgress.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/java/org/apache/hadoop/mapred/JobInProgress.java?view=diff&rev=464718&r1=464717&r2=464718 ============================================================================== --- lucene/hadoop/trunk/src/java/org/apache/hadoop/mapred/JobInProgress.java (original) +++ lucene/hadoop/trunk/src/java/org/apache/hadoop/mapred/JobInProgress.java Mon Oct 16 16:15:00 2006 @@ -563,9 +563,8 @@ for (int i = 0; i < reduces.length; i++) { reduces[i].kill(); } - JobHistory.JobInfo.logFinished(this.status.getJobId(), finishTime, - this.finishedMapTasks, this.finishedReduceTasks, failedMapTasks, - failedReduceTasks); + JobHistory.JobInfo.logFailed(this.status.getJobId(), finishTime, + this.finishedMapTasks, this.finishedReduceTasks); garbageCollect(); } } @@ -638,15 +637,15 @@ // if (tip.isFailed()) { LOG.info("Aborting job " + profile.getJobId()); + JobHistory.Task.logFailed(profile.getJobId(), tip.getTIPId(), + tip.isMapTask() ? Values.MAP.name():Values.REDUCE.name(), + System.currentTimeMillis(), status.getDiagnosticInfo()); JobHistory.JobInfo.logFailed(profile.getJobId(), System.currentTimeMillis(), this.finishedMapTasks, this.finishedReduceTasks); kill(); } jobtracker.removeTaskEntry(taskid); - JobHistory.Task.logFailed(profile.getJobId(), tip.getTIPId(), - tip.isMapTask() ? Values.MAP.name():Values.REDUCE.name(), - System.currentTimeMillis(), status.getDiagnosticInfo()); } /** Modified: lucene/hadoop/trunk/src/webapps/job/analysejobhistory.jsp URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/webapps/job/analysejobhistory.jsp?view=diff&rev=464718&r1=464717&r2=464718 ============================================================================== --- lucene/hadoop/trunk/src/webapps/job/analysejobhistory.jsp (original) +++ lucene/hadoop/trunk/src/webapps/job/analysejobhistory.jsp Mon Oct 16 16:15:00 2006 @@ -38,14 +38,13 @@
<% - Map tasks = job.getAllTasks(); - int finishedMaps = job.getInt(Keys.FINISHED_MAPS) ; - int finishedReduces = job.getInt(Keys.FINISHED_REDUCES) ; - if( finishedMaps == 0 || finishedReduces == 0 ){ + if( ! Values.SUCCESS.name().equals(job.get(Keys.JOB_STATUS)) ){ out.print("

No Analysis available as job did not finish

"); return ; } - + Map tasks = job.getAllTasks(); + int finishedMaps = job.getInt(Keys.FINISHED_MAPS) ; + int finishedReduces = job.getInt(Keys.FINISHED_REDUCES) ; JobHistory.Task [] mapTasks = new JobHistory.Task[finishedMaps]; JobHistory.Task [] reduceTasks = new JobHistory.Task[finishedReduces]; int mapIndex = 0 , reduceIndex=0;