Return-Path: X-Original-To: apmail-hadoop-yarn-commits-archive@minotaur.apache.org Delivered-To: apmail-hadoop-yarn-commits-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9C5C0DAE0 for ; Thu, 7 Mar 2013 02:58:34 +0000 (UTC) Received: (qmail 74512 invoked by uid 500); 7 Mar 2013 02:58:34 -0000 Delivered-To: apmail-hadoop-yarn-commits-archive@hadoop.apache.org Received: (qmail 74433 invoked by uid 500); 7 Mar 2013 02:58:33 -0000 Mailing-List: contact yarn-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: yarn-commits@hadoop.apache.org Delivered-To: mailing list yarn-commits@hadoop.apache.org Received: (qmail 74414 invoked by uid 99); 7 Mar 2013 02:58:32 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 07 Mar 2013 02:58:32 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.4] (HELO eris.apache.org) (140.211.11.4) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 07 Mar 2013 02:58:27 +0000 Received: from eris.apache.org (localhost [127.0.0.1]) by eris.apache.org (Postfix) with ESMTP id DF6F32388BFF; Thu, 7 Mar 2013 02:57:53 +0000 (UTC) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r1453669 [1/2] - in /hadoop/common/branches/HDFS-2802/hadoop-yarn-project: ./ hadoop-yarn/ hadoop-yarn/bin/ hadoop-yarn/conf/ hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/api/ hadoop-yarn/hadoop-yarn-applications/ hadoop... Date: Thu, 07 Mar 2013 02:57:51 -0000 To: yarn-commits@hadoop.apache.org From: szetszwo@apache.org X-Mailer: svnmailer-1.0.8-patched Message-Id: <20130307025753.DF6F32388BFF@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org Author: szetszwo Date: Thu Mar 7 02:57:40 2013 New Revision: 1453669 URL: http://svn.apache.org/r1453669 Log: Merge r1449958 through r1453659 from trunk. Added: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/start-yarn.cmd - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/bin/start-yarn.cmd hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/stop-yarn.cmd - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/bin/stop-yarn.cmd hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/yarn-config.cmd - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/bin/yarn-config.cmd hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/yarn.cmd - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/bin/yarn.cmd hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/conf/yarn-env.cmd - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/conf/yarn-env.cmd hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/WindowsBasedProcessTree.java - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/WindowsBasedProcessTree.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/WindowsResourceCalculatorPlugin.java - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/WindowsResourceCalculatorPlugin.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestWindowsBasedProcessTree.java - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestWindowsBasedProcessTree.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestWindowsResourceCalculatorPlugin.java - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestWindowsResourceCalculatorPlugin.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/webapp/view/TestInfoBlock.java - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/webapp/view/TestInfoBlock.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/rmnode/UpdatedContainerInfo.java - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/rmnode/UpdatedContainerInfo.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-tests/src/test/resources/capacity-scheduler.xml - copied unchanged from r1453659, hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-tests/src/test/resources/capacity-scheduler.xml Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/CHANGES.txt hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/yarn hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/api/ApplicationConstants.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/pom.xml hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/ApplicationCLI.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/NodeCLI.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/test/java/org/apache/hadoop/yarn/client/cli/TestYarnCLI.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/logaggregation/AggregatedLogFormat.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ProcfsBasedProcessTree.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorPlugin.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorProcessTree.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/webapp/view/InfoBlock.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestProcfsBasedProcessTree.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestRackResolver.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/ContainerExecutor.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/LocalDirsHandlerService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationEventType.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationImpl.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/launcher/ContainerLaunch.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/ResourceLocalizationService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/logaggregation/LogAggregationService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/util/ProcessIdFileReader.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/TestResourceLocalizationService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/logaggregation/TestLogAggregationService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/java/org/apache/hadoop/yarn/server/nodemanager/util/TestProcessIdFileReader.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/ClientRMService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/ResourceTrackerService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/rmapp/attempt/RMAppAttemptImpl.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/rmnode/RMNode.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/rmnode/RMNodeImpl.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/scheduler/capacity/CapacityScheduler.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/scheduler/event/NodeUpdateSchedulerEvent.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/scheduler/fair/FairScheduler.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/main/java/org/apache/hadoop/yarn/server/resourcemanager/scheduler/fifo/FifoScheduler.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/java/org/apache/hadoop/yarn/server/resourcemanager/MockNodes.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/java/org/apache/hadoop/yarn/server/resourcemanager/TestClientRMService.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/java/org/apache/hadoop/yarn/server/resourcemanager/TestFifoScheduler.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/java/org/apache/hadoop/yarn/server/resourcemanager/TestRMNodeTransitions.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/java/org/apache/hadoop/yarn/server/resourcemanager/rmapp/attempt/TestRMAppAttemptTransitions.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/java/org/apache/hadoop/yarn/server/resourcemanager/scheduler/fair/TestFairScheduler.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-tests/src/test/java/org/apache/hadoop/yarn/server/MiniYARNCluster.java hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/pom.xml hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/pom.xml hadoop/common/branches/HDFS-2802/hadoop-yarn-project/pom.xml Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/CHANGES.txt?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/CHANGES.txt (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/CHANGES.txt Thu Mar 7 02:57:40 2013 @@ -6,6 +6,10 @@ Trunk - Unreleased NEW FEATURES + HADOOP-8562. Enhancements to support Hadoop on Windows Server and Windows + Azure environments. (See breakdown of tasks below for subtasks and + contributors) + IMPROVEMENTS YARN-84. Use Builder to build RPC server. (Brandon Li via suresh) @@ -14,6 +18,36 @@ Trunk - Unreleased BUG FIXES + BREAKDOWN OF HADOOP-8562 SUBTASKS + + YARN-158. Yarn creating package-info.java must not depend on sh. + (Chris Nauroth via suresh) + + YARN-176. Some YARN tests fail to find winutils. (Chris Nauroth via suresh) + + YARN-207. YARN distribution build fails on Windows. (Chris Nauroth via + suresh) + + YARN-199. Yarn cmd line scripts for windows. (Ivan Mitic via suresh) + + YARN-213. YARN build script would be more readable using abspath. + (Chris Nauroth via suresh) + + YARN-233. Added support for running containers in MS Windows to YARN. (Chris + Nauroth via acmurthy) + + YARN-234. Added support for process tree and resource calculator in MS Windows + to YARN. (Chris Nauroth via acmurthy) + + YARN-259. Fix LocalDirsHandlerService to use Path rather than URIs. (Xuan + Gong via acmurthy) + + YARN-316. YARN container launch may exceed maximum Windows command line + length due to long classpath. (Chris Nauroth via suresh) + + YARN-359. Fixing commands for container signalling in Windows. (Chris Nauroth + via vinodkv) + Release 2.0.4-beta - UNRELEASED INCOMPATIBLE CHANGES @@ -22,6 +56,16 @@ Release 2.0.4-beta - UNRELEASED IMPROVEMENTS + YARN-365. Change NM heartbeat handling to not generate a scheduler event + on each heartbeat. (Xuan Gong via sseth) + + YARN-380. Fix yarn node -status output to be better readable. (Omkar Vinit + Joshi via vinodkv) + + YARN-410. Fixed RM UI so that the new lines diagnostics for a failed app on + the per-application page are translated to html line breaks. (Omkar Vinit + Joshi via vinodkv) + OPTIMIZATIONS BUG FIXES @@ -38,6 +82,18 @@ Release 2.0.4-beta - UNRELEASED YARN-391. Formatting fixes for LCEResourceHandler classes. (Steve Loughran via sseth) + YARN-390. ApplicationCLI and NodeCLI hard-coded platform-specific line + separator causes test failures on Windows. (Chris Nauroth via suresh) + + YARN-406. Fix TestRackResolver to function in networks where "host1" + resolves to a valid host. (Hitesh Shah via sseth) + + YARN-376. Fixes a bug which would prevent the NM knowing about completed + containers and applications. (Jason Lowe via sseth) + + YARN-429. capacity-scheduler config missing from yarn-test artifact. + (sseth via hitesh) + Release 2.0.3-alpha - 2013-02-06 INCOMPATIBLE CHANGES @@ -319,6 +375,12 @@ Release 0.23.7 - UNRELEASED YARN-236. RM should point tracking URL to RM web page when app fails to start (Jason Lowe via jeagles) + YARN-269. Resource Manager not logging the health_check_script result when + taking it out (Jason Lowe via kihwal) + + YARN-227. Application expiration difficult to debug for end-users + (Jason Lowe via jeagles) + OPTIMIZATIONS YARN-357. App submission should not be synchronized (daryn) @@ -337,6 +399,15 @@ Release 0.23.7 - UNRELEASED YARN-400. RM can return null application resource usage report leading to NPE in client (Jason Lowe via tgraves) + YARN-426. Failure to download a public resource prevents further downloads + (Jason Lowe via bobby) + + YARN-448. Remove unnecessary hflush from log aggregation (Kihwal Lee via + bobby) + + YARN-345. Many InvalidStateTransitonException errors for ApplicationImpl + in Node Manager (Robert Parker via jlowe) + Release 0.23.6 - UNRELEASED INCOMPATIBLE CHANGES Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/yarn URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/yarn?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/yarn (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/bin/yarn Thu Mar 7 02:57:40 2013 @@ -72,11 +72,6 @@ function print_usage(){ echo "Most commands print help when invoked w/o parameters." } -cygwin=false -case "`uname`" in -CYGWIN*) cygwin=true;; -esac - # if no args specified, show usage if [ $# = 0 ]; then print_usage @@ -177,9 +172,6 @@ unset IFS # figure out which class to run if [ "$COMMAND" = "classpath" ] ; then - if $cygwin; then - CLASSPATH=`cygpath -p -w "$CLASSPATH"` - fi echo $CLASSPATH exit elif [ "$COMMAND" = "rmadmin" ] ; then @@ -227,19 +219,6 @@ else CLASS=$COMMAND fi -# cygwin path translation -if $cygwin; then - CLASSPATH=`cygpath -p -w "$CLASSPATH"` - HADOOP_YARN_HOME=`cygpath -w "$HADOOP_YARN_HOME"` - YARN_LOG_DIR=`cygpath -w "$YARN_LOG_DIR"` - TOOL_PATH=`cygpath -p -w "$TOOL_PATH"` -fi - -# cygwin path translation -if $cygwin; then - JAVA_LIBRARY_PATH=`cygpath -p "$JAVA_LIBRARY_PATH"` -fi - YARN_OPTS="$YARN_OPTS -Dhadoop.log.dir=$YARN_LOG_DIR" YARN_OPTS="$YARN_OPTS -Dyarn.log.dir=$YARN_LOG_DIR" YARN_OPTS="$YARN_OPTS -Dhadoop.log.file=$YARN_LOGFILE" Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/api/ApplicationConstants.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/api/ApplicationConstants.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/api/ApplicationConstants.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api/src/main/java/org/apache/hadoop/yarn/api/ApplicationConstants.java Thu Mar 7 02:57:40 2013 @@ -19,6 +19,7 @@ package org.apache.hadoop.yarn.api; import org.apache.hadoop.security.UserGroupInformation; +import org.apache.hadoop.util.Shell; /** * This is the API for the applications comprising of constants that YARN sets @@ -192,7 +193,11 @@ public interface ApplicationConstants { } public String $() { - return "$" + variable; + if (Shell.WINDOWS) { + return "%" + variable + "%"; + } else { + return "$" + variable; + } } } } Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/pom.xml URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/pom.xml?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/pom.xml (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/pom.xml Thu Mar 7 02:57:40 2013 @@ -32,6 +32,28 @@ hadoop-yarn-applications-distributedshell hadoop-yarn-applications-unmanaged-am-launcher + + + + + org.apache.maven.plugins + maven-surefire-plugin + + + + ${basedir}/../../../../hadoop-common-project/hadoop-common/target + + + + listener + org.apache.hadoop.test.TimedOutTestsListener + + + + + + + clover Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/ApplicationCLI.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/ApplicationCLI.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/ApplicationCLI.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/ApplicationCLI.java Thu Mar 7 02:57:40 2013 @@ -17,6 +17,8 @@ */ package org.apache.hadoop.yarn.client.cli; +import java.io.ByteArrayOutputStream; +import java.io.IOException; import java.io.PrintWriter; import java.util.List; @@ -31,7 +33,9 @@ import org.apache.hadoop.yarn.exceptions import org.apache.hadoop.yarn.util.ConverterUtils; public class ApplicationCLI extends YarnCLI { - private static final String APPLICATIONS_PATTERN = "%30s\t%20s\t%10s\t%10s\t%18s\t%18s\t%35s\n"; + private static final String APPLICATIONS_PATTERN = + "%30s\t%20s\t%10s\t%10s\t%18s\t%18s\t%35s" + + System.getProperty("line.separator"); public static void main(String[] args) throws Exception { ApplicationCLI cli = new ApplicationCLI(); @@ -123,37 +127,40 @@ public class ApplicationCLI extends Yarn * @throws YarnRemoteException */ private void printApplicationReport(String applicationId) - throws YarnRemoteException { + throws YarnRemoteException, IOException { ApplicationReport appReport = client.getApplicationReport(ConverterUtils .toApplicationId(applicationId)); - StringBuffer appReportStr = new StringBuffer(); + // Use PrintWriter.println, which uses correct platform line ending. + ByteArrayOutputStream baos = new ByteArrayOutputStream(); + PrintWriter appReportStr = new PrintWriter(baos); if (appReport != null) { - appReportStr.append("Application Report : "); - appReportStr.append("\n\tApplication-Id : "); - appReportStr.append(appReport.getApplicationId()); - appReportStr.append("\n\tApplication-Name : "); - appReportStr.append(appReport.getName()); - appReportStr.append("\n\tUser : "); - appReportStr.append(appReport.getUser()); - appReportStr.append("\n\tQueue : "); - appReportStr.append(appReport.getQueue()); - appReportStr.append("\n\tStart-Time : "); - appReportStr.append(appReport.getStartTime()); - appReportStr.append("\n\tFinish-Time : "); - appReportStr.append(appReport.getFinishTime()); - appReportStr.append("\n\tState : "); - appReportStr.append(appReport.getYarnApplicationState()); - appReportStr.append("\n\tFinal-State : "); - appReportStr.append(appReport.getFinalApplicationStatus()); - appReportStr.append("\n\tTracking-URL : "); - appReportStr.append(appReport.getOriginalTrackingUrl()); - appReportStr.append("\n\tDiagnostics : "); - appReportStr.append(appReport.getDiagnostics()); + appReportStr.println("Application Report : "); + appReportStr.print("\tApplication-Id : "); + appReportStr.println(appReport.getApplicationId()); + appReportStr.print("\tApplication-Name : "); + appReportStr.println(appReport.getName()); + appReportStr.print("\tUser : "); + appReportStr.println(appReport.getUser()); + appReportStr.print("\tQueue : "); + appReportStr.println(appReport.getQueue()); + appReportStr.print("\tStart-Time : "); + appReportStr.println(appReport.getStartTime()); + appReportStr.print("\tFinish-Time : "); + appReportStr.println(appReport.getFinishTime()); + appReportStr.print("\tState : "); + appReportStr.println(appReport.getYarnApplicationState()); + appReportStr.print("\tFinal-State : "); + appReportStr.println(appReport.getFinalApplicationStatus()); + appReportStr.print("\tTracking-URL : "); + appReportStr.println(appReport.getOriginalTrackingUrl()); + appReportStr.print("\tDiagnostics : "); + appReportStr.print(appReport.getDiagnostics()); } else { - appReportStr.append("Application with id '" + applicationId + appReportStr.print("Application with id '" + applicationId + "' doesn't exist in RM."); } - sysout.println(appReportStr.toString()); + appReportStr.close(); + sysout.println(baos.toString("UTF-8")); } -} \ No newline at end of file +} Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/NodeCLI.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/NodeCLI.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/NodeCLI.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/main/java/org/apache/hadoop/yarn/client/cli/NodeCLI.java Thu Mar 7 02:57:40 2013 @@ -17,13 +17,17 @@ */ package org.apache.hadoop.yarn.client.cli; +import java.io.ByteArrayOutputStream; +import java.io.IOException; import java.io.PrintWriter; +import java.util.Date; import java.util.List; import org.apache.commons.cli.CommandLine; import org.apache.commons.cli.GnuParser; import org.apache.commons.cli.HelpFormatter; import org.apache.commons.cli.Options; +import org.apache.commons.lang.time.DateFormatUtils; import org.apache.hadoop.util.ToolRunner; import org.apache.hadoop.yarn.api.records.NodeId; import org.apache.hadoop.yarn.api.records.NodeReport; @@ -31,7 +35,9 @@ import org.apache.hadoop.yarn.exceptions import org.apache.hadoop.yarn.util.ConverterUtils; public class NodeCLI extends YarnCLI { - private static final String NODES_PATTERN = "%16s\t%10s\t%17s\t%26s\t%18s\n"; + private static final String NODES_PATTERN = "%16s\t%10s\t%17s\t%26s\t%18s" + + System.getProperty("line.separator"); + public static void main(String[] args) throws Exception { NodeCLI cli = new NodeCLI(); cli.setSysOutPrintStream(System.out); @@ -100,48 +106,52 @@ public class NodeCLI extends YarnCLI { * @param nodeIdStr * @throws YarnRemoteException */ - private void printNodeStatus(String nodeIdStr) throws YarnRemoteException { + private void printNodeStatus(String nodeIdStr) throws YarnRemoteException, + IOException { NodeId nodeId = ConverterUtils.toNodeId(nodeIdStr); List nodesReport = client.getNodeReports(); - StringBuffer nodeReportStr = new StringBuffer(); + // Use PrintWriter.println, which uses correct platform line ending. + ByteArrayOutputStream baos = new ByteArrayOutputStream(); + PrintWriter nodeReportStr = new PrintWriter(baos); NodeReport nodeReport = null; for (NodeReport report : nodesReport) { if (!report.getNodeId().equals(nodeId)) { continue; } nodeReport = report; - nodeReportStr.append("Node Report : "); - nodeReportStr.append("\n\tNode-Id : "); - nodeReportStr.append(nodeReport.getNodeId()); - nodeReportStr.append("\n\tRack : "); - nodeReportStr.append(nodeReport.getRackName()); - nodeReportStr.append("\n\tNode-State : "); - nodeReportStr.append(nodeReport.getNodeState()); - nodeReportStr.append("\n\tNode-Http-Address : "); - nodeReportStr.append(nodeReport.getHttpAddress()); - nodeReportStr.append("\n\tHealth-Status(isNodeHealthy) : "); - nodeReportStr.append(nodeReport.getNodeHealthStatus() + nodeReportStr.println("Node Report : "); + nodeReportStr.print("\tNode-Id : "); + nodeReportStr.println(nodeReport.getNodeId()); + nodeReportStr.print("\tRack : "); + nodeReportStr.println(nodeReport.getRackName()); + nodeReportStr.print("\tNode-State : "); + nodeReportStr.println(nodeReport.getNodeState()); + nodeReportStr.print("\tNode-Http-Address : "); + nodeReportStr.println(nodeReport.getHttpAddress()); + nodeReportStr.print("\tHealth-Status(isNodeHealthy) : "); + nodeReportStr.println(nodeReport.getNodeHealthStatus() .getIsNodeHealthy()); - nodeReportStr.append("\n\tLast-Last-Health-Update : "); - nodeReportStr.append(nodeReport.getNodeHealthStatus() - .getLastHealthReportTime()); - nodeReportStr.append("\n\tHealth-Report : "); + nodeReportStr.print("\tLast-Health-Update : "); + nodeReportStr.println(DateFormatUtils.format( + new Date(nodeReport.getNodeHealthStatus(). + getLastHealthReportTime()),"E dd/MMM/yy hh:mm:ss:SSzz")); + nodeReportStr.print("\tHealth-Report : "); nodeReportStr - .append(nodeReport.getNodeHealthStatus().getHealthReport()); - nodeReportStr.append("\n\tContainers : "); - nodeReportStr.append(nodeReport.getNumContainers()); - nodeReportStr.append("\n\tMemory-Used : "); - nodeReportStr.append((nodeReport.getUsed() == null) ? "0M" + .println(nodeReport.getNodeHealthStatus().getHealthReport()); + nodeReportStr.print("\tContainers : "); + nodeReportStr.println(nodeReport.getNumContainers()); + nodeReportStr.print("\tMemory-Used : "); + nodeReportStr.println((nodeReport.getUsed() == null) ? "0M" : (nodeReport.getUsed().getMemory() + "M")); - nodeReportStr.append("\n\tMemory-Capacity : "); - nodeReportStr.append(nodeReport.getCapability().getMemory()); + nodeReportStr.print("\tMemory-Capacity : "); + nodeReportStr.println(nodeReport.getCapability().getMemory()); } if (nodeReport == null) { - nodeReportStr.append("Could not find the node report for node id : " + nodeReportStr.print("Could not find the node report for node id : " + nodeIdStr); } - - sysout.println(nodeReportStr.toString()); + nodeReportStr.close(); + sysout.println(baos.toString("UTF-8")); } -} \ No newline at end of file +} Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/test/java/org/apache/hadoop/yarn/client/cli/TestYarnCLI.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/test/java/org/apache/hadoop/yarn/client/cli/TestYarnCLI.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/test/java/org/apache/hadoop/yarn/client/cli/TestYarnCLI.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/src/test/java/org/apache/hadoop/yarn/client/cli/TestYarnCLI.java Thu Mar 7 02:57:40 2013 @@ -29,11 +29,14 @@ import static org.mockito.Mockito.when; import java.io.ByteArrayOutputStream; import java.io.PrintStream; +import java.io.PrintWriter; import java.util.ArrayList; +import java.util.Date; import java.util.List; import junit.framework.Assert; +import org.apache.commons.lang.time.DateFormatUtils; import org.apache.hadoop.yarn.api.records.ApplicationId; import org.apache.hadoop.yarn.api.records.ApplicationReport; import org.apache.hadoop.yarn.api.records.FinalApplicationStatus; @@ -79,12 +82,21 @@ public class TestYarnCLI { int result = cli.run(new String[] { "-status", applicationId.toString() }); assertEquals(0, result); verify(client).getApplicationReport(applicationId); - String appReportStr = "Application Report : \n\t" - + "Application-Id : application_1234_0005\n\t" - + "Application-Name : appname\n\tUser : user\n\t" - + "Queue : queue\n\tStart-Time : 0\n\tFinish-Time : 0\n\t" - + "State : FINISHED\n\tFinal-State : SUCCEEDED\n\t" - + "Tracking-URL : N/A\n\tDiagnostics : diagnostics\n"; + ByteArrayOutputStream baos = new ByteArrayOutputStream(); + PrintWriter pw = new PrintWriter(baos); + pw.println("Application Report : "); + pw.println("\tApplication-Id : application_1234_0005"); + pw.println("\tApplication-Name : appname"); + pw.println("\tUser : user"); + pw.println("\tQueue : queue"); + pw.println("\tStart-Time : 0"); + pw.println("\tFinish-Time : 0"); + pw.println("\tState : FINISHED"); + pw.println("\tFinal-State : SUCCEEDED"); + pw.println("\tTracking-URL : N/A"); + pw.println("\tDiagnostics : diagnostics"); + pw.close(); + String appReportStr = baos.toString("UTF-8"); Assert.assertEquals(appReportStr, sysOutStream.toString()); verify(sysOut, times(1)).println(isA(String.class)); } @@ -105,16 +117,18 @@ public class TestYarnCLI { assertEquals(0, result); verify(client).getApplicationList(); - StringBuffer appsReportStrBuf = new StringBuffer(); - appsReportStrBuf.append("Total Applications:1\n"); - appsReportStrBuf - .append(" Application-Id\t Application-Name" - + "\t User\t Queue\t State\t " - + "Final-State\t Tracking-URL\n"); - appsReportStrBuf.append(" application_1234_0005\t " - + "appname\t user\t queue\t FINISHED\t " - + "SUCCEEDED\t N/A\n"); - Assert.assertEquals(appsReportStrBuf.toString(), sysOutStream.toString()); + ByteArrayOutputStream baos = new ByteArrayOutputStream(); + PrintWriter pw = new PrintWriter(baos); + pw.println("Total Applications:1"); + pw.print(" Application-Id\t Application-Name"); + pw.print("\t User\t Queue\t State\t "); + pw.println("Final-State\t Tracking-URL"); + pw.print(" application_1234_0005\t "); + pw.print("appname\t user\t queue\t FINISHED\t "); + pw.println("SUCCEEDED\t N/A"); + pw.close(); + String appsReportStr = baos.toString("UTF-8"); + Assert.assertEquals(appsReportStr, sysOutStream.toString()); verify(sysOut, times(1)).write(any(byte[].class), anyInt(), anyInt()); } @@ -137,18 +151,20 @@ public class TestYarnCLI { int result = cli.run(new String[] { "-list" }); assertEquals(0, result); verify(client).getNodeReports(); - StringBuffer nodesReportStr = new StringBuffer(); - nodesReportStr.append("Total Nodes:3"); - nodesReportStr - .append("\n Node-Id\tNode-State\tNode-Http-Address\t" - + "Health-Status(isNodeHealthy)\tRunning-Containers"); - nodesReportStr.append("\n host0:0\t RUNNING\t host1:8888" - + "\t false\t 0"); - nodesReportStr.append("\n host1:0\t RUNNING\t host1:8888" - + "\t false\t 0"); - nodesReportStr.append("\n host2:0\t RUNNING\t host1:8888" - + "\t false\t 0\n"); - Assert.assertEquals(nodesReportStr.toString(), sysOutStream.toString()); + ByteArrayOutputStream baos = new ByteArrayOutputStream(); + PrintWriter pw = new PrintWriter(baos); + pw.println("Total Nodes:3"); + pw.print(" Node-Id\tNode-State\tNode-Http-Address\t"); + pw.println("Health-Status(isNodeHealthy)\tRunning-Containers"); + pw.print(" host0:0\t RUNNING\t host1:8888"); + pw.println("\t false\t 0"); + pw.print(" host1:0\t RUNNING\t host1:8888"); + pw.println("\t false\t 0"); + pw.print(" host2:0\t RUNNING\t host1:8888"); + pw.println("\t false\t 0"); + pw.close(); + String nodesReportStr = baos.toString("UTF-8"); + Assert.assertEquals(nodesReportStr, sysOutStream.toString()); verify(sysOut, times(1)).write(any(byte[].class), anyInt(), anyInt()); } @@ -163,11 +179,22 @@ public class TestYarnCLI { int result = cli.run(new String[] { "-status", nodeId.toString() }); assertEquals(0, result); verify(client).getNodeReports(); - String nodeStatusStr = "Node Report : \n\tNode-Id : host0:0\n\t" - + "Rack : rack1\n\tNode-State : RUNNING\n\t" - + "Node-Http-Address : host1:8888\n\tHealth-Status(isNodeHealthy) " - + ": false\n\tLast-Last-Health-Update : 0\n\tHealth-Report : null" - + "\n\tContainers : 0\n\tMemory-Used : 0M\n\tMemory-Capacity : 0"; + ByteArrayOutputStream baos = new ByteArrayOutputStream(); + PrintWriter pw = new PrintWriter(baos); + pw.println("Node Report : "); + pw.println("\tNode-Id : host0:0"); + pw.println("\tRack : rack1"); + pw.println("\tNode-State : RUNNING"); + pw.println("\tNode-Http-Address : host1:8888"); + pw.println("\tHealth-Status(isNodeHealthy) : false"); + pw.println("\tLast-Health-Update : " + + DateFormatUtils.format(new Date(0), "E dd/MMM/yy hh:mm:ss:SSzz")); + pw.println("\tHealth-Report : null"); + pw.println("\tContainers : 0"); + pw.println("\tMemory-Used : 0M"); + pw.println("\tMemory-Capacity : 0"); + pw.close(); + String nodeStatusStr = baos.toString("UTF-8"); verify(sysOut, times(1)).println(isA(String.class)); verify(sysOut).println(nodeStatusStr); } @@ -225,4 +252,4 @@ public class TestYarnCLI { return cli; } -} \ No newline at end of file +} Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/logaggregation/AggregatedLogFormat.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/logaggregation/AggregatedLogFormat.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/logaggregation/AggregatedLogFormat.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/logaggregation/AggregatedLogFormat.java Thu Mar 7 02:57:40 2013 @@ -231,7 +231,6 @@ public class AggregatedLogFormat { out = this.writer.prepareAppendValue(-1); out.writeInt(VERSION); out.close(); - this.fsDataOStream.hflush(); } public void writeApplicationOwner(String user) throws IOException { Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ProcfsBasedProcessTree.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ProcfsBasedProcessTree.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ProcfsBasedProcessTree.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ProcfsBasedProcessTree.java Thu Mar 7 02:57:40 2013 @@ -36,6 +36,7 @@ import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.classification.InterfaceAudience; import org.apache.hadoop.classification.InterfaceStability; +import org.apache.hadoop.util.Shell; import org.apache.hadoop.util.Shell.ShellCommandExecutor; import org.apache.hadoop.util.StringUtils; @@ -59,32 +60,30 @@ public class ProcfsBasedProcessTree exte public static final String PROCFS_STAT_FILE = "stat"; public static final String PROCFS_CMDLINE_FILE = "cmdline"; public static final long PAGE_SIZE; - static { - ShellCommandExecutor shellExecutor = - new ShellCommandExecutor(new String[]{"getconf", "PAGESIZE"}); - long pageSize = -1; - try { - shellExecutor.execute(); - pageSize = Long.parseLong(shellExecutor.getOutput().replace("\n", "")); - } catch (IOException e) { - LOG.error(StringUtils.stringifyException(e)); - } finally { - PAGE_SIZE = pageSize; - } - } public static final long JIFFY_LENGTH_IN_MILLIS; // in millisecond + static { - ShellCommandExecutor shellExecutor = - new ShellCommandExecutor(new String[]{"getconf", "CLK_TCK"}); long jiffiesPerSecond = -1; + long pageSize = -1; try { - shellExecutor.execute(); - jiffiesPerSecond = Long.parseLong(shellExecutor.getOutput().replace("\n", "")); + if(Shell.LINUX) { + ShellCommandExecutor shellExecutorClk = new ShellCommandExecutor( + new String[] { "getconf", "CLK_TCK" }); + shellExecutorClk.execute(); + jiffiesPerSecond = Long.parseLong(shellExecutorClk.getOutput().replace("\n", "")); + + ShellCommandExecutor shellExecutorPage = new ShellCommandExecutor( + new String[] { "getconf", "PAGESIZE" }); + shellExecutorPage.execute(); + pageSize = Long.parseLong(shellExecutorPage.getOutput().replace("\n", "")); + + } } catch (IOException e) { LOG.error(StringUtils.stringifyException(e)); } finally { JIFFY_LENGTH_IN_MILLIS = jiffiesPerSecond != -1 ? Math.round(1000D / jiffiesPerSecond) : -1; + PAGE_SIZE = pageSize; } } @@ -126,8 +125,7 @@ public class ProcfsBasedProcessTree exte */ public static boolean isAvailable() { try { - String osName = System.getProperty("os.name"); - if (!osName.startsWith("Linux")) { + if (!Shell.LINUX) { LOG.info("ProcfsBasedProcessTree currently is supported only on " + "Linux."); return false; Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorPlugin.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorPlugin.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorPlugin.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorPlugin.java Thu Mar 7 02:57:40 2013 @@ -23,6 +23,7 @@ import org.apache.hadoop.classification. import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configured; import org.apache.hadoop.util.ReflectionUtils; +import org.apache.hadoop.util.Shell; /** * Plugin to calculate resource information on the system. @@ -31,6 +32,18 @@ import org.apache.hadoop.util.Reflection @InterfaceAudience.Private @InterfaceStability.Unstable public abstract class ResourceCalculatorPlugin extends Configured { + + protected String processPid = null; + + /** + * set the pid of the process for which getProcResourceValues + * will be invoked + * + * @param pid + */ + public void setProcessPid(String pid) { + processPid = pid; + } /** * Obtain the total size of the virtual memory present in the system. @@ -109,10 +122,12 @@ public abstract class ResourceCalculator // No class given, try a os specific class try { - String osName = System.getProperty("os.name"); - if (osName.startsWith("Linux")) { + if (Shell.LINUX) { return new LinuxResourceCalculatorPlugin(); } + if (Shell.WINDOWS) { + return new WindowsResourceCalculatorPlugin(); + } } catch (SecurityException se) { // Failed to get Operating System name. return null; Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorProcessTree.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorProcessTree.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorProcessTree.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ResourceCalculatorProcessTree.java Thu Mar 7 02:57:40 2013 @@ -145,14 +145,11 @@ public abstract class ResourceCalculator } // No class given, try a os specific class - try { - String osName = System.getProperty("os.name"); - if (osName.startsWith("Linux")) { - return new ProcfsBasedProcessTree(pid); - } - } catch (SecurityException se) { - // Failed to get Operating System name. - return null; + if (ProcfsBasedProcessTree.isAvailable()) { + return new ProcfsBasedProcessTree(pid); + } + if (WindowsBasedProcessTree.isAvailable()) { + return new WindowsBasedProcessTree(pid); } // Not supported on this system. Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/webapp/view/InfoBlock.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/webapp/view/InfoBlock.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/webapp/view/InfoBlock.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/webapp/view/InfoBlock.java Thu Mar 7 02:57:40 2013 @@ -20,7 +20,11 @@ package org.apache.hadoop.yarn.webapp.vi import org.apache.hadoop.yarn.webapp.ResponseInfo; import org.apache.hadoop.yarn.webapp.hamlet.Hamlet; -import org.apache.hadoop.yarn.webapp.hamlet.Hamlet.*; +import org.apache.hadoop.yarn.webapp.hamlet.Hamlet.DIV; +import org.apache.hadoop.yarn.webapp.hamlet.Hamlet.TABLE; +import org.apache.hadoop.yarn.webapp.hamlet.Hamlet.TD; +import org.apache.hadoop.yarn.webapp.hamlet.Hamlet.TR; + import com.google.inject.Inject; @@ -47,7 +51,19 @@ public class InfoBlock extends HtmlBlock String value = String.valueOf(item.value); if (item.url == null) { if (!item.isRaw) { - tr.td(value); + TD>>> td = tr.td(); + if ( value.lastIndexOf('\n') > 0) { + String []lines = value.split("\n"); + DIV>>>> singleLineDiv; + for ( String line :lines) { + singleLineDiv = td.div(); + singleLineDiv._r(line); + singleLineDiv._(); + } + } else { + td._r(value); + } + td._(); } else { tr.td()._r(value)._(); } Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestProcfsBasedProcessTree.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestProcfsBasedProcessTree.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestProcfsBasedProcessTree.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestProcfsBasedProcessTree.java Thu Mar 7 02:57:40 2013 @@ -36,6 +36,7 @@ import org.apache.hadoop.fs.FileContext; import org.apache.hadoop.fs.FileUtil; import org.apache.hadoop.fs.Path; import org.apache.hadoop.util.StringUtils; +import org.apache.hadoop.util.Shell; import org.apache.hadoop.util.Shell.ExitCodeException; import org.apache.hadoop.util.Shell.ShellCommandExecutor; import org.apache.hadoop.yarn.util.ProcfsBasedProcessTree; @@ -104,17 +105,21 @@ public class TestProcfsBasedProcessTree new Path(TEST_ROOT_DIR.getAbsolutePath()), true); } - @Test + @Test (timeout = 30000) public void testProcessTree() throws Exception { + if (!Shell.LINUX) { + System.out + .println("ProcfsBasedProcessTree is not available on this system. Not testing"); + return; + + } try { - if (!ProcfsBasedProcessTree.isAvailable()) { - System.out - .println("ProcfsBasedProcessTree is not available on this system. Not testing"); - return; - } + Assert.assertTrue(ProcfsBasedProcessTree.isAvailable()); } catch (Exception e) { LOG.info(StringUtils.stringifyException(e)); + Assert.assertTrue("ProcfsBaseProcessTree should be available on Linux", + false); return; } // create shell script @@ -328,7 +333,7 @@ public class TestProcfsBasedProcessTree * @throws IOException if there was a problem setting up the * fake procfs directories or files. */ - @Test + @Test (timeout = 30000) public void testCpuAndMemoryForProcessTree() throws IOException { // test processes @@ -402,7 +407,7 @@ public class TestProcfsBasedProcessTree * @throws IOException if there was a problem setting up the * fake procfs directories or files. */ - @Test + @Test (timeout = 30000) public void testMemForOlderProcesses() throws IOException { // initial list of processes String[] pids = { "100", "200", "300", "400" }; @@ -509,7 +514,7 @@ public class TestProcfsBasedProcessTree * @throws IOException if there was a problem setting up the * fake procfs directories or files. */ - @Test + @Test (timeout = 30000) public void testDestroyProcessTree() throws IOException { // test process String pid = "100"; @@ -535,7 +540,7 @@ public class TestProcfsBasedProcessTree * * @throws IOException */ - @Test + @Test (timeout = 30000) public void testProcessTreeDump() throws IOException { Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestRackResolver.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestRackResolver.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestRackResolver.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/test/java/org/apache/hadoop/yarn/util/TestRackResolver.java Thu Mar 7 02:57:40 2013 @@ -18,9 +18,13 @@ package org.apache.hadoop.yarn.util; +import java.net.InetAddress; +import java.net.UnknownHostException; import java.util.ArrayList; import java.util.List; +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.CommonConfigurationKeysPublic; import org.apache.hadoop.net.DNSToSwitchMapping; @@ -30,9 +34,12 @@ import org.junit.Test; public class TestRackResolver { + private static Log LOG = LogFactory.getLog(TestRackResolver.class); + public static final class MyResolver implements DNSToSwitchMapping { int numHost1 = 0; + public static String resolvedHost1 = "host1"; @Override public List resolve(List hostList) { @@ -43,7 +50,10 @@ public class TestRackResolver { if (hostList.isEmpty()) { return returnList; } - if (hostList.get(0).equals("host1")) { + LOG.info("Received resolve request for " + + hostList.get(0)); + if (hostList.get(0).equals("host1") + || hostList.get(0).equals(resolvedHost1)) { numHost1++; returnList.add("/rack1"); } @@ -62,6 +72,12 @@ public class TestRackResolver { CommonConfigurationKeysPublic.NET_TOPOLOGY_NODE_SWITCH_MAPPING_IMPL_KEY, MyResolver.class, DNSToSwitchMapping.class); RackResolver.init(conf); + try { + InetAddress iaddr = InetAddress.getByName("host1"); + MyResolver.resolvedHost1 = iaddr.getHostAddress(); + } catch (UnknownHostException e) { + // Ignore if not found + } Node node = RackResolver.resolve("host1"); Assert.assertEquals("/rack1", node.getNetworkLocation()); node = RackResolver.resolve("host1"); Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/ContainerExecutor.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/ContainerExecutor.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/ContainerExecutor.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/ContainerExecutor.java Thu Mar 7 02:57:40 2013 @@ -37,6 +37,7 @@ import org.apache.hadoop.util.Shell.Shel import org.apache.hadoop.yarn.api.records.ContainerId; import org.apache.hadoop.yarn.server.nodemanager.containermanager.container.Container; import org.apache.hadoop.yarn.server.nodemanager.util.ProcessIdFileReader; +import org.apache.hadoop.util.Shell; public abstract class ContainerExecutor implements Configurable { @@ -182,6 +183,33 @@ public abstract class ContainerExecutor readLock.unlock(); } } + + /** Return a command to execute the given command in OS shell. + * On Windows, the passed in groupId can be used to launch + * and associate the given groupId in a process group. On + * non-Windows, groupId is ignored. */ + protected static String[] getRunCommand(String command, + String groupId) { + if (Shell.WINDOWS) { + return new String[] { Shell.WINUTILS, "task", "create", groupId, + "cmd /c " + command }; + } else { + return new String[] { "bash", "-c", command }; + } + } + + /** Return a command for determining if process with specified pid is alive. */ + protected static String[] getCheckProcessIsAliveCommand(String pid) { + return Shell.WINDOWS ? + new String[] { Shell.WINUTILS, "task", "isAlive", pid } : + new String[] { "kill", "-0", pid }; + } + + /** Return a command to send a signal to a given pid */ + protected static String[] getSignalKillCommand(int code, String pid) { + return Shell.WINDOWS ? new String[] { Shell.WINUTILS, "task", "kill", pid } : + new String[] { "kill", "-" + code, pid }; + } /** * Is the container still active? @@ -253,6 +281,9 @@ public abstract class ContainerExecutor public static final boolean isSetsidAvailable = isSetsidSupported(); private static boolean isSetsidSupported() { + if (Shell.WINDOWS) { + return true; + } ShellCommandExecutor shexec = null; boolean setsidSupported = true; try { Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java Thu Mar 7 02:57:40 2013 @@ -37,6 +37,8 @@ import org.apache.hadoop.fs.FileContext; import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.UnsupportedFileSystemException; import org.apache.hadoop.fs.permission.FsPermission; +import org.apache.hadoop.io.IOUtils; +import org.apache.hadoop.util.Shell; import org.apache.hadoop.util.Shell.ExitCodeException; import org.apache.hadoop.util.Shell.ShellCommandExecutor; import org.apache.hadoop.yarn.api.records.ContainerId; @@ -53,10 +55,9 @@ public class DefaultContainerExecutor ex private static final Log LOG = LogFactory .getLog(DefaultContainerExecutor.class); - private final FileContext lfs; + private static final int WIN_MAX_PATH = 260; - private static final String WRAPPER_LAUNCH_SCRIPT = - "default_container_executor.sh"; + private final FileContext lfs; public DefaultContainerExecutor() { try { @@ -145,15 +146,24 @@ public class DefaultContainerExecutor ex lfs.util().copy(nmPrivateTokensPath, tokenDst); // Create new local launch wrapper script - Path wrapperScriptDst = new Path(containerWorkDir, WRAPPER_LAUNCH_SCRIPT); - DataOutputStream wrapperScriptOutStream = - lfs.create(wrapperScriptDst, - EnumSet.of(CREATE, OVERWRITE)); + LocalWrapperScriptBuilder sb = Shell.WINDOWS ? + new WindowsLocalWrapperScriptBuilder(containerIdStr, containerWorkDir) : + new UnixLocalWrapperScriptBuilder(containerWorkDir); + + // Fail fast if attempting to launch the wrapper script would fail due to + // Windows path length limitation. + if (Shell.WINDOWS && + sb.getWrapperScriptPath().toString().length() > WIN_MAX_PATH) { + throw new IOException(String.format( + "Cannot launch container using script at path %s, because it exceeds " + + "the maximum supported path length of %d characters. Consider " + + "configuring shorter directories in %s.", sb.getWrapperScriptPath(), + WIN_MAX_PATH, YarnConfiguration.NM_LOCAL_DIRS)); + } Path pidFile = getPidFilePath(containerId); if (pidFile != null) { - writeLocalWrapperScript(wrapperScriptOutStream, launchDst.toUri() - .getPath().toString(), pidFile.toString()); + sb.writeLocalWrapperScript(launchDst, pidFile); } else { LOG.info("Container " + containerIdStr + " was marked as inactive. Returning terminated error"); @@ -166,12 +176,13 @@ public class DefaultContainerExecutor ex try { lfs.setPermission(launchDst, ContainerExecutor.TASK_LAUNCH_SCRIPT_PERMISSION); - lfs.setPermission(wrapperScriptDst, + lfs.setPermission(sb.getWrapperScriptPath(), ContainerExecutor.TASK_LAUNCH_SCRIPT_PERMISSION); // Setup command to run - String[] command = {"bash", - wrapperScriptDst.toUri().getPath().toString()}; + String[] command = getRunCommand(sb.getWrapperScriptPath().toString(), + containerIdStr); + LOG.info("launchContainer: " + Arrays.toString(command)); shExec = new ShellCommandExecutor( command, @@ -202,28 +213,85 @@ public class DefaultContainerExecutor ex return 0; } - private void writeLocalWrapperScript(DataOutputStream out, - String launchScriptDst, String pidFilePath) throws IOException { - // We need to do a move as writing to a file is not atomic - // Process reading a file being written to may get garbled data - // hence write pid to tmp file first followed by a mv - StringBuilder sb = new StringBuilder("#!/bin/bash\n\n"); - sb.append("echo $$ > " + pidFilePath + ".tmp\n"); - sb.append("/bin/mv -f " + pidFilePath + ".tmp " + pidFilePath + "\n"); - sb.append(ContainerExecutor.isSetsidAvailable? "exec setsid" : "exec"); - sb.append(" /bin/bash "); - sb.append("\""); - sb.append(launchScriptDst); - sb.append("\"\n"); - PrintStream pout = null; - try { - pout = new PrintStream(out); - pout.append(sb); - } finally { - if (out != null) { - out.close(); + private abstract class LocalWrapperScriptBuilder { + + private final Path wrapperScriptPath; + + public Path getWrapperScriptPath() { + return wrapperScriptPath; + } + + public void writeLocalWrapperScript(Path launchDst, Path pidFile) throws IOException { + DataOutputStream out = null; + PrintStream pout = null; + + try { + out = lfs.create(wrapperScriptPath, EnumSet.of(CREATE, OVERWRITE)); + pout = new PrintStream(out); + writeLocalWrapperScript(launchDst, pidFile, pout); + } finally { + IOUtils.cleanup(LOG, pout, out); } } + + protected abstract void writeLocalWrapperScript(Path launchDst, Path pidFile, + PrintStream pout); + + protected LocalWrapperScriptBuilder(Path wrapperScriptPath) { + this.wrapperScriptPath = wrapperScriptPath; + } + } + + private final class UnixLocalWrapperScriptBuilder + extends LocalWrapperScriptBuilder { + + public UnixLocalWrapperScriptBuilder(Path containerWorkDir) { + super(new Path(containerWorkDir, "default_container_executor.sh")); + } + + @Override + public void writeLocalWrapperScript(Path launchDst, Path pidFile, + PrintStream pout) { + + // We need to do a move as writing to a file is not atomic + // Process reading a file being written to may get garbled data + // hence write pid to tmp file first followed by a mv + pout.println("#!/bin/bash"); + pout.println(); + pout.println("echo $$ > " + pidFile.toString() + ".tmp"); + pout.println("/bin/mv -f " + pidFile.toString() + ".tmp " + pidFile); + String exec = ContainerExecutor.isSetsidAvailable? "exec setsid" : "exec"; + pout.println(exec + " /bin/bash -c \"" + + launchDst.toUri().getPath().toString() + "\""); + } + } + + private final class WindowsLocalWrapperScriptBuilder + extends LocalWrapperScriptBuilder { + + private final String containerIdStr; + + public WindowsLocalWrapperScriptBuilder(String containerIdStr, + Path containerWorkDir) { + + super(new Path(containerWorkDir, "default_container_executor.cmd")); + this.containerIdStr = containerIdStr; + } + + @Override + public void writeLocalWrapperScript(Path launchDst, Path pidFile, + PrintStream pout) { + + // On Windows, the pid is the container ID, so that it can also serve as + // the name of the job object created by winutils for task management. + // Write to temp file followed by atomic move. + String normalizedPidFile = new File(pidFile.toString()).getPath(); + pout.println("@echo " + containerIdStr + " > " + normalizedPidFile + + ".tmp"); + pout.println("@move /Y " + normalizedPidFile + ".tmp " + + normalizedPidFile); + pout.println("@call " + launchDst.toString()); + } } @Override @@ -234,17 +302,13 @@ public class DefaultContainerExecutor ex : pid; LOG.debug("Sending signal " + signal.getValue() + " to pid " + sigpid + " as user " + user); - try { - sendSignal(sigpid, Signal.NULL); - } catch (ExitCodeException e) { + if (!containerIsAlive(sigpid)) { return false; } try { - sendSignal(sigpid, signal); + killContainer(sigpid, signal); } catch (IOException e) { - try { - sendSignal(sigpid, Signal.NULL); - } catch (IOException ignore) { + if (!containerIsAlive(sigpid)) { return false; } throw e; @@ -253,17 +317,33 @@ public class DefaultContainerExecutor ex } /** + * Returns true if the process with the specified pid is alive. + * + * @param pid String pid + * @return boolean true if the process is alive + */ + private boolean containerIsAlive(String pid) throws IOException { + try { + new ShellCommandExecutor(getCheckProcessIsAliveCommand(pid)).execute(); + // successful execution means process is alive + return true; + } + catch (ExitCodeException e) { + // failure (non-zero exit code) means process is not alive + return false; + } + } + + /** * Send a specified signal to the specified pid * * @param pid the pid of the process [group] to signal. * @param signal signal to send * (for logging). */ - protected void sendSignal(String pid, Signal signal) throws IOException { - ShellCommandExecutor shexec = null; - String[] arg = { "kill", "-" + signal.getValue(), pid }; - shexec = new ShellCommandExecutor(arg); - shexec.execute(); + private void killContainer(String pid, Signal signal) throws IOException { + new ShellCommandExecutor(getSignalKillCommand(signal.getValue(), pid)) + .execute(); } @Override Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/LocalDirsHandlerService.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/LocalDirsHandlerService.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/LocalDirsHandlerService.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/LocalDirsHandlerService.java Thu Mar 7 02:57:40 2013 @@ -20,7 +20,6 @@ package org.apache.hadoop.yarn.server.no import java.io.IOException; import java.net.URI; -import java.net.URISyntaxException; import java.util.ArrayList; import java.util.List; import java.util.Timer; @@ -305,7 +304,7 @@ public class LocalDirsHandlerService ext ArrayList validPaths = new ArrayList(); for (int i = 0; i < paths.length; ++i) { try { - URI uriPath = new URI(paths[i]); + URI uriPath = (new Path(paths[i])).toUri(); if (uriPath.getScheme() == null || uriPath.getScheme().equals(FILE_SCHEME)) { validPaths.add(uriPath.getPath()); @@ -316,7 +315,7 @@ public class LocalDirsHandlerService ext + " is not a valid path. Path should be with " + FILE_SCHEME + " scheme or without scheme"); } - } catch (URISyntaxException e) { + } catch (IllegalArgumentException e) { LOG.warn(e.getMessage()); throw new YarnException(paths[i] + " is not a valid path. Path should be with " + FILE_SCHEME Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationEventType.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationEventType.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationEventType.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationEventType.java Thu Mar 7 02:57:40 2013 @@ -34,5 +34,6 @@ public enum ApplicationEventType { // Source: Log Handler APPLICATION_LOG_HANDLING_INITED, - APPLICATION_LOG_HANDLING_FINISHED + APPLICATION_LOG_HANDLING_FINISHED, + APPLICATION_LOG_HANDLING_FAILED } Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationImpl.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationImpl.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationImpl.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/application/ApplicationImpl.java Thu Mar 7 02:57:40 2013 @@ -149,6 +149,9 @@ public class ApplicationImpl implements .addTransition(ApplicationState.INITING, ApplicationState.INITING, ApplicationEventType.APPLICATION_LOG_HANDLING_INITED, new AppLogInitDoneTransition()) + .addTransition(ApplicationState.INITING, ApplicationState.INITING, + ApplicationEventType.APPLICATION_LOG_HANDLING_FAILED, + new AppLogInitFailTransition()) .addTransition(ApplicationState.INITING, ApplicationState.RUNNING, ApplicationEventType.APPLICATION_INITED, new AppInitDoneTransition()) @@ -238,6 +241,26 @@ public class ApplicationImpl implements } /** + * Handles the APPLICATION_LOG_HANDLING_FAILED event that occurs after + * {@link LogAggregationService} has failed to initialize the log + * aggregation service + * + * In particular, this requests that the {@link ResourceLocalizationService} + * localize the application-scoped resources. + */ + @SuppressWarnings("unchecked") + static class AppLogInitFailTransition implements + SingleArcTransition { + @Override + public void transition(ApplicationImpl app, ApplicationEvent event) { + LOG.warn("Log Aggregation service failed to initialize, there will " + + "be no logs for this application"); + app.dispatcher.getEventHandler().handle( + new ApplicationLocalizationEvent( + LocalizationEventType.INIT_APPLICATION_RESOURCES, app)); + } + } + /** * Handles INIT_CONTAINER events which request that we launch a new * container. When we're still in the INITTING state, we simply * queue these up. When we're in the RUNNING state, we pass along Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/launcher/ContainerLaunch.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/launcher/ContainerLaunch.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/launcher/ContainerLaunch.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/launcher/ContainerLaunch.java Thu Mar 7 02:57:40 2013 @@ -23,6 +23,7 @@ import static org.apache.hadoop.fs.Creat import java.io.DataOutputStream; import java.io.IOException; +import java.io.File; import java.io.OutputStream; import java.io.PrintStream; import java.util.ArrayList; @@ -37,6 +38,7 @@ import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileContext; +import org.apache.hadoop.fs.FileUtil; import org.apache.hadoop.fs.LocalDirAllocator; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IOUtils; @@ -69,7 +71,8 @@ public class ContainerLaunch implements private static final Log LOG = LogFactory.getLog(ContainerLaunch.class); - public static final String CONTAINER_SCRIPT = "launch_container.sh"; + public static final String CONTAINER_SCRIPT = Shell.WINDOWS ? + "launch_container.cmd" : "launch_container.sh"; public static final String FINAL_CONTAINER_TOKENS_FILE = "container_tokens"; private static final String PID_FILE_NAME_FMT = "%s.pid"; @@ -130,7 +133,7 @@ public class ContainerLaunch implements for (String str : command) { // TODO: Should we instead work via symlinks without this grammar? newCmds.add(str.replace(ApplicationConstants.LOG_DIR_EXPANSION_VAR, - containerLogDir.toUri().getPath())); + containerLogDir.toString())); } launchContext.setCommands(newCmds); @@ -141,7 +144,7 @@ public class ContainerLaunch implements entry.setValue( value.replace( ApplicationConstants.LOG_DIR_EXPANSION_VAR, - containerLogDir.toUri().getPath()) + containerLogDir.toString()) ); } // /////////////////////////// End of variable expansion @@ -411,28 +414,17 @@ public class ContainerLaunch implements + appIdStr; } - private static class ShellScriptBuilder { - - private final StringBuilder sb; - - public ShellScriptBuilder() { - this(new StringBuilder("#!/bin/bash\n\n")); - } - - protected ShellScriptBuilder(StringBuilder sb) { - this.sb = sb; - } - - public ShellScriptBuilder env(String key, String value) { - line("export ", key, "=\"", value, "\""); - return this; - } - - public ShellScriptBuilder symlink(Path src, String dst) throws IOException { - return symlink(src, new Path(dst)); - } - - public ShellScriptBuilder symlink(Path src, Path dst) throws IOException { + private static abstract class ShellScriptBuilder { + + private static final String LINE_SEPARATOR = + System.getProperty("line.separator"); + private final StringBuilder sb = new StringBuilder(); + + public abstract void command(List command); + + public abstract void env(String key, String value); + + public final void symlink(Path src, Path dst) throws IOException { if (!src.isAbsolute()) { throw new IOException("Source must be absolute"); } @@ -440,28 +432,89 @@ public class ContainerLaunch implements throw new IOException("Destination must be relative"); } if (dst.toUri().getPath().indexOf('/') != -1) { - line("mkdir -p ", dst.getParent().toString()); + mkdir(dst.getParent()); } - line("ln -sf \"", src.toUri().getPath(), "\" \"", dst.toString(), "\""); - return this; + link(src, dst); } - - public void write(PrintStream out) throws IOException { + + @Override + public String toString() { + return sb.toString(); + } + + public final void write(PrintStream out) throws IOException { out.append(sb); } - - public void line(String... command) { + + protected final void line(String... command) { for (String s : command) { sb.append(s); } - sb.append("\n"); + sb.append(LINE_SEPARATOR); } - + + protected abstract void link(Path src, Path dst) throws IOException; + + protected abstract void mkdir(Path path); + } + + private static final class UnixShellScriptBuilder extends ShellScriptBuilder { + + public UnixShellScriptBuilder(){ + line("#!/bin/bash"); + line(); + } + @Override - public String toString() { - return sb.toString(); + public void command(List command) { + line("exec /bin/bash -c \"", StringUtils.join(" ", command), "\""); + } + + @Override + public void env(String key, String value) { + line("export ", key, "=\"", value, "\""); + } + + @Override + protected void link(Path src, Path dst) throws IOException { + line("ln -sf \"", src.toUri().getPath(), "\" \"", dst.toString(), "\""); + } + + @Override + protected void mkdir(Path path) { + line("mkdir -p ", path.toString()); + } + } + + private static final class WindowsShellScriptBuilder + extends ShellScriptBuilder { + + public WindowsShellScriptBuilder() { + line("@setlocal"); + line(); + } + + @Override + public void command(List command) { + line("@call ", StringUtils.join(" ", command)); + } + + @Override + public void env(String key, String value) { + line("@set ", key, "=", value); + } + + @Override + protected void link(Path src, Path dst) throws IOException { + line(String.format("@%s symlink \"%s\" \"%s\"", Shell.WINUTILS, + new File(dst.toString()).getPath(), + new File(src.toUri().getPath()).getPath())); } + @Override + protected void mkdir(Path path) { + line("@if not exist ", path.toString(), " mkdir ", path.toString()); + } } private static void putEnvIfNotNull( @@ -479,7 +532,7 @@ public class ContainerLaunch implements } public void sanitizeEnv(Map environment, - Path pwd, List appDirs) { + Path pwd, List appDirs) throws IOException { /** * Non-modifiable environment variables */ @@ -513,6 +566,14 @@ public class ContainerLaunch implements environment.put("JVM_PID", "$$"); } + // TODO: Remove Windows check and use this approach on all platforms after + // additional testing. See YARN-358. + if (Shell.WINDOWS) { + String inputClassPath = environment.get(Environment.CLASSPATH.name()); + environment.put(Environment.CLASSPATH.name(), + FileUtil.createJarWithClassPath(inputClassPath, pwd)); + } + /** * Modifiable environment variables */ @@ -537,7 +598,8 @@ public class ContainerLaunch implements Map environment, Map> resources, List command) throws IOException { - ShellScriptBuilder sb = new ShellScriptBuilder(); + ShellScriptBuilder sb = Shell.WINDOWS ? new WindowsShellScriptBuilder() : + new UnixShellScriptBuilder(); if (environment != null) { for (Map.Entry env : environment.entrySet()) { sb.env(env.getKey().toString(), env.getValue().toString()); @@ -546,21 +608,13 @@ public class ContainerLaunch implements if (resources != null) { for (Map.Entry> entry : resources.entrySet()) { for (String linkName : entry.getValue()) { - sb.symlink(entry.getKey(), linkName); + sb.symlink(entry.getKey(), new Path(linkName)); } } } - ArrayList cmd = new ArrayList(2 * command.size() + 5); - cmd.add("exec /bin/bash "); - cmd.add("-c "); - cmd.add("\""); - for (String cs : command) { - cmd.add(cs.toString()); - cmd.add(" "); - } - cmd.add("\""); - sb.line(cmd.toArray(new String[cmd.size()])); + sb.command(command); + PrintStream pout = null; try { pout = new PrintStream(out); Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/ResourceLocalizationService.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/ResourceLocalizationService.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/ResourceLocalizationService.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/ResourceLocalizationService.java Thu Mar 7 02:57:40 2013 @@ -659,25 +659,23 @@ public class ResourceLocalizationService new ContainerResourceFailedEvent( assoc.getContext().getContainerId(), assoc.getResource().getRequest(), e.getCause())); + List reqs; synchronized (attempts) { LocalResourceRequest req = assoc.getResource().getRequest(); - List reqs = attempts.get(req); + reqs = attempts.get(req); if (null == reqs) { LOG.error("Missing pending list for " + req); return; } - if (reqs.isEmpty()) { - attempts.remove(req); - } - /* - * Do not retry for now. Once failed is failed! - * LocalizerResourceRequestEvent request = reqs.remove(0); - - pending.put(queue.submit(new FSDownload( - lfs, null, conf, publicDirs, - request.getResource().getRequest(), new Random())), - request); - */ } + attempts.remove(req); + } + // let the other containers know about the localization failure + for (LocalizerResourceRequestEvent reqEvent : reqs) { + dispatcher.getEventHandler().handle( + new ContainerResourceFailedEvent( + reqEvent.getContext().getContainerId(), + reqEvent.getResource().getRequest(), e.getCause())); + } } catch (CancellationException e) { // ignore; shutting down } Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/logaggregation/LogAggregationService.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/logaggregation/LogAggregationService.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/logaggregation/LogAggregationService.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/logaggregation/LogAggregationService.java Thu Mar 7 02:57:40 2013 @@ -300,8 +300,9 @@ public class LogAggregationService exten eventResponse = new ApplicationEvent(appId, ApplicationEventType.APPLICATION_LOG_HANDLING_INITED); } catch (YarnException e) { - eventResponse = new ApplicationFinishEvent(appId, - "Application failed to init aggregation: " + e.getMessage()); + LOG.warn("Application failed to init aggregation: " + e.getMessage()); + eventResponse = new ApplicationEvent(appId, + ApplicationEventType.APPLICATION_LOG_HANDLING_FAILED); } this.dispatcher.getEventHandler().handle(eventResponse); } Modified: hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/util/ProcessIdFileReader.java URL: http://svn.apache.org/viewvc/hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/util/ProcessIdFileReader.java?rev=1453669&r1=1453668&r2=1453669&view=diff ============================================================================== --- hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/util/ProcessIdFileReader.java (original) +++ hadoop/common/branches/HDFS-2802/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/util/ProcessIdFileReader.java Thu Mar 7 02:57:40 2013 @@ -25,6 +25,8 @@ import java.io.IOException; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apache.hadoop.fs.Path; +import org.apache.hadoop.util.Shell; +import org.apache.hadoop.yarn.util.ConverterUtils; /** * Helper functionality to read the pid from a file. @@ -62,14 +64,28 @@ public class ProcessIdFileReader { } String temp = line.trim(); if (!temp.isEmpty()) { - try { - Long pid = Long.valueOf(temp); - if (pid > 0) { + if (Shell.WINDOWS) { + // On Windows, pid is expected to be a container ID, so find first + // line that parses successfully as a container ID. + try { + ConverterUtils.toContainerId(temp); processId = temp; break; + } catch (Exception e) { + // do nothing + } + } + else { + // Otherwise, find first line containing a numeric pid. + try { + Long pid = Long.valueOf(temp); + if (pid > 0) { + processId = temp; + break; + } + } catch (Exception e) { + // do nothing } - } catch (Exception e) { - // do nothing } } }