Return-Path: Delivered-To: apmail-hadoop-common-dev-archive@www.apache.org Received: (qmail 24297 invoked from network); 24 Nov 2010 05:21:30 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 24 Nov 2010 05:21:30 -0000 Received: (qmail 66223 invoked by uid 500); 24 Nov 2010 05:22:01 -0000 Delivered-To: apmail-hadoop-common-dev-archive@hadoop.apache.org Received: (qmail 65934 invoked by uid 500); 24 Nov 2010 05:21:59 -0000 Mailing-List: contact common-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-dev@hadoop.apache.org Delivered-To: mailing list common-dev@hadoop.apache.org Received: (qmail 65926 invoked by uid 99); 24 Nov 2010 05:21:58 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 24 Nov 2010 05:21:58 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.8] (HELO aegis.apache.org) (140.211.11.8) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 24 Nov 2010 05:21:56 +0000 Received: from aegis (localhost [127.0.0.1]) by aegis.apache.org (Postfix) with ESMTP id CEEB6C0161 for ; Wed, 24 Nov 2010 05:21:35 +0000 (UTC) Date: Wed, 24 Nov 2010 05:21:35 +0000 (UTC) From: Apache Hudson Server To: common-dev@hadoop.apache.org Message-ID: <120866886.9081290576095794.JavaMail.hudson@aegis> Subject: Build failed in Hudson: Hadoop-Common-trunk-Commit #444 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit See Changes: [dhruba] HADOOP-7001. Configuration changes can occur via the Reconfigurable interface. (Patrick Kline via dhruba) ------------------------------------------ [...truncated 1246 lines...] AU bin/rcc AU bin/hadoop AU bin/start-all.sh A README.txt A build.xml U . At revision 1038485 A commitBuild.sh A hudsonEnv.sh AU hudsonBuildHadoopNightly.sh AU hudsonBuildHadoopPatch.sh AU hudsonBuildHadoopRelease.sh AU processHadoopPatchEmailRemote.sh AU hudsonPatchQueueAdmin.sh AU processHadoopPatchEmail.sh A README.txt A test-patch A test-patch/test-patch.sh At revision 1038485 no change for http://svn.apache.org/repos/asf/hadoop/nightly since the previous build [Hadoop-Common-trunk-Commit] $ /bin/bash /tmp/hudson1399538409600007277.sh ====================================================================== ====================================================================== CLEAN: cleaning workspace ====================================================================== ====================================================================== Buildfile: build.xml clean-contrib: clean: clean: [echo] contrib: failmon clean: [echo] contrib: hod clean-sign: clean-fi: clean: BUILD SUCCESSFUL Total time: 0 seconds ====================================================================== ====================================================================== BUILD: ant mvn-deploy tar findbugs -Dversion=${VERSION} -Dtest.junit.output.format=xml -Dtest.output=yes -Dcompile.c++=yes -Dcompile.native=true -Dfindbugs.home=$FINDBUGS_HOME -Djava5.home=$JAVA5_HOME -Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME -Declipse.home=$ECLIPSE_HOME ====================================================================== ====================================================================== Buildfile: build.xml ant-task-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/maven/maven-ant-tasks/2.0.10/maven-ant-tasks-2.0.10.jar [get] To: mvn-taskdef: clover.setup: clover.info: clover: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: ivy-init-dirs: [mkdir] Created dir: [mkdir] Created dir: [mkdir] Created dir: [mkdir] Created dir: ivy-probe-antlib: ivy-init-antlib: ivy-init: [ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:configure] :: loading settings :: file = ivy-resolve-common: ivy-retrieve-common: [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead [ivy:cachepath] :: loading settings :: file = init: [mkdir] Created dir: [mkdir] Created dir: [mkdir] Created dir: [mkdir] Created dir: [mkdir] Created dir: [mkdir] Created dir: [touch] Creating /tmp/null935091532 [delete] Deleting: /tmp/null935091532 [copy] Copying 5 files to [copy] Copying to [copy] Copying to [copy] Copying to [copy] Copying to [copy] Copying to [mkdir] Created dir: [copy] Copying 5 files to [copy] Copying to [copy] Copying to [copy] Copying to [copy] Copying to [copy] Copying to [copy] Copying 1 file to [copy] Copying to record-parser: compile-rcc-compiler: [javac] Compiling 29 source files to [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. compile-core-classes: [javac] Compiling 398 source files to [javac] :33: cannot find symbol [javac] symbol: class Reconfigurable [javac] extends Configured implements Reconfigurable { [javac] ^ [javac] :68: cannot find symbol [javac] symbol : class Reconfigurable [javac] location: class org.apache.hadoop.conf.ReconfigurationServlet [javac] private Reconfigurable getReconfigurable(HttpServletRequest req) { [javac] ^ [javac] :93: cannot find symbol [javac] symbol : class Reconfigurable [javac] location: class org.apache.hadoop.conf.ReconfigurationServlet [javac] private void printConf(PrintWriter out, Reconfigurable reconf) { [javac] ^ [javac] :145: cannot find symbol [javac] symbol : class Reconfigurable [javac] location: class org.apache.hadoop.conf.ReconfigurationServlet [javac] private void applyChanges(PrintWriter out, Reconfigurable reconf, [javac] ^ [javac] :31: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release [javac] import sun.security.krb5.Config; [javac] ^ [javac] :32: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release [javac] import sun.security.krb5.KrbException; [javac] ^ [javac] :81: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release [javac] private static Config kerbConf; [javac] ^ [javac] :39: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release [javac] import sun.security.jgss.krb5.Krb5Util; [javac] ^ [javac] :40: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release [javac] import sun.security.krb5.Credentials; [javac] ^ [javac] :41: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release [javac] import sun.security.krb5.PrincipalName; [javac] ^ [javac] :61: method does not override or implement a method from a supertype [javac] @Override [javac] ^ [javac] :88: method does not override or implement a method from a supertype [javac] @Override [javac] ^ [javac] :97: method does not override or implement a method from a supertype [javac] @Override [javac] ^ [javac] :72: cannot find symbol [javac] symbol : class Reconfigurable [javac] location: class org.apache.hadoop.conf.ReconfigurationServlet [javac] return (Reconfigurable) [javac] ^ [javac] :214: cannot find symbol [javac] symbol : class Reconfigurable [javac] location: class org.apache.hadoop.conf.ReconfigurationServlet [javac] Reconfigurable reconf = getReconfigurable(req); [javac] ^ [javac] :231: cannot find symbol [javac] symbol : class Reconfigurable [javac] location: class org.apache.hadoop.conf.ReconfigurationServlet [javac] Reconfigurable reconf = getReconfigurable(req); [javac] ^ [javac] :85: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release [javac] kerbConf = Config.getInstance(); [javac] ^ [javac] :87: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release [javac] } catch (KrbException ke) { [javac] ^ [javac] :120: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release [javac] Credentials serviceCred = null; [javac] ^ [javac] :122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release [javac] PrincipalName principal = new PrincipalName(serviceName, [javac] ^ [javac] :122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release [javac] PrincipalName principal = new PrincipalName(serviceName, [javac] ^ [javac] :123: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release [javac] PrincipalName.KRB_NT_SRV_HST); [javac] ^ [javac] :125: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release [javac] .toString(), Krb5Util.ticketToCreds(getTgtFromSubject())); [javac] ^ [javac] :124: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release [javac] serviceCred = Credentials.acquireServiceCreds(principal [javac] ^ [javac] :134: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release [javac] .add(Krb5Util.credsToTicket(serviceCred)); [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] 10 errors [javac] 15 warnings BUILD FAILED :346: Compile failed; see the compiler error output for details. Total time: 9 seconds ====================================================================== ====================================================================== STORE: saving artifacts ====================================================================== ====================================================================== mv: cannot stat `build/*.tar.gz': No such file or directory mv: cannot stat `build/*.jar': No such file or directory mv: cannot stat `build/test/findbugs': No such file or directory mv: cannot stat `build/docs/api': No such file or directory Build Failed [FINDBUGS] Skipping publisher since build result is FAILURE Publishing Javadoc Archiving artifacts Recording test results Recording fingerprints Publishing Clover coverage report... No Clover report will be published due to a Build Failure