Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A96849144 for ; Mon, 31 Oct 2011 23:20:14 +0000 (UTC) Received: (qmail 49442 invoked by uid 500); 31 Oct 2011 23:20:14 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 49393 invoked by uid 500); 31 Oct 2011 23:20:14 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 49385 invoked by uid 99); 31 Oct 2011 23:20:14 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 31 Oct 2011 23:20:14 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.8] (HELO aegis.apache.org) (140.211.11.8) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 31 Oct 2011 23:20:12 +0000 Received: from aegis (localhost [127.0.0.1]) by aegis.apache.org (Postfix) with ESMTP id 97815C00F6 for ; Mon, 31 Oct 2011 23:19:51 +0000 (UTC) Date: Mon, 31 Oct 2011 23:19:51 +0000 (UTC) From: Apache Jenkins Server To: hdfs-dev@hadoop.apache.org Message-ID: <623927699.351320103191601.JavaMail.hudson@aegis> In-Reply-To: <1073126594.4251320065727871.JavaMail.hudson@aegis> References: <1073126594.4251320065727871.JavaMail.hudson@aegis> Subject: Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #58 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit See Changes: [todd] HDFS-2512. Add textual error message to data transfer protocol responses. Contributed by Todd Lipcon. [szetszwo] svn merge -c 1195656 from trunk for HDFS-2385. [acmurthy] Merge -c 1195579 from trunk to branch-0.23 to fix MAPREDUCE-3275. [acmurthy] Merge -c 1195575 from trunk to branch-0.23 to fix MAPREDUCE-3035. [amarrk] MAPREDUCE-3241. [Rumen] Fix Rumen to ignore the AMStartedEvent. (amarrk) [amarrk] MAPREDUCE-3166. [Rumen] Make Rumen use job history api instead of relying on current history file name format. (Ravi Gummadi via amarrk) [amarrk] MAPREDUCE-3157. [Rumen] Fix TraceBuilder to handle 0.20 history file names also. (Ravi Gummadi via amarrk) ------------------------------------------ [...truncated 7787 lines...] 80 KB 81 KB Downloaded: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/2.6/maven-checkstyle-plugin-2.6.jar (81 KB at 137.9 KB/sec) [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop HDFS 0.23.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs --- [INFO] Deleting [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (hdfs) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. Created dir: Created dir: [INFO] Compiling 8 JSP source files to log4j:WARN No appenders could be found for logger (org.apache.jasper.JspC). log4j:WARN Please initialize the log4j system properly. WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.270 [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (secondary) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. [INFO] Compiling 1 JSP source file to WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.016 [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (datanode) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. [INFO] Compiling 3 JSP source files to WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.021 [INFO] [INFO] --- build-helper-maven-plugin:1.5:add-source (add-source) @ hadoop-hdfs --- [INFO] Source directory: added. [INFO] [INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ hadoop-hdfs --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] [INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ hadoop-hdfs --- [INFO] Compiling 328 source files to [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs --- [INFO] Executing tasks main: [copy] Copying 1 file to [copy] Copying 1 file to [copy] Copying 1 file to [copy] Copying 6 files to [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.6:run (compile) @ hadoop-hdfs --- [INFO] Executing tasks main: [copy] Copying 15 files to [copy] Copied 6 empty directories to 2 empty directories under [INFO] Executed tasks [INFO] [INFO] --- make-maven-plugin:1.0-beta-1:autoreconf (compile) @ hadoop-hdfs --- [INFO] [INFO] --- make-maven-plugin:1.0-beta-1:configure (compile) @ hadoop-hdfs --- [INFO] checking for a BSD-compatible install... /usr/bin/install -c [INFO] checking whether build environment is sane... yes [INFO] checking for a thread-safe mkdir -p... /bin/mkdir -p [INFO] checking for gawk... no [INFO] checking for mawk... mawk [INFO] checking whether make sets $(MAKE)... yes [INFO] checking build system type... x86_64-unknown-linux-gnu [INFO] checking host system type... x86_64-unknown-linux-gnu [INFO] checking for style of include used by make... GNU [INFO] checking for gcc... gcc [INFO] checking whether the C compiler works... yes [INFO] checking for C compiler default output file name... a.out [INFO] checking for suffix of executables... [INFO] checking whether we are cross compiling... no [INFO] checking for suffix of object files... o [INFO] checking whether we are using the GNU C compiler... yes [INFO] checking whether gcc accepts -g... yes [INFO] checking for gcc option to accept ISO C89... none needed [INFO] checking dependency style of gcc... gcc3 [INFO] checking for a sed that does not truncate output... /bin/sed [INFO] checking for grep that handles long lines and -e... /bin/grep [INFO] checking for egrep... /bin/grep -E [INFO] checking for fgrep... /bin/grep -F [INFO] checking for ld used by gcc... /usr/bin/ld [INFO] checking if the linker (/usr/bin/ld) is GNU ld... yes [INFO] checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B [INFO] checking the name lister (/usr/bin/nm -B) interface... BSD nm [INFO] checking whether ln -s works... yes [INFO] checking the maximum length of command line arguments... 1572864 [INFO] checking whether the shell understands some XSI constructs... yes [INFO] checking whether the shell understands "+="... yes [INFO] checking for /usr/bin/ld option to reload object files... -r [INFO] checking for objdump... objdump [INFO] checking how to recognize dependent libraries... pass_all [INFO] checking for ar... ar [INFO] checking for strip... strip [INFO] checking for ranlib... ranlib [INFO] checking command to parse /usr/bin/nm -B output from gcc object... ok [INFO] checking how to run the C preprocessor... gcc -E [INFO] checking for ANSI C header files... yes [INFO] checking for sys/types.h... yes [INFO] checking for sys/stat.h... yes [INFO] checking for stdlib.h... yes [INFO] checking for string.h... yes [INFO] checking for memory.h... yes [INFO] checking for strings.h... yes [INFO] checking for inttypes.h... yes [INFO] checking for stdint.h... yes [INFO] checking for unistd.h... yes [INFO] checking for dlfcn.h... yes [INFO] checking for objdir... .libs [INFO] checking if gcc supports -fno-rtti -fno-exceptions... no [INFO] checking for gcc option to produce PIC... -fPIC -DPIC [INFO] checking if gcc PIC flag -fPIC -DPIC works... yes [INFO] checking if gcc static flag -static works... yes [INFO] checking if gcc supports -c -o file.o... yes [INFO] checking if gcc supports -c -o file.o... (cached) yes [INFO] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes [INFO] checking whether -lc should be explicitly linked in... no [INFO] checking dynamic linker characteristics... GNU/Linux ld.so [INFO] checking how to hardcode library paths into programs... immediate [INFO] checking whether stripping libraries is possible... yes [INFO] checking if libtool supports shared libraries... yes [INFO] checking whether to build shared libraries... yes [INFO] checking whether to build static libraries... yes [INFO] *** Current host *** [INFO] checking cached host system type... ok [INFO] *** C-Language compilation tools *** [INFO] checking for gcc... (cached) gcc [INFO] checking whether we are using the GNU C compiler... (cached) yes [INFO] checking whether gcc accepts -g... (cached) yes [INFO] checking for gcc option to accept ISO C89... (cached) none needed [INFO] checking dependency style of gcc... (cached) gcc3 [INFO] checking for ranlib... (cached) ranlib [INFO] *** Host support *** [INFO] checking C flags dependant on host system type... ok [INFO] *** Java compilation tools *** [INFO] checking for sablevm... NONE [INFO] checking for kaffe... NONE [INFO] checking for javac... /home/jenkins/tools/java/latest/bin/javac [INFO] /home/jenkins/tools/java/latest/bin/javac [INFO] checking wether the Java compiler (/home/jenkins/tools/java/latest/bin/javac) works... yes [INFO] checking for jar... /home/jenkins/tools/java/latest/bin/jar [INFO] checking where on earth this jvm library is..... ohh u there ... /home/jenkins/tools/java/latest/jre/lib/i386/server [INFO] VALUE OF JVM_ARCH IS :32 [INFO] gcc flags added [INFO] checking for gcc... (cached) gcc [INFO] checking whether we are using the GNU C compiler... (cached) yes [INFO] checking whether gcc accepts -g... (cached) yes [INFO] checking for gcc option to accept ISO C89... (cached) none needed [INFO] checking dependency style of gcc... (cached) gcc3 [INFO] checking for size_t... no [INFO] checking for strdup... no [INFO] checking for strerror... no [INFO] checking for strtoul... no [INFO] checking fcntl.h usability... no [INFO] checking fcntl.h presence... yes [INFO] configure: WARNING: fcntl.h: present but cannot be compiled [INFO] configure: WARNING: fcntl.h: check for missing prerequisite headers? [INFO] configure: WARNING: fcntl.h: see the Autoconf documentation [INFO] configure: WARNING: fcntl.h: section "Present But Cannot Be Compiled" [INFO] configure: WARNING: fcntl.h: proceeding with the compiler's result [INFO] configure: WARNING: ## --------------------------------- ## [INFO] configure: WARNING: ## Report this to omalley@apache.org ## [INFO] configure: WARNING: ## --------------------------------- ## [INFO] checking for fcntl.h... no [INFO] checking for an ANSI C-conforming const... yes [INFO] checking for working volatile... yes [INFO] checking for stdbool.h that conforms to C99... yes [INFO] checking for _Bool... no [INFO] configure: creating ./config.status [INFO] config.status: creating Makefile [INFO] config.status: executing depfiles commands [INFO] config.status: executing libtool commands [INFO] [INFO] --- make-maven-plugin:1.0-beta-1:make-install (compile) @ hadoop-hdfs --- [INFO] /bin/bash ./libtool --tag=CC --mode=compile gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -Dsize_t=unsigned\ int -DHAVE_STDBOOL_H=1 -I. -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/home/jenkins/tools/java/latest/include -I/home/jenkins/tools/java/latest/include/linux -Wall -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF .deps/hdfs.Tpo -c -o hdfs.lo hdfs.c [INFO] libtool: compile: gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" "-DPACKAGE_STRING=\"libhdfs 0.1.0\"" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" "-Dsize_t=unsigned int" -DHAVE_STDBOOL_H=1 -I. -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/home/jenkins/tools/java/latest/include -I/home/jenkins/tools/java/latest/include/linux -Wall -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF .deps/hdfs.Tpo -c hdfs.c -fPIC -DPIC -o .libs/hdfs.o [INFO] In file included from /usr/include/features.h:378, [INFO] from /usr/include/sys/types.h:27, [INFO] from hdfs.h:22, [INFO] from hdfs.c:19: [INFO] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such file or directory [INFO] In file included from /usr/include/sys/types.h:147, [INFO] from hdfs.h:22, [INFO] from hdfs.c:19: [INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: duplicate 'unsigned' [INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: two or more data types in declaration specifiers [INFO] make: *** [hdfs.lo] Error 1 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop HDFS ................................ FAILURE [22.162s] [INFO] Apache Hadoop HDFS Project ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 23.890s [INFO] Finished at: Mon Oct 31 22:09:40 UTC 2011 [INFO] Final Memory: 27M/275M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.codehaus.mojo:make-maven-plugin:1.0-beta-1:make-install (compile) on project hadoop-hdfs: make returned an exit value != 0. Aborting build; see command output above for more information. -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException + /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license Archiving artifacts Publishing Clover coverage report... Publishing Clover HTML report... Publishing Clover XML report... Publishing Clover coverage results... Recording test results Build step 'Publish JUnit test result report' changed build result to UNSTABLE Publishing Javadoc ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception does not exist. at org.apache.tools.ant.types.AbstractFileSet.getDirectoryScanner(AbstractFileSet.java:474) at hudson.FilePath$34.hasMatch(FilePath.java:1801) at hudson.FilePath$34.invoke(FilePath.java:1710) at hudson.FilePath$34.invoke(FilePath.java:1701) at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1995) at hudson.remoting.UserRequest.perform(UserRequest.java:118) at hudson.remoting.UserRequest.perform(UserRequest.java:48) at hudson.remoting.Request$2.run(Request.java:287) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) Recording fingerprints Updating MAPREDUCE-3157 Updating MAPREDUCE-3166 Updating MAPREDUCE-3035 Updating HDFS-2385 Updating MAPREDUCE-3275 Updating HDFS-2512 Updating MAPREDUCE-3241