hadoop-hdfs-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: Hadoop-Hdfs-trunk #1120
Date Mon, 30 Jul 2012 13:00:14 GMT
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1120/>

------------------------------------------
[...truncated 12102 lines...]
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
2 warnings
[WARNING] Javadoc Warnings
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:32:
warning: com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and
may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.OutputFormat;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:33:
warning: com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and
may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.XMLSerializer;
[WARNING] ^
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (dist) @ hadoop-hdfs ---
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.ant:*:jar'
o  'jdiff:jdiff:jar'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs ---
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode/ha already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol/proto already added, skipping
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/datanode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.0.0-SNAPSHOT-sources.jar>
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode/ha already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol/proto already added, skipping
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/datanode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs:java-source:sources:3.0.0-SNAPSHOT already
attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
2 warnings
[WARNING] Javadoc Warnings
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:32:
warning: com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and
may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.OutputFormat;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:33:
warning: com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and
may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.XMLSerializer;
[WARNING] ^
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs ---
[INFO] 
[INFO] There are 7403 checkstyle errors.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is true
[INFO] ****** FindBugsMojo executeFindbugs *******
[INFO] Temp File is <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/findbugsTemp.xml>
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HttpFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency
information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0:
Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved:
Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-httpfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-httpfs ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath) @ hadoop-hdfs-httpfs
---
[INFO] Wrote classpath file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/classes/mrapp-generated-classpath'.>
[INFO] 
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs-httpfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs-httpfs ---
[INFO] Compiling 56 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/classes>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs-httpfs ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-classes/webapp>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-classes/webapp>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-hdfs-httpfs
---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-hdfs-httpfs
---
[INFO] Compiling 44 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.12:test (default-test) @ hadoop-hdfs-httpfs ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.test.TestDirHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.048 sec
Running org.apache.hadoop.test.TestJettyHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.053 sec
Running org.apache.hadoop.test.TestHdfsHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.06 sec
Running org.apache.hadoop.test.TestHTestCase
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.197 sec
Running org.apache.hadoop.test.TestExceptionHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.047 sec
Running org.apache.hadoop.test.TestHFSTestCase
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.872 sec
Running org.apache.hadoop.lib.service.instrumentation.TestInstrumentationService
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.662 sec
Running org.apache.hadoop.lib.service.scheduler.TestSchedulerService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.212 sec
Running org.apache.hadoop.lib.service.security.TestProxyUserService
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.529 sec
Running org.apache.hadoop.lib.service.security.TestDelegationTokenManagerService
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.687 sec
Running org.apache.hadoop.lib.service.security.TestGroupsService
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.226 sec
Running org.apache.hadoop.lib.service.hadoop.TestFileSystemAccessService
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.952 sec
Running org.apache.hadoop.lib.server.TestServerConstructor
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.114 sec
Running org.apache.hadoop.lib.server.TestServer
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.5 sec
Running org.apache.hadoop.lib.server.TestBaseService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.269 sec
Running org.apache.hadoop.lib.lang.TestRunnableCallable
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.lib.lang.TestXException
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.057 sec
Running org.apache.hadoop.lib.wsrs.TestParam
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.073 sec
Running org.apache.hadoop.lib.wsrs.TestInputStreamEntity
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.05 sec
Running org.apache.hadoop.lib.wsrs.TestJSONProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 sec
Running org.apache.hadoop.lib.wsrs.TestJSONMapProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.051 sec
Running org.apache.hadoop.lib.wsrs.TestUserProvider
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.255 sec
Running org.apache.hadoop.lib.servlet.TestServerWebApp
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.21 sec
Running org.apache.hadoop.lib.servlet.TestMDCFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.245 sec
Running org.apache.hadoop.lib.servlet.TestHostnameFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.226 sec
Running org.apache.hadoop.lib.util.TestCheck
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.086 sec
Running org.apache.hadoop.lib.util.TestConfigurationUtils
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.127 sec
Running org.apache.hadoop.fs.http.server.TestHttpFSServer
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.399 sec
Running org.apache.hadoop.fs.http.server.TestHttpFSKerberosAuthenticationHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.011 sec
Running org.apache.hadoop.fs.http.server.TestCheckUploadContentTypeFilter
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.269 sec
Running org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem
Tests run: 30, Failures: 0, Errors: 8, Skipped: 0, Time elapsed: 18.598 sec <<< FAILURE!
Running org.apache.hadoop.fs.http.client.TestHttpFSFileSystem
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.006 sec

Results :

Tests in error: 
  testOperation[1](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem): Unexpected HTTP
response: code=200 != 307, op=OPEN, message=OK
  testOperationDoAs[1](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem): Unexpected
HTTP response: code=200 != 307, op=OPEN, message=OK
  testOperation[2](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem)
  testOperationDoAs[2](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem)
  testOperation[3](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem): Unexpected HTTP
response: code=400 != 200, op=APPEND, message=Data upload requests must have content-type
set to 'application/octet-stream'
  testOperationDoAs[3](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem): Unexpected
HTTP response: code=400 != 200, op=APPEND, message=Data upload requests must have content-type
set to 'application/octet-stream'
  testOperation[13](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem): Unexpected HTTP
response: code=200 != 307, op=GETFILECHECKSUM, message=OK
  testOperationDoAs[13](org.apache.hadoop.fs.http.client.TestWebhdfsFileSystem): Unexpected
HTTP response: code=200 != 307, op=GETFILECHECKSUM, message=OK

Tests run: 238, Failures: 0, Errors: 8, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:23:33.623s]
[INFO] Apache Hadoop HttpFS .............................. FAILURE [1:12.467s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:24:46.855s
[INFO] Finished at: Mon Jul 30 12:59:47 UTC 2012
[INFO] Final Memory: 51M/778M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12:test (default-test)
on project hadoop-hdfs-httpfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire-reports>
for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-httpfs
Build step 'Execute shell' marked build as failure
Archiving artifacts

Mime
View raw message