hadoop-yarn-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86
Date Mon, 11 Sep 2017 16:07:20 GMT
For more details, see https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/

[Sep 11, 2017 4:58:10 AM] (sunilg) YARN-7163. RMContext need not to be injected to webapp
and other Always
[Sep 11, 2017 6:17:59 AM] (yufei) YARN-6799. Remove the duplicated code in CGroupsHandlerImp.java.




-1 overall


The following subsystems voted -1:
    findbugs unit


The following subsystems voted -1 but
were configured to be filtered/ignored:
    cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
    unit


Specific tests:

    FindBugs :

       module:hadoop-hdfs-project/hadoop-hdfs 
       Format-string method String.format(String, Object[]) called with format string "File
%s could only be written to %d of the %d %s. There are %d datanode(s) running and %s node(s)
are excluded in this operation." wants 6 arguments but is given 7 in org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(String,
int, Node, Set, long, List, byte, BlockType, ErasureCodingPolicy, EnumSet) At BlockManager.java:with
format string "File %s could only be written to %d of the %d %s. There are %d datanode(s)
running and %s node(s) are excluded in this operation." wants 6 arguments but is given 7 in
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(String, int,
Node, Set, long, List, byte, BlockType, ErasureCodingPolicy, EnumSet) At BlockManager.java:[line
2076] 

    FindBugs :

       module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager

       Hard coded reference to an absolute pathname in org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext)
At DockerLinuxContainerRuntime.java:absolute pathname in org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext)
At DockerLinuxContainerRuntime.java:[line 490] 

    Failed junit tests :

       hadoop.ha.TestZKFailoverController 
       hadoop.hdfs.TestReconstructStripedFile 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150 
       hadoop.hdfs.TestLeaseRecoveryStriped 
       hadoop.hdfs.TestClientProtocolForPipelineRecovery 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure050 
       hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithNodeGroup 
       hadoop.hdfs.server.datanode.TestDataNodeVolumeFailure 
       hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure180 
       hadoop.hdfs.TestFileAppendRestart 
       hadoop.hdfs.server.namenode.TestNamenodeCapacityReport 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure030 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure160 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure080 
       hadoop.hdfs.TestReadStripedFileWithMissingBlocks 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure090 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure040 
       hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureReporting 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure010 
       hadoop.hdfs.server.blockmanagement.TestBlockManager 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure120 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure200 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure100 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070 
       hadoop.hdfs.protocol.datatransfer.sasl.TestSaslDataTransfer 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure190 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure 
       hadoop.hdfs.TestDFSStripedOutputStreamWithFailure000 
       hadoop.fs.http.client.TestHttpFSFWithWebhdfsFileSystem 
       hadoop.yarn.server.resourcemanager.scheduler.capacity.TestContainerAllocation 
       hadoop.yarn.server.resourcemanager.scheduler.TestAbstractYarnScheduler 
       hadoop.yarn.server.router.webapp.TestRouterWebServiceUtil 
       hadoop.mapreduce.v2.hs.webapp.TestHSWebApp 
       hadoop.yarn.sls.TestReservationSystemInvariants 
       hadoop.yarn.sls.TestSLSRunner 

    Timed out junit tests :

       org.apache.hadoop.hdfs.TestWriteReadStripedFile 
      

   cc:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/diff-compile-cc-root.txt
 [4.0K]

   javac:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/diff-compile-javac-root.txt
 [292K]

   checkstyle:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/diff-checkstyle-root.txt
 [17M]

   pylint:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/diff-patch-pylint.txt
 [20K]

   shellcheck:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/diff-patch-shellcheck.txt
 [20K]

   shelldocs:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/diff-patch-shelldocs.txt
 [12K]

   whitespace:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/whitespace-eol.txt
 [11M]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/whitespace-tabs.txt
 [1.2M]

   findbugs:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs-warnings.html
 [8.0K]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager-warnings.html
 [8.0K]

   javadoc:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/diff-javadoc-javadoc-root.txt
 [1.9M]

   unit:

       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
 [148K]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
 [1.7M]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt
 [16K]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
 [64K]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt
 [484K]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-hs.txt
 [16K]
       https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/520/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt
 [20K]

Powered by Apache Yetus 0.6.0-SNAPSHOT   http://yetus.apache.org

Mime
View raw message