Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 06ADC200C86 for ; Wed, 31 May 2017 19:50:33 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 04DC9160BDB; Wed, 31 May 2017 17:50:33 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 240CD160BC2 for ; Wed, 31 May 2017 19:50:31 +0200 (CEST) Received: (qmail 40784 invoked by uid 500); 31 May 2017 17:50:28 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 40748 invoked by uid 99); 31 May 2017 17:50:28 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 31 May 2017 17:50:28 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id 436BF9C0046; Wed, 31 May 2017 17:50:28 +0000 (UTC) Date: Wed, 31 May 2017 17:50:27 +0000 (UTC) From: Apache Jenkins Server To: common-dev@hadoop.apache.org, hdfs-dev@hadoop.apache.org, mapreduce-dev@hadoop.apache.org, yarn-dev@hadoop.apache.org Message-ID: <650162106.2456.1496253028286.JavaMail.jenkins@crius> In-Reply-To: <1954489102.1950.1496156802189.JavaMail.jenkins@crius> References: <1954489102.1950.1496156802189.JavaMail.jenkins@crius> Subject: Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86 MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="----=_Part_2455_1114554251.1496253028005" X-Jenkins-Job: hadoop-qbt-trunk-java8-linux-x86 X-Jenkins-Result: FAILURE archived-at: Wed, 31 May 2017 17:50:33 -0000 ------=_Part_2455_1114554251.1496253028005 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit For more details, see https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/ [May 30, 2017 8:22:40 AM] (sunilg) YARN-6635. Refactor yarn-app pages in new YARN UI. Contributed by Akhil [May 30, 2017 5:07:58 PM] (brahma) HADOOP-14456. Modifier 'static' is redundant for inner enums. [May 30, 2017 6:10:12 PM] (lei) HDFS-11659. TestDataNodeHotSwapVolumes.testRemoveVolumeBeingWritten fail [May 30, 2017 11:58:15 PM] (haibochen) YARN-6477. Dispatcher no longer needs the raw types suppression. (Maya -1 overall The following subsystems voted -1: findbugs unit The following subsystems voted -1 but were configured to be filtered/ignored: cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace The following subsystems are considered long running: (runtime bigger than 1h 0m 0s) unit Specific tests: FindBugs : module:hadoop-common-project/hadoop-minikdc Possible null pointer dereference in org.apache.hadoop.minikdc.MiniKdc.delete(File) due to return value of called method Dereferenced at MiniKdc.java:org.apache.hadoop.minikdc.MiniKdc.delete(File) due to return value of called method Dereferenced at MiniKdc.java:[line 368] FindBugs : module:hadoop-common-project/hadoop-auth org.apache.hadoop.security.authentication.server.MultiSchemeAuthenticationHandler.authenticate(HttpServletRequest, HttpServletResponse) makes inefficient use of keySet iterator instead of entrySet iterator At MultiSchemeAuthenticationHandler.java:of keySet iterator instead of entrySet iterator At MultiSchemeAuthenticationHandler.java:[line 192] FindBugs : module:hadoop-common-project/hadoop-common org.apache.hadoop.crypto.CipherSuite.setUnknownValue(int) unconditionally sets the field unknownValue At CipherSuite.java:unknownValue At CipherSuite.java:[line 44] org.apache.hadoop.crypto.CryptoProtocolVersion.setUnknownValue(int) unconditionally sets the field unknownValue At CryptoProtocolVersion.java:unknownValue At CryptoProtocolVersion.java:[line 67] Possible null pointer dereference in org.apache.hadoop.fs.FileUtil.fullyDeleteOnExit(File) due to return value of called method Dereferenced at FileUtil.java:org.apache.hadoop.fs.FileUtil.fullyDeleteOnExit(File) due to return value of called method Dereferenced at FileUtil.java:[line 118] Possible null pointer dereference in org.apache.hadoop.fs.RawLocalFileSystem.handleEmptyDstDirectoryOnWindows(Path, File, Path, File) due to return value of called method Dereferenced at RawLocalFileSystem.java:org.apache.hadoop.fs.RawLocalFileSystem.handleEmptyDstDirectoryOnWindows(Path, File, Path, File) due to return value of called method Dereferenced at RawLocalFileSystem.java:[line 387] Return value of org.apache.hadoop.fs.permission.FsAction.or(FsAction) ignored, but method has no side effect At FTPFileSystem.java:but method has no side effect At FTPFileSystem.java:[line 421] Useless condition:lazyPersist == true at this point At CommandWithDestination.java:[line 502] org.apache.hadoop.io.DoubleWritable.compareTo(DoubleWritable) incorrectly handles double value At DoubleWritable.java: At DoubleWritable.java:[line 78] org.apache.hadoop.io.DoubleWritable$Comparator.compare(byte[], int, int, byte[], int, int) incorrectly handles double value At DoubleWritable.java:int) incorrectly handles double value At DoubleWritable.java:[line 97] org.apache.hadoop.io.FloatWritable.compareTo(FloatWritable) incorrectly handles float value At FloatWritable.java: At FloatWritable.java:[line 71] org.apache.hadoop.io.FloatWritable$Comparator.compare(byte[], int, int, byte[], int, int) incorrectly handles float value At FloatWritable.java:int) incorrectly handles float value At FloatWritable.java:[line 89] Possible null pointer dereference in org.apache.hadoop.io.IOUtils.listDirectory(File, FilenameFilter) due to return value of called method Dereferenced at IOUtils.java:org.apache.hadoop.io.IOUtils.listDirectory(File, FilenameFilter) due to return value of called method Dereferenced at IOUtils.java:[line 350] org.apache.hadoop.io.erasurecode.ECSchema.toString() makes inefficient use of keySet iterator instead of entrySet iterator At ECSchema.java:keySet iterator instead of entrySet iterator At ECSchema.java:[line 193] Possible bad parsing of shift operation in org.apache.hadoop.io.file.tfile.Utils$Version.hashCode() At Utils.java:operation in org.apache.hadoop.io.file.tfile.Utils$Version.hashCode() At Utils.java:[line 398] org.apache.hadoop.metrics2.lib.DefaultMetricsFactory.setInstance(MutableMetricsFactory) unconditionally sets the field mmfImpl At DefaultMetricsFactory.java:mmfImpl At DefaultMetricsFactory.java:[line 49] org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.setMiniClusterMode(boolean) unconditionally sets the field miniClusterMode At DefaultMetricsSystem.java:miniClusterMode At DefaultMetricsSystem.java:[line 100] Useless object stored in variable seqOs of method org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager.addOrUpdateToken(AbstractDelegationTokenIdentifier, AbstractDelegationTokenSecretManager$DelegationTokenInformation, boolean) At ZKDelegationTokenSecretManager.java:seqOs of method org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager.addOrUpdateToken(AbstractDelegationTokenIdentifier, AbstractDelegationTokenSecretManager$DelegationTokenInformation, boolean) At ZKDelegationTokenSecretManager.java:[line 886] Bad comparison of nonnegative value with 0 in org.apache.hadoop.tracing.TraceAdmin.run(String[]) At TraceAdmin.java:with 0 in org.apache.hadoop.tracing.TraceAdmin.run(String[]) At TraceAdmin.java:[line 169] Inconsistent synchronization of org.apache.hadoop.util.SysInfoWindows.cpuUsage; locked 50% of time Unsynchronized access at SysInfoWindows.java:50% of time Unsynchronized access at SysInfoWindows.java:[line 201] Inconsistent synchronization of org.apache.hadoop.util.SysInfoWindows.numProcessors; locked 50% of time Unsynchronized access at SysInfoWindows.java:50% of time Unsynchronized access at SysInfoWindows.java:[line 174] FindBugs : module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common Possible null pointer dereference in org.apache.hadoop.yarn.logaggregation.AggregatedLogFormat$LogValue.getPendingLogFilesToUpload(File) due to return value of called method Method invoked at AggregatedLogFormat.java:org.apache.hadoop.yarn.logaggregation.AggregatedLogFormat$LogValue.getPendingLogFilesToUpload(File) due to return value of called method Method invoked at AggregatedLogFormat.java:[line 318] FindBugs : module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager Useless object stored in variable removedNullContainers of method org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.removeOrTrackCompletedContainersFromContext(List) At NodeStatusUpdaterImpl.java:removedNullContainers of method org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.removeOrTrackCompletedContainersFromContext(List) At NodeStatusUpdaterImpl.java:[line 642] org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.removeVeryOldStoppedContainersFromCache() makes inefficient use of keySet iterator instead of entrySet iterator At NodeStatusUpdaterImpl.java:keySet iterator instead of entrySet iterator At NodeStatusUpdaterImpl.java:[line 719] Hard coded reference to an absolute pathname in org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext) At DockerLinuxContainerRuntime.java:absolute pathname in org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext) At DockerLinuxContainerRuntime.java:[line 455] org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer.createStatus() makes inefficient use of keySet iterator instead of entrySet iterator At ContainerLocalizer.java:keySet iterator instead of entrySet iterator At ContainerLocalizer.java:[line 334] org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainerMetrics.usageMetrics is a mutable collection which should be package protected At ContainerMetrics.java:which should be package protected At ContainerMetrics.java:[line 134] Failed junit tests : hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150 hadoop.hdfs.TestErasureCodeBenchmarkThroughput hadoop.hdfs.TestDFSStripedOutputStreamWithFailure090 hadoop.hdfs.server.datanode.TestDataNodeVolumeFailure hadoop.hdfs.TestRollingUpgrade hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureReporting hadoop.yarn.server.resourcemanager.security.TestDelegationTokenRenewer hadoop.yarn.server.TestContainerManagerSecurity hadoop.yarn.server.TestDiskFailures hadoop.yarn.server.TestMiniYarnClusterNodeUtilization hadoop.yarn.client.api.impl.TestAMRMProxy hadoop.mapreduce.v2.app.TestRuntimeEstimators hadoop.yarn.sls.nodemanager.TestNMSimulator Timed out junit tests : org.apache.hadoop.yarn.server.resourcemanager.TestSubmitApplicationWithRMHA cc: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/diff-compile-cc-root.txt [4.0K] javac: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/diff-compile-javac-root.txt [184K] checkstyle: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/diff-checkstyle-root.txt [17M] pylint: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/diff-patch-pylint.txt [20K] shellcheck: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/diff-patch-shellcheck.txt [20K] shelldocs: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/diff-patch-shelldocs.txt [12K] whitespace: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/whitespace-eol.txt [12M] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/whitespace-tabs.txt [1.2M] findbugs: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/branch-findbugs-hadoop-common-project_hadoop-minikdc-warnings.html [8.0K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/branch-findbugs-hadoop-common-project_hadoop-auth-warnings.html [8.0K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/branch-findbugs-hadoop-common-project_hadoop-common-warnings.html [28K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-warnings.html [8.0K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager-warnings.html [12K] javadoc: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/diff-javadoc-javadoc-root.txt [2.2M] unit: https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt [356K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt [60K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-tests.txt [324K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-client.txt [16K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-app.txt [20K] https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/420/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt [8.0K] Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org ------=_Part_2455_1114554251.1496253028005 Content-Type: text/plain; charset=us-ascii --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org ------=_Part_2455_1114554251.1496253028005--