Return-Path: X-Original-To: apmail-hbase-builds-archive@minotaur.apache.org Delivered-To: apmail-hbase-builds-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 812C617A1D for ; Thu, 19 Mar 2015 23:03:12 +0000 (UTC) Received: (qmail 9564 invoked by uid 500); 19 Mar 2015 23:03:12 -0000 Delivered-To: apmail-hbase-builds-archive@hbase.apache.org Received: (qmail 9529 invoked by uid 500); 19 Mar 2015 23:03:12 -0000 Mailing-List: contact builds-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: builds@hbase.apache.org Delivered-To: mailing list builds@hbase.apache.org Received: (qmail 9480 invoked by uid 99); 19 Mar 2015 23:03:12 -0000 Received: from crius.apache.org (HELO crius) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 19 Mar 2015 23:03:12 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius (Postfix) with ESMTP id EE7EBE00491 for ; Thu, 19 Mar 2015 23:03:11 +0000 (UTC) Date: Thu, 19 Mar 2015 23:03:11 +0000 (UTC) From: Apache Jenkins Server To: builds@hbase.apache.org Message-ID: <302715523.1332.1426806191722.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: HBase-0.98 #910 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Jenkins-Job: HBase-0.98 X-Jenkins-Result: FAILURE See Changes: [matteo.bertozzi] HBASE-13285 Fix flaky getRegions() in TestAccessControlle= r.setUp() ------------------------------------------ [...truncated 2114 lines...] =09at java.lang.Thread.start0(Native Method) =09at java.lang.Thread.start(Thread.java:693) =09at org.apache.hadoop.util.Shell.runCommand(Shell.java:443) =09at org.apache.hadoop.util.Shell.run(Shell.java:379) =09at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:= 589) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:678) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:661) =09at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys= tem.java:639) =09at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.jav= a:305) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:447) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:424) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:357) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:332) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.writeRegionInf= oFileContent(HRegionFileSystem.java:756) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.writeRegionInf= oOnFilesystem(HRegionFileSystem.java:840) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.writeRegionInf= oOnFilesystem(HRegionFileSystem.java:803) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.createRegionOn= FileSystem(HRegionFileSystem.java:869) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4501) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4471) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4444) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4522) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4402) =09at org.apache.hadoop.hbase.HBaseTestingUtility.createTestRegion(HBaseTes= tingUtility.java:3439) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testNotCachingDataB= locksDuringCompactionInternals(TestCacheOnWrite.java:429) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testNotCachingDataB= locksDuringCompaction(TestCacheOnWrite.java:485) testStoreFileCacheOnWrite[106](org.apache.hadoop.hbase.io.hfile.TestCacheOn= Write) Time elapsed: 0.091 sec <<< ERROR! java.lang.OutOfMemoryError: unable to create new native thread =09at java.lang.Thread.start0(Native Method) =09at java.lang.Thread.start(Thread.java:693) =09at org.apache.hadoop.util.Shell.runCommand(Shell.java:443) =09at org.apache.hadoop.util.Shell.run(Shell.java:379) =09at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:= 589) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:678) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:661) =09at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys= tem.java:639) =09at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.jav= a:305) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:447) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:424) =09at org.apache.hadoop.fs.FilterFileSystem.create(FilterFileSystem.java:17= 4) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:357) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:332) =09at org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.createOutputStre= am(AbstractHFileWriter.java:266) =09at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.jav= a:302) =09at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.(StoreFil= e.java:755) =09at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.(StoreFil= e.java:706) =09at org.apache.hadoop.hbase.regionserver.StoreFile$WriterBuilder.build(St= oreFile.java:644) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.writeStoreFile(Test= CacheOnWrite.java:384) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testStoreFileCacheO= nWriteInternals(TestCacheOnWrite.java:262) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testStoreFileCacheO= nWrite(TestCacheOnWrite.java:479) testNotCachingDataBlocksDuringCompaction[107](org.apache.hadoop.hbase.io.hf= ile.TestCacheOnWrite) Time elapsed: 0.093 sec <<< ERROR! java.lang.OutOfMemoryError: unable to create new native thread =09at java.lang.Thread.start0(Native Method) =09at java.lang.Thread.start(Thread.java:693) =09at org.apache.hadoop.util.Shell.runCommand(Shell.java:443) =09at org.apache.hadoop.util.Shell.run(Shell.java:379) =09at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:= 589) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:678) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:661) =09at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys= tem.java:639) =09at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.jav= a:305) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:447) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:424) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:357) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:332) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.writeRegionInf= oFileContent(HRegionFileSystem.java:756) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.writeRegionInf= oOnFilesystem(HRegionFileSystem.java:840) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.writeRegionInf= oOnFilesystem(HRegionFileSystem.java:803) =09at org.apache.hadoop.hbase.regionserver.HRegionFileSystem.createRegionOn= FileSystem(HRegionFileSystem.java:869) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4501) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4471) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4444) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4522) =09at org.apache.hadoop.hbase.regionserver.HRegion.createHRegion(HRegion.ja= va:4402) =09at org.apache.hadoop.hbase.HBaseTestingUtility.createTestRegion(HBaseTes= tingUtility.java:3439) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testNotCachingDataB= locksDuringCompactionInternals(TestCacheOnWrite.java:429) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testNotCachingDataB= locksDuringCompaction(TestCacheOnWrite.java:485) testStoreFileCacheOnWrite[107](org.apache.hadoop.hbase.io.hfile.TestCacheOn= Write) Time elapsed: 0.085 sec <<< ERROR! java.lang.OutOfMemoryError: unable to create new native thread =09at java.lang.Thread.start0(Native Method) =09at java.lang.Thread.start(Thread.java:693) =09at org.apache.hadoop.util.Shell.runCommand(Shell.java:443) =09at org.apache.hadoop.util.Shell.run(Shell.java:379) =09at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:= 589) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:678) =09at org.apache.hadoop.util.Shell.execCommand(Shell.java:661) =09at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys= tem.java:639) =09at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.jav= a:305) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:447) =09at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav= a:424) =09at org.apache.hadoop.fs.FilterFileSystem.create(FilterFileSystem.java:17= 4) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:357) =09at org.apache.hadoop.hbase.util.FSUtils.create(FSUtils.java:332) =09at org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.createOutputStre= am(AbstractHFileWriter.java:266) =09at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.jav= a:302) =09at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.(StoreFil= e.java:755) =09at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.(StoreFil= e.java:706) =09at org.apache.hadoop.hbase.regionserver.StoreFile$WriterBuilder.build(St= oreFile.java:644) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.writeStoreFile(Test= CacheOnWrite.java:384) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testStoreFileCacheO= nWriteInternals(TestCacheOnWrite.java:262) =09at org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testStoreFileCacheO= nWrite(TestCacheOnWrite.java:479) Running org.apache.hadoop.hbase.io.encoding.TestLoadAndSwitchEncodeOnDisk Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.422 sec = - in org.apache.hadoop.hbase.io.encoding.TestLoadAndSwitchEncodeOnDisk Running org.apache.hadoop.hbase.io.encoding.TestChangingEncoding Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.29 sec -= in org.apache.hadoop.hbase.io.encoding.TestChangingEncoding Running org.apache.hadoop.hbase.io.encoding.TestEncodedSeekers Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 91.123 sec= - in org.apache.hadoop.hbase.io.encoding.TestEncodedSeekers Running org.apache.hadoop.hbase.io.encoding.TestDataBlockEncoders Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 194.673 se= c - in org.apache.hadoop.hbase.io.encoding.TestDataBlockEncoders Running org.apache.hadoop.hbase.io.encoding.TestBufferedDataBlockEncoder Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.225 sec = - in org.apache.hadoop.hbase.io.encoding.TestBufferedDataBlockEncoder Running org.apache.hadoop.hbase.filter.TestFilterWithScanLimits Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.88 sec - = in org.apache.hadoop.hbase.filter.TestFilterWithScanLimits Running org.apache.hadoop.hbase.filter.TestFilterWrapper Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.764 sec -= in org.apache.hadoop.hbase.filter.TestFilterWrapper Running org.apache.hadoop.hbase.filter.TestColumnRangeFilter Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.695 sec -= in org.apache.hadoop.hbase.filter.TestColumnRangeFilter Running org.apache.hadoop.hbase.filter.TestFuzzyRowAndColumnRangeFilter Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.695 sec -= in org.apache.hadoop.hbase.filter.TestFuzzyRowAndColumnRangeFilter Results : Tests in error:=20 TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:486->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:485->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:485->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:485->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:485->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:485->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:485->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:485->testNotCac= hingDataBlocksDuringCompactionInternals:429 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory TestCacheOnWrite.testStoreFileCacheOnWrite:479->testStoreFileCacheOnWrite= Internals:262->writeStoreFile:384 =C2=BB OutOfMemory Tests run: 2227, Failures: 0, Errors: 16, Skipped: 22 [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] HBase ............................................. SUCCESS [3.057s] [INFO] HBase - Checkstyle ................................ SUCCESS [0.654s] [INFO] HBase - Annotations ............................... SUCCESS [0.848s] [INFO] HBase - Common .................................... SUCCESS [47.492s= ] [INFO] HBase - Protocol .................................. SUCCESS [9.154s] [INFO] HBase - Client .................................... SUCCESS [50.757s= ] [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [6.945s] [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [5.995s] [INFO] HBase - Prefix Tree ............................... SUCCESS [8.108s] [INFO] HBase - Server .................................... FAILURE [3:29:15= .452s] [INFO] HBase - Testing Util .............................. SKIPPED [INFO] HBase - Thrift .................................... SKIPPED [INFO] HBase - Rest ...................................... SKIPPED [INFO] HBase - Shell ..................................... SKIPPED [INFO] HBase - Integration Tests ......................... SKIPPED [INFO] HBase - Examples .................................. SKIPPED [INFO] HBase - Assembly .................................. SKIPPED [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 3:31:29.287s [INFO] Finished at: Thu Mar 19 23:00:34 UTC 2015 [INFO] Final Memory: 51M/683M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plug= in:2.18:test (secondPartTestsExecution) on project hbase-server: There was = a timeout or other error in the fork -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hbase-server Build step 'Invoke top-level Maven targets' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : ZOMBIE_TESTS_COUNT=3D`jps | grep surefirebooter | wc -l= ` if [[ $ZOMBIE_TESTS_COUNT !=3D 0 ]] ; then #It seems sometimes the tests are not dying immediately. Let's give the= m 10s echo "Suspicious java process found - waiting 10s to see if there are j= ust slow to stop" sleep 10 =20 ZOMBIE_TESTS_COUNT=3D`jps | grep surefirebooter | wc -l` if [[ $ZOMBIE_TESTS_COUNT !=3D 0 ]] ; then echo "There are $ZOMBIE_TESTS_COUNT zombie tests, they should have be= en killed by surefire but survived" echo "************ BEGIN zombies jstack extract" ZB_STACK=3D`jps | grep surefirebooter | cut -d ' ' -f 1 | xargs -n 1 = jstack | grep ".test" | grep "\.java"` jps | grep surefirebooter | cut -d ' ' -f 1 | xargs -n 1 jstack echo "************ END zombies jstack extract" JIRA_COMMENT=3D"$JIRA_COMMENT {color:red}-1 core zombie tests{color}. There are ${ZOMBIE_TESTS_COUN= T} zombie test(s): ${ZB_STACK}" BAD=3D1 jps | grep surefirebooter | cut -d ' ' -f 1 | xargs kill -9 else echo "We're ok: there is no zombie test, but some tests took some tim= e to stop" fi else echo "We're ok: there is no zombie test" fi [HBase-0.98] $ /bin/bash -xe /tmp/hudson6631607213455244425.sh ++ jps ++ grep surefirebooter ++ wc -l + ZOMBIE_TESTS_COUNT=3D1 + [[ 1 !=3D 0 ]] + echo 'Suspicious java process found - waiting 10s to see if there are jus= t slow to stop' Suspicious java process found - waiting 10s to see if there are just slow t= o stop + sleep 10 ++ jps ++ grep surefirebooter ++ wc -l + ZOMBIE_TESTS_COUNT=3D1 + [[ 1 !=3D 0 ]] + echo 'There are 1 zombie tests, they should have been killed by surefire = but survived' There are 1 zombie tests, they should have been killed by surefire but surv= ived + echo '************ BEGIN zombies jstack extract' ************ BEGIN zombies jstack extract ++ jps ++ grep surefirebooter ++ cut -d ' ' -f 1 ++ xargs -n 1 jstack ++ grep .test ++ grep '\.java' 15292: Unable to open socket file: target process not responding or HotSpot= VM not loaded The -F option can be used when the target process is not responding + ZB_STACK=3D POST BUILD TASK : FAILURE END OF POST BUILD TASK : 0 Archiving artifacts Sending artifact delta relative to HBase-0.98 #909 Archived 1703 artifacts Archive block size is 32768 Received 19 blocks and 283866387 bytes Compression is 0.2% Took 1 min 56 sec Recording test results Updating HBASE-13285