Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 38D8A200CF6 for ; Mon, 18 Sep 2017 21:27:47 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 356A91609DB; Mon, 18 Sep 2017 19:27:47 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id D239F1609D4 for ; Mon, 18 Sep 2017 21:27:45 +0200 (CEST) Received: (qmail 15261 invoked by uid 500); 18 Sep 2017 19:27:44 -0000 Mailing-List: contact dev-help@atlas.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@atlas.apache.org Delivered-To: mailing list dev@atlas.apache.org Received: (qmail 15224 invoked by uid 99); 18 Sep 2017 19:27:44 -0000 Received: from Unknown (HELO jenkins-master.apache.org) (62.210.60.235) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 18 Sep 2017 19:27:44 +0000 Received: from jenkins-master.apache.org (localhost [127.0.0.1]) by jenkins-master.apache.org (ASF Mail Server at jenkins-master.apache.org) with ESMTP id 8B6CBA0046 for ; Mon, 18 Sep 2017 19:27:38 +0000 (UTC) Date: Mon, 18 Sep 2017 19:27:34 +0000 (UTC) From: Apache Jenkins Server To: dev@atlas.apache.org Message-ID: <1579875104.681.1505762856171.JavaMail.jenkins@jenkins-master.apache.org> In-Reply-To: <119353357.653.1505758878382.JavaMail.jenkins@jenkins-master.apache.org> References: <119353357.653.1505758878382.JavaMail.jenkins@jenkins-master.apache.org> Subject: Build failed in Jenkins: Atlas-0.8-IntegrationTests #65 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: Atlas-0.8-IntegrationTests X-Jenkins-Result: FAILURE archived-at: Mon, 18 Sep 2017 19:27:47 -0000 See Changes: [amestry] ATLAS-2145: Addressed build failure in dashboardv2 module. Remove= d ------------------------------------------ [...truncated 67.72 MB...] =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :487) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :507) =09at org.apache.atlas.hive.hook.HiveHookIT.testExportImportUnPartitionedTa= ble(HiveHookIT.java:745) testInsertIntoDFSDirPartitioned(org.apache.atlas.hive.hook.HiveHookIT) Tim= e elapsed: 15.019 sec <<< FAILURE! java.lang.AssertionError: Assertions failed. Failing after waiting for time= out 1000 msecs =09at org.apache.atlas.hive.hook.HiveHookIT.assertProcessIsRegistered(HiveH= ookIT.java:1735) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :487) =09at org.apache.atlas.hive.hook.HiveHookIT.testInsertIntoDFSDirPartitioned= (HiveHookIT.java:656) Caused by: java.lang.AssertionError: expected: but was: =09at org.apache.atlas.hive.hook.HiveHookIT.assertProcessIsRegistered(HiveH= ookIT.java:1735) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :487) =09at org.apache.atlas.hive.hook.HiveHookIT.testInsertIntoDFSDirPartitioned= (HiveHookIT.java:656) testInsertIntoTable(org.apache.atlas.hive.hook.HiveHookIT) Time elapsed: 1= 2.976 sec <<< FAILURE! java.lang.AssertionError: Assertions failed. Failing after waiting for time= out 1000 msecs =09at org.apache.atlas.hive.hook.HiveHookIT.assertProcessIsRegistered(HiveH= ookIT.java:1735) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :487) =09at org.apache.atlas.hive.hook.HiveHookIT.testInsertIntoTable(HiveHookIT.= java:533) Caused by: org.apache.atlas.AtlasServiceException: Metadata service API org= .apache.atlas.AtlasBaseClient$APIInfo@ad00e8b failed with status 404 (Not F= ound) Response Body ({"error":"Instance hive_process with unique attribute = {qualifiedName=3DQUERY:default.table5klqak804f@primary:1505762340000:defaul= t.tablelp5b1qefih@primary:1505762341000->:INSERT:default.tableyqpojwxsru@pr= imary:1505762341000} does not exist"}) =09at org.apache.atlas.hive.hook.HiveHookIT.assertProcessIsRegistered(HiveH= ookIT.java:1735) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :487) =09at org.apache.atlas.hive.hook.HiveHookIT.testInsertIntoTable(HiveHookIT.= java:533) testLineage(org.apache.atlas.hive.hook.HiveHookIT) Time elapsed: 6.868 sec= <<< FAILURE! java.lang.AssertionError: Assertions failed. Failing after waiting for time= out 1000 msecs =09at org.apache.atlas.hive.hook.HiveHookIT.testLineage(HiveHookIT.java:183= 1) Caused by: org.apache.atlas.AtlasServiceException: Metadata service API org= .apache.atlas.AtlasBaseClient$APIInfo@dbf2e6f failed with status 404 (Not F= ound) Response Body ({"error":"Instance hive_table with unique attribute {q= ualifiedName=3Ddefault.table1fnglltscd@primary} does not exist"}) =09at org.apache.atlas.hive.hook.HiveHookIT.testLineage(HiveHookIT.java:183= 1) testLoadDFSPathPartitioned(org.apache.atlas.hive.hook.HiveHookIT) Time ela= psed: 5.512 sec <<< FAILURE! java.lang.AssertionError: expected:<0> but was:<40000> =09at org.apache.atlas.hive.hook.HiveHookIT.testLoadDFSPathPartitioned(Hive= HookIT.java:452) testLoadLocalPath(org.apache.atlas.hive.hook.HiveHookIT) Time elapsed: 0.0= 56 sec <<< FAILURE! java.lang.AssertionError: expected:<0> but was:<40000> =09at org.apache.atlas.hive.hook.HiveHookIT.testLoadLocalPath(HiveHookIT.ja= va:428) testLoadLocalPathIntoPartition(org.apache.atlas.hive.hook.HiveHookIT) Time= elapsed: 0.079 sec <<< FAILURE! java.lang.AssertionError: expected:<0> but was:<40000> =09at org.apache.atlas.hive.hook.HiveHookIT.testLoadLocalPathIntoPartition(= HiveHookIT.java:439) testTraitsPreservedOnColumnRename(org.apache.atlas.hive.hook.HiveHookIT) T= ime elapsed: 7.234 sec <<< FAILURE! org.apache.atlas.AtlasServiceException: Metadata service API org.apache.atl= as.AtlasBaseClient$APIInfo@1573f83b failed with status 500 (Internal Server= Error) Response Body ({"error":"Could not commit transaction due to except= ion during persistence"}) =09at org.apache.atlas.hive.hook.HiveHookIT.createTrait(HiveHookIT.java:950= ) =09at org.apache.atlas.hive.hook.HiveHookIT.testTraitsPreservedOnColumnRena= me(HiveHookIT.java:1256) testUpdateProcess(org.apache.atlas.hive.hook.HiveHookIT) Time elapsed: 23.= 298 sec <<< FAILURE! java.lang.AssertionError: Assertions failed. Failing after waiting for time= out 1000 msecs =09at org.apache.atlas.hive.hook.HiveHookIT.assertProcessIsRegistered(HiveH= ookIT.java:1735) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :487) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :507) =09at org.apache.atlas.hive.hook.HiveHookIT.testUpdateProcess(HiveHookIT.ja= va:614) Caused by: java.lang.AssertionError: expected: but was: =09at org.apache.atlas.hive.hook.HiveHookIT.assertProcessIsRegistered(HiveH= ookIT.java:1735) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :487) =09at org.apache.atlas.hive.hook.HiveHookIT.validateProcess(HiveHookIT.java= :507) =09at org.apache.atlas.hive.hook.HiveHookIT.testUpdateProcess(HiveHookIT.ja= va:614) Results : Failed tests:=20 HiveMetastoreBridgeIT.testCreateTableAndImport:41->HiveITBase.assertDatab= aseIsRegistered:237->HiveITBase.assertDatabaseIsRegistered:243->HiveITBase.= assertEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Fai= ling after waiting for timeout 1000 msecs HiveMetastoreBridgeIT.testImportCreatedTable:77->HiveITBase.assertDatabas= eIsRegistered:237->HiveITBase.assertDatabaseIsRegistered:243->HiveITBase.as= sertEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Faili= ng after waiting for timeout 1000 msecs HiveHookIT.testAlterDBOwner:1569->HiveITBase.assertDatabaseIsRegistered:2= 37->HiveITBase.assertDatabaseIsRegistered:243->HiveITBase.assertEntityIsReg= istered:158->HiveITBase.waitFor:202 Assertions failed. Failing after waitin= g for timeout 1000 msecs HiveHookIT.testAlterDBProperties:1589->testAlterProperties:1616->verifyEn= tityProperties:1652->HiveITBase.assertDatabaseIsRegistered:243->HiveITBase.= assertEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Fai= ling after waiting for timeout 1000 msecs HiveHookIT.testAlterTableBucketingClusterSort:1356->runBucketSortQuery:13= 65->assertTableIsRegistered:1802->HiveITBase.assertTableIsRegistered:152->H= iveITBase.assertEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions f= ailed. Failing after waiting for timeout 1000 msecs HiveHookIT.testAlterTableChangeColumn:1023->assertColumnIsRegistered:283-= >HiveITBase.assertEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions= failed. Failing after waiting for timeout 1000 msecs HiveHookIT.testAlterTableProperties:1596->testAlterProperties:1616->verif= yEntityProperties:1644->assertTableIsRegistered:1802->HiveITBase.assertTabl= eIsRegistered:152->HiveITBase.assertEntityIsRegistered:158->HiveITBase.wait= For:202 Assertions failed. Failing after waiting for timeout 1000 msecs HiveHookIT.testAlterTableRename:889->createTrait:950 =C3=82=C2=BB AtlasSe= rvice Metadata s... HiveHookIT.testAlterTableRenameAliasRegistered:867->HiveITBase.assertTabl= eIsRegistered:146->HiveITBase.assertTableIsRegistered:152->HiveITBase.asser= tEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Failing = after waiting for timeout 1000 msecs HiveHookIT.testAlterTableSerde:1403->runSerdePropsQuery:1550->verifyTable= SdProperties:1663->assertTableIsRegistered:1802->HiveITBase.assertTableIsRe= gistered:152->HiveITBase.assertEntityIsRegistered:158->HiveITBase.waitFor:2= 02 Assertions failed. Failing after waiting for timeout 1000 msecs HiveHookIT.testAlterTableWithoutHookConf:1243->HiveITBase.assertTableIsRe= gistered:146->HiveITBase.assertTableIsRegistered:152->HiveITBase.assertEnti= tyIsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Failing after= waiting for timeout 1000 msecs HiveHookIT.testAlterViewAsSelect:397->assertProcessIsRegistered:1710->Hiv= eITBase.assertEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions fai= led. Failing after waiting for timeout 1000 msecs HiveHookIT.testAlterViewProperties:1637->testAlterProperties:1616->verify= EntityProperties:1644->assertTableIsRegistered:1802->HiveITBase.assertTable= IsRegistered:152->HiveITBase.assertEntityIsRegistered:158->HiveITBase.waitF= or:202 Assertions failed. Failing after waiting for timeout 1000 msecs HiveHookIT.testCreateDatabase:87->assertDBIsNotRegistered:1798->assertEnt= ityIsNotRegistered:1806->HiveITBase.waitFor:202 Assertions failed. Failing = after waiting for timeout 1000 msecs HiveHookIT.testCreateView:368->assertProcessIsRegistered:1710->HiveITBase= .assertEntityIsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Fa= iling after waiting for timeout 1000 msecs HiveHookIT.testDropAndRecreateCTASOutput:338->HiveITBase.assertTableIsReg= istered:146->HiveITBase.assertTableIsRegistered:152->HiveITBase.assertEntit= yIsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Failing after = waiting for timeout 1000 msecs HiveHookIT.testDropTable:1416->HiveITBase.assertTableIsRegistered:146->Hi= veITBase.assertTableIsRegistered:152->HiveITBase.assertEntityIsRegistered:1= 58->HiveITBase.waitFor:202 Assertions failed. Failing after waiting for tim= eout 1000 msecs HiveHookIT.testDropView:1526->HiveITBase.assertTableIsRegistered:146->Hiv= eITBase.assertTableIsRegistered:152->HiveITBase.assertEntityIsRegistered:15= 8->HiveITBase.waitFor:202 Assertions failed. Failing after waiting for time= out 1000 msecs HiveHookIT.testEmptyStringAsValue:328->HiveITBase.assertTableIsRegistered= :146->HiveITBase.assertTableIsRegistered:152->HiveITBase.assertEntityIsRegi= stered:158->HiveITBase.waitFor:202 Assertions failed. Failing after waiting= for timeout 1000 msecs HiveHookIT.testExportImportPartitionedTable:798->validateProcess:487->ass= ertProcessIsRegistered:1735->HiveITBase.assertEntityIsRegistered:158->HiveI= TBase.waitFor:202 Assertions failed. Failing after waiting for timeout 1000= msecs HiveHookIT.testExportImportUnPartitionedTable:745->validateProcess:507->v= alidateProcess:487->assertProcessIsRegistered:1735->HiveITBase.assertEntity= IsRegistered:158->HiveITBase.waitFor:202 Assertions failed. Failing after w= aiting for timeout 1000 msecs HiveHookIT.testInsertIntoDFSDirPartitioned:656->validateProcess:487->asse= rtProcessIsRegistered:1735->HiveITBase.assertEntityIsRegistered:158->HiveIT= Base.waitFor:202 Assertions failed. Failing after waiting for timeout 1000 = msecs HiveHookIT.testInsertIntoTable:533->validateProcess:487->assertProcessIsR= egistered:1735->HiveITBase.assertEntityIsRegistered:158->HiveITBase.waitFor= :202 Assertions failed. Failing after waiting for timeout 1000 msecs HiveHookIT.testLineage:1831->HiveITBase.assertTableIsRegistered:146->Hive= ITBase.assertTableIsRegistered:152->HiveITBase.assertEntityIsRegistered:158= ->HiveITBase.waitFor:202 Assertions failed. Failing after waiting for timeo= ut 1000 msecs HiveHookIT.testLoadDFSPathPartitioned:452->HiveITBase.runCommand:105->Hiv= eITBase.runCommandWithDelay:113->HiveITBase.runCommandWithDelay:120 expecte= d:<0> but was:<40000> HiveHookIT.testLoadLocalPath:428->HiveITBase.runCommand:105->HiveITBase.r= unCommandWithDelay:113->HiveITBase.runCommandWithDelay:120 expected:<0> but= was:<40000> HiveHookIT.testLoadLocalPathIntoPartition:439->HiveITBase.runCommand:105-= >HiveITBase.runCommandWithDelay:113->HiveITBase.runCommandWithDelay:120 exp= ected:<0> but was:<40000> HiveHookIT.testTraitsPreservedOnColumnRename:1256->createTrait:950 =C3=82= =C2=BB AtlasService HiveHookIT.testUpdateProcess:614->validateProcess:507->validateProcess:48= 7->assertProcessIsRegistered:1735->HiveITBase.assertEntityIsRegistered:158-= >HiveITBase.waitFor:202 Assertions failed. Failing after waiting for timeou= t 1000 msecs Tests run: 45, Failures: 29, Errors: 0, Skipped: 0 [INFO]=20 [INFO] --- jetty-maven-plugin:9.2.12.v20150709:stop (stop-jetty) @ hive-bri= dge --- [INFO] Waiting 10 seconds for jetty to stop [INFO] Stopped ServerConnector@516552dc{HTTP/1.1}{0.0.0.0:31000} [INFO] Closing Spring root WebApplicationContext 2017-09-18 19:20:12,595 INFO - [ShutdownMonitor:] ~ Stopping service org.a= pache.atlas.web.service.ActiveInstanceElectorService (Services:65) 2017-09-18 19:20:12,596 INFO - [ShutdownMonitor:] ~ HA is not enabled, no = need to stop leader election service (ActiveInstanceElectorService:124) 2017-09-18 19:20:12,596 INFO - [ShutdownMonitor:] ~ Stopping service org.a= pache.atlas.kafka.KafkaNotification (Services:65) 2017-09-18 19:20:15,010 INFO - [ShutdownMonitor:] ~ Stopping service org.a= pache.atlas.notification.NotificationHookConsumer (Services:65) 2017-09-18 19:20:15,011 INFO - [ShutdownMonitor:] ~ =3D=3D> stopConsumerTh= reads() (NotificationHookConsumer:181) 2017-09-18 19:20:15,011 INFO - [ShutdownMonitor:] ~ =3D=3D> HookConsumer s= hutdown() (NotificationHookConsumer$HookConsumer:485) 2017-09-18 19:20:15,012 INFO - [ShutdownMonitor:] ~ [atlas-hook-consumer-t= hread], Shutting down (Logging$class:68) 2017-09-18 19:20:15,014 INFO - [NotificationHookConsumer thread-0:] ~ clos= ing NotificationConsumer (NotificationHookConsumer$HookConsumer:316) 2017-09-18 19:20:15,025 INFO - [NotificationHookConsumer thread-0:] ~ <=3D= =3D HookConsumer doWork() (NotificationHookConsumer$HookConsumer:320) 2017-09-18 19:20:15,026 INFO - [NotificationHookConsumer thread-0:] ~ [atl= as-hook-consumer-thread], Stopped (Logging$class:68) 2017-09-18 19:20:15,029 INFO - [ShutdownMonitor:] ~ [atlas-hook-consumer-t= hread], Shutdown completed (Logging$class:68) 2017-09-18 19:20:15,030 INFO - [ShutdownMonitor:] ~ <=3D=3D HookConsumer s= hutdown() (NotificationHookConsumer$HookConsumer:500) 2017-09-18 19:20:15,030 INFO - [ShutdownMonitor:] ~ <=3D=3D stopConsumerTh= reads() (NotificationHookConsumer:190) 2017-09-18 19:20:15,041 DEBUG - [ShutdownMonitor:] ~ =3D=3D> AtlasAuthoriza= tionFilter destroy (AtlasAuthorizationFilter:78) 2017-09-18 19:20:15,041 DEBUG - [ShutdownMonitor:] ~ =3D=3D> +SimpleAtlasAu= thorizer cleanUp (SimpleAtlasAuthorizer:327) 2017-09-18 19:20:15,041 DEBUG - [ShutdownMonitor:] ~ <=3D=3D +SimpleAtlasAu= thorizer cleanUp (SimpleAtlasAuthorizer:338) 2017-09-18 19:20:15,193 DEBUG - [main-SendThread(localhost:19026):] ~ =3D= =3D> InMemoryJAASConfiguration.getAppConfigurationEntry(Client) (InMemoryJA= ASConfiguration:208) 2017-09-18 19:20:15,194 DEBUG - [main-SendThread(localhost:19026):] ~ <=3D= =3D InMemoryJAASConfiguration.getAppConfigurationEntry(Client): {} (InMemor= yJAASConfiguration:238) [INFO] Shutting down log4j [INFO] Stopped o.e.j.m.p.JettyWebAppContext@79f40a1{/, [INFO] Server reports itself as stopped [INFO]=20 [INFO] --- maven-source-plugin:2.4:jar-no-fork (attach-sources) @ hive-brid= ge --- [INFO] Building jar: log4j:WARN No appenders could be found for logger (org.apache.atlas.securit= y.InMemoryJAASConfiguration). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for mo= re info. [INFO]=20 [INFO] --- maven-source-plugin:2.4:test-jar-no-fork (attach-sources) @ hive= -bridge --- [INFO] Building jar: [INFO]=20 [INFO] --- maven-failsafe-plugin:2.19.1:verify (verify) @ hive-bridge --- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Apache Atlas Server Build Tools .................... SUCCESS [ 1.51= 3 s] [INFO] apache-atlas ....................................... SUCCESS [ 9.36= 0 s] [INFO] Apache Atlas Integration ........................... SUCCESS [ 37.93= 4 s] [INFO] Apache Atlas Common ................................ SUCCESS [ 22.43= 3 s] [INFO] Apache Atlas Typesystem ............................ SUCCESS [01:33 = min] [INFO] Apache Atlas Client ................................ SUCCESS [ 23.78= 1 s] [INFO] Apache Atlas Server API ............................ SUCCESS [ 12.45= 6 s] [INFO] Apache Atlas Notification .......................... SUCCESS [ 22.99= 8 s] [INFO] Apache Atlas Graph Database Projects ............... SUCCESS [ 0.72= 1 s] [INFO] Apache Atlas Graph Database API .................... SUCCESS [ 8.30= 6 s] [INFO] Graph Database Common Code ......................... SUCCESS [ 9.26= 2 s] [INFO] Apache Atlas Titan 1.0.0 GraphDB Impl .............. SUCCESS [ 39.55= 1 s] [INFO] Shaded version of Apache hbase client .............. SUCCESS [ 10.01= 6 s] [INFO] Apache Atlas Titan 0.5.4 Graph DB Impl ............. SUCCESS [ 55.84= 4 s] [INFO] Apache Atlas Graph Database Implementation Dependencies SUCCESS [ 1= .332 s] [INFO] Shaded version of Apache hbase server .............. SUCCESS [ 27.03= 9 s] [INFO] Apache Atlas Repository ............................ SUCCESS [02:06 = min] [INFO] Apache Atlas Authorization ......................... SUCCESS [ 12.97= 8 s] [INFO] Apache Atlas Business Catalog ...................... SUCCESS [ 21.07= 5 s] [INFO] Apache Atlas UI .................................... SUCCESS [02:06 = min] [INFO] Apache Atlas Web Application ....................... SUCCESS [05:42 = min] [INFO] Apache Atlas Documentation ......................... SUCCESS [ 9.33= 5 s] [INFO] Apache Atlas FileSystem Model ...................... SUCCESS [ 4.65= 1 s] [INFO] Apache Atlas Plugin Classloader .................... SUCCESS [ 12.25= 2 s] [INFO] Apache Atlas Hive Bridge Shim ...................... SUCCESS [ 12.73= 5 s] [INFO] Apache Atlas Hive Bridge ........................... FAILURE [07:14 = min] [INFO] Apache Atlas Falcon Bridge Shim .................... SKIPPED [INFO] Apache Atlas Falcon Bridge ......................... SKIPPED [INFO] Apache Atlas Sqoop Bridge Shim ..................... SKIPPED [INFO] Apache Atlas Sqoop Bridge .......................... SKIPPED [INFO] Apache Atlas Storm Bridge Shim ..................... SKIPPED [INFO] Apache Atlas Storm Bridge .......................... SKIPPED [INFO] Apache Atlas Distribution .......................... SKIPPED [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 24:42 min [INFO] Finished at: 2017-09-18T19:20:15Z [INFO] Final Memory: 557M/2179M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plug= in:2.19.1:verify (verify) on project hive-bridge: There are test failures. [ERROR]=20 [ERROR] Please refer to for the individual te= st results. [ERROR] -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hive-bridge Build step 'Execute shell' marked build as failure [CHECKSTYLE] Collecting checkstyle analysis files... [CHECKSTYLE] Searching for all files in that match the pattern **/target/checkstyle-res= ult.xml [CHECKSTYLE] Parsing 24 files in [CHECKSTYLE] Successfully parsed file wit= h 0 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 15 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 48= 2 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 2619 uni= que warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 4257 uniq= ue warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 5465 uniq= ue warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 5465= unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 5465 unique= warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 5734= unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 5= 921 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file = with 5921 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 5921 uni= que warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 8= 306 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 9= 254 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 15686 uniqu= e warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 168= 96 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file wi= th 17009 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 35820= unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 36199= unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 36199 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 36199 unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 36199 unique war= nings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 40417= unique warnings and 0 duplicates. [CHECKSTYLE] Successfully parsed file with 48567 uni= que warnings and 0 duplicates. Using GitBlamer to create author and commit information for al= l warnings. GIT_COMMIT=3D6c2bddf78f295a47a51fcabe0221414210b688c0, workspa= ce=3D > git rev-parse 6c2bddf78f295a47a51fcabe0221414210b688c0^{commit} # timeou= t=3D10 Skipping file no result found. [FINDBUGS] Collecting findbugs analysis files... [FINDBUGS] Searching for all files in that match the pattern **/target/findbugs.xml [FINDBUGS] Parsing 15 files in [FINDBUGS] Successfully parsed file with 0 u= nique warnings and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warn= ings and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warnings a= nd 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warnings an= d 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warnings an= d 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warnin= gs and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique war= nings and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique war= nings and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warnings and = 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warni= ngs and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique= warnings and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warning= s and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warning= s and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warning= s and 0 duplicates. [FINDBUGS] Successfully parsed file with 0 unique warnings an= d 0 duplicates. Using GitBlamer to create author and commit information for al= l warnings. GIT_COMMIT=3D6c2bddf78f295a47a51fcabe0221414210b688c0, workspa= ce=3D Archiving artifacts [Fast Archiver] No prior successful build to compare, so performing full co= py of artifacts TestNG Reports Processing: START Looking for TestNG results report in workspace using pattern: **/target/sur= efire-reports/testng-results.xml,**/target/failsafe-reports/testng-results.= xml Saving reports... Processing '/x1/jenkins/jenkins-home/jobs/Atlas-0.8-IntegrationTests/builds= /65/testng/testng-results-1.xml' Processing '/x1/jenkins/jenkins-home/jobs/Atlas-0.8-IntegrationTests/builds= /65/testng/testng-results.xml' 16.292135% of tests failed, which exceeded threshold of 0%. Marking build a= s UNSTABLE TestNG Reports Processing: FINISH