hama-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Edward J. Yoon" <edwardy...@apache.org>
Subject Re: Build failed in Jenkins: Hama-Nightly #585
Date Mon, 18 Jun 2012 07:30:20 GMT
I thought you fixed it? I'm reopen HAMA-590.

On Mon, Jun 18, 2012 at 2:52 PM, Thomas Jungblut
<thomas.jungblut@googlemail.com> wrote:
>>
>> 12/06/17 23:03:43 ERROR bsp.TaskInProgress: Could not find groom for
>> location: localhost ; active grooms: [vesta.apache.org]
>
>
> Okay, so DFS just uses what has been configured, we have to do it for the
> groom as well.
> Anyone wants to file the jira for that and fix it?
>
> 2012/6/18 Apache Jenkins Server <jenkins@builds.apache.org>
>
>> See <https://builds.apache.org/job/Hama-Nightly/585/changes>
>>
>> Changes:
>>
>> [tjungblut] print the searched location, not the locations
>>
>> ------------------------------------------
>> [...truncated 2892 lines...]
>> 12/06/17 23:03:42 INFO server.NIOServerCnxn: Client attempting to
>> establish new session at /127.0.0.1:34595
>>
>> 12/06/17 23:03:42 INFO persistence.FileTxnLog: Creating new log file: log.1
>>
>> 12/06/17 23:03:42 INFO server.NIOServerCnxn: Established session
>> 0x137fcafff8b0000 with negotiated timeout 40000 for client /
>> 127.0.0.1:34595
>>
>> 12/06/17 23:03:42 INFO zookeeper.ClientCnxn: Session establishment
>> complete on server localhost/127.0.0.1:21810, sessionid =
>> 0x137fcafff8b0000, negotiated timeout = 40000
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server Responder: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server listener on 40000: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 0 on 40000: starting
>>
>> 12/06/17 23:03:42 INFO bsp.BSPMaster: Starting RUNNING
>>
>> 12/06/17 23:03:42 INFO hama.MiniBSPCluster: Waitin for GroomServer up.
>>
>> 12/06/17 23:03:42 INFO bsp.GroomServer: groom start
>>
>> 12/06/17 23:03:42 INFO zookeeper.ZooKeeper: Initiating client connection,
>> connectString=localhost:21810 sessionTimeout=1200000
>> watcher=org.apache.hama.bsp.GroomServer@4c6320
>>
>> 12/06/17 23:03:42 INFO zookeeper.ClientCnxn: Opening socket connection to
>> server localhost/127.0.0.1:21810
>>
>> 12/06/17 23:03:42 INFO zookeeper.ClientCnxn: Socket connection established
>> to localhost/127.0.0.1:21810, initiating session
>>
>> 12/06/17 23:03:42 INFO server.NIOServerCnxn: Accepted socket connection
>> from /127.0.0.1:34911
>>
>> 12/06/17 23:03:42 INFO server.NIOServerCnxn: Client attempting to
>> establish new session at /127.0.0.1:34911
>>
>> 12/06/17 23:03:42 INFO bsp.GroomServer: /tmp/hama-test
>>
>> 12/06/17 23:03:42 INFO ipc.Server: Starting SocketReader
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server Responder: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server listener on 59498: starting
>>
>> 12/06/17 23:03:42 INFO bsp.GroomServer: Worker rpc server -->
>> vesta.apache.org:59498
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 0 on 59498: starting
>>
>> 12/06/17 23:03:42 INFO bsp.GroomServer: starting webserver:
>> vesta.apache.org
>>
>> 12/06/17 23:03:42 INFO http.HttpServer: Port returned by
>> webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening
>> the listener on 40015
>>
>> 12/06/17 23:03:42 INFO http.HttpServer: listener.getLocalPort() returned
>> 40015 webServer.getConnectors()[0].getLocalPort() returned 40015
>>
>> 12/06/17 23:03:42 INFO http.HttpServer: Jetty bound to port 40015
>>
>> 12/06/17 23:03:42 INFO mortbay.log: jetty-6.1.14
>>
>> 12/06/17 23:03:42 INFO server.NIOServerCnxn: Established session
>> 0x137fcafff8b0001 with negotiated timeout 40000 for client /
>> 127.0.0.1:34911
>>
>> 12/06/17 23:03:42 INFO zookeeper.ClientCnxn: Session establishment
>> complete on server localhost/127.0.0.1:21810, sessionid =
>> 0x137fcafff8b0001, negotiated timeout = 40000
>>
>> 12/06/17 23:03:42 INFO mortbay.log: Extract jar:<
>> https://builds.apache.org/job/Hama-Nightly/ws/trunk/core/target/hama-core-0.5.0-SNAPSHOT.jar!/webapp/groomserver/>
>> to /tmp/Jetty_vesta_apache_org_40015_groomserver____.ql73sy/webapp
>>
>> 12/06/17 23:03:42 INFO mortbay.log: Started
>> SelectChannelConnector@vesta.apache.org:40015
>>
>> 12/06/17 23:03:42 INFO ipc.Server: Starting SocketReader
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server Responder: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server listener on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 0 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 1 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 2 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 3 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 4 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 5 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 6 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 7 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 8 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO ipc.Server: IPC Server handler 9 on 51744: starting
>>
>> 12/06/17 23:03:42 INFO bsp.GroomServer: GroomServer up at: localhost/
>> 127.0.0.1:51744
>>
>> 12/06/17 23:03:42 INFO bsp.GroomServer: Starting groom:
>> vesta.apache.org:59498
>>
>> 12/06/17 23:03:42 INFO bsp.BSPMaster: groomd_vesta.apache.org_59498 is
>> added.
>>
>> 12/06/17 23:03:43 INFO bsp.TestBSPMasterGroomServer: Client finishes
>> execution job.
>>
>> 12/06/17 23:03:43 INFO bsp.FileInputFormat: Total input paths to process :
>> 1
>>
>> 12/06/17 23:03:43 INFO bsp.FileInputFormat: Total # of splits: 2
>>
>> 12/06/17 23:03:43 WARN bsp.BSPJobClient: No job jar file set.  User
>> classes may not be found. See BSPJob#setJar(String) or check Your jar file.
>>
>> 12/06/17 23:03:43 INFO bsp.JobInProgress: num BSPTasks: 2
>>
>> 12/06/17 23:03:43 WARN bsp.BSPMaster: Path did not start with /, adding
>> it: /job_201206172303_0001
>>
>> 12/06/17 23:03:43 INFO bsp.JobInProgress: Job is initialized.
>>
>> 12/06/17 23:03:43 ERROR bsp.TaskInProgress: Could not find groom for
>> location: localhost ; active grooms: [vesta.apache.org]
>>
>> 12/06/17 23:03:43 INFO bsp.BSPJobClient: Running job: job_201206172303_0001
>>
>> 12/06/17 23:03:43 ERROR bsp.SimpleTaskScheduler: Error submitting job
>> java.util.concurrent.ExecutionException: java.lang.NullPointerException
>>        at
>> java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
>>        at java.util.concurrent.FutureTask.get(FutureTask.java:83)
>>        at
>> org.apache.hama.bsp.SimpleTaskScheduler$JobProcessor.schedule(SimpleTaskScheduler.java:179)
>>        at
>> org.apache.hama.bsp.SimpleTaskScheduler$JobProcessor.run(SimpleTaskScheduler.java:157)
>> Caused by: java.lang.NullPointerException
>>        at
>> org.apache.hama.bsp.TaskInProgress.getTaskToRun(TaskInProgress.java:155)
>>        at
>> org.apache.hama.bsp.JobInProgress.obtainNewTask(JobInProgress.java:275)
>>        at
>> org.apache.hama.bsp.SimpleTaskScheduler$TaskWorker.call(SimpleTaskScheduler.java:233)
>>        at
>> org.apache.hama.bsp.SimpleTaskScheduler$TaskWorker.call(SimpleTaskScheduler.java:202)
>>        at
>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>        at
>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>        at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>        at java.lang.Thread.run(Thread.java:662)
>>
>> 12/06/17 23:03:43 ERROR bsp.SimpleTaskScheduler: Scheduling of job
>> Pagerank could not be done successfully. Killing it!
>>
>> 12/06/17 23:03:46 INFO bsp.BSPJobClient: Current supersteps number: 0
>>
>> 12/06/17 23:03:46 INFO bsp.BSPJobClient: Job failed.
>>
>> 12/06/17 23:03:46 INFO server.PrepRequestProcessor: Processed session
>> termination for sessionid: 0x137fcafff8b0000
>>
>> 12/06/17 23:03:46 INFO zookeeper.ZooKeeper: Session: 0x137fcafff8b0000
>> closed
>>
>> 12/06/17 23:03:46 INFO zookeeper.ClientCnxn: EventThread shut down
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping server on 40000
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping IPC Server listener on 40000
>>
>> 12/06/17 23:03:46 INFO metrics.RpcInstrumentation: shut down
>>
>> 12/06/17 23:03:46 INFO bsp.BSPMaster: Stopped RPC Master server.
>>
>> 12/06/17 23:03:46 INFO server.NIOServerCnxn: Closed socket connection for
>> client /127.0.0.1:34595 which had sessionid 0x137fcafff8b0000
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 0 on 40000: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping IPC Server Responder
>>
>> 12/06/17 23:03:46 INFO server.PrepRequestProcessor: Processed session
>> termination for sessionid: 0x137fcafff8b0001
>>
>> 12/06/17 23:03:46 INFO zookeeper.ZooKeeper: Session: 0x137fcafff8b0001
>> closed
>>
>> 12/06/17 23:03:46 INFO zookeeper.ClientCnxn: EventThread shut down
>>
>> 12/06/17 23:03:46 INFO server.NIOServerCnxn: Closed socket connection for
>> client /127.0.0.1:34911 which had sessionid 0x137fcafff8b0001
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping server on 59498
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 0 on 59498: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping IPC Server Responder
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping IPC Server listener on 59498
>>
>> 12/06/17 23:03:46 INFO metrics.RpcInstrumentation: shut down
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping server on 51744
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 0 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 2 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 5 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 7 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 4 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 6 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 9 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 8 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping IPC Server Responder
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 3 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO ipc.Server: Stopping IPC Server listener on 51744
>>
>> 12/06/17 23:03:46 INFO metrics.RpcInstrumentation: shut down
>>
>> 12/06/17 23:03:46 INFO ipc.Server: IPC Server handler 1 on 51744: exiting
>>
>> 12/06/17 23:03:46 INFO server.NIOServerCnxn: NIOServerCnxn factory exited
>> run method
>>
>> 12/06/17 23:03:46 INFO server.PrepRequestProcessor: PrepRequestProcessor
>> exited loop!
>>
>> 12/06/17 23:03:46 INFO server.SyncRequestProcessor: SyncRequestProcessor
>> exited!
>>
>> 12/06/17 23:03:46 INFO server.FinalRequestProcessor: shutdown of request
>> processor complete
>>
>> Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.021 sec
>> <<< FAILURE!
>>
>> Results :
>>
>> Failed tests:
>>  testSubmitJob(org.apache.hama.graph.TestSubmitGraphJob)
>>
>> Tests run: 1, Failures: 1, Errors: 0, Skipped: 0
>>
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] Apache Hama parent POM ............................ SUCCESS [8.097s]
>> [INFO] core .............................................. SUCCESS
>> [1:59.310s]
>> [INFO] graph ............................................. FAILURE [8.240s]
>> [INFO] examples .......................................... SKIPPED
>> [INFO] yarn .............................................. SKIPPED
>> [INFO] machine learning .................................. SKIPPED
>> [INFO] hama-dist ......................................... SKIPPED
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] BUILD FAILURE
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Total time: 2:16.470s
>> [INFO] Finished at: Sun Jun 17 23:03:47 UTC 2012
>> [INFO] Final Memory: 41M/247M
>> [INFO]
>> ------------------------------------------------------------------------
>> [ERROR] Failed to execute goal
>> org.apache.maven.plugins:maven-surefire-plugin:2.6:test (default-test) on
>> project hama-graph: There are test failures.
>> [ERROR]
>> [ERROR] Please refer to <
>> https://builds.apache.org/job/Hama-Nightly/ws/trunk/graph/target/surefire-reports>
>> for the individual test results.
>> [ERROR] -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>> [ERROR] [Help 1]
>> http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
>> [ERROR]
>> [ERROR] After correcting the problems, you can resume the build with the
>> command
>> [ERROR]   mvn <goals> -rf :hama-graph
>> Build step 'Invoke top-level Maven targets' marked build as failure
>> Archiving artifacts
>> Recording test results
>>
>
>
>
> --
> Thomas Jungblut
> Berlin <thomas.jungblut@gmail.com>



-- 
Best Regards, Edward J. Yoon
@eddieyoon

Mime
View raw message