giraph-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jyoti Yadav <rao.jyoti26ya...@gmail.com>
Subject Re: Run time error
Date Tue, 31 Dec 2013 04:01:01 GMT
Hi Pushparaj...
Thanks for your reply...I checked the logs..It is giving following error...


2013-12-30 22:44:21,297 INFO org.apache.giraph.master.MasterThread:
masterThread: Coordination of superstep 97 took 26.572 seconds ended with
state THIS_SUPERSTEP_DONE and is now on superstep 98
2013-12-30 22:44:21,690 ERROR org.apache.giraph.master.MasterThread:
masterThread: Master algorithm failed with CountersExceededException
org.apache.hadoop.mapred.Counters$CountersExceededException: Error:
Exceeded limits on number of counters - Counters=120 Limit=120
    at
org.apache.hadoop.mapred.Counters$Group.getCounterForName(Counters.java:315)
    at org.apache.hadoop.mapred.Counters.findCounter(Counters.java:449)
    at org.apache.hadoop.mapred.Task$TaskReporter.getCounter(Task.java:559)
    at org.apache.hadoop.mapred.Task$TaskReporter.getCounter(Task.java:506)
    at
org.apache.hadoop.mapreduce.TaskInputOutputContext.getCounter(TaskInputOutputContext.java:88)
    at
org.apache.giraph.counters.HadoopCountersBase.getCounter(HadoopCountersBase.java:60)
    at
org.apache.giraph.counters.GiraphTimers.getSuperstepMs(GiraphTimers.java:120)
    at org.apache.giraph.master.MasterThread.run(MasterThread.java:131)
2013-12-30 22:44:21,788
*FATAL org.apache.giraph.graph.GraphMapper: uncaughtException:
OverrideExceptionHandler on thread org.apache.giraph.master.MasterThread,
msg = org.apache.hadoop.mapred.Counters$CountersExceededException: Error:
Exceeded limits on number of counters - Counters=120 Limit=120,
exiting...java.lang.IllegalStateException:
org.apache.hadoop.mapred.Counters$CountersExceededException: Error:
Exceeded limits on number of counters - Counters=120 Limit=120*
    at org.apache.giraph.master.MasterThread.run(MasterThread.java:185)
Caused by: org.apache.hadoop.mapred.Counters$CountersExceededException:
Error: Exceeded limits on number of counters - Counters=120 Limit=120
    at
org.apache.hadoop.mapred.Counters$Group.getCounterForName(Counters.java:315)
    at org.apache.hadoop.mapred.Counters.findCounter(Counters.java:449)
    at org.apache.hadoop.mapred.Task$TaskReporter.getCounter(Task.java:559)
    at org.apache.hadoop.mapred.Task$TaskReporter.getCounter(Task.java:506)
    at
org.apache.hadoop.mapreduce.TaskInputOutputContext.getCounter(TaskInputOutputContext.java:88)
    at
org.apache.giraph.counters.HadoopCountersBase.getCounter(HadoopCountersBase.java:60)
    at
org.apache.giraph.counters.GiraphTimers.getSuperstepMs(GiraphTimers.java:120)
    at org.apache.giraph.master.MasterThread.run(MasterThread.java:131)
2013-12-30 22:44:22,510 INFO org.apache.giraph.zk.ZooKeeperManager: run:
Shutdown hook started.
2013-12-30 22:44:22,653 WARN org.apache.giraph.zk.ZooKeeperManager:
onlineZooKeeperServers: Forced a shutdown hook kill of the ZooKeeper
process.
2013-12-30 22:44:25,539 INFO org.apache.giraph.zk.ZooKeeperManager:
onlineZooKeeperServers: ZooKeeper process exited with 143 (note that 143
typically means killed).
2013-12-30 22:44:25,556 INFO org.apache.zookeeper.ClientCnxn: Unable to
read additional data from server sessionid 0x14344720b230000, likely server
has closed socket, closing socket connection and attempting reconnect

Any ideas???

Jyoti



On Mon, Dec 30, 2013 at 11:42 PM, Pushparaj Motamari
<pushparajxa@gmail.com>wrote:

> Check for error in program during runtime..check the logs
>
>
> On Mon, Dec 30, 2013 at 10:46 PM, Jyoti Yadav <rao.jyoti26yadav@gmail.com>wrote:
>
>> Hi folks..
>> I am trying to execute  a graph algorithm on relatively large graph..
>> I installed hadoop on single system and trying to execute algorithm on
>> Giraph...
>>
>> I got following error...
>>
>> ERROR org.apache.giraph.graph.GraphTaskManager: run: Worker failure
>> failed on another RuntimeException, original expection will be rethrown
>> java.lang.IllegalStateException: deleteExt: Failed to delete ..
>>
>> Any one faced the same error ever??
>> Thanks
>> Jyoti
>>
>
>

Mime
View raw message