mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vimal Mathew <>
Subject Re: new to hadoop
Date Tue, 04 May 2010 14:35:26 GMT
"kill -QUIT" will cause the stack trace to be dumped to stderr (which
is usually a log file). You can also try

jstack [java process ID]

to read the stack trace directly.

You can use  the "jps" command to list Java processes running on a system.

On Mon, May 3, 2010 at 7:26 PM, Sean Owen <> wrote:
> I think the infinite loop theory is good.
> As a crude way to debug, you can log on to a worker machine, locate
> the java process that may be stuck, and:
> kill -QUIT [java process ID]
> This just makes it dump its stack for each thread. Do that a few times
> and you may easily spot an infinite loop situation because it will
> just be in the same place over and over.
> On Tue, May 4, 2010 at 12:15 AM, Tamas Jambor <> wrote:
>> It should be OK, because the hosts are in a local network, properly set up
>> by the IT support.
>> I guess the conf files should be OK too, because it runs the first two jobs
>> without a problem only fails with the third. and it runs other hadoop
>> examples.
>> I will look into how to debug a hadoop project, maybe I can trace down the
>> problem that way.

View raw message