hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From praveenesh kumar <praveen...@gmail.com>
Subject Killing hadoop jobs automatically
Date Mon, 30 Jan 2012 07:06:15 GMT
Is there anyway through which we can kill hadoop jobs that are taking
enough time to execute ?

What I want to achieve is - If some job is running more than
"_some_predefined_timeout_limit", it should be killed automatically.

Is it possible to achieve this, through shell scripts or any other way ?


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message