hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "David Litster (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HADOOP-4340) "hadoop jar" always returns exit code 0 (sucess) to the shell when jar throws a fatal exception
Date Fri, 03 Oct 2008 22:43:44 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-4340?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

David Litster updated HADOOP-4340:
----------------------------------

    Description: 
Running "hadoop jar" always returns 0 (success) when the jar dies with a stack trace.  As
an example, run these commands:

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 10 10 2>&1;
echo $?
exits with 0

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 2>&1;
echo $?
exits with 255

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar  2>&1;
echo $?
exits with 0 

This seems to be expected behavior.  However, running:

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 10 badparam
2>&1; echo $?
java.lang.NumberFormatException: For input string: "badparam"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
        at java.lang.Long.parseLong(Long.java:403)
        at java.lang.Long.parseLong(Long.java:461)
        at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:241)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:252)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:53)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
exits with 0.

In my opinion, if a jar throws an exception that kills the program being run, and the developer
doesn't catch the exception and do a sane exit with a exit code, hadoop should at least exit
with a non-zero exit code.

As another example, while running a main class that exits with an exit code of 201, Hadoop
will preserve the correct exit code:

    public static void main(String[] args) throws Exception {
    System.exit(201);
  }

But when deliberately creating a null pointer exception, Hadoop exits with 0.

  public static void main(String[] args) throws Exception {
	Object o = null;
	o.toString();
    System.exit(201);
  }

This behaviour makes it very difficult, if not impossible, to use Hadoop programatically with
tools such as HOD or non-Java data processing frameworks, since if a jar crashes with an unhandled
exception, Hadoop doesn't inform the calling program in a well-bahaved way (polling stderr
for output is not a very good way to detect application failure).     

I'm not a Java programmer, so I don't know what the best code to signal failure would be.

Please let me know what other information I can include about my setup

Thanks.




  was:
Running "hadoop jar" always returns 0 (success) when the jar dies with a stack trace.  As
an example, run these commands:

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 10 10 2>&1;
echo $?
exits with 0

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 2>&1;
echo $?
exits with 255

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar  2>&1;
echo $?
exits with 0 

This seems to be expected behavior.  However, running:

/usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 10 badparam
2>&1; echo $?
java.lang.NumberFormatException: For input string: "badparam"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
        at java.lang.Long.parseLong(Long.java:403)
        at java.lang.Long.parseLong(Long.java:461)
        at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:241)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:252)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:53)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
exits with 0.

In my opinion, if a jar throws an exception that kills the program being run, and the developer
doesn't catch the exception and do a sane exit with a exit code, hadoop should at least exit
with a non-zero exit code.

As another example, while running a main class that exits with an exit code of 201, Hadoop
will preserve the correct exit code:

    public static void main(String[] args) throws Exception {
    System.exit(201);
  }

But when deliberately creating a null pointer exception, Hadoop exits with 0.

  public static void main(String[] args) throws Exception {
	Object o = null;
	o.toString();
    System.exit(201);
  }

This behaviour makes it very difficult, if not impossible, to use Hadoop programatically with
tools such as HOD or non-Java data processing frameworks, since if a jar crashes with an unhandled
exception, Hadoop doesn't inform the calling program in a well-bahaved way (polling stderr
for output is not a very good way to detect application failure).     

I'm not a Java programmer, so I don't know what the best code to signal failure would be.

Thanks. 





    Environment: Ubuntu 8.04 Server, 7 Hadoop nodes, GNU bash, version 3.2.39(1)-release (i486-pc-linux-gnu)
 (was: Ubuntu 8.04 Server, 7 Hadoop nodes,  )

> "hadoop jar" always returns exit code 0 (sucess) to the shell when jar throws a fatal
exception 
> ------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-4340
>                 URL: https://issues.apache.org/jira/browse/HADOOP-4340
>             Project: Hadoop Core
>          Issue Type: Bug
>    Affects Versions: 0.18.1
>         Environment: Ubuntu 8.04 Server, 7 Hadoop nodes, GNU bash, version 3.2.39(1)-release
(i486-pc-linux-gnu)
>            Reporter: David Litster
>
> Running "hadoop jar" always returns 0 (success) when the jar dies with a stack trace.
 As an example, run these commands:
> /usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 10 10
2>&1; echo $?
> exits with 0
> /usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 2>&1;
echo $?
> exits with 255
> /usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar  2>&1;
echo $?
> exits with 0 
> This seems to be expected behavior.  However, running:
> /usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/hadoop-0.18.1-examples.jar pi 10 badparam
2>&1; echo $?
> java.lang.NumberFormatException: For input string: "badparam"
>         at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
>         at java.lang.Long.parseLong(Long.java:403)
>         at java.lang.Long.parseLong(Long.java:461)
>         at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:241)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:252)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>         at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>         at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:53)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>         at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> exits with 0.
> In my opinion, if a jar throws an exception that kills the program being run, and the
developer doesn't catch the exception and do a sane exit with a exit code, hadoop should at
least exit with a non-zero exit code.
> As another example, while running a main class that exits with an exit code of 201, Hadoop
will preserve the correct exit code:
>     public static void main(String[] args) throws Exception {
>     System.exit(201);
>   }
> But when deliberately creating a null pointer exception, Hadoop exits with 0.
>   public static void main(String[] args) throws Exception {
> 	Object o = null;
> 	o.toString();
>     System.exit(201);
>   }
> This behaviour makes it very difficult, if not impossible, to use Hadoop programatically
with tools such as HOD or non-Java data processing frameworks, since if a jar crashes with
an unhandled exception, Hadoop doesn't inform the calling program in a well-bahaved way (polling
stderr for output is not a very good way to detect application failure).     
> I'm not a Java programmer, so I don't know what the best code to signal failure would
be.
> Please let me know what other information I can include about my setup
> Thanks.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message