hadoop-mapreduce-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hemanth Yamijala (JIRA)" <j...@apache.org>
Subject [jira] Commented: (MAPREDUCE-1086) hadoop commands in streaming tasks are trying to write to tasktracker's log
Date Fri, 16 Oct 2009 06:20:31 GMT

    [ https://issues.apache.org/jira/browse/MAPREDUCE-1086?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12766436#action_12766436
] 

Hemanth Yamijala commented on MAPREDUCE-1086:
---------------------------------------------

Seems fine. Though not really related to this patch, the place where the localFS is being
initialized can avoid a catch and ignore of the IOException, as it is anyway going to die
with a NullPointerException later. Since this is indeed an exceptional condition, I would
rather we throw the IOException. Looking at code paths, that actually seems to be handled
better in the child tasks than NullPointerException is.

Also can you update the documentation of the getVMEnvironment API to include the new parameters
?

Other than these changes +1. Please make the new patch with these changes Patch Available.

> hadoop commands in streaming tasks are trying to write to tasktracker's log
> ---------------------------------------------------------------------------
>
>                 Key: MAPREDUCE-1086
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1086
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: tasktracker
>    Affects Versions: 0.20.1, 0.21.0, 0.22.0
>            Reporter: Ravi Gummadi
>            Assignee: Ravi Gummadi
>             Fix For: 0.20.2
>
>         Attachments: MR-1086.patch, MR-1086.v1.patch
>
>
> As HADOOP_ROOT_LOGGER is not set in the environment by TT for the children, the children
of task jvm(in case of streaming) are trying to write to TT's log and getting the following
Exception. Jobs are succeeded, but the issue is to be resolved by setting the environment
variables by TT for use by children of task jvm in case of streaming job.
> When streaming calls hadoop commands, it's trying to write to TaskTracker log file.
> log4j:ERROR setFile(null,true) call failed.
> java.io.FileNotFoundException:
> /a/b/tasktracker.log (Permission denied)
>         at java.io.FileOutputStream.openAppend(Native Method)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:177)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:102)
>         at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
>         at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
>         at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
>         at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
>         at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
>         at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:97)
>         at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:689)
>         at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
>         at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544)
>         at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440)
>         at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
>         at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:471)
>         at org.apache.log4j.LogManager.<clinit>(LogManager.java:125)
>         at org.apache.log4j.Logger.getLogger(Logger.java:105)
>         at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:229)
>         at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:65)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:529)
>         at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:235)
>         at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:209)
>         at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:351)
>         at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:138)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:57)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)
> log4j:ERROR Either File or DatePattern options are not set for appender [DRFA].

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message