hadoop-mapreduce-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hemanth Yamijala (JIRA)" <j...@apache.org>
Subject [jira] Commented: (MAPREDUCE-1086) hadoop commands in streaming tasks are trying to write to tasktracker's log
Date Thu, 15 Oct 2009 13:38:31 GMT

    [ https://issues.apache.org/jira/browse/MAPREDUCE-1086?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12766056#action_12766056
] 

Hemanth Yamijala commented on MAPREDUCE-1086:
---------------------------------------------

Ravi, this looks mostly good.

I have very minor comments:

- I am thinking it makes sense to set the log parameters in HADOOP_CLIENT_OPTS than HADOOP_OPTS.
This will make sure they get appended at the end and is more reliable. Further, I think we
can assume that the streaming tasks only use the hadoop command line as a client.
- To be consistent with what is set in the scripts of Hadoop, I would recommend that the value
of HADOOP_CLIENT_OPTS is enclosed in double quotes than single.
- We really don't need to convert taskId or log file size to Strings, as Java will do the
conversion itself.
- In the test case, I suppose we can be a little more strict in verification. Can we check
that the env contains 'INFO,TLA' as that's what we are setting. Likewise, we can also verify
the log file size is set to a known value by setting it in the corresponding property in the
Configuration.

> hadoop commands in streaming tasks are trying to write to tasktracker's log
> ---------------------------------------------------------------------------
>
>                 Key: MAPREDUCE-1086
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1086
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: tasktracker
>    Affects Versions: 0.20.1, 0.21.0, 0.22.0
>            Reporter: Ravi Gummadi
>            Assignee: Ravi Gummadi
>             Fix For: 0.20.2
>
>         Attachments: MR-1086.patch
>
>
> As HADOOP_ROOT_LOGGER is not set in the environment by TT for the children, the children
of task jvm(in case of streaming) are trying to write to TT's log and getting the following
Exception. Jobs are succeeded, but the issue is to be resolved by setting the environment
variables by TT for use by children of task jvm in case of streaming job.
> When streaming calls hadoop commands, it's trying to write to TaskTracker log file.
> log4j:ERROR setFile(null,true) call failed.
> java.io.FileNotFoundException:
> /a/b/tasktracker.log (Permission denied)
>         at java.io.FileOutputStream.openAppend(Native Method)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:177)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:102)
>         at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
>         at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
>         at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
>         at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
>         at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
>         at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:97)
>         at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:689)
>         at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
>         at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544)
>         at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440)
>         at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
>         at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:471)
>         at org.apache.log4j.LogManager.<clinit>(LogManager.java:125)
>         at org.apache.log4j.Logger.getLogger(Logger.java:105)
>         at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:229)
>         at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:65)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:529)
>         at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:235)
>         at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:209)
>         at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:351)
>         at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:138)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:57)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)
> log4j:ERROR Either File or DatePattern options are not set for appender [DRFA].

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message