hadoop-mapreduce-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hemanth Yamijala (JIRA)" <j...@apache.org>
Subject [jira] Updated: (MAPREDUCE-1086) hadoop commands in streaming tasks are trying to write to tasktracker's log
Date Fri, 16 Oct 2009 10:17:31 GMT

     [ https://issues.apache.org/jira/browse/MAPREDUCE-1086?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Hemanth Yamijala updated MAPREDUCE-1086:
----------------------------------------

    Status: Open  (was: Patch Available)

Canceling patch to fix the test failure. The test passes locally on my machine as well. But
I suspect the reason it is failing on Hudson *may* be due to the fact that the streaming commands
are written without specifying a shell. Naturally, it would be useful to add some log statements,
at a minimum to print the output of the streaming job.

Another issue could be that we are using getBytes() to convert from String to bytes and writing
the command to a file. Why not use the more standard BufferedWriter ?

> hadoop commands in streaming tasks are trying to write to tasktracker's log
> ---------------------------------------------------------------------------
>
>                 Key: MAPREDUCE-1086
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1086
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: tasktracker
>    Affects Versions: 0.20.1, 0.21.0, 0.22.0
>            Reporter: Ravi Gummadi
>            Assignee: Ravi Gummadi
>             Fix For: 0.20.2
>
>         Attachments: MR-1086.patch, MR-1086.v1.1.patch, MR-1086.v1.patch
>
>
> As HADOOP_ROOT_LOGGER is not set in the environment by TT for the children, the children
of task jvm(in case of streaming) are trying to write to TT's log and getting the following
Exception. Jobs are succeeded, but the issue is to be resolved by setting the environment
variables by TT for use by children of task jvm in case of streaming job.
> When streaming calls hadoop commands, it's trying to write to TaskTracker log file.
> log4j:ERROR setFile(null,true) call failed.
> java.io.FileNotFoundException:
> /a/b/tasktracker.log (Permission denied)
>         at java.io.FileOutputStream.openAppend(Native Method)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:177)
>         at java.io.FileOutputStream.<init>(FileOutputStream.java:102)
>         at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
>         at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
>         at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
>         at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
>         at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
>         at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:97)
>         at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:689)
>         at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
>         at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544)
>         at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440)
>         at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
>         at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:471)
>         at org.apache.log4j.LogManager.<clinit>(LogManager.java:125)
>         at org.apache.log4j.Logger.getLogger(Logger.java:105)
>         at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:229)
>         at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:65)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:529)
>         at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:235)
>         at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:209)
>         at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:351)
>         at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:138)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:57)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)
> log4j:ERROR Either File or DatePattern options are not set for appender [DRFA].

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message