hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Joshi, Rekha" <Rekha_Jo...@intuit.com>
Subject Re: Log aggregation in Yarn
Date Mon, 10 Sep 2012 11:22:12 GMT
Hi Hemanth,

I am still getting my hands dirty on yarn, so this is preliminary – maybe as the hdfs path
in AggregatedLogsBlock points to /tmp/logs and you say service is unable to read it, possibly
check perm or change the configuration in yarn-site.xml and try?


From: Hemanth Yamijala <yhemanth@thoughtworks.com<mailto:yhemanth@thoughtworks.com>>
Reply-To: <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Date: Mon, 10 Sep 2012 16:19:04 +0530
To: <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Subject: Log aggregation in Yarn


I enabled log aggregation in Yarn and can see that files are getting created under the configured
directory on HDFS. I can access the files via FS shell. However, when I try to retrieve the
logs via the history server, they fail with a message:

Logs not available for attempt_1347261424213_0001_r_000000_0. Aggregation may not be complete,
Check back later or try the nodemanager.

This is with trunk.  From code, it looks like this happens because the service is unable to
access the file on HDFS. However, this is a single node setup, all processes running as the
same user. Security is off in HDFS. Anything else wrong ? Do I need to configure something
more ?


View raw message