hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From rkevinbur...@charter.net
Subject Re: M/R Staticstics
Date Fri, 26 Apr 2013 22:05:35 GMT

I was able to overcome the permission exception in the log by creating 
an HDFS tmp folder (hadoop fs -mkdir /tmp) and opening it up to the 
world (hadoop fs -chmod a+rwx /tmp). That got rid of the exception put I 
still am able to connect to port 50030 to see M/R status. More ideas?

Even though the exception was missing from the logs of one server in the 
cluster, l looked on another server and found essentially the same 
permission problem:

2013-04-26 13:34:56,462 FATAL 
org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer: Error starting 
org.apache.hadoop.yarn.YarnException: Error creating done directory: 

. . . . .

On Fri, Apr 26, 2013 at 10:37 AM, Rishi Yadav wrote:

   do you see "retired jobs" on job tracker page. There is also "job 
tracker history" on the bottom of page. 

something like this  http://nn.zettabyte.com:50030/jobtracker.jsp 
Thanks and Regards,
Rishi Yadav

On Fri, Apr 26, 2013 at 7:36 AM, < rkevinburton@charter.net 
> wrote:
When I submit a simple "Hello World" M/R job like WordCount it takes 
less than 5 seconds. The texts show numerous methods for monitoring M/R 
jobs as they are happening but I have yet to see any that show 
statistics about a job after it has completed. Obviously simple jobs 
that take a short amount of time don't allow time to fire up any web 
mage or monitoring tool to see how it progresses through the JobTracker 
and TaskTracker as well as which node it is processed on. Any 
suggestions on how could see this kind of data *after* a job has 

View raw message