hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Binglin Chang <decst...@gmail.com>
Subject Re: Tools for extracting data from hadoop logs
Date Tue, 30 Oct 2012 03:24:13 GMT

I think you want to analyze hadoop job logs in jobtracker history folder?
These logs are in a centralized folder and don't need tools like flume or
scribe to gather them.
I used to write a simple python script to parse those log files, and
generate csv/json reports, basically you can use it to get execution time,
counter, status of job, taks, attempts, maybe you can modify it to meet you


On Tue, Oct 30, 2012 at 9:48 AM, bharath vissapragada <
bharathvissapragada1990@gmail.com> wrote:

> Hi list,
> Are the any tools for parsing and extracting data from Hadoop's Job Logs?
> I want to do stuff like ..
> 1. Getting run time of each map/reduce task
> 2. Total map/reduce tasks ran on a particular node in that job  and some
> similar stuff
> Any suggestions?
> Thanks

View raw message