hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jason Venner <jason.had...@gmail.com>
Subject Re: Why doesnt my mapper speak to me :(
Date Tue, 17 Nov 2009 06:27:35 GMT
Your log messages to stdout,stderr and syslog will end up in the
logs/userlogs directory of your task tracker.

If the job is still visible via the web ui for the job tracker host (usually
port 50030), you can select the individual tasks that were run for your job,
and if you click through enough screens you will find the a link to the per
task log data on the far right side of the screen.

On Sat, Nov 14, 2009 at 12:03 PM, Max Heimel <mheimel@googlemail.com> wrote:

> Hey all,
> I am currenty trying to write a simple map job that dumps a file to
> the hdfs. I see that hadoop creates 3 instances of my mapper class (I
> expected 3, so this is fine :) but then things get a bit confusing for
> me...
> First the files don't show up in the file system, which is probably
> due to some misconfiguration ;) So I tried to let the mapper print
> some debug messages so I see where it actually fails. But: I just
> don't seem to be able to get any debug messages out of my mapper.
> I tried using printing to Sysout/Syseerr and
> (org.apache.log4j.)Logger.getLogger(MyMapper.class), but there never
> is any of my debug output in any of the logs created by hadoop: The
> logged sysout/syserr for the mappers are just empty. The log4j log
> file isn't even created :(
> Any ideas what I am doing wrong here?
> Thanks
> Max

Pro Hadoop, a book to guide you from beginner to hadoop mastery,
www.prohadoopbook.com a community for Hadoop Professionals

View raw message