hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aaron Kimball <aa...@cloudera.com>
Subject Re: unable to see anything in stdout
Date Thu, 30 Apr 2009 14:23:06 GMT
First thing I would do is to run the job in the local jobrunner (as a single
process on your local machine without involving the cluster):

JobConf conf = .....
// set other params, mapper, etc. here
conf.set("mapred.job.tracker", "local"); // use localjobrunner
conf.set("fs.default.name", "file:///"); // read from local hard disk
instead of hdfs

JobClient.runJob(conf);


This will actually print stdout, stderr, etc. to your local terminal. Try
this on a single input file. This will let you confirm that it does, in
fact, write to stdout.

- Aaron

On Thu, Apr 30, 2009 at 9:00 AM, Asim <linkasim@gmail.com> wrote:

> Hi,
>
> I am not able to see any job output in userlogs/<task_id>/stdout. It
> remains empty even though I have many println statements. Are there
> any steps to debug this problem?
>
> Regards,
> Asim
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message