hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Asim <linka...@gmail.com>
Subject Re: unable to see anything in stdout
Date Sat, 02 May 2009 01:09:29 GMT
Thanks Aaron. That worked! However, when i run everything as local, I
see everything executing much faster on local as compared to a single
node. Is there any reason for the same?

-Asim

On Thu, Apr 30, 2009 at 9:23 AM, Aaron Kimball <aaron@cloudera.com> wrote:
> First thing I would do is to run the job in the local jobrunner (as a single
> process on your local machine without involving the cluster):
>
> JobConf conf = .....
> // set other params, mapper, etc. here
> conf.set("mapred.job.tracker", "local"); // use localjobrunner
> conf.set("fs.default.name", "file:///"); // read from local hard disk
> instead of hdfs
>
> JobClient.runJob(conf);
>
>
> This will actually print stdout, stderr, etc. to your local terminal. Try
> this on a single input file. This will let you confirm that it does, in
> fact, write to stdout.
>
> - Aaron
>
> On Thu, Apr 30, 2009 at 9:00 AM, Asim <linkasim@gmail.com> wrote:
>
>> Hi,
>>
>> I am not able to see any job output in userlogs/<task_id>/stdout. It
>> remains empty even though I have many println statements. Are there
>> any steps to debug this problem?
>>
>> Regards,
>> Asim
>>
>

Mime
View raw message