hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Joris Poort <gpo...@gmail.com>
Subject Execution directory for child process within mapper
Date Mon, 26 Sep 2011 17:50:37 GMT
As part of my Java mapper I have a command executes some standalone
code on a local slave node. When I run a code it executes fine, unless
it is trying to access some local files in which case I get the error
that it cannot locate those files.

Digging a little deeper it seems to be executing from the following directory:

    /data/hadoop/mapred/local/taskTracker/{user}/jobcache/job_201109261253_0023/attempt_201109261253_0023_m_000001_0/work

But I am intending to execute from a local directory where the
relevant files are located:

    /home/users/{user}/input/jobname

Is there a way in java/hadoop to force the execution from the local
directory, instead of the jobcache directory automatically created in
hadoop?

Is there perhaps a better way to go about this?

Any help on this would be greatly appreciated!

Cheers,

Joris

Mime
View raw message