hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yang <teddyyyy...@gmail.com>
Subject Re: why does hadoop creates /tmp/hadoop-user/hadoop-unjar-xxxx/ dir and unjar my fat jar?
Date Sat, 25 Oct 2014 09:47:19 GMT
I thought this might be because that hadoop wants to pack everything
(including the -files dfs cache files) into one single jar, so I removed
the -files commands I have.

but it still extracts the jar. this is rather confusing

On Fri, Oct 24, 2014 at 11:51 AM, Yang <teddyyyy123@gmail.com> wrote:

> I just noticed that when I run a "hadoop jar
> my-fat-jar-with-all-dependencies.jar" , it unjars the job jar in
> /tmp/hadoop-username/hadoop-unjar-xxxx/ and extracts all the classes in
> there.
> the fat jar is pretty big, so it took up a lot of space (particularly
> inodes ) and ran out of quota.
> I wonder why do we have to unjar these classes on the **client node** ?
> the jar won't even be accessed until on the compute nodes, right?

View raw message