hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wojciech Langiewicz <wlangiew...@gmail.com>
Subject Re: cleaning up 'hadoop.tmp.dir' ?
Date Tue, 09 Nov 2010 14:04:01 GMT
Thanks for the answer, but what about HDFS data?

In tmp dir I have another 2: dfs and mapred
Directory 'dfs' seems to contain block from the HDFS, is it safe to 
delete them?

W dniu 09.11.2010 14:46, Segel, Mike wrote:
> Assuming you didn't change anything with the job history and you don't have jobs that
run for more than 24 hours...
> You should be able to remove anything that hasn't been touched for the past 24 hours.
>
> So you could create a cron job to remove the files.
>
> HTH
>
> -Mike
>
>
> -----Original Message-----
> From: Wojciech Langiewicz [mailto:wlangiewicz@gmail.com]
> Sent: Tuesday, November 09, 2010 6:39 AM
> To: general@hadoop.apache.org
> Subject: cleaning up 'hadoop.tmp.dir' ?
>
> Hello,
> I have a problem with my tmp dir. Right now it exceeded 4TB on every of
> my 4 machines, so it has 16TB in total.
>
> I'm running out of space on servers, so I would like to know if it is
> safe to clean this directory completly, or if there is configuration
> option that would do this for me.
>
> Any help will be appreciated.
>
> --
> Wojciech Langiewicz
>
>
> The information contained in this communication may be CONFIDENTIAL and is intended only
for the use of the recipient(s) named above.  If you are not the intended recipient, you are
hereby notified that any dissemination, distribution, or copying of this communication, or
any of its contents, is strictly prohibited.  If you have received this communication in error,
please notify the sender and delete/destroy the original message and any copy of it from your
computer or paper files.


Mime
View raw message