hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Prashant Sharma <prashant.ii...@gmail.com>
Subject Re: How to delete files older than X days in HDFS/Hadoop
Date Sat, 26 Nov 2011 16:35:00 GMT
Wont be that easy but its possible to write.
I did something like this.
$HADOOP_HOME/bin/hadoop fs -rmr  `$HADOOP_HOME/bin/hadoop fs -ls | grep
'.*2011.11.1[1-8].*' | cut -f 19 -d \ `

Notice a space in -d \<SPACE>.

-P

On Sat, Nov 26, 2011 at 8:46 PM, Uma Maheswara Rao G
<maheswara@huawei.com>wrote:

> AFAIK, there is no facility like this in HDFS through command line.
> One option is, write small client program and collect the files from root
> based on your condition and invoke delete on them.
>
> Regards,
> Uma
> ________________________________________
> From: Raimon Bosch [raimon.bosch@gmail.com]
> Sent: Saturday, November 26, 2011 8:31 PM
> To: common-user@hadoop.apache.org
> Subject: How to delete files older than X days in HDFS/Hadoop
>
> Hi,
>
> I'm wondering how to delete files older than X days with HDFS/Hadoop. On
> linux we can do it with the folowing command:
>
> find ~/datafolder/* -mtime +7 -exec rm {} \;
>
> Any ideas?
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message