hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Buyung Bahari <buyung.bah...@detik.com>
Subject Problem in a lot of small jobs.
Date Mon, 06 Jul 2009 08:18:38 GMT
Dear hadoop expert,

I want to ask, i have a lot of small job, i sending all in parallel, so 
the job queue in hadoop,  and the batch process every 15 minutes.

After 15 minutes, i erase all the output and the data. Why after a few 
days run, the hdfs run out of space? I try to run balancer and erase the 
temporary job, but dont help.

And second question, how to manage job.xml di hadoop logs directory?

I'm using Hadoop 0.18.3 and Pig for map and reduce.

Thanks for the advice.




Best Regards


Buyung Bahari


Mime
  • Unnamed multipart/mixed (inline, None, 0 bytes)
View raw message