hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Himawan Mahardianto <mahardia...@ugm.ac.id>
Subject Question about Block size configuration
Date Tue, 12 May 2015 03:08:12 GMT
Hi guys, I have a couple question about HDFS block size:

What if I set my HDFS block size from default 64 MB to 2 MB each block,
what will gonna happen?

I decrease the value of a block size because I want to store an image file
(jpeg, png etc) that have size about 4MB each file, what is your opinion or
suggestion?

What will gonna happen if i don't change the default size of a block size,
then I store an image file with 4MB size, will Hadoop use full 64MB block,
or it will create 4Mb block instead 64MB?

How much memory used on RAM to store each block if my block size is 64MB,
or my block size is 4MB?

Is there anyone have experience with this? Any suggestion are welcome
Thank you

Mime
View raw message