hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stephen Fritz <steph...@cloudera.com>
Subject Re: Metadata size for 1 TB HDFS data?
Date Thu, 20 Dec 2012 14:40:22 GMT
Each block, file, and directory is an object in the namenodes heap, so it
depends on how you're storing your data.  You may need to account for those
in your calculations.


On Thu, Dec 20, 2012 at 7:01 AM, Mohammad Tariq <dontariq@gmail.com> wrote:

> Hello group,
>
>         What could be the approx. size of the metadata if I have 1 TB of
> data in my HDFS?I am not doing anything additional but just a simple put.
> Will it be ((1*1024*1024)/64)*200 Bytes?
> *Keeping 64M as the block size.
>
> Is my understanding right?Please correct me if i'm wrong.
>
> Many thanks.
>
> Best Regards,
> Tariq
> +91-9741563634
> https://mtariq.jux.com/
>

Mime
View raw message