hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@apache.org>
Subject Re: NN memory consumption on 0.20/0.21 with compressed pointers/
Date Mon, 24 Aug 2009 10:42:26 GMT
Raghu Angadi wrote:
> 
> Suresh had made an spreadsheet for memory consumption.. will check.
> 
> A large portion of NN memory is taken by references. I would expect 
> memory savings to be very substantial (same as going from 64bit to 
> 32bit), could be on the order of 40%.
> 
> The last I heard from Sun was that compressed pointers will be in very 
> near future JVM (certainly JDK 1.6_x). It can use compressed pointers 
> upto 32GB of heap.

It's in JDK 1.6u14. Looking at the source and reading the specs implies 
there is savings, but we need to experiment to see. I now know how to do 
sizeof() in java, (in the instrumentation API), so these experiments are 
possible

> 
> I would expect runtime over head on NN would be minimal in practice.


I think there's a small extra deref cost, but its very minimal; one 8 
bit logical shift left, possibly also an addition. Both of which run at 
CPU-speeds, not main memory bus rates

Mime
View raw message