hama-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 庄克琛 <zhuangkec...@gmail.com>
Subject out of memory problem...
Date Fri, 14 Sep 2012 05:48:47 GMT
hi, every one:
I use the hama-0.5.0 with the hadoop-1.0.3, try to do some large graphs
analysis.
When I test the PageRank examples, as the (
http://wiki.apache.org/hama/WriteHamaGraphFile) shows, I download the graph
data, and run the PageRank job on a small distributed cluser, I can only
get the out of memory failed, with Superstep 0,1,2 works well, then get the
memory out fail.(Each computer have 2G memory) But when I test some small
graph, everything went well.
Also I try the trunk version(
https://builds.apache.org/job/Hama-Nightly/672/changes#detail3), replace my
hama-0.5.0 with the hama-0.6.0-snapshot, only get the same results.
Anyone got better ideas?

Thanks!

-- 

*Zhuang Kechen
*

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message