hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kunsheng Chen <ke...@yahoo.com>
Subject Program crashed when volume of data getting large
Date Wed, 23 Sep 2009 12:50:36 GMT
Hi everyone,

I am running two map-reduce program, they were working good but when the data turns into around
900MB (50000+ files). things weird happen to remind me as below:

'Communication problem with server: java.net.SocketTimeoutException: timed out waiting for
rpc response'

Also there is some other reminder like "fail to allocate memory".

Strange is that the program keeps running and shows mapping and reduce percentage after those
errors....seems it is still progressing in a slow pace.

Does anyone have some idea ?




View raw message