hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From barry.kap...@gmail.com
Subject Can't close file copied to DFS
Date Mon, 19 Jun 2006 22:34:13 GMT
Hi All,

I just recently upgraded from hadoop .2 to hadoop .32 and now I can't
get a file to the DFS. It seems like when we go to upload the CRC file
(which happens before the actual file is upload) I get a problem when
closing the stream. The error is, 'java.io.IOException: failure
closing block of file which is coming from a
'java.net.SocketTimeoutException: Read timed out''. I am not sure why
this would be happening this is a test environment where everything is
on a local box.

I can see through debugging that:

1) FileSystem is created by asking NameNode for DataNode
2) File bytes are copied to stream
3) CRC bytes are created based on file bytes
4) CRC bytes are copied to stream
5) CRC calls close, and simply hangs for a long time resulting in the
socket/IO exception.

Any help would be much appreciated.

-Barry

P.S. If this is repeat message, my apologies gmail and my web browser
seem to be at odds.

Mime
View raw message