hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Habermaas, William" <William.Haberm...@fatwire.com>
Subject RE: Bad connection to FS. command aborted
Date Wed, 11 May 2011 13:37:09 GMT
The Hadoop IPCs are version specific.  That is done to prevent an older version from talking
to a newer one.  Even if nothing has changed in the internal protocols the version check is
enforced.  Make sure the new hadoop-core.jar from your modification is on the classpath used
by the hadoop shell script. 

Bill

-----Original Message-----
From: Matthew John [mailto:tmatthewjohn1988@gmail.com] 
Sent: Wednesday, May 11, 2011 9:27 AM
To: common-user
Subject: Bad connection to FS. command aborted

Hi all!

I have been trying to figure out why I m getting this error!

All that I did was :
1) Use a single node cluster
2) Made some modifications in the core (in some MapRed modules).
Successfully compiled it
3) Tried bin/start-dfs.sh alone.

All the required daemons (NN and DN) are up.
The NameNode and DataNode logs are nt showing any errors/exceptions.

Only interesting thing I found was :
*WARN org.apache.hadoop.ipc.Server: Incorrect header or version mismatch
from 10.72.147.109:40048 got version 94 expected version 3  *
in the NameNode logs.

Someone please help me out of this!

Matthew
Mime
View raw message