hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From chenf...@post.tau.ac.il
Subject problem getting started with hadoop
Date Sun, 02 Sep 2007 13:14:25 GMT
I've tried setting up hadoop on a single computer, and I'm  
experiencing a problem with the datanode. when i run the start-all.sh  
script it seems to run smoothly, including setting up the datanode.  
The problem occurs when I try to use the hdfs for example running  
"bin/hadoop dfs -put <localsrc> <dst>".
It gives me the following error:

put: java.io.IOException: Failed to create file  
/user/chenfren/mytest/.slaves.crc on client because there  
were not enough datanodes available. Found 0 datanodes but  
MIN_REPLICATION for the cluster is configured to be 1.

I'm not sure if the "/user/chenfren/mytest/" refers to the hdfs or  
not. If not then "/user/chenfren" doesn't exist, and I don't have  
write permissions to /usr/ anyway. So if this is the case, how do I  
change this dir?
This is the hadoop-site.xml I use:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->




Can anyone advise?

View raw message