hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "elangovan anbalahan" <amazing.e...@gmail.com>
Subject Re: Bad connection to FS. command aborted.
Date Thu, 04 Dec 2008 18:43:53 GMT
i tried that but nothing happened

bash-3.2$ bin/hadoop dfs -put urll urll
put: java.io.IOException: failed to create file /user/nutch/urll/.urls.crc
on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION (1)
bash-3.2$ bin/hadoop dfs -cat urls/part-0* > urls
bash-3.2$ bin/hadoop dfs -ls urls
Found 0 items
bash-3.2$ bin/hadoop dfs -ls urll
Found 0 items
bash-3.2$ bin/hadoop dfs -ls
Found 2 items
/user/nutch/$    <dir>
/user/nutch/urll    <dir>


how do i get rid of the foll error:
*put: java.io.IOException: failed to create file /user/nutch/urll/.urls.crc
on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION (1)


*
On Thu, Dec 4, 2008 at 1:29 PM, Elia Mazzawi
<elia.mazzawi@casalemedia.com>wrote:

>
> you didn't say what the error was?
>
> but you can try this it should do the same thing
>
> bin/hadoop dfs -cat urls/part-0* > urls
>
>
> elangovan anbalahan wrote:
>
>> im getting this error message when i am dong
>>
>> *bash-3.2$ bin/hadoop dfs -put urls urls*
>>
>>
>> please lemme know the resolution, i have a project submission in a few
>> hours
>>
>>
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message