hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "elangovan anbalahan" <amazing.e...@gmail.com>
Subject Re: Bad connection to FS. command aborted.
Date Thu, 04 Dec 2008 19:50:01 GMT
Please tell me why am i getting this error.
it is becoming hard for me to find a solution

*put: java.io.IOException: failed to create file
/user/nutch/urls/urls/.urllist.txt.crc on client 127.0.0.1 because
target-length is 0, below MIN_REPLICATION (1)*

i am getting this when i do
bin/hadoop dfs -put urls urs


bash-3.2$ bin/start-all.sh
starting namenode, logging to
/nutch/search/logs/hadoop-nutch-namenode-elan.out
localhost: starting datanode, logging to
/nutch/search/logs/hadoop-nutch-datanode-elan.out
cat: /nutch/search/bin/../conf/masters: No such file or directory
starting jobtracker, logging to
/nutch/search/logs/hadoop-nutch-jobtracker-elan.out
localhost: starting tasktracker, logging to
/nutch/search/logs/hadoop-nutch-tasktracker-elan.out
bash-3.2$ mkdir urls
bash-3.2$ vi urls/urllist.txt
bash-3.2$ bin/hadoop dfs -put urls urls
put: java.io.IOException: failed to create file
/user/nutch/urls/.urllist.txt.crc on client 127.0.0.1 because target-length
is 0, below MIN_REPLICATION (1)
bash-3.2$ bin/hadoop dfs -put urls urls
put: java.io.IOException: failed to create file
/user/nutch/urls/urls/.urllist.txt.crc on client 127.0.0.1 because
target-length is 0, below MIN_REPLICATION (1)



On Thu, Dec 4, 2008 at 2:10 PM, elangovan anbalahan
<amazing.elan@gmail.com>wrote:

> Hadoop 0.12.2
>
>
>
> On Thu, Dec 4, 2008 at 1:54 PM, Sagar Naik <snaik@attributor.com> wrote:
>
>> hadoop version ?
>> command : bin/hadoop version
>>
>> -Sagar
>>
>>
>>
>> elangovan anbalahan wrote:
>>
>>> i tried that but nothing happened
>>>
>>> bash-3.2$ bin/hadoop dfs -put urll urll
>>> put: java.io.IOException: failed to create file
>>> /user/nutch/urll/.urls.crc
>>> on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION
>>> (1)
>>> bash-3.2$ bin/hadoop dfs -cat urls/part-0* > urls
>>> bash-3.2$ bin/hadoop dfs -ls urls
>>> Found 0 items
>>> bash-3.2$ bin/hadoop dfs -ls urll
>>> Found 0 items
>>> bash-3.2$ bin/hadoop dfs -ls
>>> Found 2 items
>>> /user/nutch/$    <dir>
>>> /user/nutch/urll    <dir>
>>>
>>>
>>> how do i get rid of the foll error:
>>> *put: java.io.IOException: failed to create file
>>> /user/nutch/urll/.urls.crc
>>> on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION
>>> (1)
>>>
>>>
>>> *
>>> On Thu, Dec 4, 2008 at 1:29 PM, Elia Mazzawi
>>> <elia.mazzawi@casalemedia.com>wrote:
>>>
>>>
>>>
>>>> you didn't say what the error was?
>>>>
>>>> but you can try this it should do the same thing
>>>>
>>>> bin/hadoop dfs -cat urls/part-0* > urls
>>>>
>>>>
>>>> elangovan anbalahan wrote:
>>>>
>>>>
>>>>
>>>>> im getting this error message when i am dong
>>>>>
>>>>> *bash-3.2$ bin/hadoop dfs -put urls urls*
>>>>>
>>>>>
>>>>> please lemme know the resolution, i have a project submission in a few
>>>>> hours
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>>
>>
>>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message