hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sandeep vura <sandeepv...@gmail.com>
Subject Re: Unable to load file from local to HDFS cluster
Date Thu, 09 Apr 2015 05:15:38 GMT
Can anyone give solution for my issue?

On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sandeepvura@gmail.com> wrote:

> Exactly but every time it picks randomly. Our datanodes are
> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>
> Namenode  : 192.168.2.80
>
> If i restarts the cluster next time it will show 192.168.2.81:50010
> connection closed
>
> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <Huat.Liaw@ontario.ca>
> wrote:
>
>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>> -datanode))
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:39 PM
>>
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> We are using this setup from a very long time.We are able to run all the
>> jobs successfully but suddenly went wrong with namenode.
>>
>>
>>
>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sandeepvura@gmail.com>
>> wrote:
>>
>> I have also noticed another issue when starting hadoop cluster
>> start-all.sh command
>>
>>
>>
>> namenode and datanode daemons are starting.But sometimes one of the
>> datanode would drop the connection and it shows the message connection
>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>> cluster datanode will keeps changing .
>>
>>
>>
>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>> connection closed
>>
>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>
>>
>>
>> I couldn't able to figure out the issue exactly.Is issue relates to
>> network or Hadoop configuration.
>>
>>
>>
>>
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <Huat.Liaw@ontario.ca>
>> wrote:
>>
>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 2:24 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>
>>
>>
>> Sorry Liaw,I tried same command but its didn't resolve.
>>
>>
>>
>> Regards,
>>
>> Sandeep.V
>>
>>
>>
>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <Huat.Liaw@ontario.ca>
>> wrote:
>>
>> Should be hadoop dfs -put
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>> *Sent:* April 8, 2015 1:53 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Unable to load file from local to HDFS cluster
>>
>>
>>
>> Hi,
>>
>>
>>
>> When loading a file from local to HDFS cluster using the below command
>>
>>
>>
>> hadoop fs -put sales.txt /sales_dept.
>>
>>
>>
>> Getting the following exception.Please let me know how to resolve this
>> issue asap.Please find the attached is the logs that is displaying on
>> namenode.
>>
>>
>>
>> Regards,
>>
>> Sandeep.v
>>
>>
>>
>>
>>
>>
>>
>
>

Mime
View raw message