hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sandeep vura <sandeepv...@gmail.com>
Subject Re: Unable to load file from local to HDFS cluster
Date Tue, 14 Apr 2015 05:07:02 GMT
Its not conflicted our network team as changed settings in Core Switch of
VLAN

On Sun, Apr 12, 2015 at 8:26 AM, 杨浩 <yanghaogn@gmail.com> wrote:

> Oh, I see. Is that you have configured a conflicted port before?
>
> 2015-04-09 18:36 GMT+08:00 sandeep vura <sandeepvura@gmail.com>:
>
>> Hi Yanghaogn,
>>
>> Sure, We couldn't able to load the file from local to HDFS. Its getting
>> exception DFSOutputStream connection refused,which means packets are not
>> receiving properly from namenode to datanodes .However,if we start clusters
>> our datanodes are not starting properly and getting connection closed
>> exception.
>>
>> Our Hadoop WebUI also opening very slow ,ssh connection also very
>> slow.Then finally we have changed our network ports and checked the
>> performance of the cluster it works good.
>>
>> Issue was fixed in Namenode server network port.
>>
>> Regards,
>> Sandeep.v
>>
>>
>> On Thu, Apr 9, 2015 at 12:30 PM, 杨浩 <yanghaogn@gmail.com> wrote:
>>
>>> Root cause: Network related issue?
>>> can you tell us more detailedly? Thank you
>>>
>>> 2015-04-09 13:51 GMT+08:00 sandeep vura <sandeepvura@gmail.com>:
>>>
>>>> Our issue has been resolved.
>>>>
>>>> Root cause: Network related issue.
>>>>
>>>> Thanks for each and everyone spent sometime and replied to my questions.
>>>>
>>>> Regards,
>>>> Sandeep.v
>>>>
>>>> On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <sandeepvura@gmail.com>
>>>> wrote:
>>>>
>>>>> Can anyone give solution for my issue?
>>>>>
>>>>> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <sandeepvura@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Exactly but every time it picks randomly. Our datanodes are
>>>>>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>>>>>
>>>>>> Namenode  : 192.168.2.80
>>>>>>
>>>>>> If i restarts the cluster next time it will show 192.168.2.81:50010
>>>>>> connection closed
>>>>>>
>>>>>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <
>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>
>>>>>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>>>>>> -datanode))
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:39 PM
>>>>>>>
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> We are using this setup from a very long time.We are able to
run all
>>>>>>> the jobs successfully but suddenly went wrong with namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <sandeepvura@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> I have also noticed another issue when starting hadoop cluster
>>>>>>> start-all.sh command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> namenode and datanode daemons are starting.But sometimes one
of the
>>>>>>> datanode would drop the connection and it shows the message connection
>>>>>>> closed by ((192.168.2.x -datanode)) everytime when it restart
the hadoop
>>>>>>> cluster datanode will keeps changing .
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> for example 1st time when i starts hadoop cluster - 192.168.2.1
-
>>>>>>> connection closed
>>>>>>>
>>>>>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection
>>>>>>> closed .This point again 192.168.2.1 will starts successfuly
without any
>>>>>>> errors.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I couldn't able to figure out the issue exactly.Is issue relates
to
>>>>>>> network or Hadoop configuration.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> hadoop fs -put <source> <destination> Copy from remote
location to
>>>>>>> HDFS
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 2:24 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.V
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <
>>>>>>> Huat.Liaw@ontario.ca> wrote:
>>>>>>>
>>>>>>> Should be hadoop dfs -put
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> *From:* sandeep vura [mailto:sandeepvura@gmail.com]
>>>>>>> *Sent:* April 8, 2015 1:53 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Unable to load file from local to HDFS cluster
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> When loading a file from local to HDFS cluster using the below
>>>>>>> command
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> hadoop fs -put sales.txt /sales_dept.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Getting the following exception.Please let me know how to resolve
>>>>>>> this issue asap.Please find the attached is the logs that is
displaying on
>>>>>>> namenode.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Regards,
>>>>>>>
>>>>>>> Sandeep.v
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Mime
View raw message