hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From surendra lilhore <surendra.lilh...@huawei.com>
Subject Re: Data streamer java exception
Date Thu, 24 Aug 2017 17:10:17 GMT

I suggest you use shell command for accessing cluster info instead of curl command.

For hdfs shell command you can refer


For yarn shell command you can refer


From:Atul Rajan
To:surendra lilhore
Date:2017-08-24 21:43:30
Subject:Re: Data streamer java exception

Hello Team,

I come to resolution of this issue by allowing the iproute table entry for the specific ports
used for namenode as well as datanode. now hdfs is running and cluster is running.

thanks a lot for the suggestion. now i have another issue of interface as i am running console
view of RHEL is their any way i can connect to webinterface by url so that interface and jobs
details are visible.??

On 24 August 2017 at 19:18, surendra lilhore <surendra.lilhore@huawei.com<mailto:surendra.lilhore@huawei.com>>
Hi Atul,

Please can you share the datanode exception logs ?. Check if namenode and datanode hostname
mapping is proper or not  in /etc/hosts.

Put operation is failing because datanode’s are not connected to the namenode.


From: Atul Rajan [mailto:atul.rajan29@gmail.com<mailto:atul.rajan29@gmail.com>]
Sent: 24 August 2017 09:32
To: user@hadoop.apache.org<mailto:user@hadoop.apache.org>
Subject: Data streamer java exception

Hello Team,

I am setting up a hadoop 3.0 alpha cluster of 4 nodes in RHEL 7.2 everything is set namenode
datanode resource manager node manager but data node is not able to connect to namenode i
am getting retrying logs in datanode.
Also when copying files from local to hdfs data streamer java exceptions are being thrown.

can you please help me out here.
Thanks and Regards
Atul Rajan

-Sent from my iPhone

Best Regards
Atul Rajan

View raw message