hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Atul Rajan <atul.raja...@gmail.com>
Subject Re: Data streamer java exception
Date Thu, 24 Aug 2017 16:13:24 GMT
Hello Team,

I come to resolution of this issue by allowing the iproute table entry for
the specific ports used for namenode as well as datanode. now hdfs is
running and cluster is running.

thanks a lot for the suggestion. now i have another issue of interface as i
am running console view of RHEL is their any way i can connect to
webinterface by url so that interface and jobs details are visible.??

On 24 August 2017 at 19:18, surendra lilhore <surendra.lilhore@huawei.com>
wrote:

> Hi Atul,
>
>
>
> Please can you share the datanode exception logs ?. Check if namenode and
> datanode hostname mapping is proper or not  in /etc/hosts.
>
>
>
> Put operation is failing because datanode’s are not connected to the
> namenode.
>
>
>
> -Surendra
>
>
>
> *From:* Atul Rajan [mailto:atul.rajan29@gmail.com]
> *Sent:* 24 August 2017 09:32
> *To:* user@hadoop.apache.org
> *Subject:* Data streamer java exception
>
>
>
> Hello Team,
>
>
>
> I am setting up a hadoop 3.0 alpha cluster of 4 nodes in RHEL 7.2
> everything is set namenode datanode resource manager node manager but data
> node is not able to connect to namenode i am getting retrying logs in
> datanode.
>
> Also when copying files from local to hdfs data streamer java exceptions
> are being thrown.
>
>
>
> can you please help me out here.
>
> Thanks and Regards
>
> Atul Rajan
>
>
>
> -Sent from my iPhone
>



-- 
*Best Regards*
*Atul Rajan*

Mime
View raw message