hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rakesh Radhakrishnan <rake...@apache.org>
Subject Re: Start client side daemon
Date Fri, 22 Jul 2016 14:50:08 GMT
Hi Kun,

HDFS won't start any client side object(for example,
DistributedFileSystem). I can say, HDFS Client -> user applications access
the file system using the HDFS client, a library that exports the HDFS file
system interface. Perhaps, you can visit api docs,
https://hadoop.apache.org/docs/r2.6.1/api/org/apache/hadoop/fs/FileSystem.html
.

Namenode has RPC server that listens to requests from data nodes, clients.
Datanode has RPC Server which will be used for inter data node
communications. I think, its worth reading the following to get more
information.
https://en.wikipedia.org/wiki/Apache_Hadoop
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-hdfs/HdfsDesign.html#The_Communication_Protocols

Regards,
Rakesh
Intel

On Fri, Jul 22, 2016 at 7:21 PM, Kun Ren <ren.hdfs@gmail.com> wrote:

> Hi Genius,
>
> I understand that we use the command to start namenode and datanode. But I
> don't know how HDFS starts client side and creates the Client side
> object(Like DistributedFileSystem), and client side RPC server? Could you
> please point it out how HDFS start the client side dameon?
> If the client side uses the same RPC server with server side, Can I
> understand that the client side has to be located at either Namenode or
> Datanode?
>
> Thanks so much.
> Kun
>

Mime
View raw message