hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sugandha Naolekar <sugandha....@gmail.com>
Subject Few Queries!!
Date Sat, 26 Sep 2009 04:40:02 GMT
Hello!

    I have a 4 node cluster and one remote machine(3rd party App.) which is
not a part of the hadoop cluster(mastre-slave configuration).

    Now, I want to dump the data from this remote m/c into the hadoop
cluster. But, this data dumping is to be done dynamically. Meaning,. lets
say there is this location as /home/X of remote m/c. I will have to write a
code that will fetch this data dyanamically from that place and dump it in
cluster. This code is to be executed from my(here NN) machine. I want to
execute that code by sitting on the local host. For this purpose, I have
thought, to use RMI. Now, there are two issues:::

1) Generally, if the data is to be transferred from amy of the
machines(whether it be local or remote) I have to login to that system
particularly and then then do the following-like thro' hadoop standard
commands or by writing a java code using hadoop API.

2) Now, the code would be written for remote m/c, all the tasks would be
executed or performed by it only but, the execution would be through' server
.i.e; NN  machine. Can this be solved???

RMI's strucuture says, to specify client that would just specify the methods
or operations to be performed and server would be the one that will be
executing these. Thus, inn this case, who would be the server and client?

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message