hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Joe Greenawalt <joe.greenaw...@gmail.com>
Subject Re: remotely downloading file
Date Tue, 07 Jun 2011 17:37:33 GMT
thanks for the reply, is there a resource that you have available that i can
look at the correct way to connect remotely?
I seem to be seeing conflicting ways on doing that.

I'm looking at:

But the examples i'm seeing are using the Configuration but i don't see that
being used in those classes.

Thanks again,


On Fri, Jun 3, 2011 at 5:05 PM, Habermaas, William <
William.Habermaas@fatwire.com> wrote:

> You can access HDFS for reading and writing from other machines. The API
> works through the HDFS client which can be anywhere on the network and not
> just on the namenode. You just have to have the Hadoop core jar with your
> application wherever it is going to run.
> Bill
> -----Original Message-----
> From: Joe Greenawalt [mailto:joe.greenawalt@gmail.com]
> Sent: Friday, June 03, 2011 4:55 PM
> To: common-user@hadoop.apache.org
> Subject: remotely downloading file
> Hi,
> We're interested in using HDFS to store several large file sets to be
> available for download from our customers  in the following paradigm:
> Customer  <-  | APPSERVER-CLUSTER {app1,app2,app3} |  <-  | HDFS |
> I had assumed that pulling the file from HDFS to the APPSERVER-CLUSTER
> could
> be done program-ably remotely after browsing the documentation.  But after
> reading the API, it seems that writing Java code to interface with HDFS
> needs to happen locally?  Is that correct?
> If it is correct, what is the best/recommended way to
> deliver downloadables to the APPSERVERS (and vice versa) which are hosted
> in
> the same network but on different machines?
> Thanks,
> Joe

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message