hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Doug Cutting (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-3246) FTP client over HDFS
Date Fri, 23 May 2008 17:09:56 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-3246?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12599433#action_12599433
] 

Doug Cutting commented on HADOOP-3246:
--------------------------------------

> I am just thinking if whether in the testFtp() method we should check for just one or
all of the classes

I think one is enough.  If all jars are present, as from subversion, tests will run correctly.
 If no jars are present, as in a release, tests will be skipped.  If only some jars are present,
an unsupported configuration, tests will crash.  That seems fine to me.


> FTP client over HDFS
> --------------------
>
>                 Key: HADOOP-3246
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3246
>             Project: Hadoop Core
>          Issue Type: New Feature
>          Components: util
>    Affects Versions: 0.16.3
>            Reporter: Ankur
>             Fix For: 0.18.0
>
>         Attachments: Hadoop-3246-latest.patch, Hadoop-3246-Required-Jars.tar.gz, HADOOP-3246.patch
>
>
> An FTP client that stores content directly into HDFS allows data from FTP serves to be
stored directly into HDFS instead of first copying the data locally and then uploading it
into HDFS. The benefits are apparent from an administrative perspective as large datasets
can be pulled from FTP servers with minimal human intervention.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message