hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Allen Wittenauer (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (HDFS-201) Spring and OSGi support
Date Mon, 21 Jul 2014 16:58:40 GMT

     [ https://issues.apache.org/jira/browse/HDFS-201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel

Allen Wittenauer resolved HDFS-201.

    Resolution: Duplicate

> Spring and OSGi support
> -----------------------
>                 Key: HDFS-201
>                 URL: https://issues.apache.org/jira/browse/HDFS-201
>             Project: Hadoop HDFS
>          Issue Type: New Feature
>            Reporter: Jon Brisbin
>            Assignee: Jean-Baptiste Onofré
>         Attachments: HDFS-201.patch
> I was able to compile 0.18.2 in eclipse into a new OSGi bundle using eclipse PDE. Using
Spring to control the HDFS nodes, however, seems out of the question for the time being because
of inter-dependencies between packages that should be separate OSGi bundles (for example,
SecondaryNameNode includes direct references to StatusHttpServer, which should be in a bundle
with a "web" personality that is separate from Hadoop Core). Looking through the code that
starts the daemons, it would seem code changes are necessary to allow for components to be
dependency-injected. Rather than instantiating a StatusHttpServer inside the SecondaryNameNode,
that reference should (at the very least) be able to be dependency-injected (for example from
an OSGi service from another bundle). Adding setters for infoServer would allow that reference
to be injected by Spring. This is just an example of the changes that would need to be made
to get Hadoop to live happily inside an OSGi container.
> As a starting point, it would be nice if Hadoop core was able to be split into a client
bundle that could be deployed into OSGi containers that would provide client-only access to
HDFS clusters.

This message was sent by Atlassian JIRA

View raw message