hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "George Wong (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HDFS-6819) make HDFS fault injection framework working with maven
Date Wed, 06 Aug 2014 01:16:13 GMT

    [ https://issues.apache.org/jira/browse/HDFS-6819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14087070#comment-14087070

George Wong commented on HDFS-6819:

I propose to use a separate project to build AOP FI framework in HDFS.
We can put the project in "hadoop-hdfs-project" or in "hadoop-hdfs-project/hadoop-hdfs/src/contrib/".

The HDFS-FI project depends on the hadoop-hdfs project. When building HDFS-FI, AspectJ plugin
weaves the Aspect advice into the hdfs jars like hadoop-hdfs-2.4.0.jar etc.

There are several pros for this proposal.
# fault injection code and UT are isolated with the normal HDFS code and UT.
# Aspect advice is weaved into the hdfs jars which are built in hadoop-hdfs-project. So, the
weaved jars and normal jars are separated jars.

What's your opinions?

> make HDFS fault injection framework working with maven
> ------------------------------------------------------
>                 Key: HDFS-6819
>                 URL: https://issues.apache.org/jira/browse/HDFS-6819
>             Project: Hadoop HDFS
>          Issue Type: Task
>            Reporter: George Wong
>            Assignee: George Wong
> In current trunk code repo, the FI framework does not work. Because maven build process
does not execute the AspectJ injection.
> Since FI is very useful for testing and bug reproduce, it is better to make FI framework
working in the trunk code.

This message was sent by Atlassian JIRA

View raw message