hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Konstantin Boudnik (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HDFS-703) Replace current fault injection implementation with one from Common
Date Wed, 04 Nov 2009 21:37:32 GMT

    [ https://issues.apache.org/jira/browse/HDFS-703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12773648#action_12773648

Konstantin Boudnik commented on HDFS-703:

Here's what I've just did:

  % svn co https://svn.apache.org/repos/asf/hadoop/hdfs/trunk new-hdfs-workspace
Fetching external item into '123/src/test/aop/build'
A    new-hdfs-workspace/src/test/aop/build/aop.xml
Checked out external at revision 832876.
As you can see the referred {{aop.xml}} file is imported from Common project through svn:externals
  % cd new-hdfs-workspace
  % ls -l src/test/aop/build
  % ant injectfaults
     [echo] Start weaving aspects in place
     [echo] Weaving of aspects is finished

Total time: 1 minute 11 seconds

So everything seems to be working properly. Hope it helps to resolve the issue.

> Replace current fault injection implementation with one from Common
> -------------------------------------------------------------------
>                 Key: HDFS-703
>                 URL: https://issues.apache.org/jira/browse/HDFS-703
>             Project: Hadoop HDFS
>          Issue Type: Improvement
>          Components: build
>    Affects Versions: 0.22.0
>            Reporter: Konstantin Boudnik
>            Assignee: Konstantin Boudnik
>             Fix For: 0.22.0
>         Attachments: HDFS-703.patch, HDFS-703.patch, HDFS-703.patch
> After HADOOP-6204 has been implemented HDFS doesn't need to have its separate implementation
of fault injection framework. Instead it has to reuse one from Common.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message