hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Thanh Do <than...@cs.wisc.edu>
Subject Re: How to run Fault injection in HDFS
Date Mon, 12 Oct 2009 01:29:33 GMT
Thanks for your very useful advice. I am able to play with fault injection
now.

On Thu, Oct 8, 2009 at 11:41 AM, Konstantin Boudnik <cos@yahoo-inc.com>wrote:

> Thanks for looking into fault injection - it's very interesting and useful
> technique based on AspectJ.
>
> Currently, it is fully integrated into HDFS only. There's a JIRA
> (HADOOP-6204) which tracks the same effort for Common and then all Hadoop's
> components will have injection (as well as fault injection) in place. This
> JIRA should be committed in the matter of a couple of weeks.
>
> For the immediate purpose you don't need to patch anything or do any
> tweaking of the code: the fault injection framework is in already and ready
> to work.
>
> For your current needs: to be able to run HDFS with instrumented code you
> need to run a special build. To do so:
>  - % ant injectfaults - similar to a 'normal' build, but does instrument
> the code with aspects located under src/test/aop/**
>  - % ant jar-fault-inject - similar to a 'normal' jar creation but
> instrumented
>  - % ant jar-test-fault-inject - similar to a 'normal' jar-test creation
> but instrumented
>
> Now, if you have the rest of sub-projects built you need to move the
> instrumented jar files on top of the 'normal' files in your installation
> directory. Please note that some renaming has to be done: injected jar files
> have '-fi' suffix in their names and normal jar files don't have such. Thus
> currently you'll have to rename those injected jars to pretend like they are
> normal, used by configured's classpath.
>
> At this point you all set: you have a production quality Hadoop with
> injected HDFS. As soon as the aforementioned JIRA is ready and committed
> we'd be able to provide Hadoop-injected version by the build's means rather
> than doing any renaming and manual intervention.
>
> Also, if you need to read more about fault injection (FI) in HDFS you can
> find FI-framework documentation in the current HDFS trunk (it isn't on the
> web yet for version 0.21 hasn't been released yet). Because building
> documentation requires some extra effort and additional software to be
> installed, you can simply download and read the PDF from this FI-framework
> JIRA
>
>
> https://issues.apache.org/jira/secure/attachment/12414225/Fault+injection+development+guide+and+Framework+HowTo.pdf
>
> Hope it helps,
>  Cos
>
>
> On 10/8/09 8:10 AM, Thanh Do wrote:
>
>> Thank you so much, Jakob.
>>
>> Could you please explain the fault injection running procedure in details?
>>
>> My goal is running HDFS in a cluster (with a namenode and several
>> datanode), and see how fault injection techniques affect HDFS
>> behavior's. Also, I would like to define some new aspects/fault to test
>> the system.
>>
>> What I did was:
>> 1) I checked out the hadoop-common-trunk, but this package doesn't
>> contain HDFS classes. I finally noticed that FI framework is currently
>> integrated with HDFS only.
>>
>> 2) So, I checked out the hdfs-trunk. The build.xml contain injectfaults
>> target and several other related things. I was able to build those
>> targets (injectfaults, run-test-hdfs-fault-inject, etc). Up to this
>> point, I stucked because I found no scripted that help me to start-dfs,
>> stop-dfs...
>> I copied the bin folder from common/core to HDFS project folder and ran
>> the  script:
>>
>> /bin/start-dfs.sh/
>>
>> but there is exception:
>>
>> /Exception in thread
>> main"Java.lang.NoClassDefFoundError
>> : org/apache/commons/logging/LogFactory
>> /
>> I guess the reason is I ran HDFS without any common class. How I get
>> around this?
>>
>> 3) I also tried the third way, by download the hadoop release (contain
>> everything: core, hdfs, mapred), and used Eclipse to create project from
>> existing code. I was able to build this project. The bin scripts worked
>> well but I found know FI related classes. What I did was apply the patch
>> (HADOOP-6003.patch) using Eclipse patch command (Team | apply patch),
>> but I failed the patching procedure.
>>
>> In summary, I would like to run a real HDFS with fault injection. I am
>> not very familiar with ant. Could you please show me some more details,
>> so that I could get around this?
>>
>> On Thu, Oct 8, 2009 at 12:19 AM, Jakob Homan <jhoman@yahoo-inc.com
>> <mailto:jhoman@yahoo-inc.com>> wrote:
>>
>>    Thanh-
>>    If you would like the run execute the tests that have been
>>    instrumented to use the fault injection framework the ant target is
>>    run-test-hdfs-fault-inject.  These were used extensively in the
>>    recent append work and there are quite a few append-related tests.
>>      Was there something more specific you were looking for?
>>
>>    Thanks,
>>    Jakob
>>    Hadoop at Yahoo!
>>
>>
>>    Thanh Do wrote:
>>
>>        Hi everyone,
>>
>>        Could any body so me how to run the fault injection framework
>>        mentioned in the following links?:
>>
>>        http://issues.apache.org/jira/browse/HDFS-435
>>
>>        and
>>
>>        https://issues.apache.org/jira/browse/HDFS-436
>>
>>        Thanks,
>>        Thanh
>>
>>
>>
>>
>>
>>
>> --
>> T
>>
>
> --
> With best regards,
>        Konstantin Boudnik (aka Cos)
>
>        Yahoo! Grid Computing
>        +1 (408) 349-4049
>
> 2CAC 8312 4870 D885 8616  6115 220F 6980 1F27 E622
> Attention! Streams of consciousness are disallowed
>
>


-- 
thanh

Mime
View raw message