hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Konstantin Boudnik <...@yahoo-inc.com>
Subject Re: How to run Fault injection in HDFS
Date Tue, 24 Nov 2009 17:02:42 GMT
Oh, I see. Right, injection framework has been introduced in early 0.21 and 
never been backported to 0.20.

On 11/23/09 19:38 , Thanh Do wrote:
> The reason I changed the /build.xml/ is that /build.xml/ in the
> hadoop-common trunk release (0.20.1) does not contain /injectfaults/
> target ( I wanna use AspectJ in the hadoop release that contains both
> hdfs and mapred). I just add following two targets.
>
> <target name="compile-fault-inject" depends="compile-hdfs-classes">
> <!-- AspectJ task definition -->
> <taskdef
>
> resource="org/aspectj/tools/ant/taskdefs/aspectjTaskdefs.properties">
> <classpath>
> <pathelement location="${common.ivy.lib.dir}/aspectjtools-1.6.4.jar"/>
> </classpath>
> </taskdef>
> <echo message="Start weaving aspects in place"/>
> <iajc
>            encoding="${build.encoding}"
>            srcdir="${hdfs.src.dir};${build.src}"
>            includes="org/apache/hadoop/**/*.java,
> */org/apache/hadoop/myaspect/**/*.aj/*"
>            destDir="${build.classes}"
>            debug="${javac.debug}"
>            target="${javac.version}"
>            source="${javac.version}"
>            deprecation="${javac.deprecation}">
> <classpath refid="test.classpath"/>
> </iajc>
> <echo message="Weaving of aspects is finished"/>
> </target>
>
> <target name="injectfaults" description="Instrument HDFS classes with
> faults and other AOP advices">
> <subant buildpath="${basedir}" target="compile-fault-inject">
> <property name="build.dir" value="${build.dir}"/>
> </subant>
> </target>
>
> So that, when I want to weave my aspect, I only type:
>
> /ant injectfaults/
>
>
> On Fri, Nov 20, 2009 at 3:21 PM, Konstantin Boudnik <cos@yahoo-inc.com
> <mailto:cos@yahoo-inc.com>> wrote:
>
>     Generally the idea was to provide everything needed for injection by
>     what current build.xml is having in Common and Hdfs. Would you mind
>     to share what extra changes you've needed and why?
>
>     Cos
>
>
>     On 11/20/09 12:32 , Thanh Do wrote:
>
>         Thank you folks!
>
>         Finally, I am able (really) to run FI with HADOOP. I added some
>         aspects
>         into the source code, changed the build.xml, and that's it.
>
>         AspectJ is awesome!
>
>         Have a nice weekend!
>
>         On Fri, Nov 20, 2009 at 1:08 PM, Konstantin Boudnik
>         <cos@yahoo-inc.com <mailto:cos@yahoo-inc.com>
>         <mailto:cos@yahoo-inc.com <mailto:cos@yahoo-inc.com>>> wrote:
>
>             Hi Thanh.
>
>             hmm, it sounds like you have some issue with compilation of
>         your code.
>
>             addDeprication() has been added to Configuration in 0.21, I
>         believe.
>             And it is there no matter how do you compile your code (with
>         FI or
>             without).
>
>             Cos
>
>
>             On 11/19/09 10:12 , Thanh Do wrote:
>
>                 Sorry to dig this thread again!
>
>                 I am expecting the release of 0.21 so that I don't have to
>                 manually play
>                 around with AspectJ FI any more.
>
>                 I still have problem with running HDFS with instrumented
>         code
>                 (with aspect).
>
>                 Here is what I did:
>
>                 In the root directory of HDFS:
>                 /$ ant injectfaults
>
>                 $ ant jar-fault-inject
>                 /At this point, i have a jar file containing hdfs
>         classed, namely,
>                 /hadoop-hdfs-0.22.0-dev-fi.jar/, located in /build-fi/
>         folder.
>
>                 Now I go to the HADOOP folder (which contains running
>         script in bin
>                 directory), and do the following
>                 /$ ant compile-core-classes/
>                 ( now I need additional hdfs classes to be able to run
>                 /start-dfs.sh/ <http://start-dfs.sh/>
>         <http://start-dfs.sh/>,
>
>                 right)
>                 What I did is copying
>                 /$HDFS/build-fi/hadoop-hdfs-0.22.0-dev-fi.jar /to
>                 /$HADOOP/hadoop-hdfs-fi-core.jar/ (I need to add suffix
>         "core"
>                 since the
>                 script will include all hadoop-*-core.jar in classpath)
>
>                 /$ bin/start-dfs.sh/ <http://start-dfs.sh/>
>         <http://start-dfs.sh/>
>
>                 and got error message:
>
>                 2009-11-19 11:52:57,479 ERROR
>                 org.apache.hadoop.hdfs.server.namenode.NameNode:
>                 java.lang.NoSuchMethodError:
>
>           org.apache.hadoop.conf.Configuration.addDeprecation(Ljava/lang/String;[Ljava/lang/String;)V
>                          at
>
>           org.apache.hadoop.hdfs.HdfsConfiguration.deprecate(HdfsConfiguration.java:44)
>                          at
>
>           org.apache.hadoop.hdfs.HdfsConfiguration.addDeprecatedKeys(HdfsConfiguration.java:48)
>                          at
>
>           org.apache.hadoop.hdfs.HdfsConfiguration.<clinit>(HdfsConfiguration.java:28)
>                          at
>
>           org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1169)
>                          at
>
>           org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1199)
>
>                 2009-11-19 11:52:57,480 INFO
>                 org.apache.hadoop.hdfs.server.namenode.NameNode:
>         SHUTDOWN_MSG:
>
>                 Could any one tell me how to solve this problem?
>
>                 Thank you so much.
>
>
>                 On Thu, Oct 8, 2009 at 10:41 AM, Konstantin Boudnik
>         <cos@yahoo-inc.com <mailto:cos@yahoo-inc.com>
>         <mailto:cos@yahoo-inc.com <mailto:cos@yahoo-inc.com>>
>         <mailto:cos@yahoo-inc.com <mailto:cos@yahoo-inc.com>
>         <mailto:cos@yahoo-inc.com <mailto:cos@yahoo-inc.com>>>> wrote:
>
>                     Thanks for looking into fault injection - it's very
>                 interesting and
>                     useful technique based on AspectJ.
>
>                     Currently, it is fully integrated into HDFS only.
>         There's a JIRA
>                     (HADOOP-6204) which tracks the same effort for
>         Common and
>                 then all
>                     Hadoop's components will have injection (as well as
>         fault
>                 injection)
>                     in place. This JIRA should be committed in the
>         matter of a
>                 couple of
>                     weeks.
>
>                     For the immediate purpose you don't need to patch
>         anything
>                 or do any
>                     tweaking of the code: the fault injection framework
>         is in
>                 already
>                     and ready to work.
>
>                     For your current needs: to be able to run HDFS with
>         instrumented
>                     code you need to run a special build. To do so:
>                       - % ant injectfaults - similar to a 'normal'
>         build, but does
>                     instrument the code with aspects located under
>         src/test/aop/**
>                       - % ant jar-fault-inject - similar to a 'normal' jar
>                 creation but
>                     instrumented
>                       - % ant jar-test-fault-inject - similar to a
>         'normal' jar-test
>                     creation but instrumented
>
>                     Now, if you have the rest of sub-projects built you
>         need to
>                 move the
>                     instrumented jar files on top of the 'normal' files
>         in your
>                     installation directory. Please note that some
>         renaming has to be
>                     done: injected jar files have '-fi' suffix in their
>         names
>                 and normal
>                     jar files don't have such. Thus currently you'll
>         have to rename
>                     those injected jars to pretend like they are normal,
>         used by
>                     configured's classpath.
>
>                     At this point you all set: you have a production quality
>                 Hadoop with
>                     injected HDFS. As soon as the aforementioned JIRA is
>         ready and
>                     committed we'd be able to provide Hadoop-injected
>         version by the
>                     build's means rather than doing any renaming and manual
>                 intervention.
>
>                     Also, if you need to read more about fault injection
>         (FI) in
>                 HDFS
>                     you can find FI-framework documentation in the
>         current HDFS
>                 trunk
>                     (it isn't on the web yet for version 0.21 hasn't been
>                 released yet).
>                     Because building documentation requires some extra
>         effort and
>                     additional software to be installed, you can simply
>         download and
>                     read the PDF from this FI-framework JIRA
>
>         https://issues.apache.org/jira/secure/attachment/12414225/Fault+injection+development+guide+and+Framework+HowTo.pdf
>
>                     Hope it helps,
>                       Cos
>
>
>                     On 10/8/09 8:10 AM, Thanh Do wrote:
>
>                         Thank you so much, Jakob.
>
>                         Could you please explain the fault injection running
>                 procedure
>                         in details?
>
>                         My goal is running HDFS in a cluster (with a
>         namenode
>                 and several
>                         datanode), and see how fault injection
>         techniques affect
>                 HDFS
>                         behavior's. Also, I would like to define some new
>                 aspects/fault
>                         to test
>                         the system.
>
>                         What I did was:
>                         1) I checked out the hadoop-common-trunk, but this
>                 package doesn't
>                         contain HDFS classes. I finally noticed that FI
>         framework is
>                         currently
>                         integrated with HDFS only.
>
>                         2) So, I checked out the hdfs-trunk. The
>         build.xml contain
>                         injectfaults
>                         target and several other related things. I was
>         able to
>                 build those
>                         targets (injectfaults,
>         run-test-hdfs-fault-inject, etc).
>                 Up to this
>                         point, I stucked because I found no scripted
>         that help me to
>                         start-dfs,
>                         stop-dfs...
>                         I copied the bin folder from common/core to HDFS
>         project
>                 folder
>                         and ran
>                         the  script:
>
>                         /bin/start-dfs.sh/ <http://start-dfs.sh/>
>         <http://start-dfs.sh/>
>         <http://start-dfs.sh/>
>
>
>                         but there is exception:
>
>                         /Exception in thread
>                         main"Java.lang.NoClassDefFoundError
>                         : org/apache/commons/logging/LogFactory
>                         /
>                         I guess the reason is I ran HDFS without any common
>                 class. How I get
>                         around this?
>
>                         3) I also tried the third way, by download the
>         hadoop
>                 release
>                         (contain
>                         everything: core, hdfs, mapred), and used
>         Eclipse to create
>                         project from
>                         existing code. I was able to build this project.
>         The bin
>                 scripts
>                         worked
>                         well but I found know FI related classes. What I
>         did was
>                 apply
>                         the patch
>                         (HADOOP-6003.patch) using Eclipse patch command
>         (Team |
>                 apply
>                         patch),
>                         but I failed the patching procedure.
>
>                         In summary, I would like to run a real HDFS with
>         fault
>                         injection. I am
>                         not very familiar with ant. Could you please show me
>                 some more
>                         details,
>                         so that I could get around this?
>
>                         On Thu, Oct 8, 2009 at 12:19 AM, Jakob Homan
>         <jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>
>         <mailto:jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>>
>         <mailto:jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>
>         <mailto:jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>>>
>         <mailto:jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>
>         <mailto:jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>>
>         <mailto:jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>
>         <mailto:jhoman@yahoo-inc.com <mailto:jhoman@yahoo-inc.com>>>>>
>         wrote:
>
>                             Thanh-
>                             If you would like the run execute the tests that
>                 have been
>                             instrumented to use the fault injection
>         framework
>                 the ant
>                         target is
>                             run-test-hdfs-fault-inject.  These were used
>                 extensively in the
>                             recent append work and there are quite a few
>                 append-related
>                         tests.
>                               Was there something more specific you were
>         looking
>                 for?
>
>                             Thanks,
>                             Jakob
>                             Hadoop at Yahoo!
>
>
>                             Thanh Do wrote:
>
>                                 Hi everyone,
>
>                                 Could any body so me how to run the
>         fault injection
>                         framework
>                                 mentioned in the following links?:
>
>         http://issues.apache.org/jira/browse/HDFS-435
>
>                                 and
>
>         https://issues.apache.org/jira/browse/HDFS-436
>
>                                 Thanks,
>                                 Thanh
>
>
>
>
>
>
>                         --
>                         T
>
>
>                     --
>                     With best regards,
>                             Konstantin Boudnik (aka Cos)
>
>                             Yahoo! Grid Computing
>                             +1 (408) 349-4049
>
>                     2CAC 8312 4870 D885 8616  6115 220F 6980 1F27 E622
>                     Attention! Streams of consciousness are disallowed
>
>
>
>
>                 --
>                 thanh
>
>
>
>
>         --
>         thanh
>
>
>
>
> --
> thanh

Mime
View raw message