hadoop-hdfs-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mahadev Konar <maha...@hortonworks.com>
Subject Re: Failing trunk builds for HDFS.
Date Tue, 16 Aug 2011 03:17:17 GMT
Thanks Alejandro. 

Ill leave the HDFS nightly alone now :).

BTW, the last error https://builds.apache.org/job/Hadoop-Hdfs-trunk/752/console which Todd
also mentioned went away when I re triggered the build. So, there is definitely an issue with
pulling the common artifacts for HDFS.

thanks
mahadev

On Aug 15, 2011, at 8:03 PM, Alejandro Abdelnur wrote:

> Mahadev,
> 
> AOP stuff is now wired yet in common Mavenization, the instrumented JAR is
> not being created/deployed.
> 
> From some of the messages, it seems related to that.
> 
> HDFS Mavenization (HDFS-2096), which has been +1, does not attempt to run
> AOP stuff either, thus it would 'fix' this build failure for now. Later,
> when AOP is wired to Mavenization the fault injection test would be back to
> the build.
> 
> At the moment, per Arun's request, we are holding on HDFS-2096 until MR-279
> goes in.
> 
> Thanks.
> 
> Alejandro
> 
> 
> On Mon, Aug 15, 2011 at 7:54 PM, Mahadev Konar <mahadev@hortonworks.com>wrote:
> 
>> I just tried the hdfs build on Apache Jenkins. Its still failing:
>> 
>> https://builds.apache.org/job/Hadoop-Hdfs-trunk/752/console
>> 
>> Looks like something you are noticing Todd.
>> 
>> Also, is https://issues.apache.org/jira/browse/HDFS-2261 also an issue for
>> the builds?
>> 
>> thanks
>> mahadev
>> On Aug 15, 2011, at 4:58 PM, Todd Lipcon wrote:
>> 
>>> I'm having some related issues locally... seems every time the Hudson
>>> publishes a new build, my local build breaks with something like this:
>>> 
>>> 
>>> todd@todd-w510:~/git/hadoop-common/hdfs$ ant clean test
>>> Buildfile: /home/todd/git/hadoop-common/hdfs/build.xml
>>> 
>>> clean-contrib:
>>> 
>>> clean:
>>> 
>>> clean:
>>>    [echo] contrib: fuse-dfs
>>> 
>>> clean-fi:
>>> 
>>> clean-sign:
>>> 
>>> clean:
>>>  [delete] Deleting directory /home/todd/git/hadoop-common/hdfs/build
>>> 
>>> ivy-download:
>>>     [get] Getting:
>>> 
>> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0-rc1/ivy-2.2.0-rc1.jar
>>>     [get] To: /home/todd/git/hadoop-common/hdfs/ivy/ivy-2.2.0-rc1.jar
>>>     [get] Not modified - so not downloaded
>>> 
>>> ivy-init-dirs:
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/ivy
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/ivy/lib
>>>   [mkdir] Created dir:
>> /home/todd/git/hadoop-common/hdfs/build/ivy/report
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/ivy/maven
>>> 
>>> ivy-probe-antlib:
>>> 
>>> ivy-init-antlib:
>>> 
>>> ivy-init:
>>> [ivy:configure] :: Ivy 2.2.0-rc1 - 20100629224905 ::
>>> http://ant.apache.org/ivy/ ::
>>> [ivy:configure] :: loading settings :: file =
>>> /home/todd/git/hadoop-common/hdfs/ivy/ivysettings.xml
>>> 
>>> ivy-resolve-common:
>>> [ivy:resolve] downloading
>>> 
>> https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-annotations/0.23.0-SNAPSHOT/hadoop-annotations-0.23.0-20110808.090045-15.jar
>>> ...
>>> [ivy:resolve] ... (14kB)
>>> [ivy:resolve] .. (0kB)
>>> [ivy:resolve]   [SUCCESSFUL ]
>>> 
>> org.apache.hadoop#hadoop-annotations;0.23.0-SNAPSHOT!hadoop-annotations.jar
>>> (458ms)
>>> [ivy:resolve] downloading
>>> 
>> https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-20110815.225725-267.jar
>>> ...
>>> [ivy:resolve]
>> .....................................................................................................................
>>> [ivy:resolve]
>> ....................................................................................................
>>> (1667kB)
>>> [ivy:resolve] .. (0kB)
>>> [ivy:resolve]   [SUCCESSFUL ]
>>> org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT!hadoop-common.jar
>>> (2796ms)
>>> 
>>> ivy-retrieve-common:
>>> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
>>> 'ivy.settings.file' instead
>>> [ivy:cachepath] :: loading settings :: file =
>>> /home/todd/git/hadoop-common/hdfs/ivy/ivysettings.xml
>>> 
>>> ivy-resolve-hdfs:
>>> [ivy:resolve] downloading
>>> 
>> https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-annotations/0.23.0-SNAPSHOT/hadoop-annotations-0.23.0-20110808.090045-15.jar
>>> ...
>>> [ivy:resolve] ... (14kB)
>>> [ivy:resolve] .. (0kB)
>>> [ivy:resolve]   [SUCCESSFUL ]
>>> 
>> org.apache.hadoop#hadoop-annotations;0.23.0-SNAPSHOT!hadoop-annotations.jar
>>> (193ms)
>>> 
>>> ivy-retrieve-hdfs:
>>> 
>>> ivy-resolve-test:
>>> 
>>> ivy-retrieve-test:
>>> 
>>> init:
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/classes
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/src
>>>   [mkdir] Created dir:
>>> /home/todd/git/hadoop-common/hdfs/build/web/webapps/hdfs/WEB-INF
>>>   [mkdir] Created dir:
>>> /home/todd/git/hadoop-common/hdfs/build/web/webapps/datanode/WEB-INF
>>>   [mkdir] Created dir:
>>> /home/todd/git/hadoop-common/hdfs/build/web/webapps/secondary/WEB-INF
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/ant
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/c++
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/test
>>>   [mkdir] Created dir:
>>> /home/todd/git/hadoop-common/hdfs/build/test/hdfs/classes
>>>   [mkdir] Created dir:
>> /home/todd/git/hadoop-common/hdfs/build/test/extraconf
>>>   [touch] Creating /tmp/null1539574669
>>>  [delete] Deleting: /tmp/null1539574669
>>>    [copy] Copying 6 files to
>>> /home/todd/git/hadoop-common/hdfs/build/web/webapps
>>>    [copy] Copying 1 file to /home/todd/git/hadoop-common/hdfs/conf
>>>    [copy] Copying
>>> /home/todd/git/hadoop-common/hdfs/conf/hdfs-site.xml.template to
>>> /home/todd/git/hadoop-common/hdfs/conf/hdfs-site.xml
>>>   [mkdir] Created dir: /home/todd/git/hadoop-common/hdfs/build/test/conf
>>>    [copy] Copying 1 file to
>> /home/todd/git/hadoop-common/hdfs/build/test/conf
>>>    [copy] Copying
>>> /home/todd/git/hadoop-common/hdfs/conf/hdfs-site.xml.template to
>>> /home/todd/git/hadoop-common/hdfs/build/test/conf/hdfs-site.xml
>>>    [copy] Copying 2 files to
>> /home/todd/git/hadoop-common/hdfs/build/test/conf
>>>    [copy] Copying
>>> /home/todd/git/hadoop-common/hdfs/conf/hadoop-metrics2.properties to
>>> 
>> /home/todd/git/hadoop-common/hdfs/build/test/conf/hadoop-metrics2.properties
>>>    [copy] Copying
>>> /home/todd/git/hadoop-common/hdfs/conf/log4j.properties to
>>> /home/todd/git/hadoop-common/hdfs/build/test/conf/log4j.properties
>>> 
>>> check-libhdfs-makefile:
>>> 
>>> create-libhdfs-makefile:
>>> 
>>> compile-c++-libhdfs:
>>> 
>>> clover.setup:
>>> 
>>> clover.info:
>>>    [echo]
>>>    [echo]      Clover not found. Code coverage reports disabled.
>>>    [echo]
>>> 
>>> clover:
>>> 
>>> compile-hdfs-classes:
>>>   [javac] /home/todd/git/hadoop-common/hdfs/build.xml:370: warning:
>>> 'includeantruntime' was not set, defaulting to
>>> build.sysclasspath=last; set to false for repeatable builds
>>>   [javac] Compiling 282 source files to
>>> /home/todd/git/hadoop-common/hdfs/build/classes
>>>   [javac]
>> /home/todd/git/hadoop-common/hdfs/src/java/org/apache/hadoop/fs/Hdfs.java:33:
>>> package org.apache.hadoop.conf does not exist
>>>   [javac] import org.apache.hadoop.conf.Configuration;
>>> 
>>> ... [lots more errors where o.a.h.* from common can't be found ...
>>> 
>>> and yet:
>>> 
>>> todd@todd-w510:~/git/hadoop-common/hdfs$ find . -name \*common\*23\*jar
>>> ./build/ivy/lib/hadoop-hdfs/common/hadoop-common-0.23.0-SNAPSHOT.jar
>>> ./build/ivy/lib/hadoop-hdfs/test/hadoop-common-0.23.0-SNAPSHOT-tests.jar
>>> 
>>> and then I run the exact build command again, I see:
>>> 
>>> ivy-resolve-common:
>>> [ivy:resolve] downloading
>>> 
>> https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-20110815.225725-267.jar
>>> ...
>>> [ivy:resolve]
>> ...............................................................................................................................
>>> [ivy:resolve]
>> .........................................................................................
>>> (1667kB)
>>> [ivy:resolve] .. (0kB)
>>> [ivy:resolve]   [SUCCESSFUL ]
>>> org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT!hadoop-common.jar
>>> (2661ms)
>>> 
>>> and it builds fine.
>>> 
>>> 
>>> On Mon, Aug 15, 2011 at 4:08 PM, Giridharan Kesavan
>>> <gkesavan@hortonworks.com> wrote:
>>>> On Mon, Aug 15, 2011 at 3:52 PM, Eli Collins <eli@cloudera.com> wrote:
>>>>> On Mon, Aug 15, 2011 at 3:45 PM, Giridharan Kesavan
>>>>> <gkesavan@hortonworks.com> wrote:
>>>>>> Hi Eli,
>>>>>> 
>>>>>> your are right Im talking about the apache jenkins hdfs build
>> failures;
>>>>>> 
>>>>>> Im pretty sure hdfs is picking the latest hadoop common jars. I
>>>>>> verified this with the apache repo as well.
>>>>> 
>>>>> How are you building? The method that its claiming doesn't exist
>>>>> definitely does.
>>>> 
>>>> target doesn't exist in the build.xml; its part of the fault injection
>>>> framework and is imported from "trunk/hdfs/src/test/aop/build/aop.xml"
>>>> 
>>>> you can see this import in the build.xml file
>>>> <import file="${test.src.dir}/aop/build/aop.xml"/>
>>>> 
>>>>> 
>>>>> The following works on trunk so I think it's  an issue with how
>>>>> Jenkins is running it.
>>>>> 
>>>>> hadoop-common $ mvn clean
>>>>> hadoop-common $ mvn install -DskipTests
>>>>> hadoop-common $ pushd ../hdfs
>>>>> hdfs $ ant clean
>>>>> hdfs $ ant -Dresolvers=internal jar
>>>>> hdfs $ ant run-test-hdfs-fault-inject
>>>> 
>>>> I think you should pass "-Dresolver=internal" to the
>>>> run-test-hdfs-fault-inject target as well
>>>> 
>>>> Thanks,
>>>> giri
>>>> 
>>>>> 
>>>>> Thanks,
>>>>> Eli
>>>>> 
>>>>>> 
>>>>>> 
>> https://repository.apache.org/content/groups/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT
>>>>>> 
>>>>>> hadoop-common-0.23.0-20110815.215733-266-tests.jar
>>>>>> 
>>>>>> hadoop-common-0.23.0-20110815.215733-266.jar
>>>>>> 
>>>>>> 
>> https://builds.apache.org/view/G-L/view/Hadoop/job/Hadoop-Hdfs-trunk-Commit/837/console
>>>>>> 
>>>>>> [ivy:resolve] .. (0kB)
>>>>>> 
>>>>>> [ivy:resolve]   [SUCCESSFUL ] org.apache.hadoop#avro;1.3.2!avro.jar
>> (1011ms)
>>>>>> [ivy:resolve] downloading
>>>>>> 
>> https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-20110815.215733-266.jar
>>>>>> ...
>>>>>> 
>>>>>> [ivy:resolve]
>> ........................................................................................................................................................................................................................
>>>>>> (1667kB)
>>>>>> [ivy:resolve] .. (0kB)
>>>>>> [ivy:resolve]   [SUCCESSFUL ]
>>>>>> org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT!hadoop-common.jar
>>>>>> (1549ms)
>>>>>> 
>>>>>> ivy-retrieve-common:
>>>>>> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
>>>>>> 'ivy.settings.file' instead
>>>>>> [ivy:cachepath] :: loading settings :: file =
>>>>>> 
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/ivy/ivysettings.xml
>>>>>> 
>>>>>> ivy-resolve-hdfs:
>>>>>> 
>>>>>> ivy-retrieve-hdfs:
>>>>>> 
>>>>>> ivy-resolve-test:
>>>>>> 
>>>>>> [ivy:resolve] downloading
>>>>>> 
>> https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-20110815.215733-266-tests.jar
>>>>>> ...
>>>>>> [ivy:resolve]
>> ........................................................................................................................
>>>>>> (876kB)
>>>>>> 
>>>>>> [ivy:resolve] .. (0kB)
>>>>>> [ivy:resolve]   [SUCCESSFUL ]
>>>>>> 
>> org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT!hadoop-common.jar(tests)
>>>>>> (875ms)
>>>>>> 
>>>>>> 
>>>>>> On Mon, Aug 15, 2011 at 3:33 PM, Eli Collins <eli@cloudera.com>
>> wrote:
>>>>>>> 
>>>>>>> Hey Giri,
>>>>>>> 
>>>>>>> This looks like a similar issue to what was hitting the main
Jenkins
>>>>>>> job, the Hdfs job isn't picking up the latest bits from common.
>>>>>>> 
>>>>>>> Thanks,
>>>>>>> Eli
>>>>>>> 
>>>>>>> On Mon, Aug 15, 2011 at 3:27 PM, Giridharan Kesavan
>>>>>>> <gkesavan@hortonworks.com> wrote:
>>>>>>>> Todd,
>>>>>>>> 
>>>>>>>> Could you please take a look at this ?
>>>>>>>> 
>>>>>>>> https://issues.apache.org/jira/browse/HDFS-2261
>>>>>>>> 
>>>>>>>> 
>>>>>>>> -Giri
>>>>>>>> On Mon, Aug 15, 2011 at 3:24 PM, Todd Lipcon <todd@cloudera.com>
>> wrote:
>>>>>>>> 
>>>>>>>>> Seems like some of it is a build issue where it can't
find ant.
>>>>>>>>> 
>>>>>>>>> The other is the following:
>>>>>>>>> https://issues.apache.org/jira/browse/HADOOP-7545
>>>>>>>>> Please review.
>>>>>>>>> 
>>>>>>>>> Thanks
>>>>>>>>> -Todd
>>>>>>>>> 
>>>>>>>>> On Mon, Aug 15, 2011 at 2:54 PM, Mahadev Konar <
>> mahadev@hortonworks.com>
>>>>>>>>> wrote:
>>>>>>>>>> Hi folks,
>>>>>>>>>> Can anyone take a look at the hdfs builds? Seems
to be failing:
>>>>>>>>>> 
>>>>>>>>>> https://builds.apache.org/job/Hadoop-Hdfs-trunk/
>>>>>>>>>> 
>>>>>>>>>> thanks
>>>>>>>>>> mahadev
>>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> --
>>>>>>>>> Todd Lipcon
>>>>>>>>> Software Engineer, Cloudera
>>>>>>>>> 
>>>>>>>> 
>>>>>> 
>>>>> 
>>>> 
>>> 
>>> 
>>> 
>>> --
>>> Todd Lipcon
>>> Software Engineer, Cloudera
>> 
>> 


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message