hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ian Holsman <had...@holsman.net>
Subject Re: Hadoop-common-trunk-Commit is failing since 01/19/2011
Date Tue, 08 Feb 2011 19:31:12 GMT
Hi Nige.
I've subscribed it now. as it's wasn't on the subscriber list. we do have hudson@lucene.zones.apache.org.

but I haven't seen any moderation notices for it either... so I'm not sure it is generating
emails.

On Feb 9, 2011, at 3:32 AM, Nigel Daley wrote:

> Hmm, haven't seen Hudson post build failures to the common-dev list lately.  
> 
> Ian, can you check that hudson@hudson.apache.org is still subscribed to common-dev@.
 If not, please add it.
> 
> Thx,
> Nige
> 
> 
> On Feb 1, 2011, at 11:27 AM, Giridharan Kesavan wrote:
> 
>> Konstantin,
>> 
>> trunk/artifacts gets populated when the jar and the tar ant target are successful.
>> 
>> The main reason for the build failure so far is the build abort time configuration.
It was set to 30mins.
>> I have increased the build abort time and the builds are going on fine 
>> https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit
>> 
>> 
>> Thanks,
>> Giri
>> 
>> On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:
>> 
>>> Giri,
>>> 
>>> Looking at configuration of Hadoop-Common-trunk-Commit/
>>> There seems to be errors in the Post-build Actions.
>>> It is complaining that
>>> 'trunk' exists but not 'trunk/artifacts/...'
>>> Is it possible that this misconfiguration is the reason of failures?
>>> 
>>> --Konstantin
>>> 
>>> 
>>> On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
>>> <gkesavan@yahoo-inc.com>wrote:
>>> 
>>>> Konstantin,
>>>> 
>>>> I think I need to restart the slave which is running the commit build. For
>>>> now I have published the common artifact manually from commandline.
>>>> 
>>>> Thanks,
>>>> Giri
>>>> 
>>>> On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:
>>>> 
>>>>> Giri
>>>>> looks like the last run you started failed the same way as previous ones.
>>>>> Any thoughts on what's going on?
>>>>> Thanks,
>>>>> --Konstantin
>>>>> 
>>>>> On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
>>>>> <gkesavan@yahoo-inc.com>wrote:
>>>>> 
>>>>>> ant mvn-deploy would publish snapshot artifact to the apache maven
>>>>>> repository as long you have the right credentials in ~/.m2/settings.xml.
>>>>>> 
>>>>>> For settings.xml template pls look at
>>>>>> http://wiki.apache.org/hadoop/HowToRelease
>>>>>> 
>>>>>> I'm pushing the latest common artifacts now.
>>>>>> 
>>>>>> -Giri
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:
>>>>>> 
>>>>>>> By manually installing a new core jar into the cache, I can compile
>>>>>>> trunk.  Looks like we just need to kick a new Core into maven.
 Are
>>>>>>> there instructions somewhere for committers to do this?  I know
Nigel
>>>>>>> and Owen know how, but I don't know if the knowledge is diffused
past
>>>>>>> them.
>>>>>>> -Jakob
>>>>>>> 
>>>>>>> 
>>>>>>> On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
>>>>>>> <shv.hadoop@gmail.com> wrote:
>>>>>>>> Current trunk for HDFS and MapReduce are not compiling at
the moment.
>>>>>> Try to
>>>>>>>> build trunk.
>>>>>>>> This is the result of that changes to common api introduced
by
>>>>>> HADOOP-6904
>>>>>>>> are not promoted to HDFS and MR trunks.
>>>>>>>> HDFS-1335 and MAPREDUCE-2263 depend on these changes.
>>>>>>>> 
>>>>>>>> Common is not promoted to HDFS and MR because
>>>> Hadoop-Common-trunk-Commit
>>>>>>>> build is broken. See here.
>>>>>>>> 
>>>>>> 
>>>> https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
>>>>>>>> 
>>>>>>>> As I see the last successful build was on 01/19, which integrated
>>>>>>>> HADOOP-6864.
>>>>>>>> I think this is when JNI changes were introduced, which cannot
be
>>>>>> digested
>>>>>>>> by Hudson since then.
>>>>>>>> 
>>>>>>>> Anybody with gcc active could you please verify if the problem
is
>>>> caused
>>>>>> by
>>>>>>>> HADOOP-6864.
>>>>>>>> 
>>>>>>>> Thanks,
>>>>>>>> --Konstantin
>>>>>>>> 
>>>>>>>> On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
>>>>>> wrote:
>>>>>>>> 
>>>>>>>>> The has been a problem with more than one build failing
(Mahout is
>>>> the
>>>>>> one
>>>>>>>>> that I saw first) due to a change in maven version which
meant that
>>>> the
>>>>>>>>> clover license isn't being found properly.  At least,
that is the
>>>> tale
>>>>>> I
>>>>>>>>> heard from infra.
>>>>>>>>> 
>>>>>>>>> On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
>>>> wrote:
>>>>>>>>> 
>>>>>>>>>> Hey Konstantin,
>>>>>>>>>> 
>>>>>>>>>> The only build breakage I saw from HADOOP-6904 is
MAPREDUCE-2290,
>>>>>>>>>> which was fixed.  Trees from trunk are compiling
against each other
>>>>>>>>>> for me (eg each installed to a local maven repo),
perhaps the
>>>> upstream
>>>>>>>>>> maven repo hasn't been updated with the latest bits
yet.
>>>>>>>>>> 
>>>>>>>>>> Thanks,
>>>>>>>>>> Eli
>>>>>>>>>> 
>>>>>>>>>> On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
>>>>>>>>>> <shv.hadoop@gmail.com> wrote:
>>>>>>>>>>> Sending this to general to attract urgent attention.
>>>>>>>>>>> Both HDFS and MapReduce are not compiling since
>>>>>>>>>>> HADOOP-6904 and its hdfs and MP counterparts
were committed.
>>>>>>>>>>> The problem is not with this patch as described
below, but I think
>>>>>>>>> those
>>>>>>>>>>> commits should be reversed if Common integration
build cannot be
>>>>>>>>>>> restored promptly.
>>>>>>>>>>> 
>>>>>>>>>>> Thanks,
>>>>>>>>>>> --Konstantin
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
>>>>>>>>>>> <shv.hadoop@gmail.com>wrote:
>>>>>>>>>>> 
>>>>>>>>>>>> I see Hadoop-common-trunk-Commit is failing
and not sending any
>>>>>>>>> emails.
>>>>>>>>>>>> It times out on native compilation and aborts.
>>>>>>>>>>>> Therefore changes are not integrated, and
now it lead to hdfs and
>>>>>>>>>> mapreduce
>>>>>>>>>>>> both not compiling.
>>>>>>>>>>>> Can somebody please take a look at this.
>>>>>>>>>>>> The last few lines of the build are below.
>>>>>>>>>>>> 
>>>>>>>>>>>> Thanks
>>>>>>>>>>>> --Konstantin
>>>>>>>>>>>> 
>>>>>>>>>>>> [javah] [Loaded
>>>>>>>>>> 
>>>>>>>>> 
>>>>>> 
>>>> /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
>>>>>>>>>>>> 
>>>>>>>>>>>> [javah] [Loaded
>>>>>>>>>> 
>>>>>>>>> 
>>>>>> 
>>>> /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
>>>>>>>>>>>> [javah] [Forcefully writing file
>>>>>>>>>> 
>>>>>>>>> 
>>>>>> 
>>>> /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
>>>>>>>>>>>> 
>>>>>>>>>>>>  [exec] checking for gcc... gcc
>>>>>>>>>>>>  [exec] checking whether the C compiler works...
yes
>>>>>>>>>>>>  [exec] checking for C compiler default output
file name...
>>>>>> a.out
>>>>>>>>>>>>  [exec] checking for suffix of executables...
>>>>>>>>>>>> 
>>>>>>>>>>>> Build timed out. Aborting
>>>>>>>>>>>> Build was aborted
>>>>>>>>>>>> [FINDBUGS] Skipping publisher since build
result is ABORTED
>>>>>>>>>>>> Publishing Javadoc
>>>>>>>>>>>> Archiving artifacts
>>>>>>>>>>>> Recording test results
>>>>>>>>>>>> No test report files were found. Configuration
error?
>>>>>>>>>>>> 
>>>>>>>>>>>> Recording fingerprints
>>>>>>>>>>>>  [exec] Terminated
>>>>>>>>>>>> Publishing Clover coverage report...
>>>>>>>>>>>> No Clover report will be published due to
a Build Failure
>>>>>>>>>>>> No emails were triggered.
>>>>>>>>>>>> Finished: ABORTED
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>> 
>>>>>>>>> 
>>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>>>> 
>> 
> 


Mime
View raw message