Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8FFC38149 for ; Fri, 12 Aug 2011 18:27:51 +0000 (UTC) Received: (qmail 54227 invoked by uid 500); 12 Aug 2011 18:27:51 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 54155 invoked by uid 500); 12 Aug 2011 18:27:50 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 54141 invoked by uid 99); 12 Aug 2011 18:27:50 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 12 Aug 2011 18:27:50 +0000 X-ASF-Spam-Status: No, hits=1.1 required=5.0 tests=NO_RDNS_DOTCOM_HELO,RCVD_IN_DNSWL_NONE,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: 216.145.54.173 is neither permitted nor denied by domain of ericp@yahoo-inc.com) Received: from [216.145.54.173] (HELO mrout3.yahoo.com) (216.145.54.173) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 12 Aug 2011 18:27:41 +0000 Received: from sp1-ex07cas03.ds.corp.yahoo.com (sp1-ex07cas03.ds.corp.yahoo.com [216.252.116.151]) by mrout3.yahoo.com (8.14.4/8.14.4/y.out) with ESMTP id p7CIR0l7095373; Fri, 12 Aug 2011 11:27:00 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=yahoo-inc.com; s=cobra; t=1313173620; bh=ltyftRM9GxIF7MIqbfDmTju8STbBph5tjbtCoU4a8Eo=; h=From:To:CC:Date:Subject:Message-ID:References:In-Reply-To: Content-Type:Content-Transfer-Encoding:MIME-Version; b=kmARn6f4a7J69TFbMjiXaQJ+j45tIvJduGK3UYNUOC+KR+pkRfHSsPo2G7Y8MX6bG MGpxjc3CrNH7i7GxOF6kep9XpaP+a4Mz0agiuOtwxpuLI7hpZvL2k5r2kII/InHx0b fpLH/A/0PT8XtP4kjPK1TFem5Jr0+wHka75O+TmQ= Received: from SP1-EX07VS01.ds.corp.yahoo.com ([216.252.116.139]) by sp1-ex07cas03.ds.corp.yahoo.com ([216.252.116.151]) with mapi; Fri, 12 Aug 2011 11:27:00 -0700 From: Eric Payne To: Eli Collins CC: "hdfs-dev@hadoop.apache.org" , Tom White Date: Fri, 12 Aug 2011 11:26:58 -0700 Subject: RE: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing Thread-Topic: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing Thread-Index: AcxZEiwVBsQvOnZPSLOYP7dt/zPbaQACxsNA Message-ID: <9F583E4E042FCD48BB51F1E4458BD90645B4D9F1FD@SP1-EX07VS01.ds.corp.yahoo.com> References: <1226495466.37431313108845628.JavaMail.hudson@aegis> <1804739480.37531313125902924.JavaMail.hudson@aegis> <9F583E4E042FCD48BB51F1E4458BD90645B4D9F17E@SP1-EX07VS01.ds.corp.yahoo.com> In-Reply-To: Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 X-Virus-Checked: Checked by ClamAV on apache.org Thanks Eli. I have resolvers=3Dinternal in my $HOME/build.properties file. Is that enou= gh, our should I also put -Dresolvers=3Dinternal on the command line? Thanks, -Eric -----Original Message----- From: Eli Collins [mailto:eli@cloudera.com]=20 Sent: Friday, August 12, 2011 12:06 PM To: Eric Payne Cc: hdfs-dev@hadoop.apache.org; Tom White Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing You need to build hdfs with and -Dresolvers=3Dinternal after runing mvn install -DskipTests in common. On Fri, Aug 12, 2011 at 9:51 AM, Eric Payne wrote: > I'm seeing this error when I try to build a fresh checkout. > > I can get around it by removing the .m2 directory in my $HOME directory a= nd then running 'mvn install -DskipTests' again in trun root. > > However, test-patch still gets the error and fails the 'system test frame= work' build. > > -Eric > > -----Original Message----- > From: Alejandro Abdelnur [mailto:tucu@cloudera.com] > Sent: Friday, August 12, 2011 12:41 AM > To: Eli Collins > Cc: hdfs-dev@hadoop.apache.org; Tom White > Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing > > Eli, > > I think you are right, I'm pretty sure it is picking up the latest deploy= ed > snapshot. > > I'll discuss with Tom tomorrow morning how to take care of this (once HDF= S > is Mavenized we can easily build/use all latest bits from all modules, st= ill > some tricks not to run all modules test will have to be done). > > Thxs. > > Alejandro > > On Thu, Aug 11, 2011 at 10:20 PM, Eli Collins wrote: > >> Tucu and co - does hdfs build the latest common or does it try to >> resolve against the latest deployed common artifact? >> Looks like hudson-test-patch doesn't pick up on the latest common build. >> >> >> >> On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server >> wrote: >> > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/ >> > >> > >> ########################################################################= ########### >> > ########################## LAST 60 LINES OF THE CONSOLE >> ########################### >> > [...truncated 1273 lines...] >> > =A0 =A0 [iajc] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0^^^^^^^ >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 >> [error] The method getDecodedPath(HttpServletRequest, String) is undefin= ed >> for the type ServletUtil >> > =A0 =A0 [iajc] final String path =3D ServletUtil.getDecodedPath(reques= t, >> "/data"); >> > =A0 =A0 [iajc] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =A0 =A0 ^^^ >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 >> [error] The method getRawPath(HttpServletRequest, String) is undefined f= or >> the type ServletUtil >> > =A0 =A0 [iajc] final String encodedPath =3D ServletUtil.getRawPath(req= uest, >> "/data"); >> > =A0 =A0 [iajc] >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 >> [error] The method getDecodedPath(HttpServletRequest, String) is undefin= ed >> for the type ServletUtil >> > =A0 =A0 [iajc] final String path =3D ServletUtil.getDecodedPath(reques= t, >> "/listPaths"); >> > =A0 =A0 [iajc] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =A0 =A0 ^^^^^^^^^ >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 >> [error] The method getDecodedPath(HttpServletRequest, String) is undefin= ed >> for the type ServletUtil >> > =A0 =A0 [iajc] final String filePath =3D ServletUtil.getDecodedPath(re= quest, >> "/listPaths"); >> > =A0 =A0 [iajc] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =A0 =A0 =A0 =A0 ^^^^^^^^^ >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 >> [error] The method getDecodedPath(HttpServletRequest, String) is undefin= ed >> for the type ServletUtil >> > =A0 =A0 [iajc] final String path =3D ServletUtil.getDecodedPath(reques= t, >> "/streamFile"); >> > =A0 =A0 [iajc] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =A0 =A0 ^^^^^^^^^ >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 >> [error] The method getRawPath(HttpServletRequest, String) is undefined f= or >> the type ServletUtil >> > =A0 =A0 [iajc] final String rawPath =3D ServletUtil.getRawPath(request= , >> "/streamFile"); >> > =A0 =A0 [iajc] =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0= =A0 =A0 =A0 =A0^^^^^ >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:6= 7 >> [warning] advice defined in >> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been appli= ed >> [Xlint:adviceDidNotMatch] >> > =A0 =A0 [iajc] >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:6= 0 >> [warning] advice defined in >> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been appli= ed >> [Xlint:adviceDidNotMatch] >> > =A0 =A0 [iajc] >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:5= 0 >> [warning] advice defined in >> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been appli= ed >> [Xlint:adviceDidNotMatch] >> > =A0 =A0 [iajc] >> > =A0 =A0 [iajc] >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:4= 3 >> [warning] advice defined in >> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been appli= ed >> [Xlint:adviceDidNotMatch] >> > =A0 =A0 [iajc] >> > =A0 =A0 [iajc] >> > =A0 =A0 [iajc] 18 errors, 4 warnings >> > >> > BUILD FAILED >> > >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /test/aop/build/aop.xml:222: >> The following error occurred while executing this line: >> > >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /test/aop/build/aop.xml:203: >> The following error occurred while executing this line: >> > >> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src= /test/aop/build/aop.xml:90: >> compile errors: 18 >> > >> > Total time: 55 seconds >> > >> > >> > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D >> > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D >> > STORE: saving artifacts >> > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D >> > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D >> > >> > >> > mv: cannot stat `build/*.tar.gz': No such file or directory >> > mv: cannot stat `build/test/findbugs': No such file or directory >> > mv: cannot stat `build/docs/api': No such file or directory >> > Build Failed >> > [FINDBUGS] Skipping publisher since build result is FAILURE >> > Archiving artifacts >> > Publishing Clover coverage report... >> > No Clover report will be published due to a Build Failure >> > Recording test results >> > Publishing Javadoc >> > Recording fingerprints >> > Updating HDFS-2235 >> > Email was triggered for: Failure >> > Sending email for trigger: Failure >> > >> > >> > >> > >> ########################################################################= ########### >> > ############################## FAILED TESTS (if any) >> ############################## >> > No tests ran. >> > >> >