Return-Path: X-Original-To: apmail-spark-user-archive@minotaur.apache.org Delivered-To: apmail-spark-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 610E818B00 for ; Tue, 27 Oct 2015 16:40:24 +0000 (UTC) Received: (qmail 37660 invoked by uid 500); 27 Oct 2015 16:40:12 -0000 Delivered-To: apmail-spark-user-archive@spark.apache.org Received: (qmail 37564 invoked by uid 500); 27 Oct 2015 16:40:12 -0000 Mailing-List: contact user-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@spark.apache.org Received: (qmail 37550 invoked by uid 99); 27 Oct 2015 16:40:12 -0000 Received: from Unknown (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 27 Oct 2015 16:40:12 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 24143C0B9B for ; Tue, 27 Oct 2015 16:40:12 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.901 X-Spam-Level: *** X-Spam-Status: No, score=3.901 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_REPLY=1, HTML_MESSAGE=3, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id pO_uHVerpFOQ for ; Tue, 27 Oct 2015 16:39:59 +0000 (UTC) Received: from mail-qk0-f180.google.com (mail-qk0-f180.google.com [209.85.220.180]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id 76AEF20F1E for ; Tue, 27 Oct 2015 16:39:59 +0000 (UTC) Received: by qkbl190 with SMTP id l190so124070451qkb.2 for ; Tue, 27 Oct 2015 09:39:58 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type; bh=HkPzk0urGbVtwKBXbmryWQnlAh6QvztjRhmkQD9SmXE=; b=vnR3SRbwxG6xAYVoQ1eqaP7QpLHKy/UFscQLSCdUQ7A5KtctX4bPFvHCNaVxd84TNO W08FfGK6pluar0O/Zosm/jL/y52TCf/X7CL4VySjhSEkGx1FdrYUw0uhEOF6tVIFCo5U Bldk/vtNggSzlC+yeQmARgoLDEmaqwPUZEQ+ksIV+tJsuiLVcWrAHpy0/qM1aJrheCX7 3/QZtR8kfyF0suAQSahNC4FKwtyH4uIAiRnSP81hMJj6cy8IFTWVDuRR5D2BQeUYAlS8 ZNXrdtLGRNMkzIeEzJ7oskI1I+sbAJWIOfjlHRd4QE1N1e4xyP6Rue8y3bHsnrWnnAg0 aTtw== MIME-Version: 1.0 X-Received: by 10.55.74.197 with SMTP id x188mr4946082qka.38.1445963998582; Tue, 27 Oct 2015 09:39:58 -0700 (PDT) Received: by 10.140.106.11 with HTTP; Tue, 27 Oct 2015 09:39:58 -0700 (PDT) In-Reply-To: References: <86DACDBE-3B41-4F3A-B32F-9581905D9D72@gmail.com> Date: Tue, 27 Oct 2015 12:39:58 -0400 Message-ID: Subject: Re: Maven build failed (Spark master) From: Todd Nist To: Ted Yu Cc: Kayode Odeyemi , Yana Kadiyska , Sean Owen , user Content-Type: multipart/alternative; boundary=001a114a8b9c7e5de9052318baf5 --001a114a8b9c7e5de9052318baf5 Content-Type: text/plain; charset=UTF-8 I issued the same basic command and it worked fine. RADTech-MBP:spark $ ./make-distribution.sh --name hadoop-2.6 --tgz -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -DskipTests Which created: spark-1.6.0-SNAPSHOT-bin-hadoop-2.6.tgz in the root directory of the project. FWIW, the environment was an MBP with OS X 10.10.5 and Java: java version "1.8.0_51" Java(TM) SE Runtime Environment (build 1.8.0_51-b16) Java HotSpot(TM) 64-Bit Server VM (build 25.51-b03, mixed mode) -Todd On Tue, Oct 27, 2015 at 12:17 PM, Ted Yu wrote: > I used the following command: > make-distribution.sh --name custom-spark --tgz -Phadoop-2.4 -Phive > -Phive-thriftserver -Pyarn > > spark-1.6.0-SNAPSHOT-bin-custom-spark.tgz was generated (with patch from > SPARK-11348) > > Can you try above command ? > > Thanks > > On Tue, Oct 27, 2015 at 7:03 AM, Kayode Odeyemi wrote: > >> Ted, I switched to this: >> >> ./make-distribution.sh --name spark-latest --tgz -Dhadoop.version=2.6.0 >> -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn -DskipTests clean package -U >> >> Same error. No .gz file. Here's the bottom output log: >> >> + rm -rf /home/emperor/javaprojects/spark/dist >> + mkdir -p /home/emperor/javaprojects/spark/dist/lib >> + echo 'Spark [WARNING] See >> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin (git revision >> 3689beb) built for Hadoop [WARNING] See >> http://docs.codehaus.org/display/MAVENUSER/Shade+Pl >> + echo 'Build flags: -Dhadoop.version=2.6.0' -Phadoop-2.6 -Phive >> -Phive-thriftserver -Pyarn -DskipTests clean package -U >> + cp >> /home/emperor/javaprojects/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.6.0.jar >> /home/emperor/javaprojects/spark/dist/lib/ >> + cp >> /home/emperor/javaprojects/spark/examples/target/scala-2.10/spark-examples-1.6.0-SNAPSHOT-hadoop2.6.0.jar >> /home/emperor/javaprojects/spark/dist/lib/ >> + cp >> /home/emperor/javaprojects/spark/network/yarn/target/scala-2.10/spark-1.6.0-SNAPSHOT-yarn-shuffle.jar >> /home/emperor/javaprojects/spark/dist/lib/ >> + mkdir -p /home/emperor/javaprojects/spark/dist/examples/src/main >> + cp -r /home/emperor/javaprojects/spark/examples/src/main >> /home/emperor/javaprojects/spark/dist/examples/src/ >> + '[' 1 == 1 ']' >> + cp >> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar >> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-core-3.2.10.jar >> /home/emperor/javaprojects >> ed/jars/datanucleus-rdbms-3.2.9.jar >> /home/emperor/javaprojects/spark/dist/lib/ >> + cp /home/emperor/javaprojects/spark/LICENSE >> /home/emperor/javaprojects/spark/dist >> + cp -r /home/emperor/javaprojects/spark/licenses >> /home/emperor/javaprojects/spark/dist >> + cp /home/emperor/javaprojects/spark/NOTICE >> /home/emperor/javaprojects/spark/dist >> + '[' -e /home/emperor/javaprojects/spark/CHANGES.txt ']' >> + cp -r /home/emperor/javaprojects/spark/data >> /home/emperor/javaprojects/spark/dist >> + mkdir /home/emperor/javaprojects/spark/dist/conf >> + cp /home/emperor/javaprojects/spark/conf/docker.properties.template >> /home/emperor/javaprojects/spark/conf/fairscheduler.xml.template >> /home/emperor/javaprojects/spark/conf/log4j.properties >> emperor/javaprojects/spark/conf/metrics.properties.template >> /home/emperor/javaprojects/spark/conf/slaves.template >> /home/emperor/javaprojects/spark/conf/spark-defaults.conf.template /home/em >> ts/spark/conf/spark-env.sh.template >> /home/emperor/javaprojects/spark/dist/conf >> + cp /home/emperor/javaprojects/spark/README.md >> /home/emperor/javaprojects/spark/dist >> + cp -r /home/emperor/javaprojects/spark/bin >> /home/emperor/javaprojects/spark/dist >> + cp -r /home/emperor/javaprojects/spark/python >> /home/emperor/javaprojects/spark/dist >> + cp -r /home/emperor/javaprojects/spark/sbin >> /home/emperor/javaprojects/spark/dist >> + cp -r /home/emperor/javaprojects/spark/ec2 >> /home/emperor/javaprojects/spark/dist >> + '[' -d /home/emperor/javaprojects/spark/R/lib/SparkR ']' >> + '[' false == true ']' >> + '[' true == true ']' >> + TARDIR_NAME='spark-[WARNING] See >> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest' >> + TARDIR='/home/emperor/javaprojects/spark/spark-[WARNING] See >> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest' >> + rm -rf '/home/emperor/javaprojects/spark/spark-[WARNING] See >> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest' >> + cp -r /home/emperor/javaprojects/spark/dist >> '/home/emperor/javaprojects/spark/spark-[WARNING] See >> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest' >> cp: cannot create directory >> `/home/emperor/javaprojects/spark/spark-[WARNING] See >> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest': >> No such file or directory >> >> >> On Tue, Oct 27, 2015 at 2:14 PM, Ted Yu wrote: >> >>> Can you try the same command shown in the pull request ? >>> >>> Thanks >>> >>> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi wrote: >>> >>> Thank you. >>> >>> But I'm getting same warnings and it's still preventing the archive from >>> being generated. >>> >>> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file >>> >>> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu wrote: >>> >>>> Looks like '-Pyarn' was missing in your command. >>>> >>>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi >>>> wrote: >>>> >>>>> I used this command which is synonymous to what you have: >>>>> >>>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn >>>>> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests >>>>> clean package -U >>>>> >>>>> But I still see WARNINGS like this in the output and no .gz file >>>>> created: >>>>> >>>>> cp: /usr/local/spark-latest/spark-[WARNING] See >>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc: >>>>> No such file or directory >>>>> cp: /usr/local/spark-latest/spark-[WARNING] See >>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet: >>>>> No such file or directory >>>>> cp: /usr/local/spark-latest/spark-[WARNING] See >>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9: >>>>> No such file or directory >>>>> cp: >>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9: >>>>> unable to copy extended attributes to >>>>> /usr/local/spark-latest/spark-[WARNING] See >>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9: >>>>> No such file or directory >>>>> cp: /usr/local/spark-latest/spark-[WARNING] See >>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1: >>>>> No such file or directory >>>>> cp: >>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1: >>>>> unable to copy extended attributes to >>>>> /usr/local/spark-latest/spark-[WARNING] See >>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1: >>>>> No such file or directory >>>>> cp: /usr/local/spark-latest/spark-[WARNING] See >>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc: >>>>> No such file or directory >>>>> >>>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu wrote: >>>>> >>>>>> If you use the command shown in: >>>>>> https://github.com/apache/spark/pull/9281 >>>>>> >>>>>> You should have got the following: >>>>>> >>>>>> >>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet >>>>>> >>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet >>>>>> >>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet >>>>>> >>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet >>>>>> >>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet >>>>>> >>>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi >>>>>> wrote: >>>>>> >>>>>>> I see a lot of stuffs like this after the a successful maven build: >>>>>>> >>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See >>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/ >>>>>>> part-r-00008.gz.parquet: No such file or directory >>>>>>> >>>>>>> Seems it fails when it tries to package the build as an archive. >>>>>>> >>>>>>> I'm using the latest code on github master. >>>>>>> >>>>>>> Any ideas please? >>>>>>> >>>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska < >>>>>>> yana.kadiyska@gmail.com> wrote: >>>>>>> >>>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root >>>>>>>> directory (same directory that make_distribution is in) >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi >>>>>>>> wrote: >>>>>>>> >>>>>>>>> Hi, >>>>>>>>> >>>>>>>>> The ./make_distribution task completed. However, I can't seem to >>>>>>>>> locate the >>>>>>>>> .tar.gz file. >>>>>>>>> >>>>>>>>> Where does Spark save this? or should I just work with the dist >>>>>>>>> directory? >>>>>>>>> >>>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> I saw this when I tested manually (without ./make-distribution) >>>>>>>>>> >>>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3. >>>>>>>>>> >>>>>>>>>> So I simply upgraded maven to 3.3.3. >>>>>>>>>> >>>>>>>>>> Resolved. Thanks >>>>>>>>>> >>>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>>> This doesn't show the actual error output from Maven. I have a >>>>>>>>>>> strong >>>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory >>>>>>>>>>> Maven can >>>>>>>>>>> use. >>>>>>>>>>> >>>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi < >>>>>>>>>>> dreyemi@gmail.com> wrote: >>>>>>>>>>> > Hi, >>>>>>>>>>> > >>>>>>>>>>> > I can't seem to get a successful maven build. Please see >>>>>>>>>>> command output >>>>>>>>>>> > below: >>>>>>>>>>> > >>>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz >>>>>>>>>>> --mvn mvn >>>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver >>>>>>>>>>> -DskipTests >>>>>>>>>>> > clean package >>>>>>>>>>> > +++ dirname ./make-distribution.sh >>>>>>>>>>> > ++ cd . >>>>>>>>>>> > ++ pwd >>>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest >>>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist >>>>>>>>>>> > + SPARK_TACHYON=false >>>>>>>>>>> > + TACHYON_VERSION=0.7.1 >>>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz >>>>>>>>>>> > + >>>>>>>>>>> > TACHYON_URL= >>>>>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz >>>>>>>>>>> > + MAKE_TGZ=false >>>>>>>>>>> > + NAME=none >>>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn >>>>>>>>>>> > + (( 12 )) >>>>>>>>>>> > + case $1 in >>>>>>>>>>> > + NAME=spark-latest >>>>>>>>>>> > + shift >>>>>>>>>>> > + shift >>>>>>>>>>> > + (( 10 )) >>>>>>>>>>> > + case $1 in >>>>>>>>>>> > + MAKE_TGZ=true >>>>>>>>>>> > + shift >>>>>>>>>>> > + (( 9 )) >>>>>>>>>>> > + case $1 in >>>>>>>>>>> > + MVN=mvn >>>>>>>>>>> > + shift >>>>>>>>>>> > + shift >>>>>>>>>>> > + (( 7 )) >>>>>>>>>>> > + case $1 in >>>>>>>>>>> > + break >>>>>>>>>>> > + '[' -z >>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']' >>>>>>>>>>> > + '[' -z >>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']' >>>>>>>>>>> > ++ command -v git >>>>>>>>>>> > + '[' /usr/bin/git ']' >>>>>>>>>>> > ++ git rev-parse --short HEAD >>>>>>>>>>> > + GITREV=487d409 >>>>>>>>>>> > + '[' '!' -z 487d409 ']' >>>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)' >>>>>>>>>>> > + unset GITREV >>>>>>>>>>> > ++ command -v mvn >>>>>>>>>>> > + '[' '!' /usr/bin/mvn ']' >>>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version >>>>>>>>>>> -Dhadoop.version=2.7.0 >>>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean >>>>>>>>>>> package >>>>>>>>>>> > ++ grep -v INFO >>>>>>>>>>> > ++ tail -n 1 >>>>>>>>>>> > + VERSION='[ERROR] [Help 1] >>>>>>>>>>> > >>>>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException >>>>>>>>>>> ' >>>>>>>>>>> > >>>>>>>>>>> > Same output error with JDK 7 >>>>>>>>>>> > >>>>>>>>>>> > Appreciate your help. >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > --001a114a8b9c7e5de9052318baf5 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
I issued the same basic command and it worked fine.
RADTech-MBP:spark $ ./make-distribution.sh --name hadoop-2.6 --tgz -Pyarn = -Phadoop-2.6 -Dhadoop.version=3D2.6.0 -Phive -Phive-thriftserver -DskipTest= s

Which created:=C2=A0spark-1.6.0-SNAPSHOT-bin-hadoop-2.6.tgz in the= root directory of the project.

FWIW, the environment was an MBP wit= h OS X 10.10.5 and Java:

java version "1.8.0_51"
Java(TM) SE Runtime Environment (build 1.8.0_51-b16)
Java H= otSpot(TM) 64-Bit Server VM (build 25.51-b03, mixed mode)

-Todd

On Tue, Oc= t 27, 2015 at 12:17 PM, Ted Yu <yuzhihong@gmail.com> wrote= :
I used the following c= ommand:
make-distribution.sh --name custom-spark --tgz -Phadoop-2.4= -Phive -Phive-thriftserver -Pyarn

spark-1.6.0= -SNAPSHOT-bin-custom-spark.tgz was generated (with patch from SPARK-11348)<= br>

Can you try above command ?

Thanks

On Tue, Oct 27, 2015 at 7:03 AM, Kayode Odeyemi &= lt;dreyemi@gmail.com= > wrote:
T= ed, I switched to this:=C2=A0

./make-distribution.sh --n= ame spark-latest --tgz -Dhadoop.version=3D2.6.0 -Phadoop-2.6 -Phive -Phive-= thriftserver -Pyarn -DskipTests clean package -U

S= ame error. No .gz file. Here's the bottom output log:

+ rm -rf /home/emperor/javaprojects/spark/dist
+ mkd= ir -p /home/emperor/javaprojects/spark/dist/lib
+ echo 'Spark= [WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+= Plugin (git revision 3689beb) built for Hadoop [WARNING] See htt= p://docs.codehaus.org/display/MAVENUSER/Shade+Pl
+ echo '= Build flags: -Dhadoop.version=3D2.6.0' -Phadoop-2.6 -Phive -Phive-thrif= tserver -Pyarn -DskipTests clean package -U
+ cp /home/emperor/ja= vaprojects/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-h= adoop2.6.0.jar /home/emperor/javaprojects/spark/dist/lib/
+ cp /h= ome/emperor/javaprojects/spark/examples/target/scala-2.10/spark-examples-1.= 6.0-SNAPSHOT-hadoop2.6.0.jar /home/emperor/javaprojects/spark/dist/lib/
+ cp /home/emperor/javaprojects/spark/network/yarn/target/scala-2.10= /spark-1.6.0-SNAPSHOT-yarn-shuffle.jar /home/emperor/javaprojects/spark/dis= t/lib/
+ mkdir -p /home/emperor/javaprojects/spark/dist/examples/= src/main
+ cp -r /home/emperor/javaprojects/spark/examples/src/ma= in /home/emperor/javaprojects/spark/dist/examples/src/
+ '[&#= 39; 1 =3D=3D 1 ']'
+ cp /home/emperor/javaprojects/spark/= lib_managed/jars/datanucleus-api-jdo-3.2.6.jar /home/emperor/javaprojects/s= park/lib_managed/jars/datanucleus-core-3.2.10.jar /home/emperor/javaproject= s
ed/jars/datanucleus-rdbms-3.2.9.jar /home/emperor/javaprojects/= spark/dist/lib/
+ cp /home/emperor/javaprojects/spark/LICENSE /ho= me/emperor/javaprojects/spark/dist
+ cp -r /home/emperor/javaproj= ects/spark/licenses /home/emperor/javaprojects/spark/dist
+ cp /h= ome/emperor/javaprojects/spark/NOTICE /home/emperor/javaprojects/spark/dist=
+ '[' -e /home/emperor/javaprojects/spark/CHANGES.txt &#= 39;]'
+ cp -r /home/emperor/javaprojects/spark/data /home/emp= eror/javaprojects/spark/dist
+ mkdir /home/emperor/javaprojects/s= park/dist/conf
+ cp /home/emperor/javaprojects/spark/conf/docker.= properties.template /home/emperor/javaprojects/spark/conf/fairscheduler.xml= .template /home/emperor/javaprojects/spark/conf/log4j.properties
= emperor/javaprojects/spark/conf/metrics.properties.template /home/emperor/j= avaprojects/spark/conf/slaves.template /home/emperor/javaprojects/spark/con= f/spark-defaults.conf.template /home/em
ts/spark/conf/spark-env.s= h.template /home/emperor/javaprojects/spark/dist/conf
+ cp /home/= emperor/javaprojects/spark/README.md /home/emperor/javaprojects/spark/dist<= /div>
+ cp -r /home/emperor/javaprojects/spark/bin /home/emperor/javapr= ojects/spark/dist
+ cp -r /home/emperor/javaprojects/spark/python= /home/emperor/javaprojects/spark/dist
+ cp -r /home/emperor/java= projects/spark/sbin /home/emperor/javaprojects/spark/dist
+ cp -r= /home/emperor/javaprojects/spark/ec2 /home/emperor/javaprojects/spark/dist=
+ '[' -d /home/emperor/javaprojects/spark/R/lib/SparkR &= #39;]'
+ '[' false =3D=3D true ']'
= + '[' true =3D=3D true ']'
+ TARDIR=3D= '/home/emperor/javaprojects/spark/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spa= rk-latest'
+ rm -rf '/home/emperor/javaprojects/spark= /spark-[WARNING] See http://docs.codehaus.org/d= isplay/MAVENUSER/Shade+Plugin-bin-spark-latest'
+ cp -r /= home/emperor/javaprojects/spark/dist '/home/emperor/javaprojects/spark/= spark-[WARNING] See http://docs.codehaus.org/di= splay/MAVENUSER/Shade+Plugin-bin-spark-latest'
cp: cannot= create directory `/home/emperor/javaprojects/spark/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+P= lugin-bin-spark-latest': No such file or directory


On Tue, Oct 2= 7, 2015 at 2:14 PM, Ted Yu <yuzhihong@gmail.com> wrote:
Can you try the same co= mmand shown in the pull request ?

Thanks

On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <<= a href=3D"mailto:dreyemi@gmail.com" target=3D"_blank">dreyemi@gmail.com= > wrote:

Th= ank you.

But I'm getting same warnings and it's = still preventing the archive from being generated.

I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file<= /div>

On Mon, Oct = 26, 2015 at 9:10 PM, Ted Yu <yuzhihong@gmail.com> wrote:
Looks like '-Pyarn' was missing in your command.

On = Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dreyemi@gmail.com> wrote:
I used this command which= is synonymous to what you have:

./make-distribution.sh = --name spark-latest --tgz --mvn mvn -Dhadoop.version=3D2.6.0 -Phadoop-2.6 -= Phive -Phive-thriftserver -DskipTests clean package -U=C2=A0
=
But I still see WARNINGS like this in the output and no .gz = file created:

cp: /= usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugi= n-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=3D2015/= month=3D10/day=3D26/part-r-00005.gz.parquet: No such file or directory<= /div>
cp: /usr/local/spark-latest/dist/python/tes= t_support/sql/parquet_partitioned/year=3D2015/month=3D9: unable to copy ext= ended attributes to /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-l= atest/python/test_support/sql/parquet_partitioned/year=3D2015/month=3D9= : No such file or directory
cp: /usr/local/spark-latest/dist/python/test_support/sql/parquet_partition= ed/year=3D2015/month=3D9/day=3D1: unable to copy extended attributes to /us= r/local/spark-latest/spark-[WARNING] See http://= docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/te= st_support/sql/parquet_partitioned/year=3D2015/month=3D9/day=3D1: No su= ch file or directory

On Mon, Oct 26, 2015 at 8:5= 8 PM, Ted Yu <yuzhihong@gmail.com> wrote:
If you use the command shown in:
https://github= .com/apache/spark/pull/9281

You should have got the = following:

./dist/python/test_support/sql/parquet_par= titioned/year=3D2014/month=3D9/day=3D1/part-r-00008.gz.parquet
./= dist/python/test_support/sql/parquet_partitioned/year=3D2015/month=3D9/day= =3D1/part-r-00007.gz.parquet
./dist/python/test_support/sql/parqu= et_partitioned/year=3D2015/month=3D10/day=3D25/part-r-00004.gz.parquet
./dist/python/test_support/sql/parquet_partitioned/year=3D2015/month= =3D10/day=3D25/part-r-00002.gz.parquet
./dist/python/test_support= /sql/parquet_partitioned/year=3D2015/month=3D10/day=3D26/part-r-00005.gz.pa= rquet
On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyem= i <dreyemi@gmail.com> wrote:
I see a lot of stuffs like this after the a successful maven build:

Seems it fails when it tries to= package the build as an archive.

I'm usin= g the latest code on github master.

Any ideas plea= se?

On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <yana.kadiyska@gmai= l.com> wrote:
In 1.4=C2= =A0./make_distribution produces a .tgz fil= e in the root directory (same directory that=C2=A0make_distribution is in)


=

On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <= dreyemi@gmail.com> wrote:
Hi,

=
The ./make_distribution task completed. However, I can't seem to l= ocate the
.tar.gz file.

Where does Spark= save this? or should I just work with the dist directory?
<= div class=3D"gmail_extra">
On Fri, Oct 23, 20= 15 at 4:23 PM, Kayode Odeyemi <dreyemi@gmail.com> wrote:
=
I saw this when I tested manually (withou= t ./make-distribution)

Detected Maven Version: 3.2.2 is = not in the allowed range 3.3.3.

So I simply up= graded maven to 3.3.3.

Resolved. Thanks
=

On Fri, Oct = 23, 2015 at 3:17 PM, Sean Owen <sowen@cloudera.com> wrote:<= br>
This doesn't show the actual error output from Mave= n. I have a strong
guess that you haven't set MAVEN_OPTS to increase the memory Maven can<= br> use.

On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dreyemi@gmail.com> wrote:
> Hi,
>
> I can't seem to get a successful maven build. Please see command o= utput
> below:
>
> bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn > -Dhadoop.version=3D2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver -Dski= pTests
> clean package
> +++ dirname ./make-distribution.sh
> ++ cd .
> ++ pwd
> + SPARK_HOME=3D/usr/local/spark-latest
> + DISTDIR=3D/usr/local/spark-latest/dist
> + SPARK_TACHYON=3Dfalse
> + TACHYON_VERSION=3D0.7.1
> + TACHYON_TGZ=3Dtachyon-0.7.1-bin.tar.gz
> +
> TACHYON_URL=3Dhttps://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-= bin.tar.gz
> + MAKE_TGZ=3Dfalse
> + NAME=3Dnone
> + MVN=3D/usr/local/spark-latest/build/mvn
> + ((=C2=A0 12=C2=A0 ))
> + case $1 in
> + NAME=3Dspark-latest
> + shift
> + shift
> + ((=C2=A0 10=C2=A0 ))
> + case $1 in
> + MAKE_TGZ=3Dtrue
> + shift
> + ((=C2=A0 9=C2=A0 ))
> + case $1 in
> + MVN=3Dmvn
> + shift
> + shift
> + ((=C2=A0 7=C2=A0 ))
> + case $1 in
> + break
> + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Con= tents/Home ']'
> + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Con= tents/Home ']'
> ++ command -v git
> + '[' /usr/bin/git ']'
> ++ git rev-parse --short HEAD
> + GITREV=3D487d409
> + '[' '!' -z 487d409 ']'
> + GITREVSTRING=3D' (git revision 487d409)'
> + unset GITREV
> ++ command -v mvn
> + '[' '!' /usr/bin/mvn ']'
> ++ mvn help:evaluate -Dexpression=3Dproject.version -Dhadoop.version= =3D2.7.0
> -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
> ++ grep -v INFO
> ++ tail -n 1
> + VERSION=3D'[ERROR] [Help 1]
> http://cwiki.apache.org/= confluence/display/MAVEN/MojoExecutionException'
>
> Same output error with JDK 7
>
> Appreciate your help.
>
>













--001a114a8b9c7e5de9052318baf5--