Can you try the same command shown in the pull request ?
Thanks
> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <dreyemi@gmail.com> wrote:
>
> Thank you.
>
> But I'm getting same warnings and it's still preventing the archive from being generated.
>
> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file
>
>> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yuzhihong@gmail.com> wrote:
>> Looks like '-Pyarn' was missing in your command.
>>
>>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dreyemi@gmail.com> wrote:
>>> I used this command which is synonymous to what you have:
>>>
>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn -Dhadoop.version=2.6.0
-Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests clean package -U
>>>
>>> But I still see WARNINGS like this in the output and no .gz file created:
>>>
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
No such file or directory
>>> cp: /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
unable to copy extended attributes to /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
No such file or directory
>>> cp: /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
unable to copy extended attributes to /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
No such file or directory
>>>
>>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yuzhihong@gmail.com> wrote:
>>>> If you use the command shown in:
>>>> https://github.com/apache/spark/pull/9281
>>>>
>>>> You should have got the following:
>>>>
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>>
>>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dreyemi@gmail.com>
wrote:
>>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>>
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>>> part-r-00008.gz.parquet: No such file or directory
>>>>>
>>>>> Seems it fails when it tries to package the build as an archive.
>>>>>
>>>>> I'm using the latest code on github master.
>>>>>
>>>>> Any ideas please?
>>>>>
>>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <yana.kadiyska@gmail.com>
wrote:
>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root directory
(same directory that make_distribution is in)
>>>>>>
>>>>>>
>>>>>>
>>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dreyemi@gmail.com>
wrote:
>>>>>>> Hi,
>>>>>>>
>>>>>>> The ./make_distribution task completed. However, I can't seem
to locate the
>>>>>>> .tar.gz file.
>>>>>>>
>>>>>>> Where does Spark save this? or should I just work with the dist
directory?
>>>>>>>
>>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dreyemi@gmail.com>
wrote:
>>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>>
>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range
3.3.3.
>>>>>>>>
>>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>>
>>>>>>>> Resolved. Thanks
>>>>>>>>
>>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <sowen@cloudera.com>
wrote:
>>>>>>>>> This doesn't show the actual error output from Maven.
I have a strong
>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the
memory Maven can
>>>>>>>>> use.
>>>>>>>>>
>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dreyemi@gmail.com>
wrote:
>>>>>>>>> > Hi,
>>>>>>>>> >
>>>>>>>>> > I can't seem to get a successful maven build. Please
see command output
>>>>>>>>> > below:
>>>>>>>>> >
>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest
--tgz --mvn mvn
>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
-DskipTests
>>>>>>>>> > clean package
>>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>>> > ++ cd .
>>>>>>>>> > ++ pwd
>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>>> > +
>>>>>>>>> > TACHYON_URL=https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>>> > + MAKE_TGZ=false
>>>>>>>>> > + NAME=none
>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>>> > + (( 12 ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + NAME=spark-latest
>>>>>>>>> > + shift
>>>>>>>>> > + shift
>>>>>>>>> > + (( 10 ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + MAKE_TGZ=true
>>>>>>>>> > + shift
>>>>>>>>> > + (( 9 ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + MVN=mvn
>>>>>>>>> > + shift
>>>>>>>>> > + shift
>>>>>>>>> > + (( 7 ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + break
>>>>>>>>> > + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home
']'
>>>>>>>>> > + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home
']'
>>>>>>>>> > ++ command -v git
>>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>>> > + GITREV=487d409
>>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>>> > + unset GITREV
>>>>>>>>> > ++ command -v mvn
>>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
-Dhadoop.version=2.7.0
>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests
clean package
>>>>>>>>> > ++ grep -v INFO
>>>>>>>>> > ++ tail -n 1
>>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>>> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException'
>>>>>>>>> >
>>>>>>>>> > Same output error with JDK 7
>>>>>>>>> >
>>>>>>>>> > Appreciate your help.
>>>>>>>>> >
>>>>>>>>> >
>>>>>>>
>>>
>
>
|