spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yiming \(John\) Zhang" <sdi...@gmail.com>
Subject re: How to incrementally compile spark examples using mvn
Date Thu, 20 Nov 2014 01:35:15 GMT
Hi Sean,

Thank you for your reply. I was wondering whether there is a method of reusing locally-built
components without installing them? That is, if I have successfully built the spark project
as a whole, how should I configure it so that I can incrementally build (only) the "spark-examples"
sub project without the need of downloading or installation? 

Thank you!

Cheers,
Yiming

-----邮件原件-----
发件人: Sean Owen [mailto:sowen@cloudera.com] 
发送时间: 2014年11月17日 17:40
收件人: yiming zhang
抄送: Marcelo Vanzin; user@spark.apache.org
主题: Re: How to incrementally compile spark examples using mvn

The downloads just happen once so this is not a problem.

If you are just building one module in a project, it needs a compiled copy of other modules.
It will either use your locally-built and locally-installed artifact, or, download one from
the repo if possible.

This isn't needed if you are compiling all modules at once. If you want to compile everything
and reuse the local artifacts later, you need 'install' not 'package'.

On Mon, Nov 17, 2014 at 12:27 AM, Yiming (John) Zhang <sdiris@gmail.com> wrote:
> Thank you Marcelo. I tried your suggestion (# mvn -pl :spark-examples_2.10 compile),
but it required to download many spark components (as listed below), which I have already
compiled on my server.
>
> Downloading: 
> https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/
> spark-core_2.10-1.1.0.pom
> ...
> Downloading: 
> https://repo1.maven.org/maven2/org/apache/spark/spark-streaming_2.10/1
> .1.0/spark-streaming_2.10-1.1.0.pom
> ...
> Downloading: 
> https://repository.jboss.org/nexus/content/repositories/releases/org/a
> pache/spark/spark-hive_2.10/1.1.0/spark-hive_2.10-1.1.0.pom
> ...
>
> This problem didn't happen when I compiled the whole project using ``mvn -DskipTests
package''. I guess some configurations have to be made to tell mvn the dependencies are local.
Any idea for that?
>
> Thank you for your help!
>
> Cheers,
> Yiming
>
> -----邮件原件-----
> 发件人: Marcelo Vanzin [mailto:vanzin@cloudera.com]
> 发送时间: 2014年11月16日 10:26
> 收件人: sdiris@gmail.com
> 抄送: user@spark.apache.org
> 主题: Re: How to incrementally compile spark examples using mvn
>
> I haven't tried scala:cc, but you can ask maven to just build a particular sub-project.
For example:
>
>   mvn -pl :spark-examples_2.10 compile
>
> On Sat, Nov 15, 2014 at 5:31 PM, Yiming (John) Zhang <sdiris@gmail.com> wrote:
>> Hi,
>>
>>
>>
>> I have already successfully compile and run spark examples. My 
>> problem is that if I make some modifications (e.g., on SparkPi.scala 
>> or
>> LogQuery.scala) I have to use “mvn -DskipTests package” to rebuild 
>> the whole spark project and wait a relatively long time.
>>
>>
>>
>> I also tried “mvn scala:cc” as described in 
>> http://spark.apache.org/docs/latest/building-with-maven.html, but I 
>> could only get infinite stop like:
>>
>> [INFO] --- scala-maven-plugin:3.2.0:cc (default-cli) @ spark-parent
>> ---
>>
>> [INFO] wait for files to compile...
>>
>>
>>
>> Is there any method to incrementally compile the examples using mvn?
>> Thank you!
>>
>>
>>
>> Cheers,
>>
>> Yiming
>
>
>
> --
> Marcelo
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For 
> additional commands, e-mail: user-help@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message