livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stefan Miklosovic <mikloso...@gmail.com>
Subject Re: How to submit the job via REST ?
Date Fri, 01 Dec 2017 16:43:15 GMT
I dont understand what "LivyClient jar" is in your context. You are just
using that api of Livy when you have that dependency on the class path. You
are not bundling there anything unless you make a fat standalone jar with
all you need to use LivyClient by which you would upload that job class.
But you dont upload that fat jar itself.

On 1 Dec 2017 17:38, "kant kodali" <kanth909@gmail.com> wrote:

> @Stefan Sorry if I wasn't clear I wasn't talking about uploading. I can
> have LivyClient Jar and PiJob jar and I can use LivyClient Jar to run on my
> machine which internally uploads PiJob Jar to LivyServer. correct?
>
> On Fri, Dec 1, 2017 at 8:34 AM, Stefan Miklosovic <miklosovic@gmail.com>
> wrote:
>
>> They are separate, livy client is pary of livy distribution by which you
>> talk to livy server. There is no need to upload that into livy server. Why
>> would you do that ...
>>
>> I dont know about other approach you show. It it suits your case feel
>> free to use it.
>>
>> On 1 Dec 2017 17:11, "kant kodali" <kanth909@gmail.com> wrote:
>>
>>> can we just do sparkConf.setJars(JavaSparkContext.jarOfClass(PiJob.class));
>>> ?
>>>
>>> On Fri, Dec 1, 2017 at 7:51 AM, kant kodali <kanth909@gmail.com> wrote:
>>>
>>>>
>>>> If I understand this correctly. Is LivyClient and PiJob are separate
>>>> classes within the same Jar such that LivyClient will have the main
>>>> method?  or LivyClient and PiJob are separate classes in separate Jars ?
If
>>>> so which class will have the main method?
>>>>
>>>> I believe Spark also does this but in a much more easy fashion
>>>>
>>>> public static String[] jarOfClass(Class<?> cls)
>>>>
>>>> https://spark.apache.org/docs/2.1.1/api/java/index.html?org/apache/spark/api/java/JavaSparkContext.html
>>>>
>>>>
>>>> On Fri, Dec 1, 2017 at 7:45 AM, Stefan Miklosovic <miklosovic@gmail.com
>>>> > wrote:
>>>>
>>>>> The last paragraph holds if your job class is in src/test/java as mine
>>>>> is, I am using Livy for submitting my jobs programmatically as a part
>>>>> of mvn test.
>>>>>
>>>>> On Fri, Dec 1, 2017 at 4:42 PM, Stefan Miklosovic <
>>>>> miklosovic@gmail.com> wrote:
>>>>> > As a former Red Hatter, I am using ShrinkWrap and ShrinkWrap Maven
>>>>> > resolver for it.
>>>>> >
>>>>> > It basically looks like this:
>>>>> >
>>>>> >     public static final class JarBuilder {
>>>>> >
>>>>> >         public File buildJobJar(File destinationJar) {
>>>>> >
>>>>> >             final JavaArchive javaArchive =
>>>>> > ShrinkWrap.create(MavenImporter.class)
>>>>> >                 .loadPomFromFile("pom.xml")
>>>>> >                 .importBuildOutput()
>>>>> >                 .as(JavaArchive.class)
>>>>> >                 .addClass(MyLivyJob.class);
>>>>> >
>>>>> >             logger.info(javaArchive.toString(true));
>>>>> >
>>>>> >             javaArchive.as(ZipExporter.class).exportTo(destinationJar,
>>>>> true);
>>>>> >
>>>>> >             return destinationJar;
>>>>> >         }
>>>>> >
>>>>> >
>>>>> >         // for every other dependency in the runtime which is not
>>>>> > bundled into the build jar
>>>>> >         // you would need to make something similar as below
>>>>> >
>>>>> >         public File buildHttpClientJar() {
>>>>> >             return Maven.configureResolver()
>>>>> >                 .withMavenCentralRepo(true)
>>>>> >                 .resolve(HTTP_CLIENT_COORDINATES)
>>>>> >                 .withoutTransitivity()
>>>>> >                 .asSingleFile();
>>>>> >         }
>>>>> >
>>>>> >         public File buildJacksonDatatyeJodaJar() {
>>>>> >             return Maven.configureResolver()
>>>>> >                 .withMavenCentralRepo(true)
>>>>> >                 .resolve(JACKSON_DATATYPE_JODA_COORDINATES)
>>>>> >                 .withoutTransitivity()
>>>>> >                 .asSingleFile();
>>>>> >         }
>>>>> >     }
>>>>> >
>>>>> > After that, you just use uploadJar on livy client instance. After
>>>>> > that, you can submit your job via submit method on Livy client.
>>>>> >
>>>>> > Notice that I made a jar and I added that job class into that. If
you
>>>>> > dont add it there and you upload it, it will not know how to
>>>>> > deserialize it because that compiled class would not be there.
>>>>> >
>>>>> > On Fri, Dec 1, 2017 at 3:54 PM, Meisam Fathi <meisam.fathi@gmail.com>
>>>>> wrote:
>>>>> >> You should compile and package PiJar before running this code
>>>>> snippet. It
>>>>> >> does not need to be a separate app/project. You can put the
PiJob
>>>>> code right
>>>>> >> next to the code snippet to run it. MVN/sbt/gradle can create
the
>>>>> jar for
>>>>> >> you and I assume there is a way to call them programmatically,
but
>>>>> that is
>>>>> >> not needed. You can use the path to the jar file as piJar.
>>>>> >>
>>>>> >> I hope this answers your question.
>>>>> >>
>>>>> >> Thanks,
>>>>> >> Meisam
>>>>> >>
>>>>> >> import org.apache.spark.api.java.function.*;
>>>>> >>
>>>>> >> import org.apache.livy.*;
>>>>> >>
>>>>> >> public class PiJob implements Job<Double>, Function<Integer,
>>>>> Integer>,
>>>>> >>         Function2<Integer, Integer, Integer> {
>>>>> >>
>>>>> >>   private final int samples;
>>>>> >>
>>>>> >>   public PiJob(int samples) {
>>>>> >>     this.samples = samples;
>>>>> >>   }
>>>>> >>
>>>>> >>   @Override
>>>>> >>   public Double call(JobContext ctx) throws Exception {
>>>>> >>     List<Integer> sampleList = new ArrayList<Integer>();
>>>>> >>     for (int i = 0; i < samples; i++) {
>>>>> >>       sampleList.add(i + 1);
>>>>> >>     }
>>>>> >>
>>>>> >>     return 4.0d * ctx.sc().parallelize(sampleList).map(this).reduce(this)
>>>>> /
>>>>> >> samples;
>>>>> >>   }
>>>>> >>
>>>>> >>   @Override
>>>>> >>   public Integer call(Integer v1) {
>>>>> >>     double x = Math.random();
>>>>> >>     double y = Math.random();
>>>>> >>     return (x*x + y*y < 1) ? 1 : 0;
>>>>> >>   }
>>>>> >>
>>>>> >>   @Override
>>>>> >>   public Integer call(Integer v1, Integer v2) {
>>>>> >>     return v1 + v2;
>>>>> >>   }
>>>>> >>
>>>>> >> }
>>>>> >>
>>>>> >>
>>>>> >> On Fri, Dec 1, 2017 at 1:09 AM kant kodali <kanth909@gmail.com>
>>>>> wrote:
>>>>> >>>
>>>>> >>> Hi All,
>>>>> >>>
>>>>> >>> I am looking at the following snippet of code and I wonder
where
>>>>> and how
>>>>> >>> do I create piJar ? can I create programmatically if so
how? is
>>>>> there a
>>>>> >>> complete hello world example somewhere where I can follow
steps
>>>>> and see how
>>>>> >>> this works?
>>>>> >>>
>>>>> >>> Concerning line
>>>>> >>>
>>>>> >>> client.uploadJar(new File(piJar)).get();
>>>>> >>>
>>>>> >>>
>>>>> >>>
>>>>> >>> Code snippet
>>>>> >>>
>>>>> >>> LivyClient client = new LivyClientBuilder()
>>>>> >>>   .setURI(new URI(livyUrl))
>>>>> >>>   .build();
>>>>> >>>
>>>>> >>> try {
>>>>> >>>   System.err.printf("Uploading %s to the Spark context...\n",
>>>>> piJar);
>>>>> >>>   client.uploadJar(new File(piJar)).get();
>>>>> >>>
>>>>> >>>   System.err.printf("Running PiJob with %d samples...\n",
samples);
>>>>> >>>   double pi = client.submit(new PiJob(samples)).get();
>>>>> >>>
>>>>> >>>   System.out.println("Pi is roughly: " + pi);
>>>>> >>> } finally {
>>>>> >>>   client.stop(true);
>>>>> >>> }
>>>>> >
>>>>> >
>>>>> >
>>>>> > --
>>>>> > Stefan Miklosovic
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Stefan Miklosovic
>>>>>
>>>>
>>>>
>>>
>

Mime
View raw message