Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id F32D9200D2F for ; Wed, 1 Nov 2017 08:44:33 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id F1AEA160BEA; Wed, 1 Nov 2017 07:44:33 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id E95EF160BE6 for ; Wed, 1 Nov 2017 08:44:32 +0100 (CET) Received: (qmail 93340 invoked by uid 500); 1 Nov 2017 07:44:32 -0000 Mailing-List: contact dev-help@amaterasu.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@amaterasu.apache.org Delivered-To: mailing list dev@amaterasu.apache.org Received: (qmail 93328 invoked by uid 99); 1 Nov 2017 07:44:31 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 01 Nov 2017 07:44:31 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 0E86A1830E0 for ; Wed, 1 Nov 2017 07:44:31 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 4.481 X-Spam-Level: **** X-Spam-Status: No, score=4.481 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HTML_MESSAGE=2, KAM_BADIPHTTP=2, NORMAL_HTTP_TO_IP=0.001, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RCVD_IN_SORBS_SPAM=0.5, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=shinto-io.20150623.gappssmtp.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id Li5ewMF9xaxI for ; Wed, 1 Nov 2017 07:44:26 +0000 (UTC) Received: from mail-wm0-f41.google.com (mail-wm0-f41.google.com [74.125.82.41]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id 1C0F25FB87 for ; Wed, 1 Nov 2017 07:44:26 +0000 (UTC) Received: by mail-wm0-f41.google.com with SMTP id s66so3051782wmf.5 for ; Wed, 01 Nov 2017 00:44:26 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=shinto-io.20150623.gappssmtp.com; s=20150623; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=nnq/9a6jU8SDYyxscq2kOom366/CLxICIXBV/ZIlL8E=; b=1DjdWR/98ZvuquH/C6Nc+sCpepS+a57YZFxMdeKTYhjMiRFvXT0VtVgiEvEy3gG30n 8mgvjTZ3efwuzkeODFW2C7OSqq94YNb7C+cXBuw6FKDKd/hhUbLVKeV3txTCkOKnmYrf 6ZAD91GrpS78Rrr/VSeRrnJrHBVEfP1NHRzkKiummdecVwOppwLViaBsxjWaWtfN33xI PlUzzH7dvj4nsArmDFdQjW28KRr0VETHo8DSLPfvoCoD6Ls9XxNWpZ7IZOGqCEOpu8Rq vPFyq9efB/pDFlX+T9eBNFQL4siksEClzv4LTXxRW24SlIgxQTuY+BDwCXXRyGBieL9X dWEQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=nnq/9a6jU8SDYyxscq2kOom366/CLxICIXBV/ZIlL8E=; b=SsvhIr0ZbJkKBgePi5zNfecbY4N5Ip8A+3BhGtR7/7Nw/R6AA9zrFtl/pRGg+mCMIp zr8JWnPQa0kyiY1++f2g8YU7KNFOq70CaExbSKLMhArvHTmGjGOeszE1af+jC2gS4FTE 9KLm2EoQ2CtJWqjCootiW0U3onijBkyDM6QP8D0VkN30tIBrClwT2juvR7oBzHKVE9eR uPy5N0IO53UA1BncNRjkLBBlZX19ERrU9vdtDzEqju7vmKuItjO8iGkZL7q1i+8YBsW1 x5GFslgqb7E6LlkpePG4EcUHSxFGHLguQRfSYmCc17Pb91ckgZzk/oMqbhFYeZivBbjB 2qoQ== X-Gm-Message-State: AMCzsaU0Xb01LOFWi+K3WE0FXUbRJqhXNQAfOr846OSfnCxHhlFkW7KM mHOENFGENpWaA2HgPdAXOLS9OwZwrWMM+QCH9TzfEVyk X-Google-Smtp-Source: ABhQp+TPOkf4XgxMk/7WV/NwtCXLRvRWlpih88L8gxX4MmkhlDQJCyt0cR/XzV1B38UxAikE3iXih6PUFUtJSbOY5K4= X-Received: by 10.28.13.135 with SMTP id 129mr4076414wmn.24.1509522265285; Wed, 01 Nov 2017 00:44:25 -0700 (PDT) MIME-Version: 1.0 Received: by 10.28.5.78 with HTTP; Wed, 1 Nov 2017 00:44:24 -0700 (PDT) In-Reply-To: References: From: Yaniv Rodenski Date: Wed, 1 Nov 2017 18:44:24 +1100 Message-ID: Subject: Re: Unable to run the sample job To: dev@amaterasu.apache.org Content-Type: multipart/alternative; boundary="001a11443e74674a6c055ce70a28" archived-at: Wed, 01 Nov 2017 07:44:34 -0000 --001a11443e74674a6c055ce70a28 Content-Type: text/plain; charset="UTF-8" Hi Shad, While the plan is to have better support for running spark jars in future versions, we currently have a workaround using the deps/jars.yml file. You can see an example in the error-handling branch here: https://github.com/shintoio/amaterasu-job-sample/tree/error-handling The main thing to note is that the error.scala file is using an external lib, which is being pulled from maven central (this is defined in deps/jars.ym). In order for this to work with spark programs, there are a couple of caveats to note: - Don't create a spark session/context in your jar, but pass in the one defined in AmaContexct - Exports will not work in this release Other than those you should be OK. Have a look at my slides from Dataworks Summit here: https://www.slideshare.net/Hadoop_Summit/introduction-to-apache-amaterasu-incubating-cd-framework-for-your-big-data-pipelines, and specifically the one on the planned 0.3.0 DSL to see where the current thinking is at. Cheers, Yaniv On Tue, Oct 31, 2017 at 9:36 PM, Shad Amez wrote: > Thanks Yaniv for the reply. The job now starts but completed with some > errors. > > I wanted to know if the current implementation has the ability trigger the > job from the Spark batch jar file, instead of reading from a github repo ? > Or should this be considered as a feature request ? > > PFB the error logs. > > ===> Executor 0000000002-efce1eff-b351-4bc3-9bb2-776af04d3b3b registered > ===> a provider for group spark was created > ===> launching task: 0000000002 > ===> ================= started action start ================= > ===> val data = 1 to 1000 > ===> val rdd = sc.parallelize(data) > ===> val odd = rdd.filter(n => n%2 != 0).toDF("number") > ===> ================= finished action start ================= > ===> complete task: 0000000002 > ===> launching task: 0000000003 > ===> ================= started action start ================= > ===> val highNoDf = AmaContext.getDataFrame("start", "odd").where("number > > > 100") > ===> highNoDf.write.mode(SaveMode.Overwrite).json(s"${env. > outputRootPath}/dtatframe_res") > ===> org.apache.spark.SparkException: Job aborted. > ===> Executor 0000000003-ada561dd-f38b-4741-9ca5-eeb20780bbcf registered > ===> a provider for group spark was created > ===> launching task: 0000000003 > ===> ================= started action step2 ================= > ===> val highNoDf = AmaContext.getDataFrame("start", "odd").where("number > > > 100") > ===> highNoDf.write.mode(SaveMode.Overwrite).json(s"${env. > outputRootPath}/dtatframe_res") > ===> org.apache.spark.SparkException: Job aborted. > ===> Executor 0000000003-c65bcfd5-215b-46dc-9bd8-b38d31f50bb1 registered > ===> a provider for group spark was created > ===> launching task: 0000000003 > ===> ================= started action step2 ================= > ===> val highNoDf = AmaContext.getDataFrame("start", "odd").where("number > > > 100") > ===> highNoDf.write.mode(SaveMode.Overwrite).json(s"${env. > outputRootPath}/dtatframe_res") > ===> org.apache.spark.SparkException: Job aborted. > ===> moving to err action null > 2017-10-31 10:53:37.730:INFO:oejs.ServerConnector:Thread-73: Stopped > ServerConnector@58ea606c{HTTP/1.1}{0.0.0.0:8000} > 2017-10-31 10:53:37.732:INFO:oejsh.ContextHandler:Thread-73: Stopped > o.e.j.s.ServletContextHandler@8c3b9d{/,file:/home/shad/apps/ > apache-amaterasu-0.2.0-incubating/dist/,UNAVAILABLE} > I1031 10:53:37.733896 36249 sched.cpp:2021] Asked to stop the driver > I1031 10:53:37.734133 36249 sched.cpp:1203] Stopping framework > afd772c2-509b-4782-a4c6-4cd9a2985cc2-0001 > > > W00t amaterasu job is finished!!! > > Thanks, > Shad > > On Tue, Oct 31, 2017 at 5:02 AM, Yaniv Rodenski wrote: > > > Hi Shad, > > > > sorry about that, there was indeed an issue with the job definition. > > Should be fine now. > > > > Cheers, > > Yaniv > > > > On Mon, Oct 30, 2017 at 9:02 PM, Shad Amez > wrote: > > > > > Hi All, > > > > > > I started tinkering with source code of Amaterasu and just wanted to > > > confirm if I am missing any step. > > > > > > Here are the steps that I followed : > > > > > > 1. Installed a single node mesos cluster (version 1.4) in Ubuntu 16.0.4 > > > 2. Generated the amaterasu tar file post building the project from > > source. > > > 3. Tested if mesos by executing the following command : > > > mesos-execute --master=$MASTER --name="cluster-test" > --command="sleep > > 5" > > > 4. Ran the command for deploying the spark job using Amaterasu > > > ./ama-start.sh --repo="https://github.com/ > shintoio/amaterasu-job-sample > > . > > > git" > > > --branch="master" --env="test" --report="code" > > > > > > Following is error log > > > > > > repo: https://github.com/shintoio/amaterasu-job-sample.git > > > 2017-10-30 14:00:09.761:INFO::main: Logging initialized @430ms > > > 2017-10-30 14:00:09.829:INFO:oejs.Server:main: jetty-9.2.z-SNAPSHOT > > > 2017-10-30 14:00:09.864:INFO:oejsh.ContextHandler:main: Started > > > o.e.j.s.ServletContextHandler@8c3b9d > > > {/,file:/home/shad/apps/apache-amaterasu-0.2.0- > > incubating/dist/,AVAILABLE} > > > 2017-10-30 14:00:09.882:INFO:oejs.ServerConnector:main: Started > > > ServerConnector@58ea606c{HTTP/1.1}{0.0.0.0:8000} > > > 2017-10-30 14:00:09.882:INFO:oejs.Server:main: Started @553ms > > > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > > > SLF4J: Defaulting to no-operation (NOP) logger implementation > > > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for > > further > > > details. > > > I1030 14:00:10.085204 16586 sched.cpp:232] Version: 1.4.0 > > > I1030 14:00:10.088812 16630 sched.cpp:336] New master detected at > > > master@127.0.1.1:5050 > > > I1030 14:00:10.089205 16630 sched.cpp:352] No credentials provided. > > > Attempting to register without authentication > > > I1030 14:00:10.090991 16624 sched.cpp:759] Framework registered with > > > e72b9609-4b7f-4509-b1a4-bd2055d674aa-0002 > > > Exception in thread "Thread-13" while scanning for the next token > > > found character '\t(TAB)' that cannot start any token. (Do not use > > \t(TAB) > > > for indentation) > > > in 'reader', line 2, column 1: > > > "name":"test", > > > ^ > > > > > > at > > > org.yaml.snakeyaml.scanner.ScannerImpl.fetchMoreTokens( > > > ScannerImpl.java:420) > > > at org.yaml.snakeyaml.scanner.ScannerImpl.checkToken( > > ScannerImpl.java:226) > > > at > > > org.yaml.snakeyaml.parser.ParserImpl$ParseImplicitDocumentStart. > > > produce(ParserImpl.java:194) > > > at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:157) > > > at org.yaml.snakeyaml.parser.ParserImpl.checkEvent( > ParserImpl.java:147) > > > at org.yaml.snakeyaml.composer.Composer.getSingleNode( > Composer.java:104) > > > at > > > org.yaml.snakeyaml.constructor.BaseConstructor. > > > getSingleData(BaseConstructor.java:122) > > > at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:505) > > > at org.yaml.snakeyaml.Yaml.load(Yaml.java:436) > > > at > > > org.apache.amaterasu.leader.utilities.DataLoader$. > > > yamlToMap(DataLoader.scala:87) > > > at > > > org.apache.amaterasu.leader.utilities.DataLoader$$anonfun$ > > > 3.apply(DataLoader.scala:68) > > > at > > > org.apache.amaterasu.leader.utilities.DataLoader$$anonfun$ > > > 3.apply(DataLoader.scala:68) > > > at > > > scala.collection.TraversableLike$$anonfun$map$ > > > 1.apply(TraversableLike.scala:234) > > > at > > > scala.collection.TraversableLike$$anonfun$map$ > > > 1.apply(TraversableLike.scala:234) > > > at > > > scala.collection.IndexedSeqOptimized$class. > foreach(IndexedSeqOptimized. > > > scala:33) > > > at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) > > > at scala.collection.TraversableLike$class.map( > TraversableLike.scala:234) > > > at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) > > > at > > > org.apache.amaterasu.leader.utilities.DataLoader$. > > > getExecutorData(DataLoader.scala:68) > > > at > > > org.apache.amaterasu.leader.mesos.schedulers.JobScheduler$ > > > $anonfun$resourceOffers$1.apply(JobScheduler.scala:163) > > > at > > > org.apache.amaterasu.leader.mesos.schedulers.JobScheduler$ > > > $anonfun$resourceOffers$1.apply(JobScheduler.scala:128) > > > at scala.collection.Iterator$class.foreach(Iterator.scala:891) > > > at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) > > > at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) > > > at scala.collection.AbstractIterable.foreach(Iterable.scala:54) > > > at > > > org.apache.amaterasu.leader.mesos.schedulers.JobScheduler. > > > resourceOffers(JobScheduler.scala:128) > > > I1030 14:00:12.991056 16624 sched.cpp:2055] Asked to abort the driver > > > I1030 14:00:12.991195 16624 sched.cpp:1233] Aborting framework > > > e72b9609-4b7f-4509-b1a4-bd2055d674aa-0002 > > > > > > Thanks, > > > Shad > > > > > > > > > > > -- > > Yaniv Rodenski > > > -- Yaniv Rodenski --001a11443e74674a6c055ce70a28--