Return-Path: X-Original-To: apmail-spark-dev-archive@minotaur.apache.org Delivered-To: apmail-spark-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5916D11F7C for ; Mon, 11 Aug 2014 09:00:46 +0000 (UTC) Received: (qmail 21794 invoked by uid 500); 11 Aug 2014 09:00:45 -0000 Delivered-To: apmail-spark-dev-archive@spark.apache.org Received: (qmail 21736 invoked by uid 500); 11 Aug 2014 09:00:45 -0000 Mailing-List: contact dev-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list dev@spark.apache.org Received: (qmail 21397 invoked by uid 99); 11 Aug 2014 09:00:44 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 11 Aug 2014 09:00:44 +0000 X-ASF-Spam-Status: No, hits=2.3 required=10.0 tests=SPF_SOFTFAIL,URI_HEX X-Spam-Check-By: apache.org Received-SPF: softfail (athena.apache.org: transitioning domain of teng.qiu@gmail.com does not designate 216.139.236.26 as permitted sender) Received: from [216.139.236.26] (HELO sam.nabble.com) (216.139.236.26) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 11 Aug 2014 09:00:40 +0000 Received: from ben.nabble.com ([192.168.236.152]) by sam.nabble.com with esmtp (Exim 4.72) (envelope-from ) id 1XGlSm-00014H-1h for dev@spark.incubator.apache.org; Mon, 11 Aug 2014 02:00:20 -0700 Date: Mon, 11 Aug 2014 02:00:20 -0700 (PDT) From: chutium To: dev@spark.incubator.apache.org Message-ID: <1407747620040-7798.post@n3.nabble.com> In-Reply-To: <1407531797021-7778.post@n3.nabble.com> References: <1407531797021-7778.post@n3.nabble.com> Subject: Re: spark-shell is broken! (bad option: '--master') MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org an issue 3 - 4 PR, spark dev community is really active :) it seems currently spark-shell takes only some SUBMISSION_OPTS, but no APPLICATION_OPTS do you have plan to add some APPLICATION_OPTS or CLI_OPTS like hive -e hive -f hive -hivevar then we can use our scala code as scripts, run them direktly via spark-shell, without compiling, building, packing and so on... those APPLICATION_OPTS should be some augments for org.apache.spark.repl.Main right? -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/spark-shell-is-broken-bad-option-master-tp7778p7798.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org For additional commands, e-mail: dev-help@spark.apache.org