Return-Path: X-Original-To: apmail-spark-user-archive@minotaur.apache.org Delivered-To: apmail-spark-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2E2771048C for ; Wed, 9 Sep 2015 18:04:41 +0000 (UTC) Received: (qmail 85373 invoked by uid 500); 9 Sep 2015 18:04:30 -0000 Delivered-To: apmail-spark-user-archive@spark.apache.org Received: (qmail 85284 invoked by uid 500); 9 Sep 2015 18:04:30 -0000 Mailing-List: contact user-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@spark.apache.org Received: (qmail 85270 invoked by uid 99); 9 Sep 2015 18:04:30 -0000 Received: from Unknown (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 09 Sep 2015 18:04:30 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id C4088C01AB for ; Wed, 9 Sep 2015 18:04:29 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.129 X-Spam-Level: *** X-Spam-Status: No, score=3.129 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_ENVFROM_END_DIGIT=0.25, HTML_MESSAGE=3, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id u6DLSnVr-Qoj for ; Wed, 9 Sep 2015 18:04:28 +0000 (UTC) Received: from mail-la0-f43.google.com (mail-la0-f43.google.com [209.85.215.43]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id 66A8C205E9 for ; Wed, 9 Sep 2015 18:04:27 +0000 (UTC) Received: by laeb10 with SMTP id b10so12548750lae.1 for ; Wed, 09 Sep 2015 11:04:25 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=5ax9N8E8hlS8pGAYINGhaEpgekQujvTbGkWFF+gjNZU=; b=EG5wzByBR6HH0JHgkIcwxXiUzI7AJrduRgNwR94shHi73qe+O2MX3dych60IZOfN8z RQKmRXEBvBu2xC2zHib2bhle02zEpx6+5aNnn6xyefkil/r62+qaZ4dMTvc7ww4hZh+N PnqVA5YjXwaCiN7i9iVXIScnGbu4REfaXaSIbzb9gmPm+qOA0IZKml3S6uZ9btoOk4f9 fU2B9FV+AcNm9tFSCGFieyZZDrHD3WVHC9dli7zuiOmq7wT1DWlw/vRdn3n0FJRQ7dn0 vd7AG17QO8l9Cv7yKOPeVxs+3avjrFihUDrtm9GZrtSre7/Hr3o0twmGehXT50uCtEd2 55jg== MIME-Version: 1.0 X-Received: by 10.152.26.41 with SMTP id i9mr2222706lag.36.1441821865664; Wed, 09 Sep 2015 11:04:25 -0700 (PDT) Received: by 10.25.90.78 with HTTP; Wed, 9 Sep 2015 11:04:25 -0700 (PDT) In-Reply-To: References: Date: Wed, 9 Sep 2015 23:34:25 +0530 Message-ID: Subject: Re: bad substitution for [hdp.version] Error in spark on YARN job From: Jeetendra Gangele To: user Content-Type: multipart/alternative; boundary=089e0158bde821e8fb051f5450ae --089e0158bde821e8fb051f5450ae Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Finally it did worked out solved it modifying the mapred-site.xml removed the entry for application yarn master(from this property removed the HDP version things). On 9 September 2015 at 17:44, Jeetendra Gangele wrote: > Hi , > I am getting below error when running the spark job on YARN with HDP > cluster. > I have installed spark and yarn from Ambari and I am using spark 1.3.1 > with HDP version HDP-2.3.0.0-2557. > > My spark-default.conf has correct entry > > spark.driver.extraJavaOptions -Dhdp.version=3D2.3.0.0-2557 > spark.yarn.am.extraJavaOptions -Dhdp.version=3D2.3.0.0-2557 > > can anybody from HDP reply on this not sure why hp.version is not getting > passed thought it setup in con file correctly. I tried passing same to > spark-submit with --conf "hdp.version=3D2.3.0.0-2557" same issue no lock. > > I am running my job with spark-submit from spark-client machine > > > > Exit code: 1 > Exception message: > /hadoop/yarn/local/usercache/hdfs/appcache/application_1441798371988_0002= /container_e08_1441798371988_0002_01_000005/launch_container.sh: > line 22: > $PWD:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*= :/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*= :/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-cli= ent/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/sh= are/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/= *:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/s= hare/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/= mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/= hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-fram= ework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/ha= doop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: > bad substitution > > Stack trace: ExitCodeException exitCode=3D1: > /hadoop/yarn/local/usercache/hdfs/appcache/application_1441798371988_0002= /container_e08_1441798371988_0002_01_000005/launch_container.sh: > line 22: > $PWD:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*= :/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*= :/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-cli= ent/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/sh= are/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/= *:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/s= hare/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/= mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/= hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-fram= ework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/ha= doop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: > bad substitution > > at org.apache.hadoop.util.Shell.runCommand(Shell.java:545) > at org.apache.hadoop.util.Shell.run(Shell.java:456) > at > org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) > at > org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launch= Container(DefaultContainerExecutor.java:211) > at > org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.Conta= inerLaunch.call(ContainerLaunch.java:302) > at > org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.Conta= inerLaunch.call(ContainerLaunch.java:82) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java= :1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.jav= a:617) > at java.lang.Thread.run(Thread.java:745) > > > Container exited with a non-zero exit code 1 > > --089e0158bde821e8fb051f5450ae Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Finally it did worked out solved it modifying the mapred-s= ite.xml removed the entry for application yarn master(from this property re= moved the HDP version things).

=C2=A0

On 9 September 2015 at 17:44, = Jeetendra Gangele <gangele397@gmail.com> wrote:
Hi ,
I am getting below error when running the spark job on YARN with HDP c= luster.
I have installed spark and yarn from Ambari and I am usin= g spark 1.3.1 with HDP version HDP-2.3.0.0-2557.

M= y spark-default.conf has correct entry=C2=A0

spark= .driver.extraJavaOptions -Dhdp.version=3D2.3.0.0-2557
spark.y= arn.am.extraJavaOptions -Dhdp.version=3D2.3.0.0-2557

can anybody from HDP reply on this not sure why hp.version is not gettin= g passed thought it setup in con file correctly. I tried passing same to sp= ark-submit with --conf "hdp.version=3D2.3.0.0-2557" same issue no= lock.

I am running my job with spark-submit from = spark-client machine



Exit code: 1
Exception message: /hadoop/yarn/local/usercach= e/hdfs/appcache/application_1441798371988_0002/container_e08_1441798371988_= 0002_01_000005/launch_container.sh: line 22: $PWD:$PWD/__spark__.jar:$HADOO= P_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/= lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-cl= ient/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-ya= rn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-f= ramework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share= /hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr= -framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop= /yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/= hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/= lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.ja= r:/etc/hadoop/conf/secure: bad substitution

Stack = trace: ExitCodeException exitCode=3D1: /hadoop/yarn/local/usercache/hdfs/ap= pcache/application_1441798371988_0002/container_e08_1441798371988_0002_01_0= 00005/launch_container.sh: line 22: $PWD:$PWD/__spark__.jar:$HADOOP_CONF_DI= R:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/us= r/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/= *:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client= /lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/= hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/c= ommon/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framewor= k/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib= /*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/sh= are/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/us= r/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/ha= doop/conf/secure: bad substitution

at org.apache.hadoop.util.Shell.runCommand(Sh= ell.java:545)
at org= .apache.hadoop.util.Shell.run(Shell.java:456)
at org.apache.hadoop.util.Shell$ShellCommandExecu= tor.execute(Shell.java:722)
= at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecut= or.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemana= ger.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302= )
at org.apache.hado= op.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(C= ontainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
<= span style=3D"white-space:pre-wrap"> at java.util.concurrent.ThreadP= oolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecuto= r$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Container exited with a non-zero exit code 1





--089e0158bde821e8fb051f5450ae--