Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 56FE4102F3 for ; Fri, 6 Dec 2013 06:41:26 +0000 (UTC) Received: (qmail 66118 invoked by uid 500); 6 Dec 2013 06:41:09 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 65803 invoked by uid 500); 6 Dec 2013 06:41:01 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 65770 invoked by uid 99); 6 Dec 2013 06:40:59 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 06 Dec 2013 06:40:59 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of write2kishore@gmail.com designates 209.85.216.51 as permitted sender) Received: from [209.85.216.51] (HELO mail-qa0-f51.google.com) (209.85.216.51) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 06 Dec 2013 06:40:53 +0000 Received: by mail-qa0-f51.google.com with SMTP id o15so297615qap.10 for ; Thu, 05 Dec 2013 22:40:32 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=pa7V7P0dwe85E+D6Rej/1WiDWXAjs6fy4EZ51jPAlqI=; b=IQP2GRDbRom0ukNCo3QD+o2rHPF/wv9w0IItdszNTLRSTgOmSI9UprPWDKnWnF5OXw nBpGcvV8UjeXpSLtcnz4nYIuZBacvcemOnbX3WGZjX0m0JsRn2bZrf0rHCbnZ/Eu1Ukw Z2AMorrHUOadGaUsUSq2R1DfqYAHdEkg5LKnhotv1Fzo9AVTpQOc1btQ/H2ByIw0gIIc 47g7SOzwNQ4wns/nRZFcIbpSse+DTtShqHw9VzhrxFQJpEH5inIaqYpiqgQfqkN/cGBV 3eDeiz4T6xrRTfvR6qBrrqJRHU9Xs21KNfTXqQM4ub3QnY5I+tqrc7XkrKpBmpWHuIhy wFvQ== MIME-Version: 1.0 X-Received: by 10.49.73.135 with SMTP id l7mr3508174qev.28.1386312032608; Thu, 05 Dec 2013 22:40:32 -0800 (PST) Received: by 10.96.215.100 with HTTP; Thu, 5 Dec 2013 22:40:32 -0800 (PST) In-Reply-To: <541BEC8B-0E40-4923-AF68-982990AC01B3@hortonworks.com> References: <541BEC8B-0E40-4923-AF68-982990AC01B3@hortonworks.com> Date: Fri, 6 Dec 2013 12:10:32 +0530 Message-ID: Subject: Re: YARN: LocalResources and file distribution From: Krishna Kishore Bonagiri To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7bdc0c2240669a04ecd7ec5e X-Virus-Checked: Checked by ClamAV on apache.org --047d7bdc0c2240669a04ecd7ec5e Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hi Arun, I have copied a shell script to HDFS and trying to execute it on containers. How do I specify my shell script PATH in setCommands() call on ContainerLaunchContext? I am doing it this way String shellScriptPath =3D "hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh"; commands.add(shellScriptPath); But my container execution is failing saying that there is No such file or directory! org.apache.hadoop.util.Shell$ExitCodeException: /bin/bash: hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh: No such file or directory at org.apache.hadoop.util.Shell.runCommand(Shell.java:464) at org.apache.hadoop.util.Shell.run(Shell.java:379) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589) I could see this file with "hadoop fs" command and also saw messages in Node Manager's log saying that the resource is downloaded and localized. So, how do I run the downloaded shell script on a container? Thanks, Kishore On Tue, Dec 3, 2013 at 4:57 AM, Arun C Murthy wrote: > Robert, > > YARN, by default, will only download *resource* from a shared namespace > (e.g. HDFS). > > If /home/hadoop/robert/large_jar.jar is available on each node then you > can specify path as file:///home/hadoop/robert/large_jar.jar and it shoul= d > work. > > Else, you'll need to copy /home/hadoop/robert/large_jar.jar to HDFS and > then specify hdfs://host:port/path/to/large_jar.jar. > > hth, > Arun > > On Dec 1, 2013, at 12:03 PM, Robert Metzger wrote: > > Hello, > > I'm currently writing code to run my application using Yarn (Hadoop 2.2.0= ). > I used this code as a skeleton: > https://github.com/hortonworks/simple-yarn-app > > Everything works fine on my local machine or on a cluster with the shared > directories, but when I want to access resources outside of commonly > accessible locations, my application fails. > > I have my application in a large jar file, containing everything > (Submission Client, Application Master, and Workers). > The submission client registers the large jar file as a local resource fo= r > the Application master's context. > > In my understanding, Yarn takes care of transferring the client-local > resources to the application master's container. > This is also stated here: > http://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/Writin= gYarnApplications.html > > You can use the LocalResource to add resources to your application >> request. This will cause YARN to distribute the resource to the >> ApplicationMaster node. > > > If I'm starting my jar from the dir "/home/hadoop/robert/large_jar.jar", > I'll get the following error from the nodemanager (another node in the > cluster): > > 2013-12-01 20:13:00,810 INFO >> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Res= ourceLocalizationService: >> Failed to download rsrc { { file:/home/hadoop/robert/large_jar.jar, .. >> > > So it seems as this node tries to access the file from its local file > system. > > Do I have to use another "protocol" for the file, something like " > file://host:port/home/blabla" ? > > Is it true that Yarn is able to distribute files (not using hdfs > obviously?) ? > > > The distributedshell-example suggests that I have to use HDFS: > https://github.com/apache/hadoop-common/blob/50f0de14e377091c308c3a74ed08= 9a7e4a7f0bfe/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoo= p-yarn-applications-distributedshell/src/main/java/org/apache/hadoop/yarn/a= pplications/distributedshell/Client.java > > > Sincerely, > Robert > > > > > > > -- > Arun C. Murthy > Hortonworks Inc. > http://hortonworks.com/ > > > > CONFIDENTIALITY NOTICE > NOTICE: This message is intended for the use of the individual or entity > to which it is addressed and may contain information that is confidential= , > privileged and exempt from disclosure under applicable law. If the reader > of this message is not the intended recipient, you are hereby notified th= at > any printing, copying, dissemination, distribution, disclosure or > forwarding of this communication is strictly prohibited. If you have > received this communication in error, please contact the sender immediate= ly > and delete it from your system. Thank You. --047d7bdc0c2240669a04ecd7ec5e Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi Arun,

=A0 I have copied a shell script to = HDFS and trying to execute it on containers. How do I specify my shell scri= pt PATH in setCommands() call on=A0ContainerLaunchContext? I am doing it th= is way

=A0 =A0 =A0 String shellScriptPath =3D "hdfs:= //isredeng:8020/user/kbonagir/KKDummy/list.ksh";
=A0 =A0 =A0= commands.add(shellScriptPath);

But my conta= iner execution is failing saying that there is No such file or directory!

org.apache.hadoop.util.Shell$ExitCodeException: /b= in/bash: hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh: No such file = or directory

=A0 =A0 =A0 =A0 at org.apache.hadoop.= util.Shell.runCommand(Shell.java:464)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.Shell.run(Shell.java:379)
=A0 =A0 =A0 =A0 at org.apache.hadoop.util.Shell$ShellCommandExecuto= r.execute(Shell.java:589)

I could see this f= ile with "hadoop fs" command and also saw messages in Node Manage= r's log saying that the resource is downloaded and localized. So, how d= o I run the downloaded shell script on a container?

Thanks,
Kishore



On Tue, De= c 3, 2013 at 4:57 AM, Arun C Murthy <acm@hortonworks.com> = wrote:
Robert,<= div>
=A0YARN, by default, will only download *resource* from = a shared namespace (e.g. HDFS).

=A0If /home/hadoop/robert/large_jar.jar is available on= each node then you can specify path as file:///home/hadoop/robert/large_ja= r.jar and it should work.

=A0Else, you'll need= to copy=A0/home/hadoop/robert/large_jar.jar to HDFS and then specify hd= fs://host:port/path/to/large_jar.jar.

hth,
Arun
On Dec 1, 2013, at 12:03 PM, Robert Metzger <metrobert@gmail.com> wrote= :

Hello,

I'm curren= tly writing code to run my application using Yarn (Hadoop 2.2.0).
I used= this code as a skeleton:=A0https://github.com/hortonworks/simple-yarn-ap= p

Everything works fine on my local machine or on a cluster with the shar= ed directories, but when I want to access resources outside of commonly acc= essible locations, my application fails.

I have my application in a = large jar file, containing everything (Submission Client, Application Maste= r, and Workers).=A0
The submission client registers the large jar file as a local resource for = the Application master's context.

In my understanding, Yarn take= s care of transferring the client-local resources to the application master= 's container.
This is also stated here:=A0http://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/Writ= ingYarnApplications.html

You can use the LocalResourc= e to add resources to your application request. This will cause YARN to dis= tribute the resource to the ApplicationMaster node.

If I'm starting my jar from the dir "/home/hadoop/robert/large_jar.jar", I'll get= the following error from the nodemanager (another node in the cluster):

2013-12-01 20:13:00,810 INFO org.apache.had= oop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalization= Service: Failed to download rsrc { { file:/home/hadoop/robert/large_jar.jar, ..

So= it seems as this node tries to access the file from its local file system.=

Do I have to use another "protocol" for the f= ile, something like "file://host:port/home/blabla" ?



Sincerely,
Robert






--
Arun C. Murthy
Hortonworks Inc.
http://hortonworks.com/



CONFIDENTIALITY NOTICE
NOTICE: This message is = intended for the use of the individual or entity to which it is addressed a= nd may contain information that is confidential, privileged and exempt from= disclosure under applicable law. If the reader of this message is not the = intended recipient, you are hereby notified that any printing, copying, dis= semination, distribution, disclosure or forwarding of this communication is= strictly prohibited. If you have received this communication in error, ple= ase contact the sender immediately and delete it from your system. Thank Yo= u.

--047d7bdc0c2240669a04ecd7ec5e--